Ch5.1.1: Finite Differences & Derivative, Part 1

Numerical Differentiation

  • The derivative of a function is defined as

\[ f'(x) = \lim_{h \rightarrow 0}\frac{f(x+h)-f(x)}{h} \]

title

Numerical Differentiation

  • This is not a good approximation because of cancellation error in numerator and division by (close to) zero in denominator.
  • However, it is a good place to start.

\[ f'(x) = \lim_{h \rightarrow 0}\frac{f(x+h)-f(x)}{h} \]

title

Finite Difference Formula

  • The finite difference formula is the natural numerical approximation to the derivative:

\[ f'(x) \cong \frac{f(x+h)-f(x)}{h} \]

title

Forward & Backward Differences

  • Forward difference
    • \( h > 0 \)
  • Backward difference
    • \( h < 0 \)

\[ \begin{aligned} f'(x) \cong \frac{f(x+h)-f(x)}{h} \\ f'(x) \cong \frac{f(x)-f(x-h)}{h} \end{aligned} \]

title

Centered (Symmetric) Difference

  • This formula looks both ahead and behind:

\[ f'(x) \cong \frac{f(x+h)-f(x-h)}{2h} \]

title

R Code for Forward/Backward Method

  • R code for finddif
findiff <- function(f,x,
               h = x*sqrt(.Machine$double.eps))
  { return( ( f(x + h) - f(x) ) / h ) }

\[ f'(x) \cong \frac{f(x+h)-f(x)}{h} \]

Linear Example: Forward Difference

f<-function(x){4*x-2}
findiff(f,2,h=0.1)
[1] 4
findiff(f,2)
[1] 4

title

Quadratic Example: Forward Difference

f <- function(x) 
  {x^2 + 1}

title

findiff(f,2,h = 0.1)
[1] 4.1
findiff(f,2,h = 0.01)
[1] 4.01
findiff(f,2)
[1] 4

Quadratic Example: Backward Difference

f <- function(x) 
  {x^2 + 1}

title

findiff(f,2,h = -1)
[1] 3
findiff(f,2,h = -0.1)
[1] 3.9
findiff(f,2,h = -0.01)
[1] 3.99

R Code for Centered Difference

  • R code for symdiff
symdiff <- function(f,x,
             h = x*(.Machine$double.eps)^(1/3))
{ return( (f(x + h) - f(x - h)) / (2*h) ) }

\[ f'(x) \cong \frac{f(x+h)-f(x-h)}{2h} \]

Linear Example: Forward Difference

f<-function(x){4*x-2}
symdiff(f,2,h=0.1)
[1] 4
symdiff(f,2)
[1] 4

title

Quadratic Example: Symmetric Difference

f <- function(x) 
  {x^2 + 1}

title

symdiff(f,2,h = 1)
[1] 4
symdiff(f,2,h = 0.1)
[1] 4
symdiff(f,2,h = 0.01)
[1] 4

Numerical Analysis & Optimal h

  • How do we find the right step size h?
    • Small h leads to better accuracy
    • Too small leads to cancellation error
    • Too small leads to division by zero
  • Trial & error with h values and see what is best?
  • Use Calculus somehow to minimize error?

\[ \begin{aligned} f'(x) & \cong \frac{f(x+h)-f(x)}{h} \\ f'(x) &\cong \frac{f(x+h)-f(x-h)}{2h} \end{aligned} \]

Trial and Error: Forward Difference

f <- function(x){x^2 + 1}; x <- 2
h <- c(3e-4,3e-8,3e-12,3e-16)
findiff(f,2,h)
[1] 4.000300 4.000000 3.999763 5.921189
(h <- x*sqrt(.Machine$double.eps))
[1] 2.980232e-08
findiff(f,x)
[1] 4

Trial and Error: Centered Difference

f <- function(x){x^2 + 1}; x <- 2
h <- c(1e-1,1e-5,1e-12,1e-16)
symdiff(f,x,h)
[1] 4.000000 4.000000 4.000356 0.000000
(h <- x*(.Machine$double.eps)^(1/3))
[1] 1.211091e-05
symdiff(f,x)
[1] 4

Trial and Error: Centered Difference

f <- function(x){sin(x)}; x <- 0.9
h <- c(5e-2,5e-6,5e-10,5e-14)
symdiff(f,x,h)
[1] 0.6213510 0.6216100 0.6216101 0.6217249
(h <- x*(.Machine$double.eps)^(1/3))
[1] 5.449909e-06
symdiff(f,x)
[1] 0.62161

Numerical Analysis & Taylor Polynomials

  • Find error bound for forward (backward) difference.

\[ f(x+h) = f(x) + f'(x)\left((x+h)-x)\right) + \frac{f''(x)}{2}\left((x+h)-x)\right)^2 + \ldots \]

  • It follows that

\[ \begin{aligned} f(x+h) - f(x) & = f'(x)h + \frac{f''(x)}{2}h^2 + \ldots \\ f'(x) & = \frac{f(x+h) - f(x)}{h} - \frac{f''(x)}{2}h + \ldots \\ f'(x) & = \frac{f(x+h) - f(x)}{h} - \frac{f''(\xi(x))}{2}h, \,\, \,\, \xi(x) \in [x,x+h] \\ \end{aligned} \]

Forward Difference Error

  • From previous slide,

\[ f'(x) = \frac{f(x+h) - f(x)}{h} - \frac{h}{2}f''(c) \]

  • In absence of computer error, truncation error is defined as

\[ E_T = \frac{h}{2}f''(c) \]

  • So if we use the forward difference formula with perfect computational accuracy, then \( E_T \) is simply the error that results from not using limits and rules of calculus.

  • Next, we look at roundoff error (see next slide).

Forward Difference Error

  • For roundoff error, denote the computed values by

\[ \begin{aligned} \tilde{f}(x+h) &= f(x+h) + e_1(x) \\ \tilde{f}(x) &= f(x) + e_2(x) \end{aligned} \]

  • Then the computed forward difference formula is

\[ \begin{aligned} \tilde{f}'(x) &= \frac{\tilde{f}(x+h) - \tilde{f}(x)}{h} \\ &= \frac{f(x+h) - f(x) }{h} + \frac{e_1(x) - e_2(x)}{h} \\ &= \frac{f(x+h) - f(x) }{h} + E_R \end{aligned} \]

Total Error: Forward Difference

  • From previous slide,

\[ \begin{aligned} \tilde{f}'(x) = \frac{f(x+h) - f(x) }{h} + E_R \end{aligned} \]

  • Here, we have denoted roundoff error by \( E_R \), with

\[ E_R = \frac{e_1(x) - e_2(x)}{h} \]

  • Thus the total error \( E \) is given by

\[ E = E_R + E_T = \frac{e_1(x) - e_2(x)}{h} + \frac{h}{2}f''(c) \]

Total Error: Forward Difference

  • Our total error \( E \) is given by

\[ E = \frac{e_1(x) - e_2(x)}{h} + \frac{h}{2}f''(c) \]

  • Observe that round off increases as \( h \) gets smaller, while truncation error decreases as \( h \) get smaller.

  • We will use calculus to determine the optimal value of \( h \).

Total Error: Forward Difference

  • The total error \( E \) is given by

\[ E = \frac{e_1(x) - e_2(x)}{h} + \frac{h}{2}f''(c) \]

  • First, we observe that

\[ \left|e_1 - e_2\right| \leq |e_1| + |e_2| = 2 \max \left\{ |e_1|, |e_2| \right\} = 2e \]

  • Also, let

\[ M = \max_{x \leq c \leq x+h} |f''(c)| \]

Total Error: Forward Difference

  • The total error \( E \) is given by

\[ E = \frac{e_1 - e_2}{h} + \frac{h}{2}f''(c) \]

  • From the previous slide, \( E \) is a function of \( h \), with

\[ E(h) \leq \frac{2e}{h} + \frac{M}{2}h \]

  • Worst case scenario is that

\[ E(h) = \frac{2e}{h} + \frac{M}{2}h \]

title

Total Error: Forward Difference

  • From previous slide,

\[ E(h) = \frac{2e}{h} + \frac{M}{2}h \]

  • Take derivative and set equal to zero:

\[ \begin{aligned} E'(h) & = -\frac{2e}{h^2} + \frac{M}{2} \\ 0 & = -\frac{2e}{h^2} + \frac{M}{2} \\ h &= \sqrt{\frac{4e}{M}} \end{aligned} \]

Total Error: Forward Difference

  • Worst-case scenario:

\[ E(h) = \frac{2e}{h} + \frac{M}{2}h \]

  • Minimum occurs at

\[ h = \sqrt{\frac{4e}{M}} \]

  • Book value:

\[ h^* = x\sqrt{\epsilon} \]

title

Total Error: Forward Difference

  • Optimal h values: ours and book

\[ h = \sqrt{\frac{4e}{M}} , \,\, h^* = x\sqrt{\epsilon} \]

library(pracma)
c(eps(1), eps(8))
[1] 2.220446e-16 1.776357e-15

title title

Total Error: Forward Difference

  • Optimal h values:

\[ h = \sqrt{\frac{4e}{M}} , \,\, h^* = x\sqrt{\epsilon} \]

c(3*sqrt(eps(1)), sqrt(eps(3)))
[1] 4.470348e-08 2.107342e-08
  • Recall

\[ \begin{aligned} M &= \max |f''(c) | \\ e_1 &= \tilde{f}(x+h) - f(x+h) \\ e_2 &= \tilde{f}(x) - f(x) \end{aligned} \]

title

Optimal Step Sizes in Book

\[ \begin{array}{cllr} f'(x) & = \frac{f(x+h) - f(x)}{h} &\leftrightarrow h^* = x\sqrt{\epsilon} \\ f'(x) & = \frac{f(x+h) - f(x-h)}{2h} & \leftrightarrow h^* = x\sqrt[3]{\epsilon} \end{array} \]

title

Second Derivative & Taylor Polynomials

\[ \begin{aligned} f(x+h) = f(x) + f'(x)h & + \frac{f''(x)}{2}h^2 + \frac{f'''(x)}{6}h^3 + \frac{f^{(4)}(x)}{24}h^4 + \ldots \\ f(x-h) = f(x) - f'(x)h & + \frac{f''(x)}{2}h^2 - \frac{f'''(x)}{6}h^3 + \frac{f^{(4)}(x)}{24}h^4 + \ldots \end{aligned} \]

  • Add together and solve for \( f''(x) \):

\[ \begin{aligned} f(x+h) + f(x-h) & = 2f(x) + f''(x)h^2 + \frac{f^{(4)}(x)}{12}h^4 + \ldots \\ \\ f''(x) & = \frac{f(x+h) - 2f(x) + f(x-h)}{h^2} - \frac{h^2}{12}f^{(4)}(c) \end{aligned} \]