Centered Difference Approximation of the Derivative¶
Last Revised on Monday March 29
Programming Exercise¶
Write a (Python) function to approximate derivatives using the centered difference formula,
The input variables should include the node spacing \(h\); think about and discuss what all the input and output variables should be. (Also, remember that such a function should not do any interactive input, or any output to the screen or files.)
Then — as usual — add code that tests this on some examples such as \(f(x) = e^x\), \(f'(1)\) as suggested in the section on Test Cases for Differentiation.
Update: For test cases like the above where the exact result can be calculated, use this to compute and display the error.
Note: For the above function and every function below, create a Jupyter notebook to run test cases: this will handle any output to the screen.
Warm-up exercise: forward difference approximation¶
Let us do this for the forward difference approximation
import numpy as np
def D_h(f, x, h):
D_h_of_f_at_x = (f(x+h) - f(x))/h
return D_h_of_f_at_x
def f(x): return np.exp(x)
for x in [0., 1., 3.]:
print()
print(f"For {x=}:")
for h in [0.1, 0.01, 0.001]:
print()
print(f"For {h=}:")
D_hf = D_h(f, x, h)
print(f"{D_hf=}")
Df_exact = f(x)
error = D_hf - Df_exact
print(f"{error=}")
For x=0.0:
For h=0.1:
D_hf=1.0517091807564771
error=0.051709180756477124
For h=0.01:
D_hf=1.005016708416795
error=0.005016708416794913
For h=0.001:
D_hf=1.0005001667083846
error=0.0005001667083845973
For x=1.0:
For h=0.1:
D_hf=2.858841954873883
error=0.14056012641483795
For h=0.01:
D_hf=2.7319186557871245
error=0.01363682732807936
For h=0.001:
D_hf=2.7196414225332255
error=0.0013595940741804036
For x=3.0:
For h=0.1:
D_hf=21.12414358253968
error=1.0386066593520127
For h=0.01:
D_hf=20.186300205325836
error=0.10076328213816765
For h=0.001:
D_hf=20.095583040074416
error=0.01004611688674828
Mathematical Exercise¶
Verify the order of accuracy result
and the (equivalent) fact that this method has degree of precision 2: it is exact fo all quadratics.
One could in fact go further and derive the error formula
Improving on the Centered Difference Approximation with Richardson Extrapolation¶
Slightly revised on Thursday March 25
Write a function to apply Richardson extrapolation using the above centered differences function, and a file to test it and handle any interactive input and screen or file output, as noted above.
As with almost any practical implementation of a numerical approximation algorithm, the input should include an error tolerance, and the output should include an error estimate.
Further details and suggestions¶
It could help to break this into to “sub-tasks”:
A) Write a Python function (maybe CD_richardson_1(f, x, h)
?) that does a single step of Richardson extrapolation forth centered difference approximation of the first derivative.
Then as usual add a code cell with testing of this on some examples such as \(f(x) = e^x\), \(f'(1)\) as suggested in the section on Test Cases for Differentiation.
B) Write a second function that uses a loop to seek a result that meets the error tolerance, by halving \(h\) at each iteration.
Use either a while loop — or my preferred strategy of using a for
loop to impose an iteration limit and within that a break
to stop if and when sufficient accuracy is achieved.
Also, see the updated notes on Richardson Extrapolation, where I have added a discussion of how to get error estimates.