Numerical Differentiation using local Chebyshev-Approximation

Stefan H. Reiterer

In applied mathematics, especially in optimization, functions are often only provided as so called "Black-Boxes" provided by software packages, or very complex algorithms, which make automatic differentation very complicated or even impossible. Hence one seeks the numerical approximation of the derivative. Unfortunately numerical differentation is a difficult task in itself, and it is well known that it is numerical instable. There are many works on this topic, including the usage of (global) Chebyshev approximations. Chebyshev approximations have the great property that they converge very fast, if the function is smooth. Nevertheless those approches have several drawbacks, since in practice functions are not smooth, and a global approximation needs many function evalutions. Nevertheless there is hope. Since functions in real world applications are most times smooth except for finite points, corners or edges. This motivates to use a local Chebyshev approach, where the function is only approximated locally, and hence the Chebyshev approximations still yields a fast approximation of the desired function. We will study such an approch in this work, and will provide a numerical example

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment