numerical differentiation for autograd 11/16/2024 Numerical differentiation Numerical differentiation is a method used to approximate a derivative using finite perturbation differences. There are three basic methods for numerical differentiation: Forward, Backward, and Central difference methods. It can be used when you don't have the derived function but need to perform gradient-based optimization. In a computer, if the finite perturbation value is too small, the rounding error may become significant, leading to inaccurate results in the approximation of the derivative. ( \( 10^{-4} \sim 10^{-6} \) ) Forward difference method (전방 차분) The forward difference method approximates the derivative of a function by using the value of the function at the current point and a small step forward. \[ \,\\ f'(x) = \lim_{h\rightarrow0} \frac{f(x + h) - f(x)}{h} \,\\ \] Backward difference method (후방 차분) The backward difference method approximates the derivative of a function by using the value of the function at the current point and a small step backward. \[ \,\\ f'(x) = \lim_{h\rightarrow0} \frac{f(x - h) - f(x)}{h} \,\\ \] Central difference method (중앙 차분) The central difference method approximates the derivative by using both the forward and backward steps around the point, which can give a more accurate estimate. \[ \begin{align*} \,\\ f'(x) &= \lim_{h \, \rightarrow \, 0} \, \frac{1}{2} \cdot \left( \frac{f(x + h) - f(h)}{h} - \frac{f(x - h) - f(h)}{h} \right) \\\,\\ &= \lim_{h \, \rightarrow \, 0} \, \frac{1}{2} \cdot \frac{f(x + h) - f(x - h)}{h} \\\,\\ &= \lim_{h \, \rightarrow \, 0} \, \frac{f(x + h) - f(x - h)}{2h} \,\\ \end{align*} \] Geometric application with Shapley and torch.autograd