-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathopt_algo_derivatives.qmd
44 lines (33 loc) · 1.5 KB
/
opt_algo_derivatives.qmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
---
title: "Computing derivatives"
bibliography: ref_optimization.bib
format:
html:
html-math-method: katex
code-fold: true
execute:
enabled: false
jupyter: julia-1.10
---
We have already argued that using derivatives within gives us a huge advantage in solving optimization problems.
There are three ways to compute derivatives:
- symbolic methods,
- numerical finite-difference (FD) methods,
- algorithmic (also automatic) differentiation (AD) methods.
## Symbolic methods
These are essentially the methods that we have all learnt to apply using a pen and paper. A bunch of rules. The outcome is an expression.
## Numerical finite-difference (FD) methods
These methods approximate the derivative by computing differences between the function values at different points, hence the name *finite-difference (FD)* methods. The simplest FD methods follow from the definition of the derivative after omiting the limit:
$$
\frac{\mathrm d f(x)}{\mathrm d x} \approx \frac{f(x+\alpha)-f(x)}{\alpha}\qquad\qquad \text{forward difference}
$$
or
$$
\frac{\mathrm d f(x)}{\mathrm d x} \approx \frac{f(x)-f(x-\alpha)}{\alpha}\qquad\qquad \text{backward difference}
$$
or
$$
\frac{\mathrm d f(x)}{\mathrm d x} \approx \frac{f(x+\frac{\alpha}{2})-f(x-\frac{\alpha}{2})}{\alpha}\qquad\qquad \text{central difference}
$$
For functions of vector variables, the same idea applies, but now we have to compute the difference for each component of the vector.
## Algorithmic (also Automatic) differentiation methods