Optimization Methods
One-Dimensional Unconstrained Optimization
Golden-Section Search
Quadratic Interpolation
Newton's Method
Characteristics of Optima
Mathematical Background
Objective: Maximize or Minimize f(x)
subject to
d i (x) ai
ei (x) = bi
i = 1,2, , m *
Constraints
i = 1,2, , p *
Maximize f ( x )
Minimize f ( x )
x*
Bracketing Method
f(x)
xl
xu
Bracketing Method
xl
xa xb xu
xl
xa
x b xu
x1
x2
Bracketing Method
How would you suggest we select xa and xb (with
the objective to minimize computation)?
Eliminate as much interval as possible in each iteration
Set xa and xb close to the center so that we can
halve the interval in each iteration
Drawbacks: function evaluation is usually a costly
operation.
Minimize the number of function evaluations
Select xa and xb such that one of them can be
reused in the next iteration (so that we only need to
evaluate f(x) once in each iteration).
How should we select such points?
Current iteration
Objective:
l1
xl
xb' = xa or xa' = xb
l1
lo
xa
xb
xu
Next iteration
l'1
l'1
l'o
x'l
x'a x'b
x'u
xl
xa
xb
l1'
l1
=R= '
l0
l0
xu
Current iteration
l1
l1
lo
xl
xa
xb
xu
Next iteration
l0 l1
=R
l1
l0 Rl0
=R
Rl0
l1
[ = R l1 = Rl0 ]
l0
R 2l0 + Rl0 l0 = 0
R2 + R 1 = 0
l'1
1 + 1 4( 1)
R=
2(1)
l'1
l'o
x'l
x'a x'b
x'u
xl
xa
xb
xu
5 1
0.61803
2
Golden Ratio
Golden-Section Search
Starts with two initial guesses, xl and xu
Two interior points xa and xb are calculated based on the
golden ratio as
5 1
xa = xu d or xb = xl + d where d =
( xu xl )
2
In the first iteration, both xa and xb need to be
calculated.
In subsequent iteration, xl and xu are updated
accordingly and only one of the two interior points
needs to be calculated. (The other one is inherited from
the previous iteration.)
Golden-Section Search
In each iteration the interval is reduced to about 61.8%
(Golden ratio) of its previous length.
After 10 iterations, the interval is shrunk to about
(0.618)10 or 0.8% of its initial length.
After 20 iterations, the interval is shrunk to about
(0.618)20 or 0.0066%.
x1
x2
Fibonacci Search
Fibonacci numbers are:
1,1,2,3,5,8,13,21,34,..
that is , the sum of the last 2 numbers
Fn = Fn-1 + Fn-2
L1 = L2 + L3
L2
L3
x1
Ln = (L1 + Fn-2 ) / Fn
x2
L2
L1
Quadratic Interpolation
Optima of g(x)
Optima of f(x)
f(x)
x0
x1
x3
x2
Idea:
(i) Approximate f(x) using a quadratic function g(x) = ax2+bx+c
(ii) Optima of f(x) Optima of g(x)
Quadratic Interpolation
Shape near optima typically appears like a
parabola. We can approximate the original
function f(x) using a quadratic function:
g(x) = ax2 + bx + c.
At the optimum point of g(x), g'(x) = 2ax + b = 0.
Let x3 be the optimum point, then x3 = -b/2a.
How to compute b and a?
Quadratic Interpolation
a and b can be obtained by solving the system of linear
equations
ax 02
ax12
ax 22
+ bx 0
+ bx1
+ bx 2
+ c = f ( x0 )
+ c = f ( x1 )
+ c = f ( x2 )
Quadratic Interpolation
The process can be repeated to improve the
approximation.
Next step, decide which sub-interval to discard
Since f(x3) > f(x1)
if x3 > x1, discard the interval toward the left of x1
i.e., Set x0 = x1 and x1 = x3
if x3 < x1, discard the interval toward the right of x1
i.e., Set x2 = x1 and x1 = x3
Newtons Method
Let g(x) = f'(x)
Thus the zeroes of g(x) is the optima of f(x).
Substituting g(x) into the updating formula of
Newton-Rahpson method, we have
xi +1
g ( xi )
f ' ( xi )
= xi
= xi
g ' ( xi )
f " ( xi )
Newtons Method
Shortcomings
Need to derive f'(x) and f"(x).
May diverge
May "jump" to another solution far away
Advantages
Fast convergent rate near solution
Hybrid approach: Use bracketing method to find an
approximation near the solution, then switch to
Newton's method.
False
False Position
Position Method
Method or
or Secant
Secant Method
Method
Second order information is expensive to calculate (for multivariable problems).
Thus, try to approximate second order derivative.
Replace y''(xk) in Newton Raphson with
y' ' ( x k ) =
y' ( x k ) y' ( x k 1 )
x k x k 1
x k x k 1
( y' ( x k ))
y' ( x k ) y' ( x k 1 )