Optimization
ES 204
Numerical Methods in Engineering
What is the difference between finding
roots and finding optima?
Process of creating something that is as effective as
possible
Finding the maxima and minima of a function that
depends on one or more variables (values of variables
where optimum is present)
Optimums are the points where the curve is flat
x value where the derivative f (x) is equal to zero
If f (x) < 0, the point is a maximum
If f (x) > 0, the point is a minimum
Usual strategy: you can differentiate the function and locate the root (i.e., the zero) of the
new function
Example: Determining the Optimum
Analytically by Root Location
Determine the time and magnitude of the peak elevation based on:
Elevation as a function of time for an
object initially projected upward with an
initial velocity
z = altitude (m) above the earths surface (defined as z = 0), z0 = initial altitude (m), m = mass
(kg), c = a linear drag coefficient (kg/s), v0 = initial velocity (m/s), and t = time (s)
Use the following parameter values for your calculation: g = 9.81 m/s2, z0 = 100 m, v0 = 55
m/s, m = 80 kg, and c = 15 kg/s
0
What are one- and multi-dimensional
optimization problems?
One-dimensional optimization Two-dimensional optimization
One-dimensional problems involve functions that depend on a single dependent variable;
climbing or descending one-dimensional peaks and valleys
Multidimensional problems involve functions that depend on 2 or more dependent variables
Process of finding a maximum versus finding a minimum is identical; same value x both
minimizes f (x) and maximizes f (x)
What is the golden-section search
method for 1D optimization?
Suppose we are interested in determining the minimum of a one-dimensional function
We now define an interval [xl xu] containing a single minimum (unimodal)
We define two intermediate points x1 and x2
(golden ratio)
If f (x1)< f (x2); f (x1) is minimum; remove [xl x2] since it
doesnt contain minimum; next round x2 is the new xl
If f (x2)< f (x1); f (x2) is minimum; remove [x1 xu] since it
doesnt contain minimum; next round x1 is the new xu
Bracketing method; as iterations are repeated, bracket
containing minimum is reduced; new d each iteration
Example: Golden Section Search
Use the golden-section search to find the minimum of
within the interval from xl = 0 to xu = 4.
First iteration:
f (x2)< f (x1); estimate of minimum is x = 1.5279
Minimum is in interval [xl x2 x1]; for next iteration, x1 = xu = 2.4721; x2 will be next iteration x1
Second iteration, bracket [0 2.4721]
x1 is old x2; x1 = 1.5279
f (x1)< f (x2); estimate of minimum is x = 1.5279
Minimum is in interval [x2 x1 xu]; for next iteration, x2 = xl = 0.9443
The bracket for next iteration is [0.9443 2.4721]; next x2 is old x1; x2 = 1.5279
Example: Golden Section Search
Use the golden-section search to find the minimum of
within the interval from xl = 0 to xu = 4.
Current minimum is highlighted for every iteration
Note true value of 1.7757 at x = 1.4276
Once an iteration is complete, the optimum will either fall in one of two intervals:
[xl x2 x1] or [x2 x1 xu]; we can define exact upper bound for error
Basis for terminating iterations:
Here is an M-
file function
for the golden-
section search
for
minimization
Example: Determining the optimum
using golden section search
Determine the time and magnitude of the peak elevation based on:
Use the following parameter values for your calculation: g = 9.81 m/s2, z0 = 100 m, v0 = 55
m/s, m = 80 kg, and c = 15 kg/s
This is a maximization problem; we have entered the negative of the function
fmin corresponds to a maximum height of 192.86 m
What is parabolic interpolation for 1D
optimization?
Three points that jointly bracket an optimum
Fit a parabola to the points, differentiate, set result
equal to zero, solve for an estimate of the optimal x
x1, x2, and x3 are the initial guesses
x4 - value of x that corresponds to the optimum value of the parabolic fit to the guesses
Example: Parabolic Interpolation
Use parabolic interpolation to approximate the minimum of
with initial guesses of x1= 0, x2 = 1, and x3 = 4
First iteration:
f(x4) < f(x2) < f(x1) < f(x3) and [x1 x2 x4 x3]; retain the bracket [x2 x4 x3]
Second iteration:
f(x4) < f(x2) < f(x1) < f(x3) and [x1 x4 x2 x3]; retain the bracket [x1 x4 x2]
Example: Parabolic Interpolation
Within five iterations, the result is converging rapidly on the true value of 1.7757
at x = 1.4276.
MATLAB/Octaves fminbnd function Brents combination of golden-section search and
parabolic interpolation
How to use fminbnd function?
Determine the time and magnitude of the peak elevation based on:
Use the following parameter values for your calculation: g = 9.81 m/s2, z0 = 100 m, v0 = 55
m/s, m = 80 kg, and c = 15 kg/s
Example: Visualizing a 2D Function for
2D Optimization Problems
Use Octaves graphical capabilities to display the following function and visually estimate its
minimum in the range -2 x1 0 and 0 x2 3:
Minimum value of about f (x1, x2) = 0
to 1 located at about x1=1 and x2= 1.5
What is fminsearch function?
Determine the minimum of a multidimensional function; direct method (not
gradient/descent/ascent method where derivatives are required)
Handles non-smooth objective functions