1 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Multidisciplinary System
Design Optimization (MSDO)
Numerical Optimization I
Lecture 6
23 February 2004
Karen Willcox
2 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Today’s Topics
Existence & Uniqueness of an Optimum
Solution
Kuhn-Tucker Conditions
Convex Spaces
Unconstrained Problems
Linear Programming
3 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Disclaimer!
This is not a classic optimization class ...
The aim is not to teach you the details of optimization
algorithms, but rather to expose you to different
methods.
We will utilize optimization techniques – the goal is to
understand enough to be able to utilize them wisely.
If you plan to use optimization extensively in your
research, you should take an optimization class, e.g.
15.093
4 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Learning Objectives
After the next two lectures, you should:
be familiar with what gradient-based optimization
techniques are available
understand the basics of how each technique works
be able to choose which optimization technique is
appropriate for your problem
understand what to look for when the algorithm
terminates
understand why the algorithm might fail
5 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
How to Choose an Algorithm?
Number of design variables
Type of design variables (real/integer, continuous/discrete)
Linear/nonlinear
Equality/inequality constraints
Discontinuous feasible spaces
Initial solution feasible/infeasible
Simulation code runtime
6 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Technique Overview
Steepest Descent
Conjugate Gradient
Quasi-Newton
Newton
Simplex – linear
SLP – linear
SQP – nonlinear, expensive, common in engineering applications
Exterior Penalty – nonlinear, discontinuous design spaces
Interior Penalty – nonlinear
Generalized Reduced Gradient – nonlinear
Method of Feasible Directions – nonlinear
Mixed Integer Programming
UNCONSTRAINED
CONSTRAINED
7 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Standard Problem Definition
1
2
min ( )
s.t. ( ) 0 1,..,
( ) 0 1,..,
1,..,
j
k
u
iii
J
gjm
hkm
xxxi n
d
d d
x
x
x
A
For now, we consider a single objective
function, J(x).
There are n design variables, and a total of
m constraints (m=m
1
+m
2
).
The bounds are known as side constraints.
8 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Linear vs. Nonlinear
The objective function is a linear function of the design
variables if each design variable appears only to the first
power with constant coefficients multiplying it.
12 3
() 2 3.4Jxx x x
is linear in x=[x
1
x
2
x
3
]
T
12 2 3
() 2 3.4Jxxx x x
is nonlinear in x
12
( ) cos( ) 2 3.4Jxx x
3
x
is nonlinear in x
9 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Linear vs. Nonlinear
A constraint is a linear function of the design variables if each
design variable appears only to the first power with constant
coefficients multiplying it.
12 3
621xx x 0
is linear in x
is nonlinear in x
0
2
12 3
621xx x
is nonlinear in x
123
6sin()2 10xxx
10 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Iterative Optimization Procedures
Most optimization algorithms are iterative:
where
q=iteration number
S=vector search direction
D=scalar distance
and the initial solution x
0
is given
The algorithm determines the search direction S
according to some criteria.
Gradient-based algorithms use gradient information to
decide where to move.
1qq qq
D
xx S
12 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Gradient Vector
Consider a function J(x), x=[x
1
,x
2
,...,x
n
]
The gradient of J(x) at a point x
0
is a vector of length n:
0
1
0
0
2
0
()
()
()
()
n
J
x
J
xJ
J
x
w
a o
? ?
w
w ? ?
w ?
? ?
? ?
w
? ? w
? ?
x
x
x
x
#
Each element in the vector is evaluated at the point x
0
.
13 ? Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Hessian Matrix
Consider a function J(x), x=[x
1
,x
2
,...,x
n
]
The second derivative of J(x) at a point x
0
is a
matrix of size n un:
22 2
2
112 1
2
12020
22
2
1
() ()
n
nn
JJ J
xxx xx
J
xx
J
JJ
xx x
a o
w w w
? ?
w w w w w
w
w w
? ?
{ ?
? ?
? ?
w w
w w w
? ?
Hx x
"