2.3 共轭斜量法
( Conjugate Gradient Methods)
属于一种迭代法,但如果不考虑计算过程的舍入误差,CG算法只用有限步就收敛于方程组的精确解
Outline
Background
Steepest Descent
Conjugate Gradient
1 Background
The min(max) problem:
But we learned in calculus how to solve that
kind of question!
)(m i n xf
x
“real world” problem
Connectivity shapes (isenburg,gumhold,gotsman)
What do we get only from C without
geometry?
{ (,),}m e s h C V E g e o m e t r y
Motivation-,real world” problem
First we introduce error functionals and
then try to minimize them:
23
(,)
( ) 1ns i j
i j E
E x x x?
(,)
1()
i j i
i j Ei
L x x xd
32
1
( ) ( )
n
n
ri
i
E x L x?
Motivation-,real world” problem
Then we minimize:
High dimension non-linear problem.
Conjugate gradient method is maybe the
most popular optimization technique based
on what we’ll see here.
3
(,) a r g m in 1 ( ) ( )
n srx
E C E x E x
Directional Derivatives,
first,the one dimension derivative:
x
yxf
),(
y
yxf
),(
Directional Derivatives,
Along the Axes…
v
yxf
),(
2Rv?
1?v
Directional Derivatives,
In general direction…
Directional
Derivatives
x
yxf
),(
y
yxf
),(
In the plane
2R
RRf?2:
y
f
x
f
yxf,),(
The Gradient,Definition in
n
n
x
f
x
f
xxf,.,,,:),.,,,(
1
1
RRf n?:
The Gradient,Definition
基本思想
Modern optimization methods
A method to solve quadratic function
minimization:
(A is symmetric and positive definite)
1
2m i n ( ) m i n {,,}nnx R x Rx x A x b x
2 最速下降法 ( Steepest Descent)
( 1)概念:将 点的修正方向取为该点的负梯度方向,即为最速下降方向,该方法进而称之为最速下降法,
( 2)计算公式:任意取定初始向量,
1
(,)
(,)
k k k
kk
k
kk
k k k k
p r b A x
rp
A p p
x x p
kx
( ) | kk k x xp r grad f x
Steepest Descent
3 共轭斜量法
( Conjugate Gradient)
Modern optimization methods,,conjugate
direction” methods.
A method to solve quadratic function
minimization:
(A is symmetric and positive definite)
1
2m i n {,,}nxR x A x b x
Conjugate Gradient
Originally aimed to solve linear problems,
Later extended to general functions under
rational of quadratic approximation to a
function is quite accurate.
2m i n bAxbAx
nRx
Conjugate Gradient
The basic idea,decompose the n-dimensional
quadratic problem into n problems of
1-dimension
This is done by exploring the function in
“conjugate directions”.
Definition,A-conjugate vectors:
1{ },,0,nni i i ju R u A u i j
Conjugate Gradient
If there is an A-conjugate basis then:
N problems in 1-dimension (simple smiling
quadratic)
The global minimizer is calculated sequentially
starting from x0:
0 jjjx x p
00
1
2
21
2
( ),,,
( ) ( ),,
j j j jj
j
x x A x b x
x x p A p b A x p
1,( 0,1,.,,,1 )k k k kx x p k n
Conjugate Gradient
Conjugate Gradient
Gradient
4 共轭斜量法与最速下降法的比较:
( Conjugate Gradient Methods)
属于一种迭代法,但如果不考虑计算过程的舍入误差,CG算法只用有限步就收敛于方程组的精确解
Outline
Background
Steepest Descent
Conjugate Gradient
1 Background
The min(max) problem:
But we learned in calculus how to solve that
kind of question!
)(m i n xf
x
“real world” problem
Connectivity shapes (isenburg,gumhold,gotsman)
What do we get only from C without
geometry?
{ (,),}m e s h C V E g e o m e t r y
Motivation-,real world” problem
First we introduce error functionals and
then try to minimize them:
23
(,)
( ) 1ns i j
i j E
E x x x?
(,)
1()
i j i
i j Ei
L x x xd
32
1
( ) ( )
n
n
ri
i
E x L x?
Motivation-,real world” problem
Then we minimize:
High dimension non-linear problem.
Conjugate gradient method is maybe the
most popular optimization technique based
on what we’ll see here.
3
(,) a r g m in 1 ( ) ( )
n srx
E C E x E x
Directional Derivatives,
first,the one dimension derivative:
x
yxf
),(
y
yxf
),(
Directional Derivatives,
Along the Axes…
v
yxf
),(
2Rv?
1?v
Directional Derivatives,
In general direction…
Directional
Derivatives
x
yxf
),(
y
yxf
),(
In the plane
2R
RRf?2:
y
f
x
f
yxf,),(
The Gradient,Definition in
n
n
x
f
x
f
xxf,.,,,:),.,,,(
1
1
RRf n?:
The Gradient,Definition
基本思想
Modern optimization methods
A method to solve quadratic function
minimization:
(A is symmetric and positive definite)
1
2m i n ( ) m i n {,,}nnx R x Rx x A x b x
2 最速下降法 ( Steepest Descent)
( 1)概念:将 点的修正方向取为该点的负梯度方向,即为最速下降方向,该方法进而称之为最速下降法,
( 2)计算公式:任意取定初始向量,
1
(,)
(,)
k k k
kk
k
kk
k k k k
p r b A x
rp
A p p
x x p
kx
( ) | kk k x xp r grad f x
Steepest Descent
3 共轭斜量法
( Conjugate Gradient)
Modern optimization methods,,conjugate
direction” methods.
A method to solve quadratic function
minimization:
(A is symmetric and positive definite)
1
2m i n {,,}nxR x A x b x
Conjugate Gradient
Originally aimed to solve linear problems,
Later extended to general functions under
rational of quadratic approximation to a
function is quite accurate.
2m i n bAxbAx
nRx
Conjugate Gradient
The basic idea,decompose the n-dimensional
quadratic problem into n problems of
1-dimension
This is done by exploring the function in
“conjugate directions”.
Definition,A-conjugate vectors:
1{ },,0,nni i i ju R u A u i j
Conjugate Gradient
If there is an A-conjugate basis then:
N problems in 1-dimension (simple smiling
quadratic)
The global minimizer is calculated sequentially
starting from x0:
0 jjjx x p
00
1
2
21
2
( ),,,
( ) ( ),,
j j j jj
j
x x A x b x
x x p A p b A x p
1,( 0,1,.,,,1 )k k k kx x p k n
Conjugate Gradient
Conjugate Gradient
Gradient
4 共轭斜量法与最速下降法的比较: