16.322 Stochastic Estimation and Control, Fall 2004
Prof. Vander Velde
Page 1 of 9
Lecture 11
Last time: Ergodic processes
An ergodic process is necessarily stationary.
Example: Binary process
At each time step the signal may switch polarity or stay the same. Both
0
x+
and
0
x? are equally likely.
Is it stationary and is it ergodic?
For this distribution, we expect most of the members of the ensemble to have a
change point near t=0.
16.322 Stochastic Estimation and Control, Fall 2004
Prof. Vander Velde
Page 2 of 9
[ ]
12 1 2
12
2
34 3
2
43 21 0
(, ) ()()
()()
0
(, ) ()
xx
xx
R tt Extxt
xt xt
Rtt xt
ttttx
=
≈
=
≈
?=?=
Chance of spanning a change point is the same over each regular interval, so the
process is stationary. Is it ergodic?
Some ensemble members possess properties which are not representative of the
ensemble as a whole. As an infinite set, the probability that any such member of
the ensemble occurs is zero.
16.322 Stochastic Estimation and Control, Fall 2004
Prof. Vander Velde
Page 3 of 9
All of these exceptional points are associated with rational points. They are a
countable set of infinity which constitute a set of zero measure. The
complementary set of processes are an uncountable infinity associated with
irrational numbers which constitute a set of measure one.
16.322 Stochastic Estimation and Control, Fall 2004
Prof. Vander Velde
Page 4 of 9
For ergodic processes:
[]
[]
[]
22 2
1
( ) lim ( )
2
1
[()] lim ()
2
1
() ()( ) lim ()( )
2
1
() ()( ) lim ()( )
2
T
T
T
T
T
T
T
xx
T
T
T
xy
T
T
xExt xtdt
T
xExt xtdt
T
R Extxt xtxt dt
T
R Extyt xtyt dt
T
ττ τ
τ ττ
→∞
?
→∞
?
→∞
?
→∞
?
==
==
=+= +
=+= +
∫
∫
∫
∫
A time invariant system may be defined as one such that any translation in time
of the input affects the output only by an equal translation in time.
This system will be considered time invariant if for every τ , the input ()ut τ+
causes the output ()yt τ+ . Note that the system may be either linear or non-
linear.
It is proved directly that if ()ut is a stationary random process having the ergodic
property and the system is time invariant, then ()yt is a stationary random
process having the ergodic property, in the steady state. This requires the system
to be stable, so a defined steady state exists, and to have been operating in the
presence of the input for all past time.
Example: Calculation of an autocorrelation function
Ensemble: () sin( )xt A tω θ=+
, Aθ are independent random variables
θ is uniformly distributed over 0,2π
This process is stationary (the uniform distribution of θ hints at this) but not
ergodic. Unless we are certain of stationarity, we should calculate:
16.322 Stochastic Estimation and Control, Fall 2004
Prof. Vander Velde
Page 5 of 9
()()
12 1 2
2
2
12
00
"" ""
(, ) ()()
1
() sin sin
2
xx
BA
Rtt xtxt
dA d f a A t t
π
θ ωθ ωθ
π
∞
=
=++
∫∫
1424314243
()()
1
sin sin cos cos
2
AB AB AB=??+??
??
() ()
{}
2
2
12 2 1 1 2
0
2
21
11
(, ) cos cos 2
22
1
cos ,
2
xx
R tt A t t t t d
Att
π
ω ωθθ
π
ωτ τ
?+??? ?
??? ?
==?
∫
So the autocorrelation function is sinusoidal with the same frequency.
This periodic property is true in general. If all members of a stationary ensemble
are periodic, ()()x tnT xt+=
()()( )
() ( )
()
xx
xx
R nT xtxt nT
xtxt
R
ττ
τ
τ
+= ++
=+
=
Identification of a periodic signal in noise
We have recorded a signal from an experimental device which looks like just
hash.
It is of interest to know if there are periodic components contained in it.
Consider:
() () ()x tRtPt=+
where ()Pt is any periodic function of period T and ()R t is a random process
independent of P .
16.322 Stochastic Estimation and Control, Fall 2004
Prof. Vander Velde
Page 6 of 9
If R is a stationary random process, use
21
ttτ = ?
() () () () ()
() () 2
xx RR PP RP PR
RR PP
RRRRR
RP
τ ττττ
ττ
=+++
=++
This usually makes the periodic component obvious. If P contains more than
one frequency component, ()
PP
R τ will contain the same components.
Note that this depends on ()Pt being truly periodic, not just oscillatory.
Cohesion time (over which phase is maintained) must exceed correlation time of
signal.
Detection of a known signal in noise
Communication systems depend on this technique for the detection of very weak
signals of known form in strong noise. This is how the Lincoln Laboratory radar
engineers “touched” Venus by radar, and the RLE last people touched the moon
by laser.
A signal of known form is transmitted, ()st. Upon receipt it is badly corrupted
with noise so that no recognizable waveform appears.
Received message
1
() ( ) ()mt kst t nt=?+
16.322 Stochastic Estimation and Control, Fall 2004
Prof. Vander Velde
Page 7 of 9
The received message is cross-correlated with a signal of the known waveform.
If the time of arrival is not known, the cross-correlation is carried out for various
values of τ .
1
1
1
() () ( )
()( ) () ( )
()( ) () ( )
()
sm
ss
Rstmt
ks t s t t s t n t
ks t s t t s t n t
kR t
ττ
τ τ
τ τ
τ
=+
=+?++
=+?++
=?
The signal is designed with zero mean to eliminate the second term.
The use of these correlation functions imply signals which continue for all time.
Actually it is finite data in these cases and functions similar to the R functions are
used which involve integration without averaging. However, the notions are
analogous to these. This is the basis for correlation detection. The same result is
obtained, starting from a different point of view, with the matched filter.
The also provides and estimate of k and of t
1
. GPS uses the t
1
estimate.
16.322 Stochastic Estimation and Control, Fall 2004
Prof. Vander Velde
Page 8 of 9
Two-sided Fourier transform is used as it defines the behavior of the signal for
negative time. This is important so that this can be set to zero for causal systems.
You know that use of transforms is convenient in the analysis of time-invariant
linear systems. The same is true of the study of stationary processes in invariant
linear systems.
Using the two-sided Fourier transform, the transform of the autocorrelation
function is
() ()
j
xx xx
SRed
ωτ
ω ττ
∞
?
?∞
=
∫
This is called the power spectral density function.
We reject the real part of the argument of the exponential function, as this will
diverge for negative time.
Properties of ()
xx
S ω
[]
0
() ()cos sin
()cos
2()cos
xx xx
xx
xx
SR jd
Rd
Rd
ω τωτ ωττ
τωττ
τωττ
∞
?∞
∞
?∞
∞
=?
=
=
∫
∫
∫
We see that a PSD is a real function, even in ω , and non-negative.
() Re
() 0
() ()
xx
xx
xx xx
S
S
SS
ω ω
ωω
ω ω
∈?
≥?
?=
The inverse relation between ()
xx
s ω and ()
xx
R τ is
1
() ( )
2
j
xx xx
R Sed
ωτ
τ ωω
π
∞
?∞
=
∫
Note that
2
() (0)
1
()
2
xx
xx
Ext R
Sdω ω
π
∞
?∞
??=
??
=
∫
16.322 Stochastic Estimation and Control, Fall 2004
Prof. Vander Velde
Page 9 of 9
Power means mean squared value. The PSD gives the spectral distribution of
power density.
2
1
2
1
() ( )
xx
yt S d
ω
ω
ω ω
π
=
∫
= mean squared value of the frequency components in
()x t in the range
1
ω to
2
ω .
If ()x t has a non-zero mean,
[][]
2
() (), () 0
() ()( )
() ( )
()
xx
rr
xt x rt rt
Rxtxt
xrtxrt
xR
ττ
τ
τ
=+ =
=+
=+ ++
=+
Corresponding to
2
x term, the PSD for ()x t will have an additive term
22
2()
j
xe d x
ωτ
τ πδω
∞
?
?∞
=
∫
.