CHAPTER 12 TIME SERIES ANALYSIS 1
Chapter 12 Time Series Analysis
12.1 Stochastic processes
A stochastic process is a family of random variables {Xt,t ∈ T}.
Example 1 {St,t = 0,1,2,···} where St = summationtextti=0 Xi and Xi ~ iid(0,σ2). St has a
different distribution at each point t.
12.2 Stationarity and strict strationarity
If {Xt,t ∈ T} is a stochastic process such that V ar(Xt) < ∞ for each t ∈ T, the autoco-
variance function γx (·,·) of {Xt} is defined by
γx (r,s) = Cov(Xr,Xs) = E (Xr ?EXr)(Xs ?EXs).
Because V ar(Xt) < ∞ for each t ∈ T,
γx (r,s) ≤ bracketleftbigE (Xr ?EXr)2bracketrightbig1/2bracketleftbigE (Xs ?EXs)2bracketrightbig1/2
< ∞
by the Cauchy—Schwarz inequality.
The autocorrelation function ρx (r,s) is defined by
ρx (r,s) = γx (r,s)radicalbigγ
x (r,r)γx (s,s)
Example 2 Let Xt = et + θet?1, et ~ iid(0,σ2).
γx (t + h,t) = Cov(Xt+h,Xt)
??
?
= parenleftbig1 + θ2parenrightbigσ2, h = 0
= θσ2, h = ±1
= 0, |h| > 1
ρx (t + h,t)
?
??
??
= 1, h = 0
= θ(1+θ2), h = ±1
= 0, |h| > 1
The time series {Xt,t ∈ Z} with index set Z = {0,±1,±2,···} is said to be (weakly)
stationary, if
1. E|X2t | < ∞ for all t ∈ Z
2. EXt = m for all t ∈ Z
3. γx (r,s) = γx (r + t,s + t) for all r,s,t ∈ Z
CHAPTER 12 TIME SERIES ANALYSIS 2
Remark 1 If {Xt,t∈ Z} is stationary, then
γx (r,s) =γx (r?s,s?s) =γx (r?s,0).
Hence, we may define the autocovariance function of a stationary process as a function of
just one variable, which is the difference of two time index. That is, instead of γx (r,s),
we may write
γx (r?s) =γx (h).
To be more precise,
γx (h) =Cov(Xt+h,Xt).
In the same way,
ρx (h) =γx (h)/γx (0).
Example 3 Xt =et +θet?1,et ~iid(0,σ2).
Xt is stationary.
Example 4 Xt =Xt?1 +et,et ~iid(0,σ2).
Then
Xt =
tsummationdisplay
i=1
ei +X0.
Xt is not stationary, since Var(Xt) =tσ2 (assume X0 = 0).
Example 5 Xt ≡N(0,σ2t).
Xt is not stationary.
The time series {Xt,t∈ Z} is said to be strictly stationary if the joint distribution of
(Xt1,··· ,Xtk)prime and (Xt1+h,··· ,Xtk+h)prime are the same for all positive integersk and for all
t1,··· ,tk,h∈ Z.
12.3 Autoregressive processes
yt =α1yt?1 +···+αpyt?p +et :AR(p) process
where
Eet = 0 and Eetes
braceleftbigg =σ2, t=s
= 0, tnegationslash=s .
CHAPTER 12 TIME SERIES ANALYSIS 3
Or we may write using the lag operator
(1?α1L?···?αpLp)yt =et
(Lpyt =yt?p)
Asymptotic theory of AR(1) model
yt =αyt?1 +et. |α|<1
?αOLS =
parenleftBigg Tsummationdisplay
t=2
yt?1yt
parenrightBigg
/
Tsummationdisplay
t=2
y2t?1
We have
1. ?αOLS p→α
2. √T (?α?α) d→N(0,1?α2).
Proof.
1.
?α?α=
summationdisplay
yt?1et/
summationdisplay
y2t?1.
We may express
yt?1 =
∞summationdisplay
i=0
αiet?1?i.
Then, using Chebychev’s inequality, we may obtain
summationdisplay
yt?1et/T p→ 0
summationdisplay
y2t?1/T p→ σ
2
1?α2.
2. By the central limit theorem,
1√
T
summationdisplay
yt?1et d→ N
parenleftbigg
0,σ2plim
summationtexty2
t?1
T
parenrightbigg
= N
parenleftbigg
0, σ
4
1?α2
parenrightbigg
Hence
√T (?α?α) d→Nparenleftbig0,1?α2parenrightbig.
CHAPTER 12 TIME SERIES ANALYSIS 4
When α= 1, we have
1. ?α p→ 1.
But
2. √T (?α?1) dnotarrowrightN(0,1?α2).
In fact,
T (?α?1) d→ a non—normal random variable.
(See Fuller (1976), “Introduction to Statistical Time Series”).
As a result, for a t?test for H0 :α= 1,
t= ?α?1radicalBig
?σ2parenleftbigsummationtexty2t?1parenrightbig?1
dnotarrowrightN(0,1).
Least squares estimation of AR(p) processes
yt =α1yt?1 +···+αpyt?p +et, t=p+ 1,···,T?
??
??
yp+1 =α1yp +···+αpy1 +ep+1
...
yT =α1yT?1 +···+αpyt?p +eT
or
y =Xα+e
where
y =
?
?? yp+1...
yT
?
??, X =
?
?? yp ··· y1...
yT?1 ··· yT?p
?
??, α=
?
?? α1...
αp
?
??, and e=
?
?? ep+1...
eT
?
??.
Then,
?αOLS = (XprimeX)?1Xprimey, ?σ2 = (y?X?α)
prime (y?X?α)
T ?p .
We can show that
1. ?α p→α
CHAPTER 12 TIME SERIES ANALYSIS 5
2. √T (?α?α) d→N(0,Σ), where
Σ =σ2
?
??
??
γ0 γp?1
γ1 γ0 γp?2
...
γp?1 γ0
?
??
??
?1
,
γh =Eyt+hyt,
if the process yt is stationary.
Another way to state thatyt is stationary is that all roots of the characteristic equation
1?α1Z?α2Z2 ?···?αpZp = 0
lie outside the unit circle.
Example 6 Consider the AR(2) process
yt ?yt?1 + 0.16yt?2 =et.
The characteristic equation for this is
1?Z+ 0.16Z2 = (1?0.8Z)(1?0.2Z),
which gives
Z = 10.8, 10.2.
Hence, yt is stationary. We may also express
(1?0.8L)(1?0.2L)yt =et,
which gives
(1?0.8L)yt =
∞summationdisplay
i=0
(0.2)iet?i
= ut
and
yt =
∞summationdisplay
i=0
(0.8)iut?i.
(Impact of the event that happened long ago is negligible)
CHAPTER 12 TIME SERIES ANALYSIS 6
Example 7 Consider the AR(2) process
yt ?1.2yt?1 + 0.2yt?2 =et.
1?1.2Z+ 0.2Z2 = (1?Z)(1?0.2Z) = 0
gives
Z = 1, 10.2.
Hence, yt is not stationary. As a matter of fact,
yt ?yt?1 = (1?0.2L)?1et
= parenleftbig1 + 0.2L+ 0.2L2 +···parenrightbiget
=
∞summationdisplay
i=0
(0.2)iet?i
= ut
and
yt =
tsummationdisplay
i=1
ui +u0.
12.4 Moving average processes
yt =
Msummationdisplay
j=?M
θjet?j
where
et ~iidparenleftbig0,σ2parenrightbig (finite order MA processes).
Example 8
yt =et +θ1et?1 +θ2et?2 ~MA(2)
Consider an MA(q) process
yt = et +θ1et?1 +···+θqet?q
= (1 +θ1L+···+θqLq)et.
CHAPTER 12 TIME SERIES ANALYSIS 7
If all roots of the equation
1 +θ1Z+···+θqZq = 0
lie outside the unit circle, theMAprocess can be written as an infinite—orderAR process,
such that
et =
∞summationdisplay
j=0
ψjyt?j with vextendsinglevextendsingleψjvextendsinglevextendsingle<∞.
When {et} can be expressed in such a way, we call the MA process invertible.
Example 9
yt =et ?et?1 = (1?L)et.
The root of the equation
1?Z = 0
is Z = 1. Hence, this MA process is not invertible. In fact,
et = 11?Lyt = parenleftbig1 +L+L2 +···parenrightbigyt
=
∞summationdisplay
j=0
ψjyt?j,
where
ψj = 1 and
summationdisplayvextendsinglevextendsingle
ψjvextendsinglevextendsingle = ∞.
Example 10
yt =et ?0.9et?1 = (1?0.9L)et.
1?0.9Z = 0
Z = 10.9 >1.
et = 11?0.9Lyt = parenleftbig1 + 0.9L+ 0.92L2 +···parenrightbigyt
=
∞summationdisplay
j=0
ψjyt?j,
where
ψj = 0.9j and
∞summationdisplay
0
vextendsinglevextendsingleψ
j
vextendsinglevextendsingle = 1
1?0.9 <∞.
CHAPTER 12 TIME SERIES ANALYSIS 8
Asymptotic theory for MA(1) model
yt =et +θet?1, |θ|<1 (invertible)
The nonlinear least squares estimator of θ has the following property:
1. ?θ p→θ
2. √T
parenleftBig?
θ?θ
parenrightBig d
→Nparenleftbig0,1?θ2parenrightbig.
(See Fuller (1976))
Remark 2 Estimating the coefficients of the MA processes is rather complicated, since
the problem is nonlinear.
12.5 Autoregressive—moving average process
yt = α1yt?1 +···+αpyt?p +et +θet?1 +···+θqet?q ~ARMA(p,q) model
et ~ iidparenleftbig0,σ2parenrightbig
If all roots of the equations
a(Z) = 1?α1Z?α2Z2 ?···?αpZp = 0
and
b(Z) = 1 +θ1Z+···θqZq = 0
lie outside the unit circle, {yt} is stationary and invertible. We have
√T (η
NLLS ?η) →N(0,V)
where
η =
?
??
??
??
??
α1
...
αp
θ1
...
θq
?
??
??
??
??
, V =σ2
bracketleftbigg EU
1Uprime1 EU1Vprime1
EV1Uprime1 EV1Vprime1
bracketrightbigg?1
,
a(L)Ut =et, b(L)Vt =et.
CHAPTER 12 TIME SERIES ANALYSIS 9
Example 11
yt =α1yt?1 +et +θet?1,et ~iidparenleftbig0,σ2parenrightbig
√Tparenleftbigg ?α1 ?α1
?θ1 ?θ1
parenrightbigg
d→N(0,V),
where
V =
bracketleftBigg
(1?α21)?1 (1 +α1θ1)?1
(1 +α1θ1)?1 parenleftbig1?θ21parenrightbig?1
bracketrightBigg?1
Ut ?α1Ut?1 =et, Vt ?θ1Vt?1 =et
? EU2t = σ
2
1?α21
EUtVt = σ
2
1 +α1θ1
EV2t = σ
2
1?θ21
? V =σ2
bracketleftBigg σ2
1?α21
σ2
1+α1θ1
σ2
1+α1θ1
σ2
1?θ21
bracketrightBigg
=
bracketleftBigg
(1?α21)?1 (1 +α1θ1)?1
(1 +α1θ1)?1 parenleftbig1?θ21parenrightbig?1
bracketrightBigg
The process {yt} is said to be anARIMA(p,d,q) process if (1?L)dyt is a stationary
ARMA(p,q) processes.
Example 12
(1?α)(1?L)yt =et (|α|<1)
yt is an ARIMA(1,1,0) process, i.e., differencing yt once yields ARMA(1,0) process.
Remark 3 The slowly decaying positive sample autocorrelation suggests the appropriate-
ness of an ARIMA model.
Remark 4 We need to difference ARIMA(p,d,q) process d?times in order to obtain a
stationary process.
Remark 5 To see whether there is a unit root or not, we perform unit root tests. That
is, we test H0 :α= 1 in the model
yt =αyt?1 +ut,
where {ut} is an ARMA process. Under the null, differencing once yieldARMA process.