Anda di halaman 1dari 8

66 CHAPTER 4.

STATIONARY TS MODELS
4.3 Moving Average Process MA(q)
Denition 4.5. {X
t
} is a moving-average process of order q if
X
t
= Z
t
+
1
Z
t1
+ . . . +
q
Z
tq
, (4.9)
where
Z
t
WN(0,
2
)
and
1
, . . . ,
q
are constants.
Remark 4.6. X
t
is a linear combination of q + 1 white noise variables and we say
that it is q-correlated, that is X
t
and X
t+
are uncorrelated for all lags > q.
Remark 4.7. If Z
t
is an i.i.d process then X
t
is a strictly stationary TS since
(Z
t
, . . . , Z
tq
)
T
d
= (Z
t+
, . . . , Z
tq+
)
T
for all . Then it is called q-dependent, that is X
t
and X
t+
are independent for
all lags > q .
Remark 4.8. Obviously,
IID noise is a 0-dependent TS.
White noise is a 0-correlated TS.
MA(1) is 1-correlated TS if it is a combination of WN r.vs, 1-dependent if
it is a combination of IID r.vs.
Remark 4.9. The MA(q) process can also be written in the following equivalent
form
X
t
= (B)Z
t
, (4.10)
where the moving average operator
(B) = 1 +
1
B +
2
B
2
+ . . . +
q
B
q
(4.11)
denes a linear combination of values in the shift operator B
k
Z
t
= Z
tk
.
4.3. MOVING AVERAGE PROCESS MA(Q) 67
Example 4.4. MA(2) process.
This process is written as
X
t
= Z
t
+
1
Z
t1
+
2
Z
t2
= (1 +
1
B +
2
B
2
)Z
t
. (4.12)
What are the properties of MA(2)? As it is a combination of a zero mean white
noise, it also has zero mean, i.e.,
EX
t
= E(Z
t
+
1
Z
t1
+
2
Z
t2
) = 0.
It is easy to calculate the covariance of X
t
and X
t+
. We get
() = cov(X
t
, X
t+
) =
_

_
(1 +
2
1
+
2
2
)
2
for = 0,
(
1
+
1

2
)
2
for = 1,

2
for = 2,
0 for || > 2,
which shows that the autocovariances depend on lag, but not on time. Dividing
() by (0) we obtain the autocorrelation function,
() =
_

_
1 for = 0,

1
+
1

2
1+
2
1
+
2
2
for = 1,

2
1+
2
1
+
2
2
for = 2
0 for || > 2.
MA(2) process is a weakly stationary, 2-correlated TS.

Figure 4.5 shows MA(2) processes obtained from the simulated Gaussian white
noise shown in Figure 4.1 for various values of the parameters (
1
,
2
).
The blue series is
x
t
= z
t
+ 0.5z
t1
+ 0.5z
t2
,
while the purple series is
x
t
= z
t
+ 5z
t1
+ 5z
t2
,
where z
t
are realizations of an i.i.d. Gaussian noise.
As you can see very different processes can be obtained for different sets of the
parameters. This is an important property of MA(q) processes, which is a very
large family of models. This property is reinforced by the following Proposition.
Proposition 4.2. If {X
t
} is a stationary q-correlated time series with mean zero,
then it can be represented as an MA(q) process.

68 CHAPTER 4. STATIONARY TS MODELS


10 30 50 70 90
-20
-10
0
10
time index
S
i
m
u
l
a
t
e
d

M
A
(
2
)
Figure 4.5: Two simulated MA(2) processes, both from the white noise shown in
Figure 4.1, but for different sets of parameters: (
1
,
2
) = (0.5, 0.5) and (
1
,
2
) =
(5, 5).
Lag
A
C
F
0 5 10 15 20
-
0
.
2
0
.
0
0
.
2
0
.
4
0
.
6
0
.
8
1
.
0
Series : GaussianWN$xt
Lag
A
C
F
0 5 10 15 20
-
0
.
2
0
.
0
0
.
2
0
.
4
0
.
6
0
.
8
1
.
0
Series : GaussianWN$xt55
(a) (b)
Figure 4.6: (a) Sample ACF for x
t
= z
t
+ 0.5z
t1
+ 0.5z
t2
and (b) for x
t
=
z
t
+ 5z
t1
+ 5z
t2
.
4.3. MOVING AVERAGE PROCESS MA(Q) 69
Also, the following theorem gives the form of ACF for a general MA(q).
Theorem 4.2. An MA(q) process (as in Denition 4.5) is a weakly stationary TS
with the ACVF
() =
_

2

q||
j=0

j

j+||
, if || q,
0, if || > q,
(4.13)
where
0
is dened to be 1.

The ACF of an MA(q) has a distinct cut-off at lag = q. Furthermore, if q is


small the maximum value of |(1)| is well below unity. It can be shown that
|(1)| cos
_

q + 2
_
. (4.14)
4.3.1 Non-uniqueness of MA Models
Consider an example of MA(1)
X
t
= Z
t
+ Z
t1
whose ACF is
() =
_
_
_
1, if = 0,

1+
2
if = 1,
0, if || > 1.
For q = 1, formula (4.14) means that the maximumvalue of |(1)| is 0.5. It can be
veried directly from the formula for the ACF above. Treating (1) as a function
of we can calculate its extrema. Denote
f() =

1 +
2
.
Then
f

() =
1
2
(1 +
2
)
2
.
The derivative is equal to 0 at = 1 and the function f attains maximum at
= 1 and minimum at = 1. We have
f(1) =
1
2
, f(1) =
1
2
.
This fact can be helpful in recognizing MA(1) processes. In fact, MA(1) with
|| = 1 may be uniquely identied from the autocorrelation function.
70 CHAPTER 4. STATIONARY TS MODELS
However, it is easy to see that the form of the ACF stays the same for and
for
1

. Take for example 5 and


1
5
. In both cases
() =
_
_
_
1 if = 0,
5
26
if = 1,
0 if || > 1.
Also, the pair
2
= 1, = 5 gives the same ACVF as the pair
2
= 25, =
1
5
,
namely
() =
_
_
_
(1 +
2
)
2
= 26, if = 0,

2
= 5, if = 1,
0, if || > 1.
Hence, the MA(1) processes
X
t
= Z
t
+
1
5
Z
t1
, Z
t

iid
N(0, 25)
and
X
t
= Y
t
+ 5Y
t1
, Y
t

iid
N(0, 1)
are the same. We can observe the variable X
t
, not the noise variable, so we can not
distinguish between these two models. Except for the case of || = 1, a particular
autocorrelation function will be compatible with two models.
To which of the two models we should restrict our attention?
4.3.2 Invertibility of MA Processes
The MA(1) process can be expressed in terms of lagged values of X
t
by substitut-
ing repeatedly for lagged values of Z
t
. We have
Z
t
= X
t
Z
t1
.
The substitution yields
Z
t
= X
t
Z
t1
= X
t
(X
t1
Z
t2
)
= X
t
X
t1
+
2
Z
t2
= X
t
X
t1
+
2
(X
t2
Z
t3
)
= X
t
X
t1
+
2
X
t2

3
Z
t3
= . . .
= X
t
X
t1
+
2
X
t2

3
X
t3
+
4
X
t4
+ . . . + ()
n
Z
tn
.
4.3. MOVING AVERAGE PROCESS MA(Q) 71
This can be rewritten as
()
n
Z
tn
= Z
t

n1

j=0
()
j
X
tj
.
However, if || < 1, then
E
_
Z
t

n1

j=0
()
j
X
tj
_
2
= E
_

2n
Z
2
tn
_

n
0
and we say that the sum is convergent in the mean square sense. Hence, we obtain
another representation of the model
Z
t
=

j=0
()
j
X
tj
.
This is a representation of another class of models, called innite autoregressive
(AR) models. So we inverted MA(1) to an innite AR. It was possible due to the
assumption that || < 1. Such a process is called an invertible process. This
is a desired property of TS, so in the example we would choose the model with

2
= 25, =
1
5
.
72 CHAPTER 4. STATIONARY TS MODELS
4.4 Linear Processes
Denition 4.6. The TS {X
t
} is called a linear process if it has the representation
X
t
=

j=

j
Z
tj
, (4.15)
for all t, where Z
t
WN(0,
2
) and {
j
} is a sequence of constants such that

j=
|
j
| < .

Remark 4.10. The condition

j=
|
j
| < ensures that the process converges
in the mean square sense, that is
E
_
X
t

j=n

j
Z
tj
_
2
0 as n .
Remark 4.11. MA() is a linear process with
j
= 0 for j < 0 and
j
=
j
for
j 0, that is MA() has the representation
X
t
=

j=0

j
Z
tj
,
where
0
= 1.
Note that the formula (4.15) can be written using the backward shift operator B.
We have
Z
tj
= B
j
Z
t
.
Hence
X
t
=

j=

j
Z
tj
=

j=

j
B
j
Z
t
.
Denoting
(B) =

j=

j
B
j
, (4.16)
4.4. LINEAR PROCESSES 73
we can write the linear process in a neat way
X
t
= (B)Z
t
.
The operator (B) is a linear lter, which when applied to a stationary process
produces a stationary process. This fact is proved in the following proposition.
Proposition 4.3. Let {Y
t
} be a stationary TS with mean zero and autocovariance
function
Y
. If

j=
|
j
| < , then the process
X
t
=

j=

j
Y
tj
= (B)Y
t
(4.17)
is stationary with mean zero and autocovariance function

X
() =

j=

k=

Y
( k + j). (4.18)
Proof. The assumption

j=
|
j
| < assures convergence of the series. Now,
since EY
t
= 0, we have
EX
t
= E
_

j=

j
Y
tj
_
=

j=

j
E(Y
tj
) = 0
and
E(X
t
X
t+
) = E
__

j=

j
Y
tj
__

k=

k
Y
t+k
__
=

j=

k=

k
E(Y
tj
Y
t+k
)
=

j=

k=

Y
( k + j).
It means that {X
t
} is a stationary TS with the autocavarianxe function given by
formula (4.18).

Corrolary 4.1. If {Y
t
} is a white noise process, then {X
t
} given by (4.17) is a
stationary linear process with zero mean and the ACVF

X
() =

j=

j+

2
. (4.19)

Anda mungkin juga menyukai