Anda di halaman 1dari 6

Econ 415 Dr. D.

Tran

CHAPTER 10

STATIONARITY AND CONVERTIBILITY CONDITIONS

1. STATIONARITY CONDITIONS
The BJ method applies only to stationary realizations, or to those which can be made
stationary by suitable transformation.

The following are the conditions that AR coefficients must satisfy for an ARIMA model to
be stationary:

SUMMARY OF STATIONARITY CONDITIONS FOR AR COEFFICIENTS


______________________________________________________________________________

MODEL TYPE STATIONARITY CONDITIONS

______________________________________________________________________________

ARMA(0,q) Always stationary

AR(1) or ARMA(1,q) *1* < 1


AR(2) or ARMA(2,q) *2* < 1

1 + 2 < 1

2 ! 1 < 1

AR(p) or ARMA(p,q) Complicated but at least we must satisfy


the following necessary (not sufficient) condition

1 + 2 + ... + p < 1
For p > 2, rely primarily on visual inspection
of the data and the behavior of the estimated
ACF to check for stationarity. If the estimated ACF
does not fall rapidly toward zero at lower lags,
we suspect nonstationarity.
Econ 415/Tran Chapter 10/Page 2

2. INVERTIBILITY CONDITIONS

The following are the summary of the conditions that an MA process or an ARIMA process
must satisfy for them to be invertible.

SUMMARY OF INVERTIBILITY CONDITIONS FOR MA COEFFICIENTS


______________________________________________________________________________

MODEL TYPE INVERTIBILITY CONDITIONS


______________________________________________________________________________

ARMA(p,0) Pure AR process or white noise process: always


invertible

MA(1) or ARMA(p,1) *1* < 1

MA(2) or ARMA(p,2) *2* < 1

2 + 1 < 1

2 ! 1 < 1

MA(q) or ARMA(p,q) complicated but one necessary condition is

1 + 2 + ... + q < 1
______________________________________________________________________________

STATIONARITY AND INVERTIBILITY CONDITIONS FOR MIXED MODELS

ARMA(1,1) *1* < 1


*1* < 1

ARMA(2,2) ? (exercise!)

______________________________________________________________________________

3. REASONS FOR STATIONARITY CONDITIONS

Let Zt = Yt -

Stationarity implies that Var(Zt) = Var(Zt-1) = ... = Var(Zt-k) = 0


Then 1 = Cov(Zt, Zt-1) = E(Zt Zt-1)

[Recall Cov(x,y) = E[(x!Ex)(y!Ey)], here Ex = 0, and Ey = 0; i.e. E(Zt)=0, E(Zt-1)=0. Can you
Econ 415/Tran Chapter 10/Page 3

prove them?]

Now consider AR(1)

(1) Yt = C + 1Yt-1 + at
which can be rewritten as

(2) Zt = 1Zt-1 + at (can you prove it?)

where at is white noise with E(at) = 0, E(at, as) = 2a for t=s and = 0 for t s

and E(at, Zt-1) = 0.


Multiply (2) by Zt-1

(3) ZtZt-1 = 1(Zt-1)2 + atZt-1

Taking expectation of (3) gives


E(ZtZt-1) = 1E(Zt-1)2 + E(atZt-1)

= 1Var(Zt-1) + Cov(at,Zt-1)

= 10 = 1 (by definition of 1)

Consider 2 = Cov(Zt, Zt-2) = E(ZtZt-2).

Multiply (2) by Zt-2, we get

ZtZt-2 = 1(Zt-1Zt-2) + atZt-2

= 1(1Zt-2 + at-1)Zt-2 + atZt-2

= 12 (Zt-2)2 + 1at-1Zt-2 + atZt-2

E(ZtZt-2) = 12E(Zt-2)2 + 1E(at-1Zt-2) + E(atZt-2)

= 12Var(Zt-2)

2 = 120.

Thus k = 1k0.

Now if 1 > 1, k 6 4 as k 6 4.

This means that event 10 years ago has a much stronger effect on that current value of the data than
an event that happened just last year. An event 100 years ago would have even more a dramatic
effect on today/s observation than an event that occurred 10 years ago. This is not very reasonable.
Econ 415/Tran Chapter 10/Page 4

In the above k 1 = k/0 = Cov(Zt, Zt-k)/Var(Zt), ACF will explode as k 6 4. Thus, if ACF
does not damp out or only die out slowly at higher lags, it indicates that the data is nonstationary.
The situation is even worse if the variance Var(Zt) Var(Zt-1) not constant through time. In this
case, we would have to estimate up to 2n parameters (n means + n variances) with only n
observations.

It can also be shown that a model which violates the stationarity restrictions will produce
forecasts whose variance increases without limit, an undesirable result.

Also if any of the roots of (B) < 1, ACF will explode. Thus the roots have to be outside the
unit circle.

4. CHECKING FOR STATIONARITY IN PRACTICE


(1) Examine the realization visually to see if either the mean or the variance appears to be
changing over time.

(2) Examine the estimated ACF to see if the ACF coefficients move rapidly toward zero.
"Rapidly" means the absolute value of t-statistic < 1.6 by about lag 5 or 6. If not, nonstationary.

(3) See if
*1* < 1 for AR(1) or ARMA(1,q)

*2* < 1 for AR(2) or ARMA(2,q)

1 + 2 < 1

2 ! 1 < 1

1 + 2 + ... + p < 1 for AR(p) or ARMA(p,q)

5. REASONS FOR INVERTIBILITY


It is easy to show that the mean and the variance of an MA model are constant over time
without imposing any restrictions on the value of the MA parameters (see page 42 of Walter
Vandaele, Applied Time Series Model, Academic Press, 1983).

Thus, the stationarity condition does not impose restrictions on the value of 1.

A. Now consider MA(1).

(1) , or

(2)
Econ 415/Tran Chapter 10/Page 5

which yields

(3)

Substituting (3) into (2), we obtain

(4)

Continuing substituting for at-2, at-3, ..., MA(1) process can be expressed as

(5)

Thus if 1 > 1 then the distant past observations (Zt-k) have greater influence on the present
(Zt) than the more recent past observations. Thus, it is more reasonable to assume that *1* < 1.

B.

Define

Thus ACF is
Econ 415/Tran Chapter 10/Page 6

Now suppose we have two MA(1) processes:

(1)

(2)

For process (1):

For process (2):

for both processes.

Thus the ACF's are the same, and we are unable to go back, to invert, uniquely from the ACF to
process (1) or to process (2).

By imposing *1* < 1 we can exclude model (2).

C. The BJ models are invertible if all of the roots of the MA polynomials (B) lie outside the unit
circle.

Suppose

then if the process is invertible we can solve for at with

(at is a linear combination of infinite weighted sum of Zt). Otherwise, at cannot be computed.