Modeling stationary time series

Universidad Privada Boliviana

Prof. J. Dávalos (Ph.d.)

Modeling stationary time series - ARMA models

Introduction

  • We need to model/create/represent/mimic, a stationary TS
    • We start by building the simplest possible TS from our main building block, the WN.
    • The, we build bigger models
    • We study the conditions that make our TS, stationary
      • Why? analytically tractable/statistical inference of their estimators are standard.
  • We will denote our white noise \(\varepsilon_t\) for any \(t\). It is an iid random variable with zero mean and constant variance i.e. \(\varepsilon_t \sim WN(0,\sigma^2)\)
  • Thus, we get a white noise time series of size \(T\): \(\{\varepsilon_1,\varepsilon_2,...,\varepsilon_T\}\)
    • Naive example:
      • Imagine that you plan to draw a well-behaved dice 10 times, each draw is independent.
      • Now substract 3.5 to positively sign wins (above average) from the value of the dice.
      • This is a white noise time series of size 10 (a uniformly distributed one)

Moving average process (MA)

Let’s model/create/build a TS noted \(\{Y_t\}\) using our building block \(\{\varepsilon_t\}\). Is \(\{Y_t\}\) WN? Stationary?

  • Naive model
    • \(Y_t = \varepsilon_t\)
    • \(Y_t\) is obviously WN and stationary
  • Model 1
    • \(Y_t = a + \varepsilon_t\) where \(a\) is a fixed value (a constant, non random)
    • \(E(Y_t) = a\) ; so it is not white anymore.
    • Yet, \(V(Y_t) = V(\varepsilon) = \sigma^2\)
    • \(Y_t\) autocovariance (or covariance) is the same as \(\varepsilon_t\) so it should be independent across time \(t\): \(\gamma_Y(h) = \gamma_{\varepsilon}(h) = 0\)
  • Note that adding a constant to \(\varepsilon\) keeps the normality assumption. So \(Y_t \sim N(a,\sigma^2)\)
  • Is model 1 (\(Y_t\)) stationary ?
  • Indeed, its expectation is constant across \(t\), so is the variance and the (auto)covariance.
  • This is just noise around a mean \(a\):

Simulation

set obs 100          // Time periods 
scalar sigma2  = 2   // Sigma2
scalar a       = 5   // a


gen E          = rnormal(0,sqrt(sigma2)) // our Gaussian WN, we assumed normality
gen y          = a + E
gen time       = _n

li * 

** Plot
tsset time
tsline y 
Number of observations (_N) was 0, now 100.

     +-----------------------------+
     |         E          y   time |
     |-----------------------------|
  1. |  2.864614   7.864614      1 |
  2. |  1.474503   6.474503      2 |
  3. |  .4210289   5.421029      3 |
  4. | -2.435462   2.564538      4 |
  5. | -1.031244   3.968756      5 |
     |-----------------------------|
  6. |  1.218806   6.218806      6 |
  7. | -.3384977   4.661502      7 |
  8. |  .7305106   5.730511      8 |
  9. | -2.562577   2.437423      9 |
 10. | -1.435603   3.564397     10 |
     |-----------------------------|
 11. |  1.549084   6.549084     11 |
 12. |  1.884651   6.884651     12 |
 13. | -.5100899    4.48991     13 |
 14. |  1.039596   6.039597     14 |
 15. | -1.359742   3.640258     15 |
     |-----------------------------|
 16. |  1.063672   6.063673     16 |
 17. | -.6599472   4.340053     17 |
 18. |  -3.46776    1.53224     18 |
 19. | -3.283812   1.716188     19 |
 20. | -.8648911   4.135109     20 |
     |-----------------------------|
 21. | -.6531058   4.346894     21 |
 22. |  3.445471   8.445471     22 |
 23. | -.3002964   4.699704     23 |
 24. | -1.147575   3.852425     24 |
 25. |   .028894   5.028894     25 |
     |-----------------------------|
 26. |  1.285484   6.285484     26 |
 27. |  1.414875   6.414875     27 |
 28. |  .0565763   5.056576     28 |
 29. |  2.465449   7.465449     29 |
 30. | -.5244818   4.475518     30 |
     |-----------------------------|
 31. | -.2188586   4.781141     31 |
 32. | -2.198336   2.801664     32 |
 33. | -1.001308   3.998692     33 |
 34. | -.0851307   4.914869     34 |
 35. | -.1117407   4.888259     35 |
     |-----------------------------|
 36. |  .8563858   5.856386     36 |
 37. | -1.225435   3.774565     37 |
 38. |  .6086493   5.608649     38 |
 39. | -1.295486   3.704514     39 |
 40. |     1.307      6.307     40 |
     |-----------------------------|
 41. |   .189557   5.189557     41 |
 42. |  1.447839   6.447839     42 |
 43. |   .283816   5.283816     43 |
 44. | -.5992157   4.400784     44 |
 45. |  .4388728   5.438873     45 |
     |-----------------------------|
 46. | -.7272592   4.272741     46 |
 47. | -1.059371   3.940629     47 |
 48. | -1.915077   3.084923     48 |
 49. | -.9229075   4.077093     49 |
 50. |  .0465059   5.046506     50 |
     |-----------------------------|
 51. | -.8753731   4.124627     51 |
 52. | -1.352075   3.647925     52 |
 53. |  1.004543   6.004543     53 |
 54. | -.0242493   4.975751     54 |
 55. |  2.535809   7.535809     55 |
     |-----------------------------|
 56. |  -.906954   4.093046     56 |
 57. |  1.292298   6.292298     57 |
 58. | -.7991216   4.200878     58 |
 59. | -.8866276   4.113372     59 |
 60. |  .0466306   5.046631     60 |
     |-----------------------------|
 61. |  -1.85502    3.14498     61 |
 62. |  .0454493   5.045449     62 |
 63. | -1.736427   3.263573     63 |
 64. | -.9650788   4.034921     64 |
 65. |  1.460471   6.460471     65 |
     |-----------------------------|
 66. | -.1565347   4.843465     66 |
 67. | -1.047081   3.952919     67 |
 68. | -.2569804    4.74302     68 |
 69. |  1.118015   6.118015     69 |
 70. | -.6404347   4.359565     70 |
     |-----------------------------|
 71. | -3.629284   1.370716     71 |
 72. | -1.395959   3.604041     72 |
 73. |  1.716648   6.716649     73 |
 74. |  1.623376   6.623376     74 |
 75. | -2.610505   2.389495     75 |
     |-----------------------------|
 76. |   .348838   5.348838     76 |
 77. | -.1201703    4.87983     77 |
 78. |  -.229159   4.770841     78 |
 79. |  .6864594    5.68646     79 |
 80. | -.3614471   4.638553     80 |
     |-----------------------------|
 81. | -.3914986   4.608501     81 |
 82. | -.9255267   4.074473     82 |
 83. |  -1.88965    3.11035     83 |
 84. | -.0772986   4.922701     84 |
 85. |  .1985168   5.198517     85 |
     |-----------------------------|
 86. | -1.412021   3.587979     86 |
 87. | -1.826748   3.173252     87 |
 88. |  2.411749   7.411749     88 |
 89. | -.5181077   4.481892     89 |
 90. |  1.934772   6.934772     90 |
     |-----------------------------|
 91. | -.7487599    4.25124     91 |
 92. | -1.586738   3.413262     92 |
 93. | -.3869771   4.613023     93 |
 94. |  .5136883   5.513688     94 |
 95. |  .4415336   5.441534     95 |
     |-----------------------------|
 96. | -.9372464   4.062754     96 |
 97. | -.1450005      4.855     97 |
 98. | -1.075713   3.924287     98 |
 99. | -.7323148   4.267685     99 |
100. |  .4914735   5.491474    100 |
     +-----------------------------+


Time variable: time, 1 to 100
        Delta: 1 unit

\(Y_t = 5 + \varepsilon_t\) where \(\varepsilon_t \sim N(0,2)\), stationary process

  • \(Y_t\) is ii normally distributed (iid)
  • Model 2
    • Now let’s add one WN from the immediate past, most relevant past \(\varepsilon_{t-1}\) but weighted by \(\theta_1\) :
    • \[Y_t = a + \varepsilon_t + \theta_1 \varepsilon_{t-1}\]
    • Is it WN?
      • \(E(Y_t) = a\). If \(a = 0\) this could eventually be WN given that \(V(Y_t)= (1+\theta_1^2)\sigma^2\).
      • But, since \(Y_t\) and \(Y_{t-1}\) share \(\varepsilon_t\), they are correlated: \(\gamma(1)\) cannot be 0. Note however that \(\gamma(h) = 0\) for h>1.
      • So this cannot be WN even if \(a\) is 0
  • Is it stationary?
    • The mean and variance are fixed/constant (across t).
    • what about time dependence \(\gamma(h)\)? \[\gamma(1) = E(Y_t - a) (Y_{t-1} - a )\]
      • \(= E(\varepsilon_t + \theta_1 \varepsilon_{t-1}) (\varepsilon_{t-1} + \theta_1 \varepsilon_{t-2})\)

      • \(= E(\varepsilon_{t}\varepsilon_{t-1} + \theta_1 \varepsilon_{t}\varepsilon_{t-2} \\ + \theta_1\varepsilon_{t-1}^2 + \theta_1^2\varepsilon_{t-1}\varepsilon_{t-2}\)

      • \(= \theta_1\sigma^2\)

      • \(\gamma(1) = \theta_1\sigma^2\) this does not depend on \(t\)

  • Further covariances \(\gamma(h)\) for lags (\(h\)) larger than 1 must be 0, since \(Y_t\) and \(Y_{t-h}\) don’t share any WN \(\varepsilon_t\) term. Thus, the process is stationary since 0 is independent from \(t\).

  • As you notice, the (auto)covariance functions \(\gamma_Y(h)\) is key,

  • \[ \gamma_{Y_t}(h) = \begin{cases} (1+\theta_1^2)\sigma^2 & \text{if $h=0$ this yields $V(Y_t)$}\\ \theta_1\sigma^2 & \text{if $h=1$ }\\ 0 & h>1 \end{cases} \]

  • No covariance term \(\gamma(h)\) depends on \(t\). Given a constant mean, (weak) stationarity builds solely on \(\gamma(h)\). This gave rise to the term ‘covariance stationarity’.

Simulation

clear all
set obs 100          // Time periods 
scalar sigma2  = 2   // Sigma2
scalar a       = 5   // a
scalar theta1 = 3

gen E          = rnormal(0,sqrt(sigma2)) // Gaussian WN
gen time       = _n

tsset time
gen y = .   // Just to initialize our variable

* Simulation
dyngen{
update y          = a + E + theta1*l.E  if time > 1
}

li *  in 1/10

** Plot
tsset time
tsline y 
Number of observations (_N) was 0, now 100.







Time variable: time, 1 to 100
        Delta: 1 unit

(100 missing values generated)

     +------------------------------+
     |         E   time           y |
     |------------------------------|
  1. |  2.942765      1           . |
  2. | -.9301596      2    12.89813 |
  3. | -.0054639      3    2.204057 |
  4. | -.4752548      4    4.508354 |
  5. | -3.196665      5    .3775709 |
     |------------------------------|
  6. | -.0644086      6   -4.654403 |
  7. |  .6207599      7    5.427534 |
  8. | -1.469483      8    5.392797 |
  9. | -.9865208      9   -.3949695 |
 10. |   2.38448     10    4.424917 |
     +------------------------------+


Time variable: time, 1 to 100
        Delta: 1 unit

\(Y_t = 5 + \varepsilon_t + 3\varepsilon_{t-1}\) where \(\varepsilon_t \sim N(0,2)\), covariance stationary process

  • \(Y_t\) is not independently distributed, yet it is still normally distributed (sum of 2 normal WN)
  • Model 3

    • Now let’s add many WN from the past, all weighted by \(\theta_j\) coefficients :
      • \[Y_t = a + \varepsilon_t + \theta_1 \varepsilon_{t-1} + \theta_2 \varepsilon_{t-2} + \theta_3 \varepsilon_{t-3} + ... +\theta_q \varepsilon_{t-q}\]
  • Recall the definition of a moving average, it is a rolling window average with arbitrary weights.

  • The terms \(\theta_1 \varepsilon_{t-1} + \theta_2 \varepsilon_{t-2} + \theta_3 \varepsilon_{t-3} + ... +\theta_q \varepsilon_{t-q}\) are a \(\theta\)-weighted sum of past information (WM) over a q-sized rolling window, from \(t-1\) to \(t-q\). Thus, it is a (weigthed) moving average of past WN.

Exercise

  1. Identify the \(\gamma(h)\) function for a MA(2) process: \(Y_t = a + \varepsilon_t + \theta_1\varepsilon_{t-1} + \theta_2 \varepsilon_{t-2}\)
  • Tip. See Hamilton, Ch.3

MA(q)

  • \(Y_t = a + \varepsilon_t + \sum_{j=1}^{q} \theta_j \varepsilon_{t-j}\)

  • A general finite (q) order moving average model shares the previous properties

  • \(\gamma(h) = \\ \begin{cases} (1+\sum_{j=1}^q\theta_j^2) \sigma^2 & \text{if $h=0$}\\ (\theta_h + \theta_{h+1}\theta_1+ \theta_{h+2}\theta_2 + ...+\theta_q\theta_{q-h})\sigma^2 & \text{if $h=1,2,...,q$ }\\ 0 & h>q \end{cases}\)

  • Is it stationary ? …

  • Indeed, it has constant mean (\(a\)), and \(\gamma(h)\) does not depend on \(t\).

  • Thus, any MA(q) process is stationary

Exercise

  1. Given \(\gamma(h)\), the (auto)covariance function of a MA(q) process, provide the definition of the autocorrelation function \(\rho(h)\)

  2. Draw the autocorrelation function for a MA(3) where \(\theta_1 =3\) , \(\theta_2=-2\), \(\theta_3=0.1\) and \(\sigma^2 = 2\).

Autoregresions (AR)

  • Many time series in social sciences exhibit some inertia. A past behavior \(Y_{t-h}\) may affect today’s i.e. \(Y_{t}\). Many non-systematic environmental factors may also alter \(Y_{t}\). Simply put, while using our main building block \(\varepsilon_t\):

    • \(Y_t = a + \delta Y_{t-1} + \varepsilon_t\)
  • This is an AUTOregression or first-order autoregresive model - AR(1)

  • Since there is a constant \(a\) this will not be WN even if we cancel our newly introduced lagged term by setting \(\delta = 0\). So what matters is to indentify whether this is stationary.

  • To identify its expectation and \(\gamma(h)\) we need to rewrite it as a function of our building blocks \(\{\varepsilon_t\}\)

AR stationarity

  • We can forward or backward replace the lagged dependent variable to write it as a function of our WN process.

  • From a backward replacement of the stochastic series \(\{Y_1, Y_2,...,Y_t\}\) : \(Y_{t-1} = a + \delta Y_{t-2} + \varepsilon_{t-1}\) into \(Y_t\), replacing everything, even \(Y_1\) as a function of an initial condition \(Y_0\):

  • \(Y_t = \\ a +a\delta+ a\delta^2+...+a\delta^{t-1} + \\ \delta^t Y_0 + \\ +\varepsilon_t + \delta \varepsilon_{t-1} + \delta^2 \varepsilon_{t-2} +...+ \delta^{t-1} \varepsilon_1\)

  • \(Y_t = a \sum_{j=0}^{t-1}\delta^j + \delta^t Y_0 + \sum_{j=0}^{t-1}\delta^j\varepsilon_{t-j}\)

  • Applying the expectation operator to identify the population mean : \(E (Y_t) = a \sum_{j=0}^{t-1}\delta^j + \delta^t Y_0 + \sum_{j=0}^{t-1}\delta^j\underbrace{E(\varepsilon_{t-j})}_0\)
    • This depends on \(t\) from the first and second term.
  • If our time series is long, there is hope. Assume \(t \rightarrow \infty\):
    • First term is the sum of geometric series. If \(|\delta| < 1\), then: \(\lim_{t\rightarrow \infty} a \sum_{j=0}^{t-1}\delta^{t} = a (\frac{1}{1-\delta})\)
    • Clearly, the second term will tend to zero if \(|\delta| < 1\) as \(t \rightarrow \infty\)
  • Hence, a long TS that follows a AR(1) process whose \(|\delta| < 1\), has constant mean (time invariant).
  • Short process or \(|\delta| >1\) are doomed non-stationary
  • Let’s introduce some intuition regarding stationarity by running a simulation
clear all
set obs 1000         // Long Time Series
scalar sigma2  = 30  // Sigma2
scalar a       = 5   // a
scalar y0      = 100  // initial condition in Y_t
scalar delta_I    = 0.7
scalar delta_II   = 1
scalar delta_III  = 1.3
scalar lrm =  a*(1/(1-delta_I)) // Long run mean I
di lrm

gen E          = rnormal(0,sqrt(sigma2)) // Gaussian WN
gen time       = _n

tsset time
gen yI   = y0 in 1  //
gen yII  = y0 in 1  // Just to initialize our AR process
gen yIII = y0 in 1  //
  
* Simulation
dyngen{
update yI          = a + E + delta_I*l.yI       if time > 1
update yII         = a + E + delta_II*l.yII     if time > 1
update yIII        = a + E + delta_III*l.yIII   if time > 1
}


li *  in 950/1000

** Plot
tsset time
tsline yI in 950/1000 // Long ts
Number of observations (_N) was 0, now 1,000.








16.666667




Time variable: time, 1 to 1000
        Delta: 1 unit

(999 missing values generated)

(999 missing values generated)

(999 missing values generated)

      +-----------------------------------------------+
      |         E   time         yI        yII   yIII |
      |-----------------------------------------------|
 950. |  7.816747    950   35.15678   4693.931      . |
 951. |  5.871238    951   35.48099   4704.802      . |
 952. | -5.816751    952   24.01994   4703.985      . |
 953. |  7.427537    953    29.2415   4716.413      . |
 954. |  1.598201    954   27.06725   4723.011      . |
      |-----------------------------------------------|
 955. | -7.585894    955   16.36118   4720.425      . |
 956. | -4.136673    956   12.31615   4721.288      . |
 957. |  5.517688    957   19.13899   4731.806      . |
 958. |  9.490705    958     27.888   4746.296      . |
 959. | -2.827493    959   21.69411   4748.469      . |
      |-----------------------------------------------|
 960. |  1.446644    960   21.63252   4754.916      . |
 961. |  7.161493    961   27.30426   4767.077      . |
 962. |  4.269646    962   28.38263   4776.347      . |
 963. |  .0351305    963   24.90297   4781.382      . |
 964. |  1.880224    964    24.3123   4788.262      . |
      |-----------------------------------------------|
 965. |  1.011597    965   23.03021   4794.274      . |
 966. |  8.340838    966   29.46198   4807.615      . |
 967. |  1.541983    967   27.16537   4814.157      . |
 968. |  1.698817    968   25.71458   4820.855      . |
 969. | -.7546824    969   22.24552   4825.101      . |
      |-----------------------------------------------|
 970. | -4.848135    970   15.72373   4825.252      . |
 971. |  8.690218    971   24.69683   4838.943      . |
 972. |  .3269342    972   22.61472    4844.27      . |
 973. |   -6.3697    973    14.4606     4842.9      . |
 974. |  -6.62886    974   8.493561   4841.271      . |
      |-----------------------------------------------|
 975. | -9.420105    975   1.525388   4836.852      . |
 976. |  6.309196    976   12.37697   4848.161      . |
 977. | -7.326552    977   6.337325   4845.834      . |
 978. | -1.545349    978   7.890779   4849.289      . |
 979. | -5.470868    979   5.052677   4848.818      . |
      |-----------------------------------------------|
 980. | -.4659393    980   8.070934   4853.352      . |
 981. |  8.296366    981   18.94602   4866.648      . |
 982. | -.8155763    982   17.44664   4870.833      . |
 983. | -.8798494    983    16.3328   4874.953      . |
 984. |  8.523033    984   24.95599   4888.476      . |
      |-----------------------------------------------|
 985. | -2.717906    985   19.75129   4890.758      . |
 986. |  6.893144    986   25.71905   4902.651      . |
 987. |   10.9484    987   33.95173     4918.6      . |
 988. | -9.461301    988   19.30491   4914.138      . |
 989. |  3.767349    989   22.28078   4922.906      . |
      |-----------------------------------------------|
 990. |  8.361107    990   28.95766   4936.267      . |
 991. | -2.321234    991   22.94913   4938.946      . |
 992. | -15.17878    992   5.885604   4928.767      . |
 993. | -6.736923    993      2.383    4927.03      . |
 994. |  2.204907    994   8.873008   4934.235      . |
      |-----------------------------------------------|
 995. |   14.5068    995    25.7179   4953.742      . |
 996. | -.3824061    996   22.62013    4958.36      . |
 997. |  6.361985    997   27.19607   4969.722      . |
 998. |  3.286541    998   27.32379   4978.008      . |
 999. |  4.825727    999   28.95238   4987.834      . |
      |-----------------------------------------------|
1000. |  .6276147   1000   25.89428   4993.461      . |
      +-----------------------------------------------+


Time variable: time, 1 to 1000
        Delta: 1 unit
  • A TS that had a long past behind it…
** Plot
tsset time
tsline yI in 1/50 // short TS
Time variable: time, 1 to 1000
        Delta: 1 unit
  • A TS that had a short past…

** Plot
tsset time
tsline yI   // everything
Time variable: time, 1 to 1000
        Delta: 1 unit
  • The whole TS…

  • From our formulas, \(|\theta| >= 1\) can not have a constant mean, not even with high \(t\). It makes things worst:
** Plot
tsset time
tsline yII   // everything
Time variable: time, 1 to 1000
        Delta: 1 unit
  • The whole TS for \(\delta = 1\)

  • Clearly time dependent. It even seems to have a (deterministic) trend: \(Y_t = \beta_0 + \beta_1 t + \varepsilon_t\)

** Plot
tsset time
tsline yIII   // everything
Time variable: time, 1 to 1000
        Delta: 1 unit
  • The whole TS for \(\delta > 1\)

AR(1) and MA relationship

  • If a long TS brings the two first terms of: \(Y_t = a \sum_{j=0}^{t-1}\delta^j + \delta^t Y_0 + \sum_{j=0}^{t-1}\delta^j\varepsilon_{t-j}\) to zero given \(|\delta|<1\), then what?
    • \(Y_t =\sum_{j=0}^{t-1}\delta^j\varepsilon_{t-j}\)
    • \(Y_t = \varepsilon_t + \delta \varepsilon_{t-1} + \delta^2 \varepsilon_{t-2} +...+ \delta^{t-1} \varepsilon_1\)
  • So given a long TS, an AR(1) \(\equiv\) MA(\(\infty\)) !

AR(1) Covariance function

  • Mean stationarity is only available given a long TS and \(\delta < 1\), so there is no point of studying the \(\gamma(h)\) function without these assumptions.

  • So instead of this solution for \(Y_t\): \(Y_t = a \sum_{j=0}^{t-1}\delta^j + \delta^t Y_0 + \sum_{j=0}^{t-1}\delta^j\varepsilon_{t-j}\)

  • We should use:

  • \(Y_t = \varepsilon_t + \delta \varepsilon_{t-1} + \delta^2 \varepsilon_{t-2} +...+ \delta^{t-1} \varepsilon_1\)

    • \(\gamma(0) = \frac{\sigma^2}{1-\delta^2}\)
    • \(\gamma(1) = \frac{\sigma^2 \delta}{1-\delta^2}\)
    • \(\gamma(j) = \frac{\sigma^2 \delta^j}{1-\delta^2}\)
  • Is this AR(1) process, covariance stationary?

    • Indeed, there is no time dependence.

AR(p) process

  • Stationarity of a more general autoregressice process AR(p) could be cumbersome to check using the previous approach

  • \(Y_t = a + \sum_{j=1}^p \alpha_j Y_{t-j} + \varepsilon_t\)

  • An alternative way to check for stationarity (in mean and covariance) is:

    • \(\sum_j \alpha_j < 1\), this is a necessary (not sufficient condition)
    • \(\sum_j |\alpha_j| < 1\), the signs of the coefficients may cancel out from the previous expression, so this is a sufficient condition.
  • If \(\sum_j |\alpha_j| = 1\), there will not be stationarity. The implied difference equations solution will have a unit root solution.

  • The caveat of this approach is the dismissal of informative \(\gamma(h)\), but we will come back to this while studying the Box-Jenkins specification approach.

ARMA(p,q)

  • Stationarity of a more general autoregressive process ARMA(p,q) depends solely on the AR(p) process

  • \(Y_t = a + \sum_j \alpha_j Y_{t-j} + \varepsilon_t \sum_{j=1}^{q}\theta_j\varepsilon_{t-j} \varepsilon_t\)

Exercise

  1. Use Stata to simulate:

\(Y_t = 0.4 + 0.7Y_{t-1} + 0.1Y_{t-2} + \varepsilon_t + 2\varepsilon_{t-1} - 0.5\varepsilon_{t-2}\)

where the WN follows a N(0,4) distribution for T = 200. Look at the first 20 periods, does it look stationary? Use \(y_0 = 30\) Explain.