VAR

Modelo VAR(\(p\))

La serie multivariada \(\boldsymbol{z_t}\) sigue un modelo VAR de orden \(p\), VAR(\(p\)), si

\[\boldsymbol{z}_{t}=\phi_{0}+\sum_{i=1}^{p} \phi_{i} z_{t-i}+\boldsymbol{a}_{t}\]

VAR(\(1\))

\[ \boldsymbol{z}_{t}=\boldsymbol{\phi}_{0}+\boldsymbol{\phi}_{1} \boldsymbol{z}_{t-1}+\boldsymbol{a}_{t} \]

Escribiendo el modelo explicitamente se tiene:

\[ \left[\begin{array}{l} z_{1 t} \\ z_{2 t} \end{array}\right]=\left[\begin{array}{l} \phi_{10} \\ \phi_{20} \end{array}\right]+\left[\begin{array}{ll} \phi_{1,11} & \phi_{1,12} \\ \phi_{1,21} & \phi_{1,22} \end{array}\right]\left[\begin{array}{l} z_{1, t-1} \\ z_{2, t-1} \end{array}\right]+\left[\begin{array}{l} a_{1 t} \\ a_{2 t} \end{array}\right] \]

\[\begin{array}{l} z_{1 t}=\phi_{10}+\phi_{1,11} z_{1, t-1}+\phi_{1,12} z_{2, t-1}+a_{1 t} \\ z_{2 t}=\phi_{20}+\phi_{1,21} z_{1, t-1}+\phi_{1,22} z_{2, t-1}+a_{2 t} \end{array}\]

Notas

Construir el Modelo

Seguir el procedimiento iterativo de Box y Jenkins

Estimado el VAR(p), el usuario estará interesado en alguno de los siguientes elementos

Orden de Selección

\[\begin{array}{l} \operatorname{AIC}(\ell)=\ln \left|\hat{\mathbf{\Sigma}}_{a, \ell}\right|+\frac{2}{T} \ell k^{2} \\ \operatorname{BIC}(\ell)=\ln \left|\hat{\mathbf{\Sigma}}_{a, \ell}\right|+\frac{\ln (T)}{T} \ell k^{2} \\ \operatorname{HQ}(\ell)=\ln \left|\hat{\mathbf{\Sigma}}_{a, \ell}\right|+\frac{2 \ln [\ln (T)]}{T} \ell k^{2} \end{array}\]

Test de Correlación Serial

Causalidad

\[H_0: \ X_t \ \ \text{no causa en el sentido de Granger a} \ \ Y_t \]

Ejemplo

Cargar las librerias

library(vars)
library(tseries)
library(forecast)
library(urca)
library(highcharter)
data(Canada)

Canada=as.data.frame(Canada)

layout(matrix(1:4, nrow = 2, ncol = 2))
plot.ts(Canada$e, main = "Employment", ylab = "", xlab = "")
plot.ts(Canada$prod, main = "Productivity", ylab = "", xlab = "")
plot.ts(Canada$rw, main = "Real Wage", ylab = "", xlab = "")
plot.ts(Canada$U, main = "Unemployment Rate", ylab = "", xlab = "")

.

var_canada<-ts(Canada[,c(1,2,3,4)],frequency = 12)

plot.ts(var_canada)

Pruebas de estacionariedad

pp.test(var_canada[,1])
## 
##  Phillips-Perron Unit Root Test
## 
## data:  var_canada[, 1]
## Dickey-Fuller Z(alpha) = -5.8701, Truncation lag parameter = 3, p-value
## = 0.7735
## alternative hypothesis: stationary
adf.test(var_canada[,1]) # no es estcionarias
## 
##  Augmented Dickey-Fuller Test
## 
## data:  var_canada[, 1]
## Dickey-Fuller = -2.148, Lag order = 4, p-value = 0.5152
## alternative hypothesis: stationary
pp.test(var_canada[,2])
## 
##  Phillips-Perron Unit Root Test
## 
## data:  var_canada[, 2]
## Dickey-Fuller Z(alpha) = -7.4247, Truncation lag parameter = 3, p-value
## = 0.6816
## alternative hypothesis: stationary
adf.test(var_canada[,2]) # no es estacionaria
## 
##  Augmented Dickey-Fuller Test
## 
## data:  var_canada[, 2]
## Dickey-Fuller = -2.8725, Lag order = 4, p-value = 0.218
## alternative hypothesis: stationary

pruebas de estacionariedad salario

pp.test(var_canada[,3])
## 
##  Phillips-Perron Unit Root Test
## 
## data:  var_canada[, 3]
## Dickey-Fuller Z(alpha) = -4.6296, Truncation lag parameter = 3, p-value
## = 0.8468
## alternative hypothesis: stationary
adf.test(var_canada[,3]) # no es estacionaria
## 
##  Augmented Dickey-Fuller Test
## 
## data:  var_canada[, 3]
## Dickey-Fuller = -2.0558, Lag order = 4, p-value = 0.553
## alternative hypothesis: stationary
  • pruebas de estacionariedad Desempleo
pp.test(var_canada[,4])
## 
##  Phillips-Perron Unit Root Test
## 
## data:  var_canada[, 4]
## Dickey-Fuller Z(alpha) = -7.1861, Truncation lag parameter = 3, p-value
## = 0.6957
## alternative hypothesis: stationary
adf.test(var_canada[,4]) # no es estacionaria
## 
##  Augmented Dickey-Fuller Test
## 
## data:  var_canada[, 4]
## Dickey-Fuller = -2.5988, Lag order = 4, p-value = 0.3303
## alternative hypothesis: stationary

Con una diferencia

adf.test(diff(var_canada[,1]))
## 
##  Augmented Dickey-Fuller Test
## 
## data:  diff(var_canada[, 1])
## Dickey-Fuller = -3.2668, Lag order = 4, p-value = 0.08274
## alternative hypothesis: stationary
pp.test(diff(var_canada[,1])) # Estacionaria segun PhILLIPS PERRON  
## 
##  Phillips-Perron Unit Root Test
## 
## data:  diff(var_canada[, 1])
## Dickey-Fuller Z(alpha) = -26.227, Truncation lag parameter = 3, p-value
## = 0.01233
## alternative hypothesis: stationary

Productividad con una diferencia

adf.test(diff(var_canada[,2]))
## 
##  Augmented Dickey-Fuller Test
## 
## data:  diff(var_canada[, 2])
## Dickey-Fuller = -3.2361, Lag order = 4, p-value = 0.08775
## alternative hypothesis: stationary
pp.test(diff(var_canada[,2])) # estacionaria phillips perron
## Warning in pp.test(diff(var_canada[, 2])): p-value smaller than printed p-value
## 
##  Phillips-Perron Unit Root Test
## 
## data:  diff(var_canada[, 2])
## Dickey-Fuller Z(alpha) = -60.867, Truncation lag parameter = 3, p-value
## = 0.01
## alternative hypothesis: stationary

Salario con una diferencia

adf.test(diff(var_canada[,3]))
## 
##  Augmented Dickey-Fuller Test
## 
## data:  diff(var_canada[, 3])
## Dickey-Fuller = -3.2352, Lag order = 4, p-value = 0.08789
## alternative hypothesis: stationary
pp.test(diff(var_canada[,3])) # Estacionaria segun Phillips peron
## Warning in pp.test(diff(var_canada[, 3])): p-value smaller than printed p-value
## 
##  Phillips-Perron Unit Root Test
## 
## data:  diff(var_canada[, 3])
## Dickey-Fuller Z(alpha) = -61.576, Truncation lag parameter = 3, p-value
## = 0.01
## alternative hypothesis: stationary

Desempleo

adf.test(diff(var_canada[,4]))
## 
##  Augmented Dickey-Fuller Test
## 
## data:  diff(var_canada[, 4])
## Dickey-Fuller = -3.7325, Lag order = 4, p-value = 0.02698
## alternative hypothesis: stationary
pp.test(diff(var_canada[,4]))  #Estacionaria pasa ambas pruebas
## Warning in pp.test(diff(var_canada[, 4])): p-value smaller than printed p-value
## 
##  Phillips-Perron Unit Root Test
## 
## data:  diff(var_canada[, 4])
## Dickey-Fuller Z(alpha) = -38.946, Truncation lag parameter = 3, p-value
## = 0.01
## alternative hypothesis: stationary

diferencia de todas las variables

var_canada_dif<-diff(var_canada[,c(1,2,3,4)])  
plot.ts(var_canada_dif)

hchart(var_canada_dif)

Una vez las series son estacionarias, se procede a realiza la estimacion

VARselect(var_canada_dif,lag.max=8, type = "const")
## $selection
## AIC(n)  HQ(n)  SC(n) FPE(n) 
##      2      1      1      2 
## 
## $criteria
##                   1            2           3            4            5
## AIC(n) -6.270032208 -6.285458074 -6.03818873 -6.035665365 -5.808725091
## HQ(n)  -6.023272860 -5.841291247 -5.39661443 -5.196683581 -4.772335828
## SC(n)  -5.652035378 -5.173063780 -4.43139698 -3.934476142 -3.213138403
## FPE(n)  0.001893667  0.001871884  0.00241986  0.002469804  0.003191486
##                   6            7           8
## AIC(n) -5.558322503 -5.370524857 -5.33131938
## HQ(n)  -4.324525761 -3.939320637 -3.70270768
## SC(n)  -2.468338351 -1.786143241 -1.25254030
## FPE(n)  0.004286004  0.005511821  0.00626064

Modelo

número de rezagos que se incluiran en el modelo

modelo_var<-VAR(Canada, p=2)
summary(modelo_var)
## 
## VAR Estimation Results:
## ========================= 
## Endogenous variables: e, prod, rw, U 
## Deterministic variables: const 
## Sample size: 82 
## Log Likelihood: -175.819 
## Roots of the characteristic polynomial:
## 0.995 0.9081 0.9081 0.7381 0.7381 0.1856 0.1429 0.1429
## Call:
## VAR(y = Canada, p = 2)
## 
## 
## Estimation results for equation e: 
## ================================== 
## e = e.l1 + prod.l1 + rw.l1 + U.l1 + e.l2 + prod.l2 + rw.l2 + U.l2 + const 
## 
##           Estimate Std. Error t value Pr(>|t|)    
## e.l1     1.638e+00  1.500e-01  10.918  < 2e-16 ***
## prod.l1  1.673e-01  6.114e-02   2.736  0.00780 ** 
## rw.l1   -6.312e-02  5.524e-02  -1.143  0.25692    
## U.l1     2.656e-01  2.028e-01   1.310  0.19444    
## e.l2    -4.971e-01  1.595e-01  -3.116  0.00262 ** 
## prod.l2 -1.017e-01  6.607e-02  -1.539  0.12824    
## rw.l2    3.844e-03  5.552e-02   0.069  0.94499    
## U.l2     1.327e-01  2.073e-01   0.640  0.52418    
## const   -1.370e+02  5.585e+01  -2.453  0.01655 *  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## 
## Residual standard error: 0.3628 on 73 degrees of freedom
## Multiple R-Squared: 0.9985,  Adjusted R-squared: 0.9984 
## F-statistic:  6189 on 8 and 73 DF,  p-value: < 2.2e-16 
## 
## 
## Estimation results for equation prod: 
## ===================================== 
## prod = e.l1 + prod.l1 + rw.l1 + U.l1 + e.l2 + prod.l2 + rw.l2 + U.l2 + const 
## 
##           Estimate Std. Error t value Pr(>|t|)    
## e.l1      -0.17277    0.26977  -0.640  0.52390    
## prod.l1    1.15043    0.10995  10.464 3.57e-16 ***
## rw.l1      0.05130    0.09934   0.516  0.60710    
## U.l1      -0.47850    0.36470  -1.312  0.19362    
## e.l2       0.38526    0.28688   1.343  0.18346    
## prod.l2   -0.17241    0.11881  -1.451  0.15104    
## rw.l2     -0.11885    0.09985  -1.190  0.23778    
## U.l2       1.01592    0.37285   2.725  0.00805 ** 
## const   -166.77552  100.43388  -1.661  0.10109    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## 
## Residual standard error: 0.6525 on 73 degrees of freedom
## Multiple R-Squared: 0.9787,  Adjusted R-squared: 0.9764 
## F-statistic: 419.3 on 8 and 73 DF,  p-value: < 2.2e-16 
## 
## 
## Estimation results for equation rw: 
## =================================== 
## rw = e.l1 + prod.l1 + rw.l1 + U.l1 + e.l2 + prod.l2 + rw.l2 + U.l2 + const 
## 
##           Estimate Std. Error t value Pr(>|t|)    
## e.l1     -0.268833   0.322619  -0.833    0.407    
## prod.l1  -0.081065   0.131487  -0.617    0.539    
## rw.l1     0.895478   0.118800   7.538 1.04e-10 ***
## U.l1      0.012130   0.436149   0.028    0.978    
## e.l2      0.367849   0.343087   1.072    0.287    
## prod.l2  -0.005181   0.142093  -0.036    0.971    
## rw.l2     0.052677   0.119410   0.441    0.660    
## U.l2     -0.127708   0.445892  -0.286    0.775    
## const   -33.188339 120.110525  -0.276    0.783    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## 
## Residual standard error: 0.7803 on 73 degrees of freedom
## Multiple R-Squared: 0.9989,  Adjusted R-squared: 0.9987 
## F-statistic:  8009 on 8 and 73 DF,  p-value: < 2.2e-16 
## 
## 
## Estimation results for equation U: 
## ================================== 
## U = e.l1 + prod.l1 + rw.l1 + U.l1 + e.l2 + prod.l2 + rw.l2 + U.l2 + const 
## 
##          Estimate Std. Error t value Pr(>|t|)    
## e.l1     -0.58076    0.11563  -5.023 3.49e-06 ***
## prod.l1  -0.07812    0.04713  -1.658 0.101682    
## rw.l1     0.01866    0.04258   0.438 0.662463    
## U.l1      0.61893    0.15632   3.959 0.000173 ***
## e.l2      0.40982    0.12296   3.333 0.001352 ** 
## prod.l2   0.05212    0.05093   1.023 0.309513    
## rw.l2     0.04180    0.04280   0.977 0.331928    
## U.l2     -0.07117    0.15981  -0.445 0.657395    
## const   149.78056   43.04810   3.479 0.000851 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## 
## Residual standard error: 0.2797 on 73 degrees of freedom
## Multiple R-Squared: 0.9726,  Adjusted R-squared: 0.9696 
## F-statistic:   324 on 8 and 73 DF,  p-value: < 2.2e-16 
## 
## 
## 
## Covariance matrix of residuals:
##              e      prod       rw        U
## e     0.131635 -0.007469 -0.04210 -0.06909
## prod -0.007469  0.425711  0.06461  0.01392
## rw   -0.042099  0.064613  0.60886  0.03422
## U    -0.069087  0.013923  0.03422  0.07821
## 
## Correlation matrix of residuals:
##             e     prod      rw       U
## e     1.00000 -0.03155 -0.1487 -0.6809
## prod -0.03155  1.00000  0.1269  0.0763
## rw   -0.14870  0.12691  1.0000  0.1568
## U    -0.68090  0.07630  0.1568  1.0000

Plot del modelo

plot(modelo_var)

Pruebas al modelo

Estacionariedad de multiples variables

roots(modelo_var)
## [1] 0.9950338 0.9081062 0.9081062 0.7380565 0.7380565 0.1856381 0.1428889
## [8] 0.1428889

Al ser todas menores a 1 este modelo cumple el supuesto de estacionierada

Comprobación de la autocorrelación en los errores

Se utilizará la prueba Breusch- Godfrey, por medio del comando serial.test

serial.test(modelo_var,lags.pt=16,type = "PT.adjusted")
## 
##  Portmanteau Test (adjusted)
## 
## data:  Residuals of VAR object modelo_var
## Chi-squared = 231.59, df = 224, p-value = 0.3497

No hay autocorrelacion de los errores

Prueba de nomalidad

normality.test(modelo_var,multivariate.only = FALSE)
## $e
## 
##  JB-Test (univariate)
## 
## data:  Residual of e equation
## Chi-squared = 0.15347, df = 2, p-value = 0.9261
## 
## 
## $prod
## 
##  JB-Test (univariate)
## 
## data:  Residual of prod equation
## Chi-squared = 4.2651, df = 2, p-value = 0.1185
## 
## 
## $rw
## 
##  JB-Test (univariate)
## 
## data:  Residual of rw equation
## Chi-squared = 0.33482, df = 2, p-value = 0.8459
## 
## 
## $U
## 
##  JB-Test (univariate)
## 
## data:  Residual of U equation
## Chi-squared = 0.56643, df = 2, p-value = 0.7534
## 
## 
## $JB
## 
##  JB-Test (multivariate)
## 
## data:  Residuals of VAR object modelo_var
## Chi-squared = 5.094, df = 8, p-value = 0.7475
## 
## 
## $Skewness
## 
##  Skewness only (multivariate)
## 
## data:  Residuals of VAR object modelo_var
## Chi-squared = 1.7761, df = 4, p-value = 0.7769
## 
## 
## $Kurtosis
## 
##  Kurtosis only (multivariate)
## 
## data:  Residuals of VAR object modelo_var
## Chi-squared = 3.3179, df = 4, p-value = 0.5061

El modelo Tiene distribución normal de lo errores