Packages

## Warning: package 'dLagM' was built under R version 4.3.3
## Loading required package: nardl
## Warning: package 'nardl' was built under R version 4.3.3
## Registered S3 method overwritten by 'quantmod':
##   method            from
##   as.zoo.data.frame zoo
## Loading required package: dynlm
## Warning: package 'dynlm' was built under R version 4.3.3
## Loading required package: zoo
## Warning: package 'zoo' was built under R version 4.3.2
## 
## Attaching package: 'zoo'
## The following objects are masked from 'package:base':
## 
##     as.Date, as.Date.numeric
## Warning: package 'MLmetrics' was built under R version 4.3.3
## 
## Attaching package: 'MLmetrics'
## The following object is masked from 'package:dLagM':
## 
##     MAPE
## The following object is masked from 'package:base':
## 
##     Recall
## Warning: package 'lmtest' was built under R version 4.3.2
## Loading required package: carData
## Warning: package 'readxl' was built under R version 4.3.2

Impor Data

data <- read_xlsx("C:/Users/Muhammad Rizqa Salas/Downloads/Semester 5/MPDW/Project/Samsung.xlsx")
str(data)
## tibble [50 × 4] (S3: tbl_df/tbl/data.frame)
##  $ t     : num [1:50] 1 2 3 4 5 6 7 8 9 10 ...
##  $ Yt    : num [1:50] 7040 7020 7110 7010 7070 6980 6990 7150 7040 7060 ...
##  $ Y(t-1): num [1:50] NA 7040 7020 7110 7010 7070 6980 6990 7150 7040 ...
##  $ Xt    : num [1:50] 8395448 9155219 9422097 9584939 10144957 ...
data
## # A tibble: 50 × 4
##        t    Yt `Y(t-1)`       Xt
##    <dbl> <dbl>    <dbl>    <dbl>
##  1     1  7040       NA  8395448
##  2     2  7020     7040  9155219
##  3     3  7110     7020  9422097
##  4     4  7010     7110  9584939
##  5     5  7070     7010 10144957
##  6     6  6980     7070 10295316
##  7     7  6990     6980 10296217
##  8     8  7150     6990 10528252
##  9     9  7040     7150 10783368
## 10    10  7060     7040 10874505
## # ℹ 40 more rows

Pembagian Data

#SPLIT DATA
train<-data[1:40,]
test<-data[41:50,]
#data time series
train.ts<-ts(train)
test.ts<-ts(test)
data.ts<-ts(data)

Model Koyck

Model Koyck didasarkan pada asumsi bahwa semakin jauh jarak lag peubah independen dari periode sekarang maka semakin kecil pengaruh peubah lag terhadap peubah dependen.

Koyck mengusulkan suatu metode untuk menduga model dinamis distributed lag dengan mengasumsikan bahwa semua koefisien \(\beta\) mempunyai tanda sama.

Model kyock merupakan jenis paling umum dari model infinite distributed lag dan juga dikenal sebagai geometric lag

\[ y_t=a(1-\lambda)+\beta_0X_t+\beta_1Z_t+\lambda Y_{t-1}+V_t \]

dengan \[V_t=u_t-\lambda u_{t-1}\]

Pemodelan

Fungsi koyckDlm() akan menerapkan model lag terdistribusi dengan transformasi Koyck satu prediktor. Nilai x dan y tidak perlu sebagai objek time series (ts). intercept dapat dibuat TRUE untuk memasukkan intersep ke dalam model.

#MODEL KOYCK
model.koyck <- koyckDlm(x = train$Xt, y = train$Yt)
summary(model.koyck)
## 
## Call:
## "Y ~ (Intercept) + Y.1 + X.t"
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -148.95  -63.37  -15.49   38.29  320.74 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 2.076e+03  8.466e+02   2.452   0.0192 *  
## Y.1         6.524e-01  1.410e-01   4.628 4.66e-05 ***
## X.t         3.710e-05  1.567e-05   2.368   0.0234 *  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 98.15 on 36 degrees of freedom
## Multiple R-Squared: 0.893,   Adjusted R-squared: 0.8871 
## Wald test:   149 on 2 and 36 DF,  p-value: < 2.2e-16 
## 
## Diagnostic tests:
## NULL
## 
##                             alpha         beta       phi
## Geometric coefficients:  5971.489 3.710389e-05 0.6524284
AIC(model.koyck)
## [1] 473.3004
BIC(model.koyck)
## [1] 479.9547

Dari hasil tersebut, didapat bahwa peubah \(x_t\) dan \(y_{t-1}\) memiliki nilai \(P-Value<0.05\). Hal ini menunjukkan bahwa peubah \(x_t\) dan \(y_{t-1}\) berpengaruh signifikan terhadap \(y\). Adapun model keseluruhannya adalah sebagai berikut

\[ \hat{Y_t}=-2076+0,6524X_t+0,0000371Y_{t-1} \]

Peramalan dan Akurasi

Berikut adalah hasil peramalan y untuk 5 periode kedepan menggunakan model koyck

fore.koyck <- forecast(model = model.koyck, x=test$Xt, h=10) #sesuaikan data testing
fore.koyck
## $forecasts
##  [1] 7865.012 7941.341 8022.682 8109.644 8168.094 8219.227 8255.346 8339.158
##  [9] 8536.826 8771.842
## 
## $call
## forecast.koyckDlm(model = model.koyck, x = test$Xt, h = 10)
## 
## attr(,"class")
## [1] "forecast.koyckDlm" "dLagM"
mape.koyck <- MAPE(fore.koyck$forecasts, test$Yt)
#akurasi data training
GoF(model.koyck)
##              n     MAE           MPE        MAPE       sMAPE      MASE      MSE
## model.koyck 39 71.2219 -0.0001729442 0.009744656 0.009766829 0.9205551 8891.968
##                    MRAE    GMRAE
## model.koyck 13753913681 3.590582

Regression with Distributed Lag

Fungsi dlm() akan menerapkan model lag terdistribusi dengan satu atau lebih prediktor. Nilai x dan y tidak perlu sebagai objek time series (ts). \(q\) adalah integer yang mewakili panjang lag yang terbatas.

Pemodelan (Lag=2)

model.dlm <- dlm(x = train$Xt,y = train$Yt , q = 2)
summary(model.dlm)
## 
## Call:
## lm(formula = model.formula, data = design)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -191.25  -66.46  -15.54   56.02  304.87 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  6.024e+03  1.037e+02  58.093  < 2e-16 ***
## x.t          2.432e-04  7.907e-05   3.076  0.00413 ** 
## x.1         -3.247e-05  1.070e-04  -0.304  0.76334    
## x.2         -1.162e-04  7.676e-05  -1.513  0.13947    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 114.8 on 34 degrees of freedom
## Multiple R-squared:  0.8582, Adjusted R-squared:  0.8457 
## F-statistic: 68.61 on 3 and 34 DF,  p-value: 1.67e-14
## 
## AIC and BIC values for the model:
##        AIC      BIC
## 1 474.1087 482.2966
AIC(model.dlm)
## [1] 474.1087
BIC(model.dlm)
## [1] 482.2966

Dari hasil diatas, didapat bahwa \(P-value\) dari intercept dan \(x_{t-1}>0.05\). Hal ini menunjukkan bahwa intercept dan \(x_{t-1}\) tidak berpengaruh signifikan terhadap \(y\). Adapun model keseluruhan yang terbentuk adalah sebagai berikut

\[ \hat{Y_t}=6024+0,0002432X_t-0,00003247X_{t-1}-0,0001162X_{t-2} \]

Peramalan dan Akurasi

Berikut merupakan hasil peramalan \(y\) untuk 5 periode kedepan

fore.dlm <- forecast(model = model.dlm, x=test$Xt, h=10)
fore.dlm
## $forecasts
##  [1] 7989.065 8084.471 8168.180 8297.425 8180.258 8157.868 8159.208 8511.029
##  [9] 9386.954 9768.391
## 
## $call
## forecast.dlm(model = model.dlm, x = test$Xt, h = 10)
## 
## attr(,"class")
## [1] "forecast.dlm" "dLagM"
mape.dlm <- MAPE(fore.dlm$forecasts, test$Yt)
#akurasi data training
GoF(model.dlm)
##            n      MAE           MPE       MAPE     sMAPE     MASE      MSE
## model.dlm 38 82.16513 -0.0002243657 0.01120379 0.0112245 1.066705 11795.92
##                  MRAE    GMRAE
## model.dlm 24341762183 4.183363

Lag Optimum

#penentuan lag optimum 
finiteDLMauto(formula = Yt ~ Xt,
              data = data.frame(train), q.min = 1, q.max = 18,
              model.type = "dlm", error.type = "AIC", trace = FALSE)
##    q - k    MASE      AIC      BIC   GMRAE   MBRAE R.Adj.Sq  Ljung-Box
## 18    18 0.08691 206.1572 229.0691 1.18676 0.12106  0.98271 0.02685975

Berdasarkan output tersebut, lag optimum didapatkan ketika lag=18. Selanjutnya dilakukan pemodelan untuk lag=18

#penentuan lag optimum 
finiteDLMauto(formula = Yt ~ Xt,
              data = data.frame(train), q.min = 1, q.max = 18,
              model.type = "dlm", error.type = "AIC", trace = FALSE)
##    q - k    MASE      AIC      BIC   GMRAE   MBRAE R.Adj.Sq  Ljung-Box
## 18    18 0.08691 206.1572 229.0691 1.18676 0.12106  0.98271 0.02685975

Berdasarkan output tersebut, lag optimum didapatkan ketika lag=6. Selanjutnya dilakukan pemodelan untuk lag=6

#model dlm dengan lag optimum
model.dlm2 <- dlm(x = train$Xt,y = train$Yt , q = 18)
summary(model.dlm2)
## 
## Call:
## lm(formula = model.formula, data = design)
## 
## Residuals:
##       1       2       3       4       5       6       7       8       9      10 
##   8.045 -11.334   6.210  -4.923  -4.366   4.564 -18.193  27.287  -5.058 -17.321 
##      11      12      13      14      15      16      17      18      19      20 
##   6.948  10.123   1.678  -2.000  -4.001   4.049   4.540  -7.787  -3.914  14.414 
##      21      22 
##  -6.707  -2.254 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)  
## (Intercept)  2.912e+03  2.553e+03   1.141   0.3722  
## x.t          3.577e-04  1.185e-04   3.019   0.0944 .
## x.1         -2.779e-05  1.266e-04  -0.220   0.8466  
## x.2         -5.007e-04  1.382e-04  -3.623   0.0685 .
## x.3         -1.293e-04  7.460e-05  -1.733   0.2253  
## x.4          1.220e-04  9.120e-05   1.338   0.3129  
## x.5          3.804e-04  7.644e-05   4.976   0.0381 *
## x.6          1.194e-04  8.062e-05   1.481   0.2768  
## x.7         -3.676e-04  8.075e-05  -4.552   0.0450 *
## x.8         -7.610e-05  1.330e-04  -0.572   0.6248  
## x.9         -7.153e-04  1.365e-04  -5.239   0.0346 *
## x.10        -1.160e-04  2.342e-04  -0.496   0.6693  
## x.11        -2.764e-04  2.548e-04  -1.085   0.3914  
## x.12         1.112e-03  1.963e-04   5.666   0.0298 *
## x.13         9.153e-04  1.838e-04   4.981   0.0380 *
## x.14        -6.841e-05  3.114e-04  -0.220   0.8465  
## x.15        -7.329e-05  1.404e-04  -0.522   0.6538  
## x.16        -3.805e-04  1.326e-04  -2.870   0.1030  
## x.17         3.804e-04  1.751e-04   2.172   0.1620  
## x.18        -2.412e-04  1.144e-04  -2.108   0.1696  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 33.48 on 2 degrees of freedom
## Multiple R-squared:  0.9984, Adjusted R-squared:  0.9827 
## F-statistic: 63.83 on 19 and 2 DF,  p-value: 0.01553
## 
## AIC and BIC values for the model:
##        AIC      BIC
## 1 206.1572 229.0691
AIC(model.dlm2)
## [1] 206.1572
BIC(model.dlm2)
## [1] 229.0691

Dari hasil tersebut terdapat beberapa peubah yang berpengaruh signifikan terhadap taraf nyata 5% yaitu \(x_t\) , \(x_{t-2}\) , \(x_{t-4}\) , \(x_{t-6}\). Adapun keseluruhan model yang terbentuk adalah

\[ \hat{Y_t}=2912+0,0003557X_t+...-0,0002412X_{t-18} \]

Adapun hasil peramalan 10 periode kedepan menggunakan model tersebut adalah sebagai berikut

#peramalan dan akurasi
fore.dlm2 <- forecast(model = model.dlm2, x=test$Xt, h=10) #sesuaikan data testing
mape.dlm2<- MAPE(fore.dlm2$forecasts, test$Yt)
mape.dlm2
## [1] 0.1510791
#akurasi data training
GoF(model.dlm2)
##             n     MAE           MPE        MAPE       sMAPE      MASE      MSE
## model.dlm2 22 7.98714 -1.954565e-06 0.001080265 0.001080384 0.0869067 101.8819
##                  MRAE    GMRAE
## model.dlm2 4336467747 1.186756

Model tersebut merupakan model yang sangat baik dengan nilai MAPE yang kurang dari 10%.

Model Autoregressive

Peubah dependen dipengaruhi oleh peubah independen pada waktu sekarang, serta dipengaruhi juga oleh peubah dependen itu sendiri pada satu waktu yang lalu maka model tersebut disebut autoregressive.

Pemodelan

Dengan \(p\) adalah integer yang mewakili panjang lag yang terbatas dan \(q\) adalah integer yang merepresentasikan ordo dari proses autoregressive.

model.ardl <- ardlDlm(formula = Yt ~ Xt, 
                         data = train,p = 1 , q = 1)
summary(model.ardl)
## 
## Time series regression with "ts" data:
## Start = 2, End = 40
## 
## Call:
## dynlm(formula = as.formula(model.text), data = data)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -134.31  -54.40   -6.58   42.18  331.43 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  2.541e+03  7.987e+02   3.182 0.003062 ** 
## Xt.t         1.615e-04  6.198e-05   2.606 0.013362 *  
## Xt.1        -1.195e-04  6.128e-05  -1.951 0.059132 .  
## Yt.1         5.757e-01  1.326e-01   4.343 0.000115 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 94.26 on 35 degrees of freedom
## Multiple R-squared:  0.9041, Adjusted R-squared:  0.8959 
## F-statistic:   110 on 3 and 35 DF,  p-value: < 2.2e-16
AIC(model.ardl)
## [1] 471.0503
BIC(model.ardl)
## [1] 479.3682

Hasil di atas menunjukkan bahwa selain peubah \(x_{t-1}\), hasil uji t menunjukkan nilai-p pada peubah \(\ge0.05\) Hal ini menunjukkan bahwa peubah \(x_{t-1}\) berpengaruh signifikan terhadap \(y_t\), sementara \(x_t\) dan \(y_{t-1}\) berpengaruh signifikan terhadap \(y_t\). Model keseluruhannya adalah sebagai berikut:

\[ \hat{Y}=2541+0,0001615X_t-0,0001195X_{t-1}+0,5757Y_{t-1} \]

Peramalan dan Akurasi

fore.ardl <- forecast(model = model.ardl, x=test$Xt, h=10)
fore.ardl
## $forecasts
##  [1] 7935.608 8008.174 8120.039 8230.359 8192.142 8221.205 8208.067 8453.882
##  [9] 9023.745 9352.820
## 
## $call
## forecast.ardlDlm(model = model.ardl, x = test$Xt, h = 10)
## 
## attr(,"class")
## [1] "forecast.ardlDlm" "dLagM"

Data di atas merupakan hasil peramalan untuk 5 periode ke depan menggunakan Model Autoregressive dengan \(p=1\) dan \(q=1\).

mape.ardl <- MAPE(fore.ardl$forecasts, test$Yt)
mape.ardl
## [1] 0.05947607
#akurasi data training
GoF(model.ardl)
##             n      MAE           MPE        MAPE       sMAPE      MASE      MSE
## model.ardl 39 65.49411 -0.0001514317 0.008945152 0.008965785 0.8465225 7973.882
##                   MRAE    GMRAE
## model.ardl 22186406727 2.608117

Berdasarkan akurasi di atas, terlihat bahwa nilai MAPE keduanya tidak jauh berbeda. Artinya, model regresi dengan distribusi lag ini tidak overfitted atau underfitted

Lag Optimum

#penentuan lag optimum
model.ardl.opt <- ardlBoundOrders(data = data.frame(data), ic = "AIC", 
                                  formula = Yt ~ Xt )
model.ardl.opt
## $p
##   Xt
## 1 15
## 
## $q
## [1] 15
## 
## $Stat.table
##           q = 1    q = 2    q = 3    q = 4    q = 5    q = 6    q = 7    q = 8
## p = 1  585.8340 574.2951 563.7604 552.8971 542.8644 532.9985 522.7288 511.1220
## p = 2  573.9919 575.7773 564.2704 553.6469 543.5246 533.1524 523.0102 511.3433
## p = 3  563.6533 563.6533 564.1103 553.8595 543.5504 532.6843 523.1214 510.9639
## p = 4  552.9870 554.0704 554.0704 555.1839 545.4057 534.5345 524.9157 512.8574
## p = 5  542.3305 543.8020 545.7722 545.7722 547.3939 536.3250 526.7624 514.2405
## p = 6  534.2348 536.2186 537.2349 538.2477 538.2477 538.1846 528.7084 516.0771
## p = 7  519.6483 521.6475 521.3059 523.2978 524.2194 524.2194 526.1409 515.9721
## p = 8  508.1465 510.0329 508.9651 509.8519 511.4627 512.9583 512.9583 514.4724
## p = 9  500.0714 502.0634 502.0835 502.6798 504.2130 504.4255 504.5926 504.5926
## p = 10 491.1310 493.1183 493.0272 493.9066 494.9226 490.7479 491.4781 493.1068
## p = 11 482.7714 484.6872 485.2607 486.1866 487.7783 483.8974 481.1302 482.9043
## p = 12 468.4597 470.1448 469.5604 471.5212 473.4962 470.6280 471.1052 472.7515
## p = 13 457.6861 457.6541 459.2226 461.0853 463.0697 460.7818 459.1809 460.8611
## p = 14 447.2362 445.9588 447.9588 449.8454 451.8053 449.2421 448.9020 450.7438
## p = 15 438.2062 435.1487 436.8533 438.2088 438.4676 428.1367 429.4672 428.4565
##           q = 9   q = 10   q = 11   q = 12   q = 13   q = 14   q = 15
## p = 1  501.2565 491.6221 482.3578 471.9034 462.1995 449.9119 440.7221
## p = 2  501.5862 491.7534 482.4084 471.9027 462.4846 451.4090 442.3632
## p = 3  501.7306 491.9210 481.0431 470.9635 461.8748 451.5632 442.6431
## p = 4  503.6704 493.8670 482.8404 472.4426 463.2885 453.2747 444.3994
## p = 5  504.6074 494.2812 483.1044 472.0695 461.6024 448.9206 439.7970
## p = 6  506.2916 494.7824 481.9166 470.6062 457.1323 431.0944 421.0727
## p = 7  506.2314 495.5640 481.1086 471.2377 458.2152 432.7956 418.0114
## p = 8  505.3573 495.0883 481.5590 472.0702 460.1609 434.6512 419.2363
## p = 9  506.2821 496.8695 483.5179 473.8933 461.9161 427.3011 405.8415
## p = 10 493.1068 494.5669 480.7956 472.2552 461.9544 428.1099 407.3999
## p = 11 482.8796 482.8796 480.8560 472.4053 463.0455 430.1086 405.5551
## p = 12 471.4859 469.3551 469.3551 471.2254 463.0006 424.5590 374.0771
## p = 13 459.0238 459.2633 460.9351 460.9351 462.9086 424.0584 335.8157
## p = 14 450.0794 451.3713 452.8957 454.4407 454.4407 372.3900 266.9054
## p = 15 412.7979 375.1531 363.2341 331.9199 261.1859 261.1859       NA
## 
## $min.Stat
## [1] 261.1859
min_p=c()
for(i in 1:6){
  min_p[i]=min(model.ardl.opt$Stat.table[[i]])
}
q_opt=which(min_p==min(min_p, na.rm = TRUE))
p_opt=which(model.ardl.opt$Stat.table[[q_opt]] == 
              min(model.ardl.opt$Stat.table[[q_opt]], na.rm = TRUE))
data.frame("q_optimum" = q_opt, "p_optimum" = p_opt, 
           "AIC"=model.ardl.opt$min.Stat)
##   q_optimum p_optimum      AIC
## 1         6        15 261.1859

Dari tabel di atas, dapat terlihat bahwa nilai AIC terendah didapat ketika \(p=15\) dan \(q=14\), yaitu sebesar 261.1859. Artinya, model autoregressive optimum didapat ketika \(p=15\) dan \(q=14\).

Selanjutnya dapat dilakukan pemodelan dengan nilai \(p\) dan \(q\) optimum seperti inisialisasi di langkah sebelumnya.

Pemodelan DLM & ARDL dengan Library dynlm

Untuk menentukan formula model yang akan digunakan, tersedia fungsi tambahan yang memungkinkan spesifikasi dinamika (melalui d() dan L()) atau pola linier/siklus dengan mudah (melalui trend(), season(), dan harmon()). Semua fungsi formula baru mengharuskan argumennya berupa objek deret waktu (yaitu, "ts" atau "zoo").

#sama dengan model dlm q=1
cons_lm1 <- dynlm(Yt ~ Xt+L(Xt),data = train.ts)
#sama dengan model ardl p=1 q=0
cons_lm2 <- dynlm(Yt ~ Xt+L(Yt),data = train.ts)
#sama dengan ardl p=1 q=1
cons_lm3 <- dynlm(Yt ~ Xt+L(Xt)+L(Yt),data = train.ts)
#sama dengan dlm p=2
cons_lm4 <- dynlm(Yt ~ Xt+L(Xt)+L(Xt,2),data = train.ts)

Ringkasan Model

summary(cons_lm1)
## 
## Time series regression with "ts" data:
## Start = 2, End = 40
## 
## Call:
## dynlm(formula = Yt ~ Xt + L(Xt), data = train.ts)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -202.84  -63.62  -27.52   58.81  281.84 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  5.993e+03  9.582e+01  62.548  < 2e-16 ***
## Xt           2.391e-04  7.259e-05   3.294  0.00222 ** 
## L(Xt)       -1.397e-04  7.474e-05  -1.868  0.06985 .  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 115.3 on 36 degrees of freedom
## Multiple R-squared:  0.8524, Adjusted R-squared:  0.8442 
## F-statistic: 103.9 on 2 and 36 DF,  p-value: 1.107e-15
summary(cons_lm2)
## 
## Time series regression with "ts" data:
## Start = 2, End = 40
## 
## Call:
## dynlm(formula = Yt ~ Xt + L(Yt), data = train.ts)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -154.16  -58.47  -21.52   44.37  316.32 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 2.403e+03  8.259e+02   2.910 0.006165 ** 
## Xt          4.403e-05  1.519e-05   2.900 0.006329 ** 
## L(Yt)       5.952e-01  1.372e-01   4.337 0.000111 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 97.87 on 36 degrees of freedom
## Multiple R-squared:  0.8937, Adjusted R-squared:  0.8877 
## F-statistic: 151.3 on 2 and 36 DF,  p-value: < 2.2e-16
summary(cons_lm3)
## 
## Time series regression with "ts" data:
## Start = 2, End = 40
## 
## Call:
## dynlm(formula = Yt ~ Xt + L(Xt) + L(Yt), data = train.ts)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -134.31  -54.40   -6.58   42.18  331.43 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  2.541e+03  7.987e+02   3.182 0.003062 ** 
## Xt           1.615e-04  6.198e-05   2.606 0.013362 *  
## L(Xt)       -1.195e-04  6.128e-05  -1.951 0.059132 .  
## L(Yt)        5.757e-01  1.326e-01   4.343 0.000115 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 94.26 on 35 degrees of freedom
## Multiple R-squared:  0.9041, Adjusted R-squared:  0.8959 
## F-statistic:   110 on 3 and 35 DF,  p-value: < 2.2e-16
summary(cons_lm4)
## 
## Time series regression with "ts" data:
## Start = 3, End = 40
## 
## Call:
## dynlm(formula = Yt ~ Xt + L(Xt) + L(Xt, 2), data = train.ts)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -191.25  -66.46  -15.54   56.02  304.87 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  6.024e+03  1.037e+02  58.093  < 2e-16 ***
## Xt           2.432e-04  7.907e-05   3.076  0.00413 ** 
## L(Xt)       -3.247e-05  1.070e-04  -0.304  0.76334    
## L(Xt, 2)    -1.162e-04  7.676e-05  -1.513  0.13947    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 114.8 on 34 degrees of freedom
## Multiple R-squared:  0.8582, Adjusted R-squared:  0.8457 
## F-statistic: 68.61 on 3 and 34 DF,  p-value: 1.67e-14

SSE

deviance(cons_lm1)
## [1] 478563
deviance(cons_lm2)
## [1] 344792.7
deviance(cons_lm3)
## [1] 310981.4
deviance(cons_lm4)
## [1] 448245

Uji Diagnostik

#uji model
if(require("lmtest")) encomptest(cons_lm1, cons_lm2)
## Encompassing test
## 
## Model 1: Yt ~ Xt + L(Xt)
## Model 2: Yt ~ Xt + L(Yt)
## Model E: Yt ~ Xt + L(Xt) + L(Yt)
##           Res.Df Df       F    Pr(>F)    
## M1 vs. ME     35 -1 18.8608 0.0001145 ***
## M2 vs. ME     35 -1  3.8054 0.0591318 .  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Autokorelasi

#durbin watson
dwtest(cons_lm1)
## 
##  Durbin-Watson test
## 
## data:  cons_lm1
## DW = 0.93804, p-value = 5.739e-05
## alternative hypothesis: true autocorrelation is greater than 0
dwtest(cons_lm2)
## 
##  Durbin-Watson test
## 
## data:  cons_lm2
## DW = 1.5714, p-value = 0.05
## alternative hypothesis: true autocorrelation is greater than 0
dwtest(cons_lm3)
## 
##  Durbin-Watson test
## 
## data:  cons_lm3
## DW = 1.6915, p-value = 0.1016
## alternative hypothesis: true autocorrelation is greater than 0
dwtest(cons_lm4)
## 
##  Durbin-Watson test
## 
## data:  cons_lm4
## DW = 0.88953, p-value = 3.364e-05
## alternative hypothesis: true autocorrelation is greater than 0

Heterogenitas

bptest(cons_lm1)
## 
##  studentized Breusch-Pagan test
## 
## data:  cons_lm1
## BP = 0.10477, df = 2, p-value = 0.949
bptest(cons_lm2)
## 
##  studentized Breusch-Pagan test
## 
## data:  cons_lm2
## BP = 1.4117, df = 2, p-value = 0.4937
bptest(cons_lm3)
## 
##  studentized Breusch-Pagan test
## 
## data:  cons_lm3
## BP = 1.2846, df = 3, p-value = 0.7328
bptest(cons_lm4)
## 
##  studentized Breusch-Pagan test
## 
## data:  cons_lm4
## BP = 0.39519, df = 3, p-value = 0.9412

Kenormalan

shapiro.test(residuals(cons_lm1))
## 
##  Shapiro-Wilk normality test
## 
## data:  residuals(cons_lm1)
## W = 0.95782, p-value = 0.1508
shapiro.test(residuals(cons_lm2))
## 
##  Shapiro-Wilk normality test
## 
## data:  residuals(cons_lm2)
## W = 0.93191, p-value = 0.0207
shapiro.test(residuals(cons_lm3))
## 
##  Shapiro-Wilk normality test
## 
## data:  residuals(cons_lm3)
## W = 0.91092, p-value = 0.004607
shapiro.test(residuals(cons_lm4))
## 
##  Shapiro-Wilk normality test
## 
## data:  residuals(cons_lm4)
## W = 0.94926, p-value = 0.08418

Perbandingan Model

akurasi <- matrix(c(mape.koyck, mape.dlm, mape.dlm2, mape.ardl))
row.names(akurasi)<- c("Koyck","DLM 1","DLM 2","Autoregressive")
colnames(akurasi) <- c("MAPE")
akurasi
##                      MAPE
## Koyck          0.04017347
## DLM 1          0.07169568
## DLM 2          0.15107915
## Autoregressive 0.05947607

Berdasarkan nilai MAPE, model paling optimum didapat pada Model Koyck karena memiliki nilai MAPE yang terkecil.

Plot

par(mfrow=c(1,1))
plot(test$Xt, test$Yt, type="b", col="black", ylim=c(7000,12000))
points(test$Xt, fore.koyck$forecasts,col="red")
lines(test$Xt, fore.koyck$forecasts,col="red")
points(test$Xt, fore.dlm$forecasts,col="blue")
lines(test$Xt, fore.dlm$forecasts,col="blue")
points(test$Xt, fore.dlm2$forecasts,col="orange")
lines(test$Xt, fore.dlm2$forecasts,col="orange")
points(test$Xt, fore.ardl$forecasts,col="green")
lines(test$Xt, fore.ardl$forecasts,col="green")
legend("topleft",c("aktual", "koyck","DLM 1","DLM 2", "autoregressive"), lty=1, col=c("black","red","blue","orange","green"), cex=0.8)

Berdasarkan plot tersebut, terlihat bahwa plot yang paling mendekati data aktualnya adalah Model koyck, sehingga dapat disimpulkan model terbaik dalam hal ini adalah model regresi koyck

Pengayaan (Regresi Berganda)

Data

data(M1Germany)
data1 = M1Germany[1:144,]

DLM

#Run the search over finite DLMs according to AIC values
finiteDLMauto(formula = logprice ~ interest+logm1,
              data = data.frame(data1), q.min = 1, q.max = 5,
              model.type = "dlm", error.type = "AIC", trace = FALSE)
##   q - k    MASE       AIC       BIC   GMRAE    MBRAE R.Adj.Sq Ljung-Box
## 5     5 1.77163 -463.1393 -422.0566 1.43662 -1.60494  0.98836         0
#model dlm berganda
model.dlmberganda = dlm(formula = logprice ~ interest + logm1,
                data = data.frame(data1) , q = 5)
summary(model.dlmberganda)
## 
## Call:
## lm(formula = as.formula(model.formula), data = design)
## 
## Residuals:
##       Min        1Q    Median        3Q       Max 
## -0.095761 -0.028610 -0.000012  0.029496  0.102597 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) -7.81759    0.11384 -68.669  < 2e-16 ***
## interest.t  -1.75616    0.80358  -2.185 0.030707 *  
## interest.1   1.38935    1.22707   1.132 0.259679    
## interest.2   0.40776    1.23726   0.330 0.742273    
## interest.3   1.23130    1.20752   1.020 0.309830    
## interest.4  -0.08718    1.20869  -0.072 0.942616    
## interest.5   3.06850    0.89380   3.433 0.000808 ***
## logm1.t      0.43219    0.20876   2.070 0.040474 *  
## logm1.1      0.42190    0.19807   2.130 0.035109 *  
## logm1.2      0.20943    0.12883   1.626 0.106532    
## logm1.3      0.22053    0.13011   1.695 0.092567 .  
## logm1.4      0.05513    0.21457   0.257 0.797633    
## logm1.5      0.03042    0.19192   0.159 0.874296    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.04343 on 126 degrees of freedom
## Multiple R-squared:  0.9894, Adjusted R-squared:  0.9884 
## F-statistic: 977.9 on 12 and 126 DF,  p-value: < 2.2e-16
## 
## AIC and BIC values for the model:
##         AIC       BIC
## 1 -463.1393 -422.0566
model.dlmberganda2 = dlm(formula = logprice ~ interest + logm1,
                        data = data.frame(data1) , q = 1)
summary(model.dlmberganda2)
## 
## Call:
## lm(formula = as.formula(model.formula), data = design)
## 
## Residuals:
##       Min        1Q    Median        3Q       Max 
## -0.134002 -0.044697  0.006407  0.036962  0.113063 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) -7.77917    0.13299 -58.492  < 2e-16 ***
## interest.t  -3.22103    0.94184  -3.420 0.000824 ***
## interest.1   6.52775    0.94501   6.908 1.66e-10 ***
## logm1.t      0.73918    0.08419   8.780 5.61e-15 ***
## logm1.1      0.63330    0.08429   7.513 6.55e-12 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.05443 on 138 degrees of freedom
## Multiple R-squared:  0.9832, Adjusted R-squared:  0.9828 
## F-statistic:  2025 on 4 and 138 DF,  p-value: < 2.2e-16
## 
## AIC and BIC values for the model:
##         AIC       BIC
## 1 -419.7575 -401.9805

ARDL

#Mencari orde lag optimum model ARDL
ardlBoundOrders(data = data1 , formula = logprice ~ interest + logm1,
                ic="AIC")
## $p
##    interest logm1
## 65        0     4
## 
## $q
## [1] 4
## 
## $Stat.table
##            q = 1     q = 2     q = 3     q = 4     q = 5     q = 6     q = 7
## p = 1  -760.1786 -757.9195 -846.8342 -975.2079 -965.7536 -958.9072 -956.7315
## p = 2  -760.0433 -759.3090 -843.6247 -971.2514 -961.7929 -955.2809 -953.4890
## p = 3  -753.7746 -753.7746 -841.2485 -970.4543 -961.4343 -953.7173 -950.0412
## p = 4  -829.8076 -832.6436 -832.6436 -971.0837 -962.1804 -955.0429 -953.4667
## p = 5  -749.4144 -753.2292 -962.9290 -962.9290 -961.7063 -954.3406 -951.7660
## p = 6  -742.2103 -742.9945 -891.6195 -952.3771 -952.3771 -952.2461 -950.1105
## p = 7  -728.9374 -733.0286 -851.2943 -945.7445 -944.6879 -944.6879 -949.3720
## p = 8  -747.9277 -746.2948 -812.4289 -937.9446 -938.9491 -937.3393 -937.3393
## p = 9  -722.6891 -724.5786 -863.2734 -928.9215 -927.2914 -926.8716 -936.6432
## p = 10 -714.8175 -714.5658 -816.3319 -918.5218 -918.6350 -916.9076 -921.1246
## p = 11 -703.1807 -705.3383 -794.0772 -909.6457 -908.8225 -906.9542 -912.9605
## p = 12 -716.7111 -714.7403 -774.0127 -910.0315 -910.6834 -908.7146 -909.6612
## p = 13 -697.7175 -698.1931 -793.4602 -895.5927 -894.9273 -893.5995 -897.7589
## p = 14 -686.5600 -685.7967 -766.5292 -886.0709 -885.4341 -885.2283 -890.1638
## p = 15 -676.7280 -678.3689 -753.2854 -875.6392 -874.1257 -874.3117 -879.2727
##            q = 8     q = 9    q = 10    q = 11    q = 12    q = 13    q = 14
## p = 1  -954.3375 -946.6293 -936.5328 -927.7728 -920.6435 -917.5463 -918.3110
## p = 2  -951.1470 -943.9360 -933.7047 -924.7949 -917.5334 -913.6213 -914.4063
## p = 3  -948.4683 -941.1039 -930.8509 -922.0563 -914.5728 -910.5351 -913.4996
## p = 4  -948.2330 -941.8238 -931.5689 -923.2663 -916.2063 -911.6023 -913.9345
## p = 5  -947.5994 -939.3767 -929.0155 -920.4475 -913.5968 -909.0781 -911.6312
## p = 6  -945.5758 -937.4076 -927.2439 -919.3949 -911.9537 -907.7394 -910.2890
## p = 7  -945.5181 -937.1826 -926.9640 -917.9619 -910.2774 -905.9449 -907.8712
## p = 8  -941.9617 -933.5959 -923.3691 -914.6251 -907.0608 -902.2187 -903.9255
## p = 9  -936.6432 -935.7172 -925.2881 -917.0877 -911.6973 -903.9027 -904.6405
## p = 10 -926.6891 -926.6891 -924.6986 -917.0904 -911.4197 -903.4313 -903.0612
## p = 11 -917.9145 -918.2328 -918.2328 -919.2867 -913.3674 -904.8733 -903.6541
## p = 12 -916.1321 -914.4362 -914.4610 -914.4610 -912.5159 -904.2394 -901.6216
## p = 13 -905.4744 -903.7559 -902.4406 -902.2530 -902.2530 -902.9434 -901.2363
## p = 14 -896.2370 -896.2620 -894.2896 -897.5711 -899.1407 -899.1407 -902.2350
## p = 15 -884.5637 -886.8221 -884.9832 -890.5665 -893.2335 -891.6220 -891.6220
##           q = 15
## p = 1  -908.0863
## p = 2  -904.1665
## p = 3  -903.3006
## p = 4  -903.9256
## p = 5  -901.6220
## p = 6  -900.1824
## p = 7  -897.9867
## p = 8  -894.1031
## p = 9  -894.7387
## p = 10 -893.6199
## p = 11 -893.6060
## p = 12 -892.4805
## p = 13 -892.5115
## p = 14 -893.6214
## p = 15 -891.3741
## 
## $min.Stat
## [1] -977.2745
## 
## $Stat.p
##     interest logm1      Stat
## 65         0     4 -977.2745
## 1          0     0 -976.5191
## 2          1     0 -976.2558
## 17         0     1 -975.9606
## 66         1     4 -975.6027
## 18         1     1 -975.2079
## 49         0     3 -974.4859
## 3          2     0 -974.4275
## 33         0     2 -974.0166
## 50         1     3 -973.7500
## 67         2     4 -973.6028
## 34         1     2 -973.2324
## 19         2     1 -973.2188
## 68         3     4 -972.5992
## 4          3     0 -972.4875
## 51         2     3 -971.7743
## 20         3     1 -971.3872
## 35         2     2 -971.2514
## 69         4     4 -971.0837
## 5          4     0 -970.5114
## 52         3     3 -970.4543
## 81         0     5 -969.9284
## 53         4     3 -969.5311
## 21         4     1 -969.4756
## 36         3     2 -969.3907
## 82         1     5 -968.6783
## 37         4     2 -967.4756
## 83         2     5 -966.8835
## 84         3     5 -965.6393
## 85         4     5 -963.9662
## 86         5     5 -962.9290
## 70         5     4 -961.2547
## 54         5     3 -960.9580
## 97         0     6 -960.7402
## 6          5     0 -960.6858
## 22         5     1 -959.8419
## 98         1     6 -959.6604
## 38         5     2 -957.8547
## 99         2     6 -957.7528
## 100        3     6 -956.7875
## 101        4     6 -955.2416
## 71         6     4 -954.8953
## 87         6     5 -954.6855
## 102        5     6 -954.3662
## 103        6     6 -954.0973
## 7          6     0 -954.0615
## 113        0     7 -953.9160
## 55         6     3 -953.2860
## 23         6     1 -953.1080
## 114        1     7 -952.6540
## 39         6     2 -951.1356
## 115        2     7 -950.6562
## 116        3     7 -949.6038
## 88         7     5 -949.2090
## 72         7     4 -948.5194
## 117        4     7 -947.7999
## 104        7     6 -947.7424
## 56         7     3 -947.6915
## 8          7     0 -947.5092
## 120        7     7 -947.3660
## 24         7     1 -947.0094
## 118        5     7 -946.9631
## 119        6     7 -946.8080
## 40         7     2 -945.0123
## 129        0     8 -943.9035
## 130        1     8 -942.6627
## 131        2     8 -940.6818
## 145        0     9 -940.0114
## 132        3     8 -939.6913
## 89         8     5 -939.1878
## 73         8     4 -938.5330
## 146        1     9 -938.2680
## 133        4     8 -937.8368
## 105        8     6 -937.6834
## 57         8     3 -937.6370
## 9          8     0 -937.5705
## 121        8     7 -937.5351
## 136        7     8 -937.3948
## 25         8     1 -937.0088
## 134        5     8 -936.9393
## 135        6     8 -936.8904
## 147        2     9 -936.3875
## 148        3     9 -936.3159
## 137        8     8 -935.5389
## 41         8     2 -935.0088
## 149        4     9 -934.3458
## 150        5     9 -934.1858
## 152        7     9 -934.0733
## 151        6     9 -932.9538
## 153        8     9 -932.3338
## 154        9     9 -930.9065
## 161        0    10 -929.8056
## 90         9     5 -929.2731
## 74         9     4 -928.5254
## 162        1    10 -928.1257
## 10         9     0 -927.9853
## 58         9     3 -927.9744
## 122        9     7 -927.9061
## 106        9     6 -927.6344
## 26         9     1 -927.4482
## 164        3    10 -926.5271
## 163        2    10 -926.2965
## 138        9     8 -926.1307
## 42         9     2 -925.4484
## 165        4    10 -924.5287
## 168        7    10 -924.2716
## 166        5    10 -924.0521
## 167        6    10 -922.7596
## 169        8    10 -922.5928
## 155       10     9 -921.2169
## 170        9    10 -921.1777
## 177        0    11 -920.2608
## 171       10    10 -920.0124
## 91        10     5 -919.0182
## 178        1    11 -918.7342
## 75        10     4 -918.4135
## 11        10     0 -917.8597
## 59        10     3 -917.7711
## 123       10     7 -917.6569
## 107       10     6 -917.3861
## 27        10     1 -917.2925
## 179        2    11 -916.9417
## 180        3    11 -916.8682
## 193        0    12 -916.1477
## 139       10     8 -915.9643
## 92        11     5 -915.3201
## 43        10     2 -915.2941
## 156       11     9 -915.0851
## 181        4    11 -914.8854
## 194        1    12 -914.4423
## 124       11     7 -914.3141
## 184        7    11 -914.1880
## 76        11     4 -914.1395
## 182        5    11 -914.0440
## 108       11     6 -913.4052
## 140       11     8 -913.3026
## 195        2    12 -913.1680
## 172       11    10 -913.0914
## 60        11     3 -912.7714
## 183        6    11 -912.7548
## 196        3    12 -912.5820
## 185        8    11 -912.5636
## 12        11     0 -912.2009
## 28        11     1 -912.0389
## 186        9    11 -911.1737
## 157       12     9 -911.1513
## 188       11    11 -911.1189
## 93        12     5 -910.7693
## 198        5    12 -910.7434
## 197        4    12 -910.6154
## 125       12     7 -910.5873
## 141       12     8 -910.0719
## 44        11     2 -910.0439
## 187       10    11 -909.9928
## 200        7    12 -909.4197
## 173       12    10 -909.2473
## 77        12     4 -909.1913
## 109       12     6 -908.7753
## 199        6    12 -908.7635
## 201        8    12 -908.1609
## 61        12     3 -908.0357
## 29        12     1 -907.8613
## 209        0    13 -907.6473
## 13        12     0 -907.6158
## 205       12    12 -907.5931
## 204       11    12 -907.5525
## 202        9    12 -907.3633
## 189       12    11 -907.3200
## 210        1    13 -906.1005
## 45        12     2 -905.9070
## 203       10    12 -905.7653
## 211        2    13 -904.7293
## 212        3    13 -903.9077
## 214        5    13 -902.0824
## 158       13     9 -901.9574
## 213        4    13 -901.9144
## 94        13     5 -901.6338
## 126       13     7 -901.3766
## 142       13     8 -900.9367
## 216        7    13 -900.5676
## 225        0    14 -900.5066
## 174       13    10 -900.1413
## 215        6    13 -900.1102
## 78        13     4 -900.0282
## 110       13     6 -899.6703
## 226        1    14 -899.0967
## 217        8    13 -899.0866
## 62        13     3 -898.8589
## 30        13     1 -898.7940
## 190       13    11 -898.4409
## 221       12    13 -898.4110
## 220       11    13 -898.3058
## 218        9    13 -898.2568
## 14        13     0 -898.2039
## 206       13    12 -897.9014
## 227        2    14 -897.3889
## 46        13     2 -896.8637
## 219       10    13 -896.6244
## 222       13    13 -896.4458
## 228        3    14 -896.2512
## 230        5    14 -895.1320
## 95        14     5 -894.6021
## 229        4    14 -894.3023
## 159       14     9 -894.2497
## 127       14     7 -893.9663
## 143       14     8 -893.6932
## 231        6    14 -893.4037
## 79        14     4 -893.1343
## 232        7    14 -893.1064
## 111       14     6 -892.6253
## 175       14    10 -892.5085
## 63        14     3 -891.9131
## 191       14    11 -891.1895
## 233        8    14 -891.1877
## 234        9    14 -891.1729
## 31        14     1 -890.7573
## 236       11    14 -890.5576
## 241        0    15 -890.5500
## 15        14     0 -890.3449
## 237       12    14 -890.1854
## 235       10    14 -889.8957
## 207       14    12 -889.7107
## 242        1    15 -889.0419
## 47        14     2 -888.9410
## 238       13    14 -888.1867
## 223       14    13 -887.7488
## 239       14    14 -887.6659
## 243        2    15 -887.3088
## 244        3    15 -886.0691
## 246        5    15 -884.7479
## 96        15     5 -884.2869
## 245        4    15 -884.1417
## 160       15     9 -883.9364
## 128       15     7 -883.6409
## 144       15     8 -883.4503
## 247        6    15 -883.0158
## 80        15     4 -882.8148
## 248        7    15 -882.7881
## 112       15     6 -882.3106
## 176       15    10 -882.2093
## 64        15     3 -881.6497
## 253       12    15 -881.4274
## 252       11    15 -881.3077
## 250        9    15 -881.1831
## 192       15    11 -880.9028
## 249        8    15 -880.8964
## 32        15     1 -880.5983
## 251       10    15 -880.2736
## 16        15     0 -880.2468
## 254       13    15 -879.4467
## 208       15    12 -879.4364
## 255       14    15 -879.2846
## 48        15     2 -878.8432
## 224       15    13 -877.4985
## 240       15    14 -877.4570
model.ardlDlmberganda = ardlDlm(formula = logprice ~ interest + logm1,
                        data = data.frame(data1) , p = 4 , q = 4)
summary(model.ardlDlmberganda)
## 
## Time series regression with "ts" data:
## Start = 5, End = 144
## 
## Call:
## dynlm(formula = as.formula(model.text), data = data)
## 
## Residuals:
##        Min         1Q     Median         3Q        Max 
## -0.0290527 -0.0075965  0.0005726  0.0072745  0.0304486 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  0.0145022  0.1822785   0.080  0.93671    
## interest.t   0.0067985  0.2135315   0.032  0.97465    
## interest.1   0.6093502  0.3240545   1.880  0.06238 .  
## interest.2   0.0798544  0.3221168   0.248  0.80461    
## interest.3  -0.3638172  0.3238873  -1.123  0.26347    
## interest.4   0.2084240  0.2447331   0.852  0.39604    
## logm1.t      0.0828689  0.0457486   1.811  0.07248 .  
## logm1.1     -0.0092841  0.0399079  -0.233  0.81642    
## logm1.2     -0.1166129  0.0390732  -2.984  0.00342 ** 
## logm1.3      0.0007016  0.0389297   0.018  0.98565    
## logm1.4      0.0447857  0.0425474   1.053  0.29455    
## logprice.1   0.3274245  0.0651574   5.025  1.7e-06 ***
## logprice.2   0.1323801  0.0684485   1.934  0.05537 .  
## logprice.3  -0.1448245  0.0674268  -2.148  0.03365 *  
## logprice.4   0.6730871  0.0636443  10.576  < 2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.01132 on 125 degrees of freedom
## Multiple R-squared:  0.9993, Adjusted R-squared:  0.9992 
## F-statistic: 1.273e+04 on 14 and 125 DF,  p-value: < 2.2e-16
#model p interest 0 p logm1 4 
rem.p = list(interest = c(1,2,3,4))
remove = list(p = rem.p)
model.ardlDlmberganda2 = ardlDlm(formula = logprice ~ interest + logm1,
                        data = data.frame(data1) , p = 4 , q = 4 ,
                        remove = remove)
summary(model.ardlDlmberganda2)
## 
## Time series regression with "ts" data:
## Start = 5, End = 144
## 
## Call:
## dynlm(formula = as.formula(model.text), data = data)
## 
## Residuals:
##        Min         1Q     Median         3Q        Max 
## -0.0290369 -0.0083445  0.0009024  0.0079199  0.0303652 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  0.174838   0.133708   1.308  0.19333    
## interest.t   0.448826   0.098736   4.546 1.24e-05 ***
## logm1.t      0.056659   0.043836   1.293  0.19849    
## logm1.1     -0.017025   0.039159  -0.435  0.66446    
## logm1.2     -0.118413   0.037399  -3.166  0.00193 ** 
## logm1.3     -0.006454   0.038112  -0.169  0.86580    
## logm1.4      0.060220   0.040337   1.493  0.13789    
## logprice.1   0.319059   0.062107   5.137 1.00e-06 ***
## logprice.2   0.111794   0.066101   1.691  0.09320 .  
## logprice.3  -0.122129   0.065114  -1.876  0.06297 .  
## logprice.4   0.699061   0.062611  11.165  < 2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.01149 on 129 degrees of freedom
## Multiple R-squared:  0.9993, Adjusted R-squared:  0.9992 
## F-statistic: 1.73e+04 on 10 and 129 DF,  p-value: < 2.2e-16