Praktikum 3 Regresi dengan Peubah Lag

Packages

## Warning: package 'dLagM' was built under R version 4.2.3
## Loading required package: nardl
## Warning: package 'nardl' was built under R version 4.2.3
## Registered S3 method overwritten by 'quantmod':
##   method            from
##   as.zoo.data.frame zoo
## Loading required package: dynlm
## Loading required package: zoo
## 
## Attaching package: 'zoo'
## The following objects are masked from 'package:base':
## 
##     as.Date, as.Date.numeric
## Warning: package 'MLmetrics' was built under R version 4.2.3
## 
## Attaching package: 'MLmetrics'
## The following object is masked from 'package:dLagM':
## 
##     MAPE
## The following object is masked from 'package:base':
## 
##     Recall
## Warning: package 'lmtest' was built under R version 4.2.3
## Loading required package: carData

Impor Data

datasales <- rio::import("https://raw.githubusercontent.com/aidara11/mpdw/main/Pertemuan%203/salesmonthly.csv")
str(datasales)
## 'data.frame':    70 obs. of  9 variables:
##  $ datum: IDate, format: "2014-01-31" "2014-02-28" ...
##  $ M01AB: num  128 133 137 113 102 ...
##  $ M01AE: num  99.1 126.1 93 89.5 119.9 ...
##  $ N02BA: num  152 177 148 131 132 ...
##  $ N02BE: num  878 1002 779 698 629 ...
##  $ N05B : num  354 347 232 209 270 323 348 420 399 472 ...
##  $ N05C : num  50 31 20 18 23 23 21 29 14 30 ...
##  $ R03  : num  112 122 112 97 107 57 61 37 115 182 ...
##  $ R06  : num  48.2 36.2 85.4 73.7 123.7 ...
datasales
##         datum  M01AB   M01AE   N02BA    N02BE  N05B N05C    R03    R06
## 1  2014-01-31 127.69  99.090 152.100  878.030 354.0   50 112.00  48.20
## 2  2014-02-28 133.32 126.050 177.000 1001.900 347.0   31 122.00  36.20
## 3  2014-03-31 137.44  92.950 147.655  779.275 232.0   20 112.00  85.40
## 4  2014-04-30 113.10  89.475 130.900  698.500 209.0   18  97.00  73.70
## 5  2014-05-31 101.79 119.933 132.100  628.780 270.0   23 107.00 123.70
## 6  2014-06-30 112.07  94.710 122.900  548.225 323.0   23  57.00 109.30
## 7  2014-07-31 117.06  95.010 129.300  491.900 348.0   21  61.00  69.10
## 8  2014-08-31 134.79  99.780 123.800  583.850 420.0   29  37.00  70.80
## 9  2014-09-30 108.78 109.094 122.100  887.820 399.0   14 115.00  58.80
## 10 2014-10-31 154.75 185.241 191.600 1856.815 472.0   30 182.00  74.50
## 11 2014-11-30 138.08 100.860 142.700  723.800 489.0   19 112.00  45.20
## 12 2014-12-31 131.90 121.401 111.124 1015.660 492.0   25 163.00  33.40
## 13 2015-01-31 135.91 130.349 141.000 1044.240 463.0   24 177.25  42.00
## 14 2015-02-28 115.71 123.740 131.830  953.252 243.0    9 208.00  47.00
## 15 2015-03-31 156.04 129.386 133.800 1084.850 208.0   13 195.00  54.00
## 16 2015-04-30 154.50 101.115 122.100  940.170 192.0    5  97.00 112.00
## 17 2015-05-31 160.02 119.117 136.040  765.900 194.0   10 100.00 159.50
## 18 2015-06-30 151.87 113.690 145.460  746.788 217.0   12 193.00 125.80
## 19 2015-07-31 175.61 113.810 125.500  708.828 203.0    6  60.00 130.30
## 20 2015-08-31 181.69 144.519 133.400  790.788 265.5   15  45.00  83.70
## 21 2015-09-30 166.22 134.122 110.400  852.125 243.5   11  91.00  71.00
## 22 2015-10-31 195.81 127.231 146.200 1574.335 222.0    8 184.00  72.00
## 23 2015-11-30 152.78 128.233 145.900 1277.725 228.0   18 195.00  44.00
## 24 2015-12-31 159.46 131.291 137.000 1258.349 286.0   28 231.00  41.73
## 25 2016-01-31 171.65 128.402 172.500 1476.324 248.0   24 174.00  56.50
## 26 2016-02-29 173.81 137.528 134.200 1224.862 239.0   20 245.00  58.00
## 27 2016-03-31 156.64 180.589 148.400 1150.700 250.0   13 253.00  97.84
## 28 2016-04-30 166.61 146.526 147.700  998.337 318.0   18 216.00 162.40
## 29 2016-05-31 167.36 120.861 130.550  997.150 275.0   18 131.00 137.10
## 30 2016-06-30 169.67 114.961 117.750  760.050 311.0   20 127.00 134.80
## 31 2016-07-31 203.97 141.019 137.900  652.362 240.0    8 109.00 116.83
## 32 2016-08-31 211.13 114.375 132.700  753.050 275.5   12 116.00  85.30
## 33 2016-09-30 172.96 126.218 116.700 1118.699 307.0   18 121.00  69.30
## 34 2016-10-31 186.76 142.056 160.150 1617.275 312.0   11 220.00  60.90
## 35 2016-11-30 175.18 116.850 133.850 1062.686 246.0   27 150.00  51.20
## 36 2016-12-31 169.32 135.056 132.400 1624.335 257.0   18 275.00  34.90
## 37 2017-01-31   0.00   0.000   0.000    0.000   1.0    0   0.00   0.00
## 38 2017-02-28 139.69 103.517  97.000  526.350 144.0    7 117.00  30.60
## 39 2017-03-31 162.85 111.055 107.350  612.500 165.0    9 139.00 100.10
## 40 2017-04-30 155.61 101.215 100.500  540.200 132.0    9 209.00 122.40
## 41 2017-05-31 143.66 118.125  98.950  547.940 148.0   23 128.00 161.81
## 42 2017-06-30 122.33 103.006 119.600  496.100 163.0    8 163.00 151.90
## 43 2017-07-31 159.67 116.206  75.200  479.350 219.0   15 115.00  81.10
## 44 2017-08-31 170.15 112.470  84.400  549.300 239.0   12  75.00  60.10
## 45 2017-09-30 138.33 118.711  88.150  863.750 223.0   23 139.00  66.90
## 46 2017-10-31 137.64  88.737 100.400 1184.350 226.0   15 247.00  51.00
## 47 2017-11-30 163.85 119.780 104.450  867.899 192.0   15 196.00  46.60
## 48 2017-12-31 160.01 121.663 115.150 1007.180 226.0    6 204.00  47.10
## 49 2018-01-31 132.28 109.446 101.150 1134.325 229.0   11 219.00  49.50
## 50 2018-02-28 128.36 132.804 114.650 1255.374 268.0   12 253.00  39.06
## 51 2018-03-31 146.16 111.764 122.300  999.123 381.0   42 269.00  85.50
## 52 2018-04-30 170.02 107.723  84.600  836.037 289.0   21 229.00 197.10
## 53 2018-05-31 160.52 103.522  89.400  644.648 259.0   13 192.00 213.04
## 54 2018-06-30 141.18 114.226  86.800  584.343 248.0   18 101.00 120.80
## 55 2018-07-31 150.18 132.549  87.200  679.350 283.0   19  90.00 122.20
## 56 2018-08-31 140.00 114.719  88.250  733.838 253.0   20 159.00 103.10
## 57 2018-09-30 153.52 114.992  86.500 1058.262 263.0   12 205.00  88.10
## 58 2018-10-31 144.71 129.400  76.050 1129.275 287.0   25 353.00  76.90
## 59 2018-11-30 172.29 105.487 102.150  995.150 252.2   22 311.00  48.40
## 60 2018-12-31 147.71 113.024  84.750 1213.950 254.0   27 384.00  53.10
## 61 2019-01-31 179.70 222.351  99.700 1660.612 295.2   23 386.00  41.30
## 62 2019-02-28 133.73 142.155 110.200 1001.212 249.4   12 226.00  69.50
## 63 2019-03-31 154.52 113.118  83.350  941.050 301.4   19 257.00 169.50
## 64 2019-04-30 161.39 100.165  88.100  647.650 299.4   22 259.00 179.10
## 65 2019-05-31 168.04  97.258 104.100  703.562 265.8   26 322.00 135.40
## 66 2019-06-30 151.54 101.627 103.200  610.000 193.0   25 142.00 156.04
## 67 2019-07-31 181.00 103.541  92.800  649.800 250.6   20 115.00 105.20
## 68 2019-08-31 181.91  88.269  84.200  518.100 237.0   26 145.00  97.30
## 69 2019-09-30 161.07 111.437  93.500  984.480 227.8   16 161.00 109.10
## 70 2019-10-31  44.37  37.300  20.650  295.150  86.0    7  37.00  11.13

Pembagian Data

#SPLIT DATA
train <- datasales [1:56,]
train
##         datum  M01AB   M01AE   N02BA    N02BE  N05B N05C    R03    R06
## 1  2014-01-31 127.69  99.090 152.100  878.030 354.0   50 112.00  48.20
## 2  2014-02-28 133.32 126.050 177.000 1001.900 347.0   31 122.00  36.20
## 3  2014-03-31 137.44  92.950 147.655  779.275 232.0   20 112.00  85.40
## 4  2014-04-30 113.10  89.475 130.900  698.500 209.0   18  97.00  73.70
## 5  2014-05-31 101.79 119.933 132.100  628.780 270.0   23 107.00 123.70
## 6  2014-06-30 112.07  94.710 122.900  548.225 323.0   23  57.00 109.30
## 7  2014-07-31 117.06  95.010 129.300  491.900 348.0   21  61.00  69.10
## 8  2014-08-31 134.79  99.780 123.800  583.850 420.0   29  37.00  70.80
## 9  2014-09-30 108.78 109.094 122.100  887.820 399.0   14 115.00  58.80
## 10 2014-10-31 154.75 185.241 191.600 1856.815 472.0   30 182.00  74.50
## 11 2014-11-30 138.08 100.860 142.700  723.800 489.0   19 112.00  45.20
## 12 2014-12-31 131.90 121.401 111.124 1015.660 492.0   25 163.00  33.40
## 13 2015-01-31 135.91 130.349 141.000 1044.240 463.0   24 177.25  42.00
## 14 2015-02-28 115.71 123.740 131.830  953.252 243.0    9 208.00  47.00
## 15 2015-03-31 156.04 129.386 133.800 1084.850 208.0   13 195.00  54.00
## 16 2015-04-30 154.50 101.115 122.100  940.170 192.0    5  97.00 112.00
## 17 2015-05-31 160.02 119.117 136.040  765.900 194.0   10 100.00 159.50
## 18 2015-06-30 151.87 113.690 145.460  746.788 217.0   12 193.00 125.80
## 19 2015-07-31 175.61 113.810 125.500  708.828 203.0    6  60.00 130.30
## 20 2015-08-31 181.69 144.519 133.400  790.788 265.5   15  45.00  83.70
## 21 2015-09-30 166.22 134.122 110.400  852.125 243.5   11  91.00  71.00
## 22 2015-10-31 195.81 127.231 146.200 1574.335 222.0    8 184.00  72.00
## 23 2015-11-30 152.78 128.233 145.900 1277.725 228.0   18 195.00  44.00
## 24 2015-12-31 159.46 131.291 137.000 1258.349 286.0   28 231.00  41.73
## 25 2016-01-31 171.65 128.402 172.500 1476.324 248.0   24 174.00  56.50
## 26 2016-02-29 173.81 137.528 134.200 1224.862 239.0   20 245.00  58.00
## 27 2016-03-31 156.64 180.589 148.400 1150.700 250.0   13 253.00  97.84
## 28 2016-04-30 166.61 146.526 147.700  998.337 318.0   18 216.00 162.40
## 29 2016-05-31 167.36 120.861 130.550  997.150 275.0   18 131.00 137.10
## 30 2016-06-30 169.67 114.961 117.750  760.050 311.0   20 127.00 134.80
## 31 2016-07-31 203.97 141.019 137.900  652.362 240.0    8 109.00 116.83
## 32 2016-08-31 211.13 114.375 132.700  753.050 275.5   12 116.00  85.30
## 33 2016-09-30 172.96 126.218 116.700 1118.699 307.0   18 121.00  69.30
## 34 2016-10-31 186.76 142.056 160.150 1617.275 312.0   11 220.00  60.90
## 35 2016-11-30 175.18 116.850 133.850 1062.686 246.0   27 150.00  51.20
## 36 2016-12-31 169.32 135.056 132.400 1624.335 257.0   18 275.00  34.90
## 37 2017-01-31   0.00   0.000   0.000    0.000   1.0    0   0.00   0.00
## 38 2017-02-28 139.69 103.517  97.000  526.350 144.0    7 117.00  30.60
## 39 2017-03-31 162.85 111.055 107.350  612.500 165.0    9 139.00 100.10
## 40 2017-04-30 155.61 101.215 100.500  540.200 132.0    9 209.00 122.40
## 41 2017-05-31 143.66 118.125  98.950  547.940 148.0   23 128.00 161.81
## 42 2017-06-30 122.33 103.006 119.600  496.100 163.0    8 163.00 151.90
## 43 2017-07-31 159.67 116.206  75.200  479.350 219.0   15 115.00  81.10
## 44 2017-08-31 170.15 112.470  84.400  549.300 239.0   12  75.00  60.10
## 45 2017-09-30 138.33 118.711  88.150  863.750 223.0   23 139.00  66.90
## 46 2017-10-31 137.64  88.737 100.400 1184.350 226.0   15 247.00  51.00
## 47 2017-11-30 163.85 119.780 104.450  867.899 192.0   15 196.00  46.60
## 48 2017-12-31 160.01 121.663 115.150 1007.180 226.0    6 204.00  47.10
## 49 2018-01-31 132.28 109.446 101.150 1134.325 229.0   11 219.00  49.50
## 50 2018-02-28 128.36 132.804 114.650 1255.374 268.0   12 253.00  39.06
## 51 2018-03-31 146.16 111.764 122.300  999.123 381.0   42 269.00  85.50
## 52 2018-04-30 170.02 107.723  84.600  836.037 289.0   21 229.00 197.10
## 53 2018-05-31 160.52 103.522  89.400  644.648 259.0   13 192.00 213.04
## 54 2018-06-30 141.18 114.226  86.800  584.343 248.0   18 101.00 120.80
## 55 2018-07-31 150.18 132.549  87.200  679.350 283.0   19  90.00 122.20
## 56 2018-08-31 140.00 114.719  88.250  733.838 253.0   20 159.00 103.10
test  <- datasales [57:70,]
test
##         datum  M01AB   M01AE  N02BA    N02BE  N05B N05C R03    R06
## 57 2018-09-30 153.52 114.992  86.50 1058.262 263.0   12 205  88.10
## 58 2018-10-31 144.71 129.400  76.05 1129.275 287.0   25 353  76.90
## 59 2018-11-30 172.29 105.487 102.15  995.150 252.2   22 311  48.40
## 60 2018-12-31 147.71 113.024  84.75 1213.950 254.0   27 384  53.10
## 61 2019-01-31 179.70 222.351  99.70 1660.612 295.2   23 386  41.30
## 62 2019-02-28 133.73 142.155 110.20 1001.212 249.4   12 226  69.50
## 63 2019-03-31 154.52 113.118  83.35  941.050 301.4   19 257 169.50
## 64 2019-04-30 161.39 100.165  88.10  647.650 299.4   22 259 179.10
## 65 2019-05-31 168.04  97.258 104.10  703.562 265.8   26 322 135.40
## 66 2019-06-30 151.54 101.627 103.20  610.000 193.0   25 142 156.04
## 67 2019-07-31 181.00 103.541  92.80  649.800 250.6   20 115 105.20
## 68 2019-08-31 181.91  88.269  84.20  518.100 237.0   26 145  97.30
## 69 2019-09-30 161.07 111.437  93.50  984.480 227.8   16 161 109.10
## 70 2019-10-31  44.37  37.300  20.65  295.150  86.0    7  37  11.13

Mengubah format data jadi time series

train.ts <- ts(train)
test.ts  <- ts(test)
data.ts  <- ts(datasales)

Model Koyck

Model Koyck didasarkan pada asumsi bahwa semakin jauh jarak lag peubah independen (x) dari periode sekarang maka semakin kecil pengaruh peubah lag terhadap peubah dependen (y).

Koyck mengusulkan suatu metode untuk menduga model dinamis distributed lag dengan mengasumsikan bahwa semua koefisien \(\beta\) mempunyai tanda sama.

Model kyock merupakan jenis paling umum dari model infinite distributed lag dan juga dikenal sebagai geometric lag

\[ y_t=a(1-\lambda)+\beta_0X_t+\beta_1Z_t+\lambda Y_{t-1}+V_t \]

dengan \[V_t=u_t-\lambda u_{t-1}\]

Pemodelan

Pemodelan model Koyck dengan R dapat menggunakan dLagM::koyckDlm() .

Fungsi koyckDlm() akan menerapkan model lag terdistribusi dengan transformasi Koyck satu prediktor. Nilai x dan y tidak perlu sebagai objek time series (ts). intercept dapat dibuat TRUE untuk memasukkan intersep ke dalam model.

#MODEL KOYCK
model.koyck <- koyckDlm(x = train$R03, y = train$N05B)
summary(model.koyck)
## 
## Call:
## "Y ~ (Intercept) + Y.1 + X.t"
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -366.77  -31.68   -4.25   46.64  201.12 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 176.2521    68.7616   2.563   0.0133 *  
## Y.1           0.7452     0.1222   6.098 1.35e-07 ***
## X.t          -0.7289     0.4028  -1.809   0.0762 .  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 82.42 on 52 degrees of freedom
## Multiple R-Squared: 0.2113,  Adjusted R-squared: 0.181 
## Wald test: 19.79 on 2 and 52 DF,  p-value: 4.056e-07 
## 
## Diagnostic tests:
## NULL
## 
##                           alpha       beta       phi
## Geometric coefficients:  691.77 -0.7289469 0.7452158
AIC(model.koyck)
## [1] 646.2944
BIC(model.koyck)
## [1] 654.3238

Dari hasil tersebut, didapat bahwa peubah x_t memiliki nilai P-Value > 0.05 dan y_{t-1} memiliki nilai P-Value<0.05. Hal ini menunjukkan bahwa peubah x_t tidak berpengaruh signifikan pada taraf nyata 5% (namun berpengaruh signifikan pada taraf nyata 10%) dan peubah y_{t-1} berpengaruh signifikan terhadap y pada taraf nyata 5%. Adapun model keseluruhannya adalah sebagai berikut

\[ \hat{Y_t}=176.2521+0.7452X_t-0.7289Y_{t-1} \]

Peramalan dan Akurasi

Berikut adalah hasil peramalan y untuk 14 periode kedepan menggunakan model koyck

fore.koyck <- forecast(model = model.koyck, x = test$R03, h=14)  #h = periode peramalan
fore.koyck
## $forecasts
##  [1]  215.357534   79.421627    8.735812  -97.153501 -177.521784 -120.781988
##  [7] -101.095951  -87.883499 -123.961028  -19.636124   77.790010  128.525097
## [13]  154.670535  264.543948
## 
## $call
## forecast.koyckDlm(model = model.koyck, x = test$R03, h = 14)
## 
## attr(,"class")
## [1] "forecast.koyckDlm" "dLagM"
mape.koyck <- MAPE(fore.koyck$forecasts, test$N05B)

#akurasi data training
GoF(model.koyck)
##              n      MAE       MPE    MAPE     sMAPE     MASE      MSE     MRAE
## model.koyck 55 53.96828 -6.666903 6.85131 0.2241851 1.215806 6421.947 3.419882
##                GMRAE
## model.koyck 1.186783

Pada perhitungan keakuratan model menggunakan metode Koyck didapatkan nilai MAPE 6,85%. Nilai akurasi model ini kurang dari 10% sehingga dapat dikategorikan sangat baik.

Regression with Distributed Lag

Pemodelan model Regression with Distributed Lag dengan R dapat menggunakan dLagM::dlm() .

Fungsi dlm() akan menerapkan model lag terdistribusi dengan satu atau lebih prediktor. Nilai x dan y tidak perlu sebagai objek time series (ts). \(q\) adalah integer yang mewakili panjang lag yang terbatas.

Pemodelan (Lag=2)

model.dlm <- dlm(x = train$R03 ,y = train$N05B , q = 2)
summary(model.dlm)
## 
## Call:
## lm(formula = model.formula, data = design)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -183.62  -46.80  -15.87   35.00  241.15 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 284.5485    39.8756   7.136 3.68e-09 ***
## x.t           0.2829     0.2115   1.337    0.187    
## x.1          -0.2855     0.2258  -1.264    0.212    
## x.2          -0.1429     0.2131  -0.670    0.506    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 90.77 on 50 degrees of freedom
## Multiple R-squared:  0.06547,    Adjusted R-squared:  0.0094 
## F-statistic: 1.168 on 3 and 50 DF,  p-value: 0.3314
## 
## AIC and BIC values for the model:
##        AIC      BIC
## 1 645.9927 655.9377
AIC(model.dlm)
## [1] 645.9927
BIC(model.dlm)
## [1] 655.9377

Dari hasil diatas, didapat bahwa \(P-value\) dari intercept < 0.05. Hal ini menunjukkan bahwa intercept berpengaruh signifikan terhadap \(y\). Adapun model keseluruhan yang terbentuk adalah sebagai berikut

\[ \hat{Y_t}=284.5485+0.2829X_t-0.2855X_{t-1}-0.1429X_{t-2} \]

Peramalan dan Akurasi

Berikut merupakan hasil peramalan \(y\) untuk 14 periode kedepan

fore.dlm <- forecast(model = model.dlm, x=test$R03, h=14)
fore.dlm
## $forecasts
##  [1] 284.2924 303.1688 242.4675 253.9589 239.6881 183.4244 237.5805 252.1602
##  [9] 264.9814 195.7928 230.5342 272.4488 272.2695 228.3377
## 
## $call
## forecast.dlm(model = model.dlm, x = test$R03, h = 14)
## 
## attr(,"class")
## [1] "forecast.dlm" "dLagM"
mape.dlm <- MAPE(fore.dlm$forecasts, test$N05B)

#akurasi data training
GoF(model.dlm)
##            n      MAE      MPE     MAPE     sMAPE     MASE      MSE     MRAE
## model.dlm 54 65.42553 -3.47443 3.642495 0.2626092 1.519524 7629.412 4.846577
##              GMRAE
## model.dlm 1.517162

Pada perhitungan keakuratan model menggunakan metode Regression with Distributed Lag didapatkan nilai MAPE 3.64%. Nilai akurasi model ini kurang dari 10% sehingga dapat dikategorikan sangat baik.

Lag Optimum

#penentuan lag optimum 
finiteDLMauto(formula = N05B ~ R06,
              data = data.frame(train), q.min = 1, q.max = 6,
              model.type = "dlm", error.type = "AIC", trace = FALSE)
##   q - k    MASE      AIC      BIC   GMRAE   MBRAE R.Adj.Sq    Ljung-Box
## 6     6 1.51882 611.0544 628.2627 1.47196 1.37676 -0.11642 4.593417e-08

Berdasarkan output tersebut, lag optimum didapatkan ketika lag = 6. Selanjutnya dilakukan pemodelan untuk lag=6

#model dlm dengan lag optimum
model.dlm2 <- dlm(x = train$R03,y = train$N05B, q = 6) # q = lag
summary(model.dlm2)
## 
## Call:
## lm(formula = model.formula, data = design)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -188.75  -60.24  -10.03   58.38  204.51 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  3.472e+02  6.088e+01   5.703 1.06e-06 ***
## x.t          2.781e-01  2.344e-01   1.187    0.242    
## x.1         -3.311e-01  2.503e-01  -1.323    0.193    
## x.2         -9.296e-04  2.513e-01  -0.004    0.997    
## x.3         -7.038e-02  2.518e-01  -0.279    0.781    
## x.4         -2.758e-01  2.539e-01  -1.086    0.283    
## x.5         -5.309e-02  2.543e-01  -0.209    0.836    
## x.6         -1.032e-01  2.417e-01  -0.427    0.672    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 94.47 on 42 degrees of freedom
## Multiple R-squared:  0.1349, Adjusted R-squared:  -0.009237 
## F-statistic: 0.9359 on 7 and 42 DF,  p-value: 0.4894
## 
## AIC and BIC values for the model:
##        AIC      BIC
## 1 606.0077 623.2159
AIC(model.dlm2)
## [1] 606.0077
BIC(model.dlm2)
## [1] 623.2159

Dari hasil tersebut tidak terdapat peubah yang berpengaruh signifikan terhadap taraf nyata 5%. Adapun keseluruhan model yang terbentuk adalah

\[ \hat{Y_t}=347.2+0.2781X_t+...-0.1032X_{t-6} \]

Adapun hasil peramalan 14 periode kedepan menggunakan model tersebut adalah sebagai berikut

#peramalan dan akurasi
fore.dlm2 <- forecast(model = model.dlm2, x=test$R03, h=14)
mape.dlm2<- MAPE(fore.dlm2$forecasts, test$N05B)

#akurasi data training
GoF(model.dlm2)
##             n      MAE       MPE     MAPE     sMAPE     MASE      MSE     MRAE
## model.dlm2 50 70.35149 -3.847603 4.039386 0.2859447 1.626049 7497.145 4.920313
##               GMRAE
## model.dlm2 1.900647

Didapatkan nilai MAPE sebesar 4.03%. Model tersebut merupakan model yang sangat baik dengan nilai MAPE yang kurang dari 10%.

Model Autoregressive

Peubah dependen (y) dipengaruhi oleh peubah independen (x) pada waktu sekarang, serta dipengaruhi juga oleh peubah dependen (y) itu sendiri pada satu waktu yang lalu maka model tersebut disebut autoregressive (Gujarati 2004).

Pemodelan

Pemodelan Autoregressive dilakukan menggunakan fungsi dLagM::ardlDlm() . Fungsi tersebut akan menerapkan autoregressive berordo \((p,q)\) dengan satu prediktor. Fungsi umum dari ardlDlm() adalah sebagai berikut.

ardlDlm(formula = NULL , data = NULL , x = NULL , y = NULL , p = 1 , q = 1 , 
         remove = NULL )

Dengan \(p\) adalah integer yang mewakili panjang lag yang terbatas dan \(q\) adalah integer yang merepresentasikan ordo dari proses autoregressive.

model.ardl <- ardlDlm(x = train$R03, y = train$N05B, p = 1 , q = 1)
summary(model.ardl)
## 
## Time series regression with "ts" data:
## Start = 2, End = 56
## 
## Call:
## dynlm(formula = as.formula(model.text), data = data, start = 1)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -173.74  -26.44    5.52   28.66  126.83 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 86.65801   31.05817   2.790  0.00739 ** 
## X.t          0.27744    0.13164   2.108  0.04000 *  
## X.1         -0.43069    0.13151  -3.275  0.00190 ** 
## Y.1          0.75316    0.08437   8.927 5.33e-12 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 56.81 on 51 degrees of freedom
## Multiple R-squared:  0.6325, Adjusted R-squared:  0.6109 
## F-statistic: 29.26 on 3 and 51 DF,  p-value: 3.819e-11
AIC(model.ardl)
## [1] 606.2956
BIC(model.ardl)
## [1] 616.3323

Dari hasil tersebut, didapat bahwa peubah \(x_t\) , \(x_{t-1}\) , dan \(y_{t-1}\) memiliki nilai P-Value < 0.05 Hal ini menunjukkan bahwa ketiga peubah tersebut berpengaruh signifikan terhadap \(y_t\) pada taraf nyata 5%. Model keseluruhannya adalah sebagai berikut:

\[ \hat{Y}=86.65801-0.43069X_t+1,4557X_{t-1}+0.75316Y_{t-1} \]

Peramalan dan Akurasi

fore.ardl <- forecast(model = model.ardl, x=test$R03, h=14)
fore.ardl
## $forecasts
##  [1] 265.6047 296.3481 244.1082 243.1054 211.4646 142.3815 167.8618 174.2561
##  [9] 195.6897 134.7592 158.9018 197.0371 217.2776 191.2279
## 
## $call
## forecast.ardlDlm(model = model.ardl, x = test$R03, h = 14)
## 
## attr(,"class")
## [1] "forecast.ardlDlm" "dLagM"

Data di atas merupakan hasil peramalan untuk 14 periode ke depan menggunakan Model Autoregressive dengan \(p=1\) dan \(q=1\).

mape.ardl <- MAPE(fore.ardl$forecasts, test$N05B)
mape.ardl
## [1] 0.2900784
#akurasi data training
GoF(model.ardl)
##             n      MAE       MPE     MAPE     sMAPE      MASE      MSE     MRAE
## model.ardl 55 40.79747 -2.938051 3.068268 0.1763384 0.9190919 2992.491 1.818713
##                GMRAE
## model.ardl 0.9112162

Berdasarkan akurasi di atas, terlihat bahwa nilai MAPE keduanya tidak jauh berbeda. Artinya, model regresi dengan distribusi lag ini tidak overfitted atau underfitted

Lag Optimum

#penentuan lag optimum
model.ardl.opt <- ardlBoundOrders(data = data.frame(datasales), ic = "AIC", 
                                  formula = N05B ~ R03 )
min_p=c()
for(i in 1:6){
  min_p[i]=min(model.ardl.opt$Stat.table[[i]])
}
q_opt=which(min_p==min(min_p, na.rm = TRUE))
p_opt=which(model.ardl.opt$Stat.table[[q_opt]] == 
              min(model.ardl.opt$Stat.table[[q_opt]], na.rm = TRUE))
data.frame("q_optimum" = q_opt, "p_optimum" = p_opt, 
           "AIC"=model.ardl.opt$min.Stat)
##   q_optimum p_optimum      AIC
## 1         5        15 556.9577

Dari tabel di atas, dapat terlihat bahwa nilai AIC terendah didapat ketika \(p=15\) dan \(q=5\), yaitu sebesar 556.9577. Artinya, model autoregressive optimum didapat ketika \(p=15\) dan \(q=5\).

Selanjutnya dapat dilakukan pemodelan dengan nilai \(p\) dan \(q\) optimum seperti inisialisasi di langkah sebelumnya.

Pemodelan DLM & ARDL dengan Library dynlm

Pemodelan regresi dengan peubah lag tidak hanya dapat dilakukan dengan fungsi pada packages dLagM , tetapi terdapat packages dynlm yang dapat digunakan. Fungsi dynlm secara umum adalah sebagai berikut.

dynlm(formula, data, subset, weights, na.action, method = "qr",
  model = TRUE, x = FALSE, y = FALSE, qr = TRUE, singular.ok = TRUE,
  contrasts = NULL, offset, start = NULL, end = NULL, ...)

Untuk menentukan formula model yang akan digunakan, tersedia fungsi tambahan yang memungkinkan spesifikasi dinamika (melalui d() dan L()) atau pola linier/siklus dengan mudah (melalui trend(), season(), dan harmon()). Semua fungsi formula baru mengharuskan argumennya berupa objek deret waktu (yaitu, "ts" atau "zoo").

#sama dengan model dlm q=1
cons_lm1 <- dynlm(N05B ~ R03+L(R03),data = train.ts)
#sama dengan model ardl p=1 q=0
cons_lm2 <- dynlm(N05B ~ R03+L(N05B),data = train.ts)
#sama dengan ardl p=1 q=1
cons_lm3 <- dynlm(N05B ~ R03+L(R03)+L(N05B),data = train.ts)
#sama dengan dlm p=2
cons_lm4 <- dynlm(N05B ~ R03+L(R03)+L(R03,2),data = train.ts)

Ringkasan Model

summary(cons_lm1)
## 
## Time series regression with "ts" data:
## Start = 2, End = 56
## 
## Call:
## dynlm(formula = N05B ~ R03 + L(R03), data = train.ts)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -179.61  -44.87  -17.64   38.90  246.21 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 277.3904    35.7358   7.762 3.03e-10 ***
## R03           0.2630     0.2087   1.260   0.2132    
## L(R03)       -0.3519     0.2080  -1.692   0.0967 .  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 90.06 on 52 degrees of freedom
## Multiple R-squared:  0.0582, Adjusted R-squared:  0.02197 
## F-statistic: 1.607 on 2 and 52 DF,  p-value: 0.2104
summary(cons_lm2)
## 
## Time series regression with "ts" data:
## Start = 2, End = 56
## 
## Call:
## dynlm(formula = N05B ~ R03 + L(N05B), data = train.ts)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -242.532  -24.849   -6.371   38.192  107.267 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 54.73205   32.12801   1.704   0.0944 .  
## R03          0.09277    0.12960   0.716   0.4773    
## L(N05B)      0.73463    0.09171   8.010 1.23e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 61.89 on 52 degrees of freedom
## Multiple R-squared:  0.5552, Adjusted R-squared:  0.5381 
## F-statistic: 32.45 on 2 and 52 DF,  p-value: 7.121e-10
summary(cons_lm3)
## 
## Time series regression with "ts" data:
## Start = 2, End = 56
## 
## Call:
## dynlm(formula = N05B ~ R03 + L(R03) + L(N05B), data = train.ts)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -173.74  -26.44    5.52   28.66  126.83 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 86.65801   31.05817   2.790  0.00739 ** 
## R03          0.27744    0.13164   2.108  0.04000 *  
## L(R03)      -0.43069    0.13151  -3.275  0.00190 ** 
## L(N05B)      0.75316    0.08437   8.927 5.33e-12 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 56.81 on 51 degrees of freedom
## Multiple R-squared:  0.6325, Adjusted R-squared:  0.6109 
## F-statistic: 29.26 on 3 and 51 DF,  p-value: 3.819e-11
summary(cons_lm4)
## 
## Time series regression with "ts" data:
## Start = 3, End = 56
## 
## Call:
## dynlm(formula = N05B ~ R03 + L(R03) + L(R03, 2), data = train.ts)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -183.62  -46.80  -15.87   35.00  241.15 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 284.5485    39.8756   7.136 3.68e-09 ***
## R03           0.2829     0.2115   1.337    0.187    
## L(R03)       -0.2855     0.2258  -1.264    0.212    
## L(R03, 2)    -0.1429     0.2131  -0.670    0.506    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 90.77 on 50 degrees of freedom
## Multiple R-squared:  0.06547,    Adjusted R-squared:  0.0094 
## F-statistic: 1.168 on 3 and 50 DF,  p-value: 0.3314

Perbandingan Model

akurasi <- matrix(c(mape.koyck, mape.dlm, mape.dlm2, mape.ardl))
row.names(akurasi)<- c("Koyck","DLM 1","DLM 2","Autoregressive")
colnames(akurasi) <- c("MAPE")
akurasi
##                     MAPE
## Koyck          1.0770995
## DLM 1          0.2211184
## DLM 2          0.3067633
## Autoregressive 0.2900784

Berdasarkan nilai MAPE, model paling optimum didapat pada Model DLM 1 karena memiliki nilai MAPE yang terkecil.

Plot

par(mfrow=c(1,1))
plot(test$R03, test$N05B, type="b", col="black", ylim=c(120,250))
points(test$R03, fore.koyck$forecasts,col="red")
lines(test$R03, fore.koyck$forecasts,col="red")
points(test$R03, fore.dlm$forecasts,col="blue")
lines(test$R03, fore.dlm$forecasts,col="blue")
points(test$R03, fore.dlm2$forecasts,col="orange")
lines(test$R03, fore.dlm2$forecasts,col="orange")
points(test$R03, fore.ardl$forecasts,col="green")
lines(test$R03, fore.ardl$forecasts,col="green")
legend("topleft",c("aktual", "koyck","DLM 1","DLM 2", "autoregressive"), lty=1, col=c("black","red","blue","orange","green"), cex=0.8)

Berdasarkan plot tersebut, terlihat bahwa plot yang paling mendekati data aktualnya adalah Model DLM 1, sehingga dapat disimpulkan model terbaik dalam hal ini adalah model regresi DLM 1

Pengayaan (Regresi Berganda)

Data

data(M1Germany)
data1 = M1Germany[1:144,]

DLM

#Run the search over finite DLMs according to AIC values
finiteDLMauto(formula = logprice ~ interest+logm1,
              data = data.frame(data1), q.min = 1, q.max = 5,
              model.type = "dlm", error.type = "AIC", trace = FALSE)
##   q - k    MASE       AIC       BIC   GMRAE    MBRAE R.Adj.Sq Ljung-Box
## 5     5 1.77163 -463.1393 -422.0566 1.43662 -1.60494  0.98836         0
#model dlm berganda
model.dlmberganda = dlm(formula = logprice ~ interest + logm1,
                data = data.frame(data1) , q = 5)
summary(model.dlmberganda)
## 
## Call:
## lm(formula = as.formula(model.formula), data = design)
## 
## Residuals:
##       Min        1Q    Median        3Q       Max 
## -0.095761 -0.028610 -0.000012  0.029496  0.102597 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) -7.81759    0.11384 -68.669  < 2e-16 ***
## interest.t  -1.75616    0.80358  -2.185 0.030707 *  
## interest.1   1.38935    1.22707   1.132 0.259679    
## interest.2   0.40776    1.23726   0.330 0.742273    
## interest.3   1.23130    1.20752   1.020 0.309830    
## interest.4  -0.08718    1.20869  -0.072 0.942616    
## interest.5   3.06850    0.89380   3.433 0.000808 ***
## logm1.t      0.43219    0.20876   2.070 0.040474 *  
## logm1.1      0.42190    0.19807   2.130 0.035109 *  
## logm1.2      0.20943    0.12883   1.626 0.106532    
## logm1.3      0.22053    0.13011   1.695 0.092567 .  
## logm1.4      0.05513    0.21457   0.257 0.797633    
## logm1.5      0.03042    0.19192   0.159 0.874296    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.04343 on 126 degrees of freedom
## Multiple R-squared:  0.9894, Adjusted R-squared:  0.9884 
## F-statistic: 977.9 on 12 and 126 DF,  p-value: < 2.2e-16
## 
## AIC and BIC values for the model:
##         AIC       BIC
## 1 -463.1393 -422.0566
model.dlmberganda2 = dlm(formula = logprice ~ interest + logm1,
                        data = data.frame(data1) , q = 1)
summary(model.dlmberganda2)
## 
## Call:
## lm(formula = as.formula(model.formula), data = design)
## 
## Residuals:
##       Min        1Q    Median        3Q       Max 
## -0.134002 -0.044697  0.006407  0.036962  0.113063 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) -7.77917    0.13299 -58.492  < 2e-16 ***
## interest.t  -3.22103    0.94184  -3.420 0.000824 ***
## interest.1   6.52775    0.94501   6.908 1.66e-10 ***
## logm1.t      0.73918    0.08419   8.780 5.61e-15 ***
## logm1.1      0.63330    0.08429   7.513 6.55e-12 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.05443 on 138 degrees of freedom
## Multiple R-squared:  0.9832, Adjusted R-squared:  0.9828 
## F-statistic:  2025 on 4 and 138 DF,  p-value: < 2.2e-16
## 
## AIC and BIC values for the model:
##         AIC       BIC
## 1 -419.7575 -401.9805

ARDL

#Mencari orde lag optimum model ARDL
ardlBoundOrders(data = data1 , formula = logprice ~ interest + logm1,
                ic="AIC")
## $p
##    interest logm1
## 65        0     4
## 
## $q
## [1] 4
## 
## $Stat.table
##            q = 1     q = 2     q = 3     q = 4     q = 5     q = 6     q = 7
## p = 1  -760.1786 -757.9195 -846.8342 -975.2079 -965.7536 -958.9072 -956.7315
## p = 2  -760.0433 -759.3090 -843.6247 -971.2514 -961.7929 -955.2809 -953.4890
## p = 3  -753.7746 -753.7746 -841.2485 -970.4543 -961.4343 -953.7173 -950.0412
## p = 4  -829.8076 -832.6436 -832.6436 -971.0837 -962.1804 -955.0429 -953.4667
## p = 5  -749.4144 -753.2292 -962.9290 -962.9290 -961.7063 -954.3406 -951.7660
## p = 6  -742.2103 -742.9945 -891.6195 -952.3771 -952.3771 -952.2461 -950.1105
## p = 7  -728.9374 -733.0286 -851.2943 -945.7445 -944.6879 -944.6879 -949.3720
## p = 8  -747.9277 -746.2948 -812.4289 -937.9446 -938.9491 -937.3393 -937.3393
## p = 9  -722.6891 -724.5786 -863.2734 -928.9215 -927.2914 -926.8716 -936.6432
## p = 10 -714.8175 -714.5658 -816.3319 -918.5218 -918.6350 -916.9076 -921.1246
## p = 11 -703.1807 -705.3383 -794.0772 -909.6457 -908.8225 -906.9542 -912.9605
## p = 12 -716.7111 -714.7403 -774.0127 -910.0315 -910.6834 -908.7146 -909.6612
## p = 13 -697.7175 -698.1931 -793.4602 -895.5927 -894.9273 -893.5995 -897.7589
## p = 14 -686.5600 -685.7967 -766.5292 -886.0709 -885.4341 -885.2283 -890.1638
## p = 15 -676.7280 -678.3689 -753.2854 -875.6392 -874.1257 -874.3117 -879.2727
##            q = 8     q = 9    q = 10    q = 11    q = 12    q = 13    q = 14
## p = 1  -954.3375 -946.6293 -936.5328 -927.7728 -920.6435 -917.5463 -918.3110
## p = 2  -951.1470 -943.9360 -933.7047 -924.7949 -917.5334 -913.6213 -914.4063
## p = 3  -948.4683 -941.1039 -930.8509 -922.0563 -914.5728 -910.5351 -913.4996
## p = 4  -948.2330 -941.8238 -931.5689 -923.2663 -916.2063 -911.6023 -913.9345
## p = 5  -947.5994 -939.3767 -929.0155 -920.4475 -913.5968 -909.0781 -911.6312
## p = 6  -945.5758 -937.4076 -927.2439 -919.3949 -911.9537 -907.7394 -910.2890
## p = 7  -945.5181 -937.1826 -926.9640 -917.9619 -910.2774 -905.9449 -907.8712
## p = 8  -941.9617 -933.5959 -923.3691 -914.6251 -907.0608 -902.2187 -903.9255
## p = 9  -936.6432 -935.7172 -925.2881 -917.0877 -911.6973 -903.9027 -904.6405
## p = 10 -926.6891 -926.6891 -924.6986 -917.0904 -911.4197 -903.4313 -903.0612
## p = 11 -917.9145 -918.2328 -918.2328 -919.2867 -913.3674 -904.8733 -903.6541
## p = 12 -916.1321 -914.4362 -914.4610 -914.4610 -912.5159 -904.2394 -901.6216
## p = 13 -905.4744 -903.7559 -902.4406 -902.2530 -902.2530 -902.9434 -901.2363
## p = 14 -896.2370 -896.2620 -894.2896 -897.5711 -899.1407 -899.1407 -902.2350
## p = 15 -884.5637 -886.8221 -884.9832 -890.5665 -893.2335 -891.6220 -891.6220
##           q = 15
## p = 1  -908.0863
## p = 2  -904.1665
## p = 3  -903.3006
## p = 4  -903.9256
## p = 5  -901.6220
## p = 6  -900.1824
## p = 7  -897.9867
## p = 8  -894.1031
## p = 9  -894.7387
## p = 10 -893.6199
## p = 11 -893.6060
## p = 12 -892.4805
## p = 13 -892.5115
## p = 14 -893.6214
## p = 15 -891.3741
## 
## $min.Stat
## [1] -977.2745
## 
## $Stat.p
##     interest logm1      Stat
## 65         0     4 -977.2745
## 1          0     0 -976.5191
## 2          1     0 -976.2558
## 17         0     1 -975.9606
## 66         1     4 -975.6027
## 18         1     1 -975.2079
## 49         0     3 -974.4859
## 3          2     0 -974.4275
## 33         0     2 -974.0166
## 50         1     3 -973.7500
## 67         2     4 -973.6028
## 34         1     2 -973.2324
## 19         2     1 -973.2188
## 68         3     4 -972.5992
## 4          3     0 -972.4875
## 51         2     3 -971.7743
## 20         3     1 -971.3872
## 35         2     2 -971.2514
## 69         4     4 -971.0837
## 5          4     0 -970.5114
## 52         3     3 -970.4543
## 81         0     5 -969.9284
## 53         4     3 -969.5311
## 21         4     1 -969.4756
## 36         3     2 -969.3907
## 82         1     5 -968.6783
## 37         4     2 -967.4756
## 83         2     5 -966.8835
## 84         3     5 -965.6393
## 85         4     5 -963.9662
## 86         5     5 -962.9290
## 70         5     4 -961.2547
## 54         5     3 -960.9580
## 97         0     6 -960.7402
## 6          5     0 -960.6858
## 22         5     1 -959.8419
## 98         1     6 -959.6604
## 38         5     2 -957.8547
## 99         2     6 -957.7528
## 100        3     6 -956.7875
## 101        4     6 -955.2416
## 71         6     4 -954.8953
## 87         6     5 -954.6855
## 102        5     6 -954.3662
## 103        6     6 -954.0973
## 7          6     0 -954.0615
## 113        0     7 -953.9160
## 55         6     3 -953.2860
## 23         6     1 -953.1080
## 114        1     7 -952.6540
## 39         6     2 -951.1356
## 115        2     7 -950.6562
## 116        3     7 -949.6038
## 88         7     5 -949.2090
## 72         7     4 -948.5194
## 117        4     7 -947.7999
## 104        7     6 -947.7424
## 56         7     3 -947.6915
## 8          7     0 -947.5092
## 120        7     7 -947.3660
## 24         7     1 -947.0094
## 118        5     7 -946.9631
## 119        6     7 -946.8080
## 40         7     2 -945.0123
## 129        0     8 -943.9035
## 130        1     8 -942.6627
## 131        2     8 -940.6818
## 145        0     9 -940.0114
## 132        3     8 -939.6913
## 89         8     5 -939.1878
## 73         8     4 -938.5330
## 146        1     9 -938.2680
## 133        4     8 -937.8368
## 105        8     6 -937.6834
## 57         8     3 -937.6370
## 9          8     0 -937.5705
## 121        8     7 -937.5351
## 136        7     8 -937.3948
## 25         8     1 -937.0088
## 134        5     8 -936.9393
## 135        6     8 -936.8904
## 147        2     9 -936.3875
## 148        3     9 -936.3159
## 137        8     8 -935.5389
## 41         8     2 -935.0088
## 149        4     9 -934.3458
## 150        5     9 -934.1858
## 152        7     9 -934.0733
## 151        6     9 -932.9538
## 153        8     9 -932.3338
## 154        9     9 -930.9065
## 161        0    10 -929.8056
## 90         9     5 -929.2731
## 74         9     4 -928.5254
## 162        1    10 -928.1257
## 10         9     0 -927.9853
## 58         9     3 -927.9744
## 122        9     7 -927.9061
## 106        9     6 -927.6344
## 26         9     1 -927.4482
## 164        3    10 -926.5271
## 163        2    10 -926.2965
## 138        9     8 -926.1307
## 42         9     2 -925.4484
## 165        4    10 -924.5287
## 168        7    10 -924.2716
## 166        5    10 -924.0521
## 167        6    10 -922.7596
## 169        8    10 -922.5928
## 155       10     9 -921.2169
## 170        9    10 -921.1777
## 177        0    11 -920.2608
## 171       10    10 -920.0124
## 91        10     5 -919.0182
## 178        1    11 -918.7342
## 75        10     4 -918.4135
## 11        10     0 -917.8597
## 59        10     3 -917.7711
## 123       10     7 -917.6569
## 107       10     6 -917.3861
## 27        10     1 -917.2925
## 179        2    11 -916.9417
## 180        3    11 -916.8682
## 193        0    12 -916.1477
## 139       10     8 -915.9643
## 92        11     5 -915.3201
## 43        10     2 -915.2941
## 156       11     9 -915.0851
## 181        4    11 -914.8854
## 194        1    12 -914.4423
## 124       11     7 -914.3141
## 184        7    11 -914.1880
## 76        11     4 -914.1395
## 182        5    11 -914.0440
## 108       11     6 -913.4052
## 140       11     8 -913.3026
## 195        2    12 -913.1680
## 172       11    10 -913.0914
## 60        11     3 -912.7714
## 183        6    11 -912.7548
## 196        3    12 -912.5820
## 185        8    11 -912.5636
## 12        11     0 -912.2009
## 28        11     1 -912.0389
## 186        9    11 -911.1737
## 157       12     9 -911.1513
## 188       11    11 -911.1189
## 93        12     5 -910.7693
## 198        5    12 -910.7434
## 197        4    12 -910.6154
## 125       12     7 -910.5873
## 141       12     8 -910.0719
## 44        11     2 -910.0439
## 187       10    11 -909.9928
## 200        7    12 -909.4197
## 173       12    10 -909.2473
## 77        12     4 -909.1913
## 109       12     6 -908.7753
## 199        6    12 -908.7635
## 201        8    12 -908.1609
## 61        12     3 -908.0357
## 29        12     1 -907.8613
## 209        0    13 -907.6473
## 13        12     0 -907.6158
## 205       12    12 -907.5931
## 204       11    12 -907.5525
## 202        9    12 -907.3633
## 189       12    11 -907.3200
## 210        1    13 -906.1005
## 45        12     2 -905.9070
## 203       10    12 -905.7653
## 211        2    13 -904.7293
## 212        3    13 -903.9077
## 214        5    13 -902.0824
## 158       13     9 -901.9574
## 213        4    13 -901.9144
## 94        13     5 -901.6338
## 126       13     7 -901.3766
## 142       13     8 -900.9367
## 216        7    13 -900.5676
## 225        0    14 -900.5066
## 174       13    10 -900.1413
## 215        6    13 -900.1102
## 78        13     4 -900.0282
## 110       13     6 -899.6703
## 226        1    14 -899.0967
## 217        8    13 -899.0866
## 62        13     3 -898.8589
## 30        13     1 -898.7940
## 190       13    11 -898.4409
## 221       12    13 -898.4110
## 220       11    13 -898.3058
## 218        9    13 -898.2568
## 14        13     0 -898.2039
## 206       13    12 -897.9014
## 227        2    14 -897.3889
## 46        13     2 -896.8637
## 219       10    13 -896.6244
## 222       13    13 -896.4458
## 228        3    14 -896.2512
## 230        5    14 -895.1320
## 95        14     5 -894.6021
## 229        4    14 -894.3023
## 159       14     9 -894.2497
## 127       14     7 -893.9663
## 143       14     8 -893.6932
## 231        6    14 -893.4037
## 79        14     4 -893.1343
## 232        7    14 -893.1064
## 111       14     6 -892.6253
## 175       14    10 -892.5085
## 63        14     3 -891.9131
## 191       14    11 -891.1895
## 233        8    14 -891.1877
## 234        9    14 -891.1729
## 31        14     1 -890.7573
## 236       11    14 -890.5576
## 241        0    15 -890.5500
## 15        14     0 -890.3449
## 237       12    14 -890.1854
## 235       10    14 -889.8957
## 207       14    12 -889.7107
## 242        1    15 -889.0419
## 47        14     2 -888.9410
## 238       13    14 -888.1867
## 223       14    13 -887.7488
## 239       14    14 -887.6659
## 243        2    15 -887.3088
## 244        3    15 -886.0691
## 246        5    15 -884.7479
## 96        15     5 -884.2869
## 245        4    15 -884.1417
## 160       15     9 -883.9364
## 128       15     7 -883.6409
## 144       15     8 -883.4503
## 247        6    15 -883.0158
## 80        15     4 -882.8148
## 248        7    15 -882.7881
## 112       15     6 -882.3106
## 176       15    10 -882.2093
## 64        15     3 -881.6497
## 253       12    15 -881.4274
## 252       11    15 -881.3077
## 250        9    15 -881.1831
## 192       15    11 -880.9028
## 249        8    15 -880.8964
## 32        15     1 -880.5983
## 251       10    15 -880.2736
## 16        15     0 -880.2468
## 254       13    15 -879.4467
## 208       15    12 -879.4364
## 255       14    15 -879.2846
## 48        15     2 -878.8432
## 224       15    13 -877.4985
## 240       15    14 -877.4570
model.ardlDlmberganda = ardlDlm(formula = logprice ~ interest + logm1,
                        data = data.frame(data1) , p = 4 , q = 4)
summary(model.ardlDlmberganda)
## 
## Time series regression with "ts" data:
## Start = 5, End = 144
## 
## Call:
## dynlm(formula = as.formula(model.text), data = data)
## 
## Residuals:
##        Min         1Q     Median         3Q        Max 
## -0.0290527 -0.0075965  0.0005726  0.0072745  0.0304486 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  0.0145022  0.1822785   0.080  0.93671    
## interest.t   0.0067985  0.2135315   0.032  0.97465    
## interest.1   0.6093502  0.3240545   1.880  0.06238 .  
## interest.2   0.0798544  0.3221168   0.248  0.80461    
## interest.3  -0.3638172  0.3238873  -1.123  0.26347    
## interest.4   0.2084240  0.2447331   0.852  0.39604    
## logm1.t      0.0828689  0.0457486   1.811  0.07248 .  
## logm1.1     -0.0092841  0.0399079  -0.233  0.81642    
## logm1.2     -0.1166129  0.0390732  -2.984  0.00342 ** 
## logm1.3      0.0007016  0.0389297   0.018  0.98565    
## logm1.4      0.0447857  0.0425474   1.053  0.29455    
## logprice.1   0.3274245  0.0651574   5.025  1.7e-06 ***
## logprice.2   0.1323801  0.0684485   1.934  0.05537 .  
## logprice.3  -0.1448245  0.0674268  -2.148  0.03365 *  
## logprice.4   0.6730871  0.0636443  10.576  < 2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.01132 on 125 degrees of freedom
## Multiple R-squared:  0.9993, Adjusted R-squared:  0.9992 
## F-statistic: 1.273e+04 on 14 and 125 DF,  p-value: < 2.2e-16
#model p interest 0 p logm1 4 
rem.p = list(interest = c(1,2,3,4))
remove = list(p = rem.p)
model.ardlDlmberganda2 = ardlDlm(formula = logprice ~ interest + logm1,
                        data = data.frame(data1) , p = 4 , q = 4 ,
                        remove = remove)
summary(model.ardlDlmberganda2)
## 
## Time series regression with "ts" data:
## Start = 5, End = 144
## 
## Call:
## dynlm(formula = as.formula(model.text), data = data)
## 
## Residuals:
##        Min         1Q     Median         3Q        Max 
## -0.0290369 -0.0083445  0.0009024  0.0079199  0.0303652 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  0.174838   0.133708   1.308  0.19333    
## interest.t   0.448826   0.098736   4.546 1.24e-05 ***
## logm1.t      0.056659   0.043836   1.293  0.19849    
## logm1.1     -0.017025   0.039159  -0.435  0.66446    
## logm1.2     -0.118413   0.037399  -3.166  0.00193 ** 
## logm1.3     -0.006454   0.038112  -0.169  0.86580    
## logm1.4      0.060220   0.040337   1.493  0.13789    
## logprice.1   0.319059   0.062107   5.137 1.00e-06 ***
## logprice.2   0.111794   0.066101   1.691  0.09320 .  
## logprice.3  -0.122129   0.065114  -1.876  0.06297 .  
## logprice.4   0.699061   0.062611  11.165  < 2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.01149 on 129 degrees of freedom
## Multiple R-squared:  0.9993, Adjusted R-squared:  0.9992 
## F-statistic: 1.73e+04 on 10 and 129 DF,  p-value: < 2.2e-16

Proses selanjutnya sama dengan pemodelan menggunakan peubah tunggal.