Regresi dengan Peubah Lag

Pendahuluan

Penelitian ini bertujuan untuk melakukan pemodelan dari pengaruh Europe GDP terhadap Adidas Revenue. Pemodelan ini menggunakan beberapa metode regresi peubah lag. Data yang diambil bersumber dari kagle.com dari Tahun 2000 sampai 2017 dengan jumlah observasi 69 amatan. Sumber data:

https://www.kaggle.com/datasets/kofi2614/adidas-quarterly-sales

https://www.kaggle.com/code/kofi2614/adidas-quarterly-sales-forecasting

Package

Library yang digunakan untuk regresi dengan peubah lag sebagai berikut:

dLagM dynlm MLmetrics lmtest car

Data

ADIDAS AG adalah perusahaan multinasional yang didirikan dan berlokasi di Herzogenaurach, Jerman, yang merancang dan memproduksi sepatu, pakaian, dan aksesori.

adidas_sales <- readxl::read_xlsx("adidas_revenue1.xlsx")
str(adidas_sales)
## tibble [69 x 7] (S3: tbl_df/tbl/data.frame)
##  $ date       : chr [1:69] "2000Q1" "2000Q2" "2000Q3" "2000Q4" ...
##  $ Revenue    : num [1:69] 1517 1248 1677 1393 1558 ...
##  $ US GDP     : num [1:69] 12359095 1259253 12607676 12679338 12643283 ...
##  $ Europe GDP : num [1:69] 1728244 17496 17692586 17892533 18191398 ...
##  $ CHN GDP US : num [1:69] 2.58e+09 2.90e+09 3.11e+09 3.53e+09 2.91e+09 ...
##  $ Price Index: num [1:69] 2465 3264 3225 321 3294 ...
##  $ NIKE       : num [1:69] 21616 22727 26367 21987 21701 ...
t = tibble::num(1:69)
Yt <- adidas_sales$Revenue
Xt <- adidas_sales$`Europe GDP`
data=data.frame(t,Yt,Xt)
data
##     t   Yt       Xt
## 1   1 1517  1728244
## 2   2 1248    17496
## 3   3 1677 17692586
## 4   4 1393 17892533
## 5   5 1558 18191398
## 6   6 1368 18337128
## 7   7 1790 18458787
## 8   8 1396 18619357
## 9   9 1638 18787827
## 10 10 1507 18938006
## 11 11 1868 19148373
## 12 12 1510 19273414
## 13 13 1669 19338716
## 14 14 1392 19439955
## 15 15 1853 19698488
## 16 16 1353 19884921
## 17 17 1623 20101204
## 18 18 1401 20317648
## 19 19 1758 20461822
## 20 20 1078 20650153
## 21 21 1674 20769228
## 22 22 1516 21015592
## 23 23 1924 21245527
## 24 24 1522 21538395
## 25 25 2459 21791093
## 26 26 2428 22148485
## 27 27 2949 22402389
## 28 28 2248 22759027
## 29 29 2538 23140186
## 30 30 2400  2338311
## 31 31 2941 23623233
## 32 32 2420 23924261
## 33 33 2621 24161125
## 34 34 2521 24197352
## 35 35 3083 24116787
## 36 36 2574  2382319
## 37 37 2577  2317899
## 38 38 2457 23103118
## 39 39 2888 23208229
## 40 40 2458 23393294
## 41 41 2674 23490783
## 42 42 2917 23772072
## 43 43 3468 23967357
## 44 44 2931 24158493
## 45 45 3273  2439892
## 46 46 3064 24459589
## 47 47 3744 24545014
## 48 48 3241 24550806
## 49 49 3824 24592684
## 50 50 3517 24575753
## 51 51 4173 24617675
## 52 52 3369 24593375
## 53 53 3751 24621908
## 54 54 3383 24804379
## 55 55 3879 24929432
## 56 56 3391 25035416
## 57 57 3480 25233995
## 58 58 3400 25280244
## 59 59 4044 25458879
## 60 60 3610 25670729
## 61 61 4083 26001163
## 62 62 3907 26168871
## 63 63 4758 26370496
## 64 64 4167 26562397
## 65 65 4769 26729105
## 66 66 4199 26839678
## 67 67 5413 26995255
## 68 68 4687 27254219
## 69 69 5671 27451033
cor(Yt, Xt)
## [1] 0.4938515
train<-data[1:52,]
test<-data[53:69,]
#data time series
train.ts<-ts(train)
test.ts<-ts(test)
data.ts<-ts(data)

Model Koyck

Metode Koyck didasarkan asumsi bahwa semakin jauh jarak lag peubah independen dari periode sekarang maka semakin kecil pengaruh peubah lag terhadap peubah dependen.Koyck mengusulkan suatu metode untuk menduga model dinamis distributed lag dengan mengasumsikan bahwa semua koefisien 𝛽 mempunyai tanda sama.Model Koyck merupakan jenis paling umum dari model infinite distributed lag dan juga dikenal sebagai geometric lag.

model.koyck = dLagM::koyckDlm(x = train$Xt, y = train$Yt)
summary(model.koyck)
## 
## Call:
## "Y ~ (Intercept) + Y.1 + X.t"
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -817.068 -372.166    6.708  266.334  751.535 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 9.366e+01  6.303e+02   0.149    0.882    
## Y.1         8.305e-01  9.643e-02   8.612 2.66e-11 ***
## X.t         1.653e-05  3.732e-05   0.443    0.660    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 421.2 on 48 degrees of freedom
## Multiple R-Squared: 0.7282,  Adjusted R-squared: 0.7169 
## Wald test:  63.7 on 2 and 48 DF,  p-value: 3.108e-14 
## 
## Diagnostic tests:
## NULL
## 
##                             alpha         beta       phi
## Geometric coefficients:  552.4549 1.653479e-05 0.8304686

AIC dan BIC

AIC (Akaike’s Information Criteria) BIC (Bayesian Information Criteria) adalah suatu metrik yang digunakan untuk mengukur kebaikan model. Perbedaan AIC dan BIC adalah BIC memiliki perhitungan yang lebih kompleks. Semakin kecil nilai AIC/BIC yang diperoleh, semakin bagus suatu model.

AIC(model.koyck)
## [1] 766.0315
BIC(model.koyck)
## [1] 773.7588

Ramalan

fore.koyck <- forecast(model = model.koyck, x=test$Xt, h=nrow(test));fore.koyck
## $forecasts
##  [1] 3298.625 3243.198 3199.236 3164.479 3138.897 3118.418 3104.364 3096.195
##  [9] 3094.875 3096.552 3101.278 3108.376 3117.027 3126.040 3136.098 3148.732
## [17] 3162.478
## 
## $call
## forecast.koyckDlm(model = model.koyck, x = test$Xt, h = nrow(test))
## 
## attr(,"class")
## [1] "forecast.koyckDlm" "dLagM"

Mape data testing

mape.koyck <- MLmetrics::MAPE(fore.koyck$forecasts, test$Yt)

Akurasi data training

mape_train <- GoF(model.koyck)["MAPE"]

c("MAPE_testing" = mape.koyck, "MAPE_training" = mape_train)
## $MAPE_testing
## [1] 0.2237812
## 
## $MAPE_training.MAPE
## [1] 0.1662669

Regresi dengan peubah Lag

Lag yang dimaksud dalam regresi dengan peubah lag adalah Waktu yang diperlukan bagi peubah bebas X dalam mempengaruhi peubah tak bebas Y. Pada data, lag yang digunakan sebesar 4 agar data menjadi stationer. ## Regression with Distributed Lag (lag=4)

model.dlm = dlm(x = train$Xt,y = train$Yt , q = 4)
summary(model.dlm)
## 
## Call:
## lm(formula = model.formula, data = design)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -1342.77  -615.36    62.09   532.73  1505.89 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)
## (Intercept) -4.251e+01  8.059e+02  -0.053    0.958
## x.t          3.039e-05  1.923e-05   1.581    0.121
## x.1          2.254e-05  1.933e-05   1.166    0.250
## x.2          1.957e-05  1.967e-05   0.995    0.326
## x.3          2.090e-05  1.802e-05   1.160    0.253
## x.4          2.800e-05  1.672e-05   1.675    0.101
## 
## Residual standard error: 747.9 on 42 degrees of freedom
## Multiple R-squared:  0.1857, Adjusted R-squared:  0.08872 
## F-statistic: 1.915 on 5 and 42 DF,  p-value: 0.1121
## 
## AIC and BIC values for the model:
##        AIC      BIC
## 1 779.0683 792.1667
AIC(model.dlm)
## [1] 779.0683
BIC(model.dlm)
## [1] 792.1667

Ramalan

meramal 5 periode ke depan

fore.dlm <- forecast(model = model.dlm, x=test$Xt, h=nrow(test))
fore.dlm
## $forecasts
##  [1] 2944.081 2950.196 2959.335 2968.861 2984.344 3000.023 3016.096 3034.585
##  [9] 3059.425 3081.144 3106.947 3133.443 3159.538 3179.321 3199.460 3221.858
## [17] 3243.700
## 
## $call
## forecast.dlm(model = model.dlm, x = test$Xt, h = nrow(test))
## 
## attr(,"class")
## [1] "forecast.dlm" "dLagM"

Mape data testing

mape.dlm <- MLmetrics::MAPE(fore.dlm$forecasts, test$Yt)

Akurasi data training

mape_train <- GoF(model.dlm)["MAPE"]

c("MAPE_testing" = mape.dlm, "MAPE_training" = mape_train)
## $MAPE_testing
## [1] 0.2450151
## 
## $MAPE_training.MAPE
## [1] 0.2978644

Regression with Distributed Lag Optimum

Penentuan lag optimum

finiteDLMauto(formula = Yt ~ Xt,
              data = data.frame(train), q.min = 1, q.max = 4 ,
              model.type = "dlm", error.type = "AIC", trace = TRUE) ##q max lag maksimum
##   q - k    MASE      AIC      BIC   GMRAE    MBRAE R.Adj.Sq    Ljung-Box
## 4     4 1.55547 779.0683 792.1667 1.58680 -2.86536  0.08872 4.463593e-08
## 3     3 1.60085 795.1669 806.5178 1.48410 -2.62021  0.08128 3.094106e-08
## 2     2 1.64851 811.5620 821.1221 1.50498 -1.32231  0.05298 1.620961e-08
## 1     1 1.65392 826.7004 834.4277 1.43596  1.73952  0.06967 5.791333e-09

Model dlm dengan lag optimum

model.dlm2 = dLagM::dlm(x = train$Xt,y = train$Yt , q = 4) #terdapat lag yang tidak signifikan sehingga dapat dikurangi jumlah lagnya 
summary(model.dlm2)
## 
## Call:
## lm(formula = model.formula, data = design)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -1342.77  -615.36    62.09   532.73  1505.89 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)
## (Intercept) -4.251e+01  8.059e+02  -0.053    0.958
## x.t          3.039e-05  1.923e-05   1.581    0.121
## x.1          2.254e-05  1.933e-05   1.166    0.250
## x.2          1.957e-05  1.967e-05   0.995    0.326
## x.3          2.090e-05  1.802e-05   1.160    0.253
## x.4          2.800e-05  1.672e-05   1.675    0.101
## 
## Residual standard error: 747.9 on 42 degrees of freedom
## Multiple R-squared:  0.1857, Adjusted R-squared:  0.08872 
## F-statistic: 1.915 on 5 and 42 DF,  p-value: 0.1121
## 
## AIC and BIC values for the model:
##        AIC      BIC
## 1 779.0683 792.1667
AIC(model.dlm2)
## [1] 779.0683
BIC(model.dlm2)
## [1] 792.1667

Ramalan

(fore.dlm2 <- forecast(model = model.dlm2, x=test$Xt,  h=nrow(test)))
## $forecasts
##  [1] 2944.081 2950.196 2959.335 2968.861 2984.344 3000.023 3016.096 3034.585
##  [9] 3059.425 3081.144 3106.947 3133.443 3159.538 3179.321 3199.460 3221.858
## [17] 3243.700
## 
## $call
## forecast.dlm(model = model.dlm2, x = test$Xt, h = nrow(test))
## 
## attr(,"class")
## [1] "forecast.dlm" "dLagM"

Akurasi testing

mape.dlm2 <- MAPE(fore.dlm2$forecasts, test$Yt)

Akurasi data training

mape_train <- GoF(model.dlm2)["MAPE"]

c("MAPE_testing" = mape.dlm2, "MAPE_training" = mape_train)
## $MAPE_testing
## [1] 0.2450151
## 
## $MAPE_training.MAPE
## [1] 0.2978644

Model Autoregressive / Dynamic Regression

Apabila peubah dependen dipengaruhi oleh peubah independen pada waktu sekarang, serta dipengaruhii juga oleh peubah dependen itu sendiri pada satu waktu yang lalu maka model tersebut disebut autoregressive (Gujarati, 2004)

Model Autoregressive

model.ardl = ardlDlm(x = train$Xt, y = train$Yt, p = 1 , q = 1) #p:lag x, q:lag y
#model untuk p=1, q=1: yt=b0+b1yt-1+b2xt+b3xt-1
#model untuk p=2, q=3: yt=b0+b1yt-1+b2yt-2+b3xt+b4xt-1+b5xt-2
summary(model.ardl)
## 
## Time series regression with "ts" data:
## Start = 2, End = 52
## 
## Call:
## dynlm(formula = as.formula(model.text), data = data, start = 1)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -811.77 -364.97    5.02  263.04  762.53 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 1.547e+02  2.573e+02   0.601    0.551    
## X.t         1.176e-05  9.969e-06   1.179    0.244    
## X.1         1.166e-06  9.506e-06   0.123    0.903    
## Y.1         8.353e-01  8.031e-02  10.401 8.86e-14 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 424.6 on 47 degrees of freedom
## Multiple R-squared:  0.7295, Adjusted R-squared:  0.7122 
## F-statistic: 42.25 on 3 and 47 DF,  p-value: 2.164e-13
AIC(model.ardl)
## [1] 767.7831
BIC(model.ardl)
## [1] 777.4422
#ramalan
(fore.ardl <- forecast(model = model.ardl, x=test$Xt, h=nrow(test)))
## $forecasts
##  [1] 3286.923 3220.545 3166.783 3123.270 3089.383 3061.853 3041.013 3026.304
##  [9] 3018.151 3013.699 3012.546 3014.075 3017.536 3021.922 3027.543 3035.465
## [17] 3044.699
## 
## $call
## forecast.ardlDlm(model = model.ardl, x = test$Xt, h = nrow(test))
## 
## attr(,"class")
## [1] "forecast.ardlDlm" "dLagM"
#akurasi testing
mape.ardl <- MAPE(fore.ardl$forecasts, test$Yt) #data testing

#akurasi data training
mape_train <- GoF(model.ardl)["MAPE"]

c("MAPE_testing" = mape.ardl, "MAPE_training" = mape_train)
## $MAPE_testing
## [1] 0.2407311
## 
## $MAPE_training.MAPE
## [1] 0.1666482
#penentuan lag optimum
ardlBoundOrders(data = data.frame(data) , formula = Yt ~ Xt ) #yang digunakan harusnya data train, tetapi karena keterbatasan data jika menggunakan data train menyebabkan error sehingga dicontohkan menggunakan keseluruhan data
## $p
##   Xt
## 1 15
## 
## $q
## [1] 15
## 
## $Stat.table
##           q = 1    q = 2    q = 3    q = 4    q = 5    q = 6    q = 7    q = 8
## p = 1  954.5416 940.6745 904.5394 865.3491 854.7342 839.4851 828.6776 818.4445
## p = 2  941.2032 940.4905 902.0090 864.4919 853.7601 840.1122 829.3713 819.2136
## p = 3  928.1338 928.1338 903.8648 866.1109 855.5447 841.9730 831.2509 821.0971
## p = 4  903.4171 891.9124 891.9124 867.5267 857.1795 843.3339 832.5383 822.3312
## p = 5  902.5731 903.9673 857.0853 857.0853 858.9224 845.3121 834.2635 824.0742
## p = 6  890.7567 889.7946 857.7237 845.3476 845.3476 843.4860 833.1307 822.2323
## p = 7  874.9032 876.1999 854.9039 832.2761 833.1998 833.1998 835.1218 824.1184
## p = 8  862.4467 856.6275 845.5854 822.2908 823.9129 824.1138 824.1138 826.1130
## p = 9  854.2452 852.7079 829.8962 809.2262 810.7606 809.9536 811.8702 811.8702
## p = 10 842.5354 840.2159 821.2097 799.9439 801.0656 798.9582 800.9495 802.9174
## p = 11 829.1223 827.6746 811.2130 784.8965 786.1371 787.0185 788.9000 790.7546
## p = 12 811.7294 806.6702 796.8762 771.4805 772.9244 773.6469 774.0631 775.9573
## p = 13 807.0018 804.2417 788.1384 761.0920 763.0251 763.2473 765.1949 766.8428
## p = 14 791.2461 788.6048 773.5069 749.2909 750.6720 750.0761 751.9054 753.8542
## p = 15 779.2604 776.6099 760.3901 737.2604 738.8309 739.4832 741.4593 743.4517
##           q = 9   q = 10   q = 11   q = 12   q = 13   q = 14   q = 15
## p = 1  807.8297 796.2798 781.6098 770.7787 754.4564 744.0946 725.8765
## p = 2  808.6307 796.7684 781.3867 770.6970 754.1918 744.1921 727.6667
## p = 3  810.4965 798.4886 783.3028 772.6040 756.1880 746.1880 729.1028
## p = 4  811.7534 799.3732 783.0763 772.3404 756.8869 746.8933 729.9159
## p = 5  813.4520 801.1937 784.7335 773.9950 758.5404 748.5293 731.9024
## p = 6  811.3980 800.4113 782.6403 771.8496 756.7794 746.7917 727.3469
## p = 7  812.8256 801.8697 784.6372 773.8491 758.7129 748.7263 729.2656
## p = 8  814.6409 803.8543 786.5031 775.6349 759.5051 749.4609 729.1648
## p = 9  813.6937 802.2731 785.8271 775.3346 760.5529 750.6559 730.9728
## p = 10 802.9174 803.7482 787.8218 777.1813 762.4941 752.5980 732.4638
## p = 11 792.6236 792.6236 788.5312 778.3593 764.3098 754.3979 734.4347
## p = 12 776.5555 777.8825 777.8825 779.8321 766.1281 756.2672 736.4304
## p = 13 768.3456 770.3280 768.2588 768.2588 765.7722 755.6029 737.8025
## p = 14 752.2823 754.0059 753.7805 753.8531 753.8531 755.3039 735.3068
## p = 15 743.8820 744.6411 741.1251 742.1615 743.7716 743.7716 736.3578
## 
## $min.Stat
## [1] 725.8765

PEMODELAN DLM dan ARDL dengan library dynlm

sama dengan model dlm p=1

cons_lm1 <- dynlm(Yt ~ Xt+L(Xt),data = train.ts)
summary(cons_lm1)
## 
## Time series regression with "ts" data:
## Start = 2, End = 52
## 
## Call:
## dynlm(formula = Yt ~ Xt + L(Xt), data = train.ts)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -1274.41  -678.71   -28.62   533.54  1623.05 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)   
## (Intercept) 1.347e+03  4.142e+02   3.253  0.00209 **
## Xt          2.472e-05  1.778e-05   1.390  0.17094   
## L(Xt)       2.418e-05  1.662e-05   1.454  0.15234   
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 763.5 on 48 degrees of freedom
## Multiple R-squared:  0.1069, Adjusted R-squared:  0.06967 
## F-statistic: 2.872 on 2 and 48 DF,  p-value: 0.06634

sama dengan model ardl p=0 q=1

cons_lm2 <- dynlm(Yt ~ Xt+L(Yt),data = train.ts)
summary(cons_lm2)
## 
## Time series regression with "ts" data:
## Start = 2, End = 52
## 
## Call:
## dynlm(formula = Yt ~ Xt + L(Yt), data = train.ts)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -809.55 -363.26    5.84  253.93  764.26 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 1.658e+02  2.383e+02   0.696    0.490    
## Xt          1.207e-05  9.537e-06   1.266    0.212    
## L(Yt)       8.376e-01  7.730e-02  10.836 1.71e-14 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 420.2 on 48 degrees of freedom
## Multiple R-squared:  0.7294, Adjusted R-squared:  0.7181 
## F-statistic:  64.7 on 2 and 48 DF,  p-value: 2.372e-14

sama dengan ardl p=1 q=1

cons_lm3 <- dynlm(Yt ~ Xt+L(Xt)+L(Yt),data = train.ts)
summary(cons_lm3)
## 
## Time series regression with "ts" data:
## Start = 2, End = 52
## 
## Call:
## dynlm(formula = Yt ~ Xt + L(Xt) + L(Yt), data = train.ts)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -811.77 -364.97    5.02  263.04  762.53 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 1.547e+02  2.573e+02   0.601    0.551    
## Xt          1.176e-05  9.969e-06   1.179    0.244    
## L(Xt)       1.166e-06  9.506e-06   0.123    0.903    
## L(Yt)       8.353e-01  8.031e-02  10.401 8.86e-14 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 424.6 on 47 degrees of freedom
## Multiple R-squared:  0.7295, Adjusted R-squared:  0.7122 
## F-statistic: 42.25 on 3 and 47 DF,  p-value: 2.164e-13

sama dengan dlm p=2

cons_lm4 <- dynlm(Yt ~ Xt+L(Xt)+L(Xt,2),data = train.ts)
summary(cons_lm4)
## 
## Time series regression with "ts" data:
## Start = 3, End = 52
## 
## Call:
## dynlm(formula = Yt ~ Xt + L(Xt) + L(Xt, 2), data = train.ts)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -1296.8  -644.4   -11.1   586.7  1618.8 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)  
## (Intercept) 1.038e+03  5.532e+02   1.877   0.0669 .
## Xt          2.650e-05  1.937e-05   1.369   0.1778  
## L(Xt)       1.553e-05  1.817e-05   0.854   0.3973  
## L(Xt, 2)    2.320e-05  1.682e-05   1.379   0.1745  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 763.9 on 46 degrees of freedom
## Multiple R-squared:  0.111,  Adjusted R-squared:  0.05298 
## F-statistic: 1.914 on 3 and 46 DF,  p-value: 0.1406

SSE

deviance(cons_lm1)
## [1] 27977129
deviance(cons_lm2)
## [1] 8476094
deviance(cons_lm3)
## [1] 8473380
deviance(cons_lm4)
## [1] 26839530

Uji Diagnostik Model

Uji Non Autokorelasi

dwtest(cons_lm1)
## 
##  Durbin-Watson test
## 
## data:  cons_lm1
## DW = 0.39148, p-value = 1.235e-13
## alternative hypothesis: true autocorrelation is greater than 0
dwtest(cons_lm2)
## 
##  Durbin-Watson test
## 
## data:  cons_lm2
## DW = 3.1661, p-value = 1
## alternative hypothesis: true autocorrelation is greater than 0
dwtest(cons_lm3)
## 
##  Durbin-Watson test
## 
## data:  cons_lm3
## DW = 3.1601, p-value = 1
## alternative hypothesis: true autocorrelation is greater than 0
dwtest(cons_lm4)
## 
##  Durbin-Watson test
## 
## data:  cons_lm4
## DW = 0.42892, p-value = 2.654e-12
## alternative hypothesis: true autocorrelation is greater than 0

Uji Heterogenitas

bptest(cons_lm1)
## 
##  studentized Breusch-Pagan test
## 
## data:  cons_lm1
## BP = 0.078215, df = 2, p-value = 0.9616
bptest(cons_lm2)
## 
##  studentized Breusch-Pagan test
## 
## data:  cons_lm2
## BP = 4.9342, df = 2, p-value = 0.08483
bptest(cons_lm3)
## 
##  studentized Breusch-Pagan test
## 
## data:  cons_lm3
## BP = 6.3667, df = 3, p-value = 0.09507
bptest(cons_lm4)
## 
##  studentized Breusch-Pagan test
## 
## data:  cons_lm4
## BP = 0.91056, df = 3, p-value = 0.8229

Uji Normalitas

#shapiro wilk
shapiro.test(residuals(cons_lm1))
## 
##  Shapiro-Wilk normality test
## 
## data:  residuals(cons_lm1)
## W = 0.9486, p-value = 0.02756
shapiro.test(residuals(cons_lm2))
## 
##  Shapiro-Wilk normality test
## 
## data:  residuals(cons_lm2)
## W = 0.97093, p-value = 0.2416
shapiro.test(residuals(cons_lm3))
## 
##  Shapiro-Wilk normality test
## 
## data:  residuals(cons_lm3)
## W = 0.97168, p-value = 0.2595
shapiro.test(residuals(cons_lm4))
## 
##  Shapiro-Wilk normality test
## 
## data:  residuals(cons_lm4)
## W = 0.95596, p-value = 0.06022

Perbandingan Keakuratan Ramalan

akurasi <- matrix(c(mape.koyck, mape.dlm, mape.dlm2, mape.ardl))
row.names(akurasi)<- c("Koyck","DLM 1","DLM 2","Autoregressive")
colnames(akurasi) <- c("MAPE")
akurasi
##                     MAPE
## Koyck          0.2237812
## DLM 1          0.2450151
## DLM 2          0.2450151
## Autoregressive 0.2407311
#PLOT
par(mfrow=c(1,1))
plot(test$Xt, test$Yt, type="b", col="black", ylim=c(4000,4000))
points(test$Xt, fore.koyck$forecasts,col="red")
lines(test$Xt, fore.koyck$forecasts,col="red")
points(test$Xt, fore.dlm$forecasts,col="blue")
lines(test$Xt, fore.dlm$forecasts,col="blue")
points(test$Xt, fore.dlm2$forecasts,col="orange")
lines(test$Xt, fore.dlm2$forecasts,col="orange")
points(test$Xt, fore.ardl$forecasts,col="green")
lines(test$Xt, fore.ardl$forecasts,col="green")
legend("topleft",c("aktual", "koyck","DLM 1","DLM 2", "autoregressive"), lty=1, col=c("black","red","blue","orange","green"), cex=0.8)

Berdasarkan metode metode yang diuji ditemukan bahwa model Koyck memiliki MAPE terkecil sekaligus memiliki plot deret waktu yang paling mendekati data aktual. Sehingga model koyck layak digunakan.