Regresi dengan Peubah Lag

Author

Windi Pangesti

Published

September 21, 2025

Regresi dengan peubah lag (lagged variable regression) adalah bentuk analisis regresi yang melibatkan nilai masa lalu dari suatu variabel (lag) sebagai prediktor terhadap nilai sekarang.

1 Packages

Warning: package 'dLagM' was built under R version 4.5.1
Loading required package: nardl
Warning: package 'nardl' was built under R version 4.5.1
Registered S3 method overwritten by 'quantmod':
  method            from
  as.zoo.data.frame zoo 
Loading required package: dynlm
Warning: package 'dynlm' was built under R version 4.5.1
Loading required package: zoo

Attaching package: 'zoo'
The following objects are masked from 'package:base':

    as.Date, as.Date.numeric
Warning: package 'MLmetrics' was built under R version 4.5.1

Attaching package: 'MLmetrics'
The following object is masked from 'package:dLagM':

    MAPE
The following object is masked from 'package:base':

    Recall
Warning: package 'lmtest' was built under R version 4.5.1
Warning: package 'car' was built under R version 4.5.1
Loading required package: carData

2 Impor Data

data <- rio::import("https://raw.githubusercontent.com/rizkynurhambali/praktikum-sta1341/main/Pertemuan%203/Data%20Asli.csv")
str(data)
'data.frame':   20 obs. of  4 variables:
 $ t     : int  1 2 3 4 5 6 7 8 9 10 ...
 $ Yt    : num  52.9 53.8 54.9 58.2 60 63.4 68.2 78 84.7 90.6 ...
 $ Y(t-1): num  NA 52.9 53.8 54.9 58.2 60 63.4 68.2 78 84.7 ...
 $ Xt    : num  30.3 30.9 30.9 33.4 35.1 37.3 41 44.9 46.5 50.3 ...
data

3 Pembagian Data

#SPLIT DATA
train<-data[1:15,]
test<-data[16:20,]
#data time series
train.ts<-ts(train)
test.ts<-ts(test)
data.ts<-ts(data)

4 Model Koyck

Model Koyck didasarkan pada asumsi bahwa semakin jauh jarak lag peubah independen dari periode sekarang maka semakin kecil pengaruh peubah lag terhadap peubah dependen.

Koyck mengusulkan suatu metode untuk menduga model dinamis distributed lag dengan mengasumsikan bahwa semua koefisien \(\beta\) mempunyai tanda sama.

Model kyock merupakan jenis paling umum dari model infinite distributed lag dan juga dikenal sebagai geometric lag

\[ y_t=a(1-\lambda)+\beta_0X_t+\beta_1Z_t+\lambda Y_{t-1}+V_t \]

dengan \[V_t=u_t-\lambda u_{t-1}\]

4.1 Pemodelan

Pemodelan model Koyck dengan R dapat menggunakan dLagM::koyckDlm() . Fungsi umum dari koyckDlm adalah sebagai berikut.

koyckDlm(x , y , intercept)

Fungsi koyckDlm() akan menerapkan model lag terdistribusi dengan transformasi Koyck satu prediktor. Nilai x dan y tidak perlu sebagai objek time series (ts). intercept dapat dibuat TRUE untuk memasukkan intersep ke dalam model.

#MODEL KOYCK
model.koyck <- koyckDlm(x = train$Xt, y = train$Yt)
summary(model.koyck)

Call:
"Y ~ (Intercept) + Y.1 + X.t"

Residuals:
     Min       1Q   Median       3Q      Max 
-3.75605 -1.16407  0.01599  1.17295  3.28003 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept)  -3.7335     2.1785  -1.714   0.1146    
Y.1           0.4214     0.1158   3.639   0.0039 ** 
X.t           1.1510     0.1901   6.055 8.25e-05 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 2.023 on 11 degrees of freedom
Multiple R-Squared: 0.9934, Adjusted R-squared: 0.9922 
Wald test: 828.9 on 2 and 11 DF,  p-value: 1.01e-12 

Diagnostic tests:
NULL

                             alpha     beta       phi
Geometric coefficients:  -6.452844 1.150951 0.4214181
AIC(model.koyck)
[1] 64.08525
BIC(model.koyck)
[1] 66.64148

Dari hasil tersebut, didapat bahwa peubah \(x_t\) dan \(y_{t-1}\) memiliki nilai \(P-Value<0.05\). Hal ini menunjukkan bahwa peubah \(x_t\) dan \(y_{t-1}\) berpengaruh signifikan terhadap \(y\). Adapun model keseluruhannya adalah sebagai berikut

\[ \hat{Y_t}=-3.7335+1.1510X_t+0.4214Y_{t-1} \]

4.2 Peramalan dan Akurasi

Berikut adalah hasil peramalan y untuk 5 periode kedepan menggunakan model koyck

fore.koyck <- forecast(model = model.koyck, x=test$Xt, h=5)
fore.koyck
$forecasts
[1] 146.4180 157.6420 176.5288 198.1843 223.3085

$call
forecast.koyckDlm(model = model.koyck, x = test$Xt, h = 5)

attr(,"class")
[1] "forecast.koyckDlm" "dLagM"            
mape.koyck <- MAPE(fore.koyck$forecasts, test$Yt)
mape.koyck #data test
[1] 0.06845456
#akurasi data training
GoF(model.koyck)

5 Regression with Distributed Lag

Pemodelan model Regression with Distributed Lag dengan R dapat menggunakan dLagM::dlm() . Fungsi umum dari dlm adalah sebagai berikut.

dlm(formula , data , x , y , q , remove )

Fungsi dlm() akan menerapkan model lag terdistribusi dengan satu atau lebih prediktor. Nilai x dan y tidak perlu sebagai objek time series (ts). \(q\) adalah integer yang mewakili panjang lag yang terbatas.

5.1 Pemodelan (Lag=2)

model.dlm <- dlm(x = train$Xt,y = train$Yt , q = 1)
summary(model.dlm)

Call:
lm(formula = model.formula, data = design)

Residuals:
    Min      1Q  Median      3Q     Max 
-1.4118 -0.8790 -0.3542  0.7202  2.3047 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept)  -9.4870     1.6123  -5.884 0.000106 ***
x.t           0.2557     0.1632   1.567 0.145434    
x.1           1.8395     0.1927   9.547 1.17e-06 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 1.277 on 11 degrees of freedom
Multiple R-squared:  0.9974,    Adjusted R-squared:  0.9969 
F-statistic:  2081 on 2 and 11 DF,  p-value: 6.538e-15

AIC and BIC values for the model:
       AIC      BIC
1 51.20728 53.76351
AIC(model.dlm)
[1] 51.20728
BIC(model.dlm)
[1] 53.76351

Dari hasil diatas, didapat bahwa \(P-value\) dari intercept dan \(x_{t-1}<0.05\). Hal ini menunjukkan bahwa intercept dan \(x_{t-1}\) berpengaruh signifikan terhadap \(y\). Adapun model keseluruhan yang terbentuk adalah sebagai berikut

\[ \hat{Y_t}=-9.6779+0.3179X_t+1.5276X_{t-1}+0.2651X_{t-2} \]

5.2 Peramalan dan Akurasi

Berikut merupakan hasil peramalan \(y\) untuk 5 periode kedepan

fore.dlm <- forecast(model = model.dlm, x=test$Xt, h=5)
fore.dlm
$forecasts
[1] 146.4833 168.6500 175.1065 200.7757 226.2206

$call
forecast.dlm(model = model.dlm, x = test$Xt, h = 5)

attr(,"class")
[1] "forecast.dlm" "dLagM"       
mape.dlm <- MAPE(fore.dlm$forecasts, test$Yt)
mape.dlm
[1] 0.08502714
#akurasi data training
GoF(model.dlm)

5.3 Lag Optimum

#penentuan lag optimum 
finiteDLMauto(formula = Yt ~ Xt,
              data = data.frame(train), q.min = 1, q.max = 10,
              model.type = "dlm", error.type = "AIC", trace = TRUE)

Berdasarkan output tersebut, lag optimum didapatkan ketika lag=6. Selanjutnya dilakukan pemodelan untuk lag=6

#model dlm dengan lag optimum
model.dlm2 <- dlm(x = train$Xt,y = train$Yt , q = 6)
summary(model.dlm2)

Call:
lm(formula = model.formula, data = design)

Residuals:
        1         2         3         4         5         6         7         8 
-0.023415  0.014375  0.032641 -0.010695 -0.013096 -0.014927  0.010054  0.010809 
        9 
-0.005747 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)  
(Intercept) 21.42223    1.44472  14.828   0.0429 *
x.t          1.68749    0.04758  35.466   0.0179 *
x.1         -1.23901    0.11688 -10.600   0.0599 .
x.2          0.97604    0.02787  35.021   0.0182 *
x.3         -0.23945    0.02746  -8.719   0.0727 .
x.4          3.24431    0.14678  22.103   0.0288 *
x.5          0.08560    0.04903   1.746   0.3312  
x.6         -3.47612    0.15157 -22.933   0.0277 *
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 0.05079 on 1 degrees of freedom
Multiple R-squared:      1, Adjusted R-squared:      1 
F-statistic: 1.277e+05 on 7 and 1 DF,  p-value: 0.002155

AIC and BIC values for the model:
        AIC       BIC
1 -29.87368 -28.09865
AIC(model.dlm2)
[1] -29.87368
BIC(model.dlm2)
[1] -28.09865

Dari hasil tersebut terdapat beberapa peubah yang berpengaruh signifikan terhadap taraf nyata 5% yaitu \(x_t\) , \(x_{t-2}\) , \(x_{t-4}\) , \(x_{t-6}\). Adapun keseluruhan model yang terbentuk adalah

\[ \hat{Y_t}=21.42223+1.68749X_t+...-3.47612X_{t-6} \]

Adapun hasil peramalan 5 periode kedepan menggunakan model tersebut adalah sebagai berikut

#peramalan dan akurasi
fore.dlm2 <- forecast(model = model.dlm2, x=test$Xt, h=5)

#akurasi data test
mape.dlm2<- MAPE(fore.dlm2$forecasts, test$Yt)
mape.dlm2
[1] 0.1511753
#akurasi data training
GoF(model.dlm2)

Model tersebut merupakan model yang sangat baik dengan nilai MAPE yang kurang dari 10%.

6 Model Autoregressive

Peubah dependen dipengaruhi oleh peubah independen pada waktu sekarang, serta dipengaruhi juga oleh peubah dependen itu sendiri pada satu waktu yang lalu maka model tersebut disebut autoregressive (Gujarati 2004).

6.1 Pemodelan

Pemodelan Autoregressive dilakukan menggunakan fungsi dLagM::ardlDlm() . Fungsi tersebut akan menerapkan autoregressive berordo \((p,q)\) dengan satu prediktor. Fungsi umum dari ardlDlm() adalah sebagai berikut.

ardlDlm(formula = NULL , data = NULL , x = NULL , y = NULL , p = 1 , q = 1 , 
         remove = NULL )

Dengan \(p\) adalah integer yang mewakili panjang lag yang terbatas dan \(q\) adalah integer yang merepresentasikan ordo dari proses autoregressive.

model.ardl <- ardlDlm(x = train$Xt, y = train$Yt, p = 1 , q = 1)
summary(model.ardl)

Time series regression with "ts" data:
Start = 2, End = 15

Call:
dynlm(formula = as.formula(model.text), data = data, start = 1)

Residuals:
    Min      1Q  Median      3Q     Max 
-1.6274 -0.8401 -0.1767  0.8392  1.9447 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)   
(Intercept)  -8.3594     1.9186  -4.357  0.00143 **
X.t           0.3563     0.1875   1.900  0.08661 . 
X.1           1.4557     0.4071   3.575  0.00505 **
Y.1           0.1408     0.1318   1.068  0.31055   
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 1.269 on 10 degrees of freedom
Multiple R-squared:  0.9976,    Adjusted R-squared:  0.9969 
F-statistic:  1405 on 3 and 10 DF,  p-value: 2.006e-13
AIC(model.ardl)
[1] 51.69462
BIC(model.ardl)
[1] 54.8899
model.ardl <- ardlDlm(formula = Yt ~ Xt, 
                         data = train,p = 1 , q = 1)
summary(model.ardl)

Time series regression with "ts" data:
Start = 2, End = 15

Call:
dynlm(formula = as.formula(model.text), data = data)

Residuals:
    Min      1Q  Median      3Q     Max 
-1.6274 -0.8401 -0.1767  0.8392  1.9447 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)   
(Intercept)  -8.3594     1.9186  -4.357  0.00143 **
Xt.t          0.3563     0.1875   1.900  0.08661 . 
Xt.1          1.4557     0.4071   3.575  0.00505 **
Yt.1          0.1408     0.1318   1.068  0.31055   
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 1.269 on 10 degrees of freedom
Multiple R-squared:  0.9976,    Adjusted R-squared:  0.9969 
F-statistic:  1405 on 3 and 10 DF,  p-value: 2.006e-13
AIC(model.ardl)
[1] 51.69462
BIC(model.ardl)
[1] 54.8899

Hasil di atas menunjukkan bahwa selain peubah \(x_{t-1}\), hasil uji t menunjukkan nilai-p pada peubah \(\ge0.05\) Hal ini menunjukkan bahwa peubah \(x_{t-1}\) berpengaruh signifikan terhadap \(y_t\), sementara \(x_t\) dan \(y_{t-1}\) berpengaruh signifikan terhadap \(y_t\). Model keseluruhannya adalah sebagai berikut:

\[ \hat{Y}=-8,3594+0,3563X_t+1,4557X_{t-1}+0,1408Y_{t-1} \]

6.2 Peramalan dan Akurasi

fore.ardl <- forecast(model = model.ardl, x=test$Xt, h=5)
fore.ardl
$forecasts
[1] 145.6865 166.4608 176.3897 199.9337 225.5255

$call
forecast.ardlDlm(model = model.ardl, x = test$Xt, h = 5)

attr(,"class")
[1] "forecast.ardlDlm" "dLagM"           

Data di atas merupakan hasil peramalan untuk 5 periode ke depan menggunakan Model Autoregressive dengan \(p=1\) dan \(q=1\).

mape.ardl <- MAPE(fore.ardl$forecasts, test$Yt)
mape.ardl
[1] 0.08313896
#akurasi data training
GoF(model.ardl)

Berdasarkan akurasi di atas, terlihat bahwa nilai MAPE keduanya tidak jauh berbeda. Artinya, model regresi dengan distribusi lag ini tidak overfitted atau underfitted

6.3 Lag Optimum

#penentuan lag optimum
model.ardl.opt <- ardlBoundOrders(data = data.frame(data), ic = "AIC", 
                                  formula = Yt ~ Xt )
model.ardl.opt
$p
  Xt
1  6

$q
[1] 1

$Stat.table
          q = 1     q = 2     q = 3     q = 4    q = 5    q = 6 q = 7 q = 8
p = 1 112.18269 109.52534 105.52217 101.63734 95.28841 88.62221    NA    NA
p = 2 107.53473 107.27801 104.24754  98.64030 82.80884 78.24396    NA    NA
p = 3 104.39360 104.39360 106.01978  99.61104 80.50702       NA    NA    NA
p = 4  96.84811  96.30318  96.30318  92.11385 81.60323       NA    NA    NA
p = 5  86.53889  84.38162  71.12124  71.12124       NA       NA    NA    NA
p = 6 -20.56587        NA        NA        NA       NA       NA    NA    NA
      q = 9 q = 10 q = 11 q = 12 q = 13 q = 14 q = 15
p = 1    NA     NA     NA     NA     NA     NA     NA
p = 2    NA     NA     NA     NA     NA     NA     NA
p = 3    NA     NA     NA     NA     NA     NA     NA
p = 4    NA     NA     NA     NA     NA     NA     NA
p = 5    NA     NA     NA     NA     NA     NA     NA
p = 6    NA     NA     NA     NA     NA     NA     NA

$min.Stat
[1] -20.56587
min_p=c()
for(i in 1:6){
  min_p[i]=min(model.ardl.opt$Stat.table[[i]])
}
q_opt=which(min_p==min(min_p, na.rm = TRUE))
p_opt=which(model.ardl.opt$Stat.table[[q_opt]] == 
              min(model.ardl.opt$Stat.table[[q_opt]], na.rm = TRUE))
data.frame("q_optimum" = q_opt, "p_optimum" = p_opt, 
           "AIC"=model.ardl.opt$min.Stat)

Dari tabel di atas, dapat terlihat bahwa nilai AIC terendah didapat ketika \(p=6\) dan \(q=1\), yaitu sebesar -20,56587. Artinya, model autoregressive optimum didapat ketika \(p=6\) dan \(q=1\).

Selanjutnya dapat dilakukan pemodelan dengan nilai \(p\) dan \(q\) optimum seperti inisialisasi di langkah sebelumnya.

7 Pemodelan DLM & ARDL dengan Library dynlm

Pemodelan regresi dengan peubah lag tidak hanya dapat dilakukan dengan fungsi pada packages dLagM , tetapi terdapat packages dynlm yang dapat digunakan. Fungsi dynlm secara umum adalah sebagai berikut.

dynlm(formula, data, subset, weights, na.action, method = "qr",
  model = TRUE, x = FALSE, y = FALSE, qr = TRUE, singular.ok = TRUE,
  contrasts = NULL, offset, start = NULL, end = NULL, ...)

Untuk menentukan formula model yang akan digunakan, tersedia fungsi tambahan yang memungkinkan spesifikasi dinamika (melalui d() dan L()) atau pola linier/siklus dengan mudah (melalui trend(), season(), dan harmon()). Semua fungsi formula baru mengharuskan argumennya berupa objek deret waktu (yaitu, "ts" atau "zoo").

L() : fungsi untuk mengambil nilai lag dari suatu variabel.

L(Xt)= Xt pada periode sebelumnya (lag ke-1).

L(Xt, 2) = Xt pada dua periode sebelumnya (lag ke-2).

#sama dengan model dlm q=1
cons_lm1 <- dynlm(Yt ~ Xt+L(Xt),data = train.ts)

#sama dengan model ardl p=1 q=0
cons_lm2 <- dynlm(Yt ~ Xt+L(Yt),data = train.ts)

#sama dengan ardl p=1 q=1
cons_lm3 <- dynlm(Yt ~ Xt+L(Xt)+L(Yt),data = train.ts)

#sama dengan dlm p=2
cons_lm4 <- dynlm(Yt ~ Xt+L(Xt)+L(Xt,2),data = train.ts)

7.1 Ringkasan Model

Berikut adalah ringkasan hasil estimasi dari masing-masing model:

summary(cons_lm1)

Time series regression with "ts" data:
Start = 2, End = 15

Call:
dynlm(formula = Yt ~ Xt + L(Xt), data = train.ts)

Residuals:
    Min      1Q  Median      3Q     Max 
-1.4118 -0.8790 -0.3542  0.7202  2.3047 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept)  -9.4870     1.6123  -5.884 0.000106 ***
Xt            0.2557     0.1632   1.567 0.145434    
L(Xt)         1.8395     0.1927   9.547 1.17e-06 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 1.277 on 11 degrees of freedom
Multiple R-squared:  0.9974,    Adjusted R-squared:  0.9969 
F-statistic:  2081 on 2 and 11 DF,  p-value: 6.538e-15
summary(cons_lm2)

Time series regression with "ts" data:
Start = 2, End = 15

Call:
dynlm(formula = Yt ~ Xt + L(Yt), data = train.ts)

Residuals:
    Min      1Q  Median      3Q     Max 
-3.4441 -1.1436  0.1785  1.5549  2.1584 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept) -3.54096    1.96526  -1.802    0.099 .  
Xt           0.92218    0.14482   6.368 5.31e-05 ***
L(Yt)        0.55684    0.08922   6.241 6.34e-05 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 1.827 on 11 degrees of freedom
Multiple R-squared:  0.9946,    Adjusted R-squared:  0.9936 
F-statistic:  1015 on 2 and 11 DF,  p-value: 3.344e-13
summary(cons_lm3)

Time series regression with "ts" data:
Start = 2, End = 15

Call:
dynlm(formula = Yt ~ Xt + L(Xt) + L(Yt), data = train.ts)

Residuals:
    Min      1Q  Median      3Q     Max 
-1.6274 -0.8401 -0.1767  0.8392  1.9447 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)   
(Intercept)  -8.3594     1.9186  -4.357  0.00143 **
Xt            0.3563     0.1875   1.900  0.08661 . 
L(Xt)         1.4557     0.4071   3.575  0.00505 **
L(Yt)         0.1408     0.1318   1.068  0.31055   
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 1.269 on 10 degrees of freedom
Multiple R-squared:  0.9976,    Adjusted R-squared:  0.9969 
F-statistic:  1405 on 3 and 10 DF,  p-value: 2.006e-13
summary(cons_lm4)

Time series regression with "ts" data:
Start = 3, End = 15

Call:
dynlm(formula = Yt ~ Xt + L(Xt) + L(Xt, 2), data = train.ts)

Residuals:
    Min      1Q  Median      3Q     Max 
-1.4446 -0.6965 -0.2373  0.8810  1.8630 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept)  -9.6779     1.8156  -5.330 0.000474 ***
Xt            0.3179     0.1792   1.774 0.109856    
L(Xt)         1.5276     0.3487   4.380 0.001770 ** 
L(Xt, 2)      0.2651     0.2440   1.087 0.305388    
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 1.322 on 9 degrees of freedom
Multiple R-squared:  0.9974,    Adjusted R-squared:  0.9965 
F-statistic:  1133 on 3 and 9 DF,  p-value: 6.471e-12

7.2 SSE

Selain ringkasan hasil, kita juga bisa menghitung nilai SSE dari masing-masing model.

deviance(cons_lm1)
[1] 17.94685
deviance(cons_lm2)
[1] 36.70155
deviance(cons_lm3)
[1] 16.10882
deviance(cons_lm4)
[1] 15.73004

Semakin kecil nilai SSE, berarti model lebih baik dalam menyesuaikan data (karena error lebih kecil).

7.3 Ui encomptest

langkah selanjutnya adalah membandingkan kinerja antar model. selanjutnya adalah membandingkan kinerja antar model. Salah satu pendekatan yang dapat digunakan adalah encompassing test dengan fungsi encomptest() dari package lmtest.

#uji model
if(require("lmtest")) encomptest(cons_lm1, cons_lm2)
  • Model 1 : p-value 0.31055 (> 0.05). Artinya, penambahan lag dari Y (L(Yt)) tidak memberikan peningkatan signifikan.
    Dengan demikian, Model 1 sudah memadai.

  • Model 2 : p-value = 0.00505 (< 0.01).
    Artinya, penambahan lag dari X (L(Xt)) memberikan peningkatan signifikan.Sehingga Model 2 tidak cukup memadai tanpa tambahan variabel lag dari Xt.

7.3.1 Uji Autokorelasi Residual

#durbin watson
dwtest(cons_lm1)

    Durbin-Watson test

data:  cons_lm1
DW = 2.1065, p-value = 0.3842
alternative hypothesis: true autocorrelation is greater than 0
dwtest(cons_lm2)

    Durbin-Watson test

data:  cons_lm2
DW = 1.441, p-value = 0.0497
alternative hypothesis: true autocorrelation is greater than 0
dwtest(cons_lm3)

    Durbin-Watson test

data:  cons_lm3
DW = 1.9337, p-value = 0.2449
alternative hypothesis: true autocorrelation is greater than 0
dwtest(cons_lm4)

    Durbin-Watson test

data:  cons_lm4
DW = 1.8189, p-value = 0.1911
alternative hypothesis: true autocorrelation is greater than 0

7.3.2 Uji Heterogenitas

bptest(cons_lm1)

    studentized Breusch-Pagan test

data:  cons_lm1
BP = 1.5713, df = 2, p-value = 0.4558
bptest(cons_lm2)

    studentized Breusch-Pagan test

data:  cons_lm2
BP = 3.7022, df = 2, p-value = 0.1571
bptest(cons_lm3)

    studentized Breusch-Pagan test

data:  cons_lm3
BP = 4.0554, df = 3, p-value = 0.2555
bptest(cons_lm4)

    studentized Breusch-Pagan test

data:  cons_lm4
BP = 2.7921, df = 3, p-value = 0.4248
  • Jika p-value > 0.05 → tidak ada heteroskedastisitas
  • Jika p-value ≤ 0.05 → ada heteroskedastisitas.

7.3.3 Uji Normalitas

H0: Residual berdistribusi normal.

H1: Residual tidak berdistribusi normal.

shapiro.test(residuals(cons_lm1))

    Shapiro-Wilk normality test

data:  residuals(cons_lm1)
W = 0.90752, p-value = 0.145
shapiro.test(residuals(cons_lm2))

    Shapiro-Wilk normality test

data:  residuals(cons_lm2)
W = 0.94358, p-value = 0.4661
shapiro.test(residuals(cons_lm3))

    Shapiro-Wilk normality test

data:  residuals(cons_lm3)
W = 0.94403, p-value = 0.4723
shapiro.test(residuals(cons_lm4))

    Shapiro-Wilk normality test

data:  residuals(cons_lm4)
W = 0.92477, p-value = 0.2907
  • Jika p-value > 0.05 → gagal tolak H0 → residual normal.
  • Jika p-value ≤ 0.05 → tolak H0 → residual tidak normal.

8 Perbandingan Model

Membuat tabel ringkas akurasi model berdasarkan nilai MAPE (Mean Absolute Percentage Error).

akurasi <- matrix(c(mape.koyck, mape.dlm, mape.dlm2, mape.ardl))
row.names(akurasi)<- c("Koyck","DLM 1","DLM 2","Autoregressive")
colnames(akurasi) <- c("MAPE")
akurasi
                     MAPE
Koyck          0.06845456
DLM 1          0.08502714
DLM 2          0.15117526
Autoregressive 0.08313896

Berdasarkan nilai MAPE, model paling optimum didapat pada Model Koyck karena memiliki nilai MAPE yang terkecil.

8.1 Plot

Untuk membandingkan hasil peramalan dari berbagai model (Koyck, DLM 1, DLM 2, dan ARDL) dengan data aktual, dibuat grafik garis. Visualisasi ini membantu melihat seberapa dekat hasil ramalan tiap model dengan nilai aktual.

par(mfrow=c(1,1))  # atur layout grafik tunggal

# plot data aktual
plot(test$Xt, test$Yt, type="b", col="black", ylim=c(120,250))

# tambahkan garis ramalan dari masing-masing model
points(test$Xt, fore.koyck$forecasts, col="red");   lines(test$Xt, fore.koyck$forecasts, col="red")
points(test$Xt, fore.dlm$forecasts, col="blue");    lines(test$Xt, fore.dlm$forecasts, col="blue")
points(test$Xt, fore.dlm2$forecasts, col="orange"); lines(test$Xt, fore.dlm2$forecasts, col="orange")
points(test$Xt, fore.ardl$forecasts, col="green");  lines(test$Xt, fore.ardl$forecasts, col="green")

# legenda model
legend("topleft",
       c("Aktual", "Koyck","DLM 1","DLM 2", "Autoregressive"),
       lty=1, col=c("black","red","blue","orange","green"), cex=0.8)

Berdasarkan plot perbandingan, terlihat bahwa semua model mampu mengikuti tren data aktual, meskipun dengan tingkat kedekatan yang berbeda. Model DLM 1 (biru) dan Autoregressive (hijau) cukup konsisten mendekati data aktual, sementara DLM 2 (oranye) tampak menyimpang pada awal periode sebelum kembali mendekati tren. Model Koyck (merah) terlihat paling mendekati garis data aktual secara keseluruhan.

9 Pengayaan (Regresi Berganda)

9.1 Data

data(M1Germany)
data1 = M1Germany[1:144,]
data1
           logm1 logprice   loggnp interest
1960 Q1 7.947090 3.403827 8.309448    0.062
1960 Q2 7.996191 3.380212 8.387236    0.064
1960 Q3 7.949908 3.436211 8.465361    0.064
1960 Q4 8.030746 3.419888 8.477333    0.062
1961 Q1 7.951792 3.449861 8.374404    0.060
1961 Q2 8.020040 3.427157 8.420854    0.057
1961 Q3 8.002081 3.478868 8.482004    0.060
1961 Q4 8.090023 3.473332 8.488274    0.060
1962 Q1 8.006645 3.494354 8.387732    0.058
1962 Q2 8.071870 3.470630 8.461327    0.060
1962 Q3 8.050562 3.514109 8.525332    0.061
1962 Q4 8.122130 3.505587 8.522898    0.062
1963 Q1 8.022497 3.535145 8.363411    0.061
1963 Q2 8.100153 3.505047 8.478836    0.061
1963 Q3 8.089230 3.535699 8.557715    0.061
1963 Q4 8.153842 3.532547 8.562039    0.061
1964 Q1 8.075132 3.558344 8.449229    0.060
1964 Q2 8.151805 3.531026 8.538758    0.062
1964 Q3 8.127938 3.561926 8.593462    0.063
1964 Q4 8.174400 3.576271 8.599235    0.063
1965 Q1 8.111390 3.598271 8.496794    0.064
1965 Q2 8.193264 3.567756 8.579759    0.068
1965 Q3 8.161774 3.599420 8.628106    0.071
1965 Q4 8.202444 3.610648 8.638486    0.074
1966 Q1 8.120452 3.633684 8.534204    0.074
1966 Q2 8.199461 3.601359 8.611403    0.079
1966 Q3 8.150163 3.635848 8.643068    0.081
1966 Q4 8.179318 3.641919 8.637052    0.076
1967 Q1 8.116368 3.655374 8.516676    0.072
1967 Q2 8.188107 3.623514 8.590857    0.069
1967 Q3 8.189083 3.639952 8.633877    0.069
1967 Q4 8.260558 3.656873 8.659213    0.070
1968 Q1 8.172073 3.664074 8.546246    0.070
1968 Q2 8.239624 3.647979 8.630219    0.067
1968 Q3 8.228731 3.673537 8.699079    0.065
1968 Q4 8.294264 3.677819 8.724545    0.065
1969 Q1 8.207375 3.696922 8.609492    0.066
1969 Q2 8.275868 3.683867 8.696443    0.069
1969 Q3 8.260526 3.713035 8.762062    0.072
1969 Q4 8.289086 3.735238 8.781503    0.074
1970 Q1 8.203062 3.756819 8.656656    0.079
1970 Q2 8.247039 3.763569 8.755605    0.086
1970 Q3 8.228462 3.785530 8.792995    0.084
1970 Q4 8.297291 3.801739 8.817977    0.083
1971 Q1 8.202307 3.833780 8.710257    0.080
1971 Q2 8.274615 3.836955 8.768910    0.083
1971 Q3 8.272566 3.857440 8.807739    0.084
1971 Q4 8.328451 3.877949 8.823891    0.081
1972 Q1 8.278112 3.890145 8.748202    0.078
1972 Q2 8.352214 3.887013 8.798990    0.083
1972 Q3 8.349937 3.905884 8.835008    0.083
1972 Q4 8.406540 3.930688 8.871531    0.087
1973 Q1 8.345429 3.946965 8.806641    0.087
1973 Q2 8.344052 3.947255 8.843224    0.102
1973 Q3 8.294420 3.964084 8.875188    0.098
1973 Q4 8.356775 4.002668 8.898903    0.097
1974 Q1 8.285387 4.001224 8.817920    0.107
1974 Q2 8.330475 4.013947 8.846744    0.109
1974 Q3 8.312175 4.037845 8.872360    0.108
1974 Q4 8.384079 4.080617 8.885187    0.099
1975 Q1 8.335851 4.070240 8.788548    0.089
1975 Q2 8.404430 4.075824 8.830693    0.084
1975 Q3 8.420188 4.084530 8.864359    0.087
1975 Q4 8.472594 4.123450 8.905368    0.086
1976 Q1 8.417439 4.103833 8.849738    0.078
1976 Q2 8.488578 4.111153 8.897213    0.083
1976 Q3 8.452609 4.129229 8.909537    0.081
1976 Q4 8.484192 4.153007 8.962553    0.074
1977 Q1 8.460301 4.138569 8.891506    0.070
1977 Q2 8.508119 4.149574 8.919986    0.064
1977 Q3 8.511290 4.159976 8.928903    0.060
1977 Q4 8.551160 4.195441 8.991502    0.060
1978 Q1 8.548410 4.179681 8.920155    0.056
1978 Q2 8.591674 4.189700 8.958250    0.060
1978 Q3 8.583403 4.207763 8.970353    0.064
1978 Q4 8.647541 4.233454 9.021976    0.066
1979 Q1 8.610424 4.216385 8.955347    0.071
1979 Q2 8.640258 4.219890 9.006041    0.080
1979 Q3 8.601359 4.245534 9.010996    0.078
1979 Q4 8.643389 4.277041 9.057142    0.080
1980 Q1 8.574950 4.263764 8.994907    0.095
1980 Q2 8.597368 4.276861 9.007502    0.083
1980 Q3 8.584653 4.292006 9.011558    0.083
1980 Q4 8.632301 4.323218 9.045887    0.091
1981 Q1 8.550121 4.302429 8.982616    0.104
1981 Q2 8.581936 4.312878 9.002807    0.111
1981 Q3 8.529503 4.330983 9.016630    0.113
1981 Q4 8.575449 4.371484 9.053618    0.099
1982 Q1 8.523798 4.351336 8.975347    0.096
1982 Q2 8.574183 4.355144 9.000765    0.092
1982 Q3 8.544303 4.376700 9.000425    0.088
1982 Q4 8.607217 4.409411 9.040018    0.080
1983 Q1 8.591824 4.389238 8.984577    0.074
1983 Q2 8.649830 4.385907 9.019927    0.081
1983 Q3 8.617160 4.405194 9.025687    0.084
1983 Q4 8.660706 4.439741 9.080193    0.083
1984 Q1 8.605596 4.413525 9.029947    0.079
1984 Q2 8.648854 4.408328 9.038796    0.081
1984 Q3 8.631439 4.421560 9.064916    0.077
1984 Q4 8.705472 4.458988 9.113929    0.070
1985 Q1 8.637735 4.429792 9.033036    0.077
1985 Q2 8.672848 4.427179 9.071107    0.070
1985 Q3 8.666981 4.444415 9.096274    0.064
1985 Q4 8.731544 4.482053 9.131516    0.066
1986 Q1 8.688053 4.461577 9.044135    0.060
1986 Q2 8.736441 4.462477 9.100618    0.060
1986 Q3 8.716323 4.475050 9.118103    0.058
1986 Q4 8.774430 4.509518 9.157114    0.060
1987 Q1 8.733100 4.487478 9.061203    0.056
1987 Q2 8.797912 4.485452 9.106269    0.055
1987 Q3 8.790317 4.487186 9.131280    0.062
1987 Q4 8.827642 4.523776 9.174376    0.058
1988 Q1 8.809855 4.498520 9.103868    0.056
1988 Q2 8.870276 4.499443 9.133583    0.060
1988 Q3 8.849585 4.503957 9.155993    0.063
1988 Q4 8.899237 4.542284 9.190899    0.062
1989 Q1 8.858069 4.521147 9.136808    0.070
1989 Q2 8.875347 4.520614 9.165657    0.071
1989 Q3 8.853254 4.530425 9.166810    0.071
1989 Q4 8.909425 4.567780 9.206169    0.078
1990 Q1 8.833934 4.551548 9.169267    0.090
1990 Q2 8.859619 4.553519 9.188996    0.090
1990 Q3 8.795776 4.559440 9.079103    0.091
1990 Q4 8.917403 4.586802 9.110099    0.090
1991 Q1 8.832875 4.573783 9.077348    0.086
1991 Q2 8.834515 4.590463 9.101797    0.086
1991 Q3 8.828346 4.604870 9.111730    0.088
1991 Q4 8.883167 4.647942 9.128769    0.087
1992 Q1 8.813976 4.634146 9.096869    0.082
1992 Q2 8.836945 4.643621 9.107293    0.084
1992 Q3 8.836982 4.660889 9.118105    0.082
1992 Q4 8.933352 4.692081 9.137836    0.074
1993 Q1 8.851538 4.679628 9.056395    0.065
1993 Q2 8.909006 4.686381 9.088570    0.067
1993 Q3 8.888773 4.693364 9.106137    0.061
1993 Q4 8.979351 4.722953 9.120107    0.056
1994 Q1 8.919931 4.707095 9.078167    0.062
1994 Q2 8.962028 4.705920 9.112044    0.069
1994 Q3 8.951495 4.713935 9.125273    0.074
1994 Q4 9.006916 4.743627 9.140267    0.074
1995 Q1 8.930109 4.726502 9.099703    0.071
1995 Q2 8.958109 4.728803 9.130422    0.064
1995 Q3 8.961501 4.736988 9.139965    0.061
1995 Q4 9.049352 4.763626 9.145096    0.055

9.2 DLM

#Run the search over finite DLMs according to AIC values
finiteDLMauto(formula = logprice ~ interest+logm1,
              data = data.frame(data1), 
              q.min = 1, q.max = 5,
              model.type = "dlm", 
              error.type = "AIC", trace = FALSE)

Fungsi akan mencoba model dengan lag minimal q=1 hingga maksimal q=5. Dari hasil, model terbaik yang ditemukan menggunakan lag q=5 dengan nilai AIC = -463.1393.

# Model dengan q = 5 (lag sampai periode ke-5)
model.dlmberganda = dlm(formula = logprice ~ interest + logm1,
                        data = data.frame(data1), q = 5)
summary(model.dlmberganda)

Call:
lm(formula = as.formula(model.formula), data = design)

Residuals:
      Min        1Q    Median        3Q       Max 
-0.095761 -0.028610 -0.000012  0.029496  0.102597 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept) -7.81759    0.11384 -68.669  < 2e-16 ***
interest.t  -1.75616    0.80358  -2.185 0.030707 *  
interest.1   1.38935    1.22707   1.132 0.259679    
interest.2   0.40776    1.23726   0.330 0.742273    
interest.3   1.23130    1.20752   1.020 0.309830    
interest.4  -0.08718    1.20869  -0.072 0.942616    
interest.5   3.06850    0.89380   3.433 0.000808 ***
logm1.t      0.43219    0.20876   2.070 0.040474 *  
logm1.1      0.42190    0.19807   2.130 0.035109 *  
logm1.2      0.20943    0.12883   1.626 0.106532    
logm1.3      0.22053    0.13011   1.695 0.092567 .  
logm1.4      0.05513    0.21457   0.257 0.797633    
logm1.5      0.03042    0.19192   0.159 0.874296    
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 0.04343 on 126 degrees of freedom
Multiple R-squared:  0.9894,    Adjusted R-squared:  0.9884 
F-statistic: 977.9 on 12 and 126 DF,  p-value: < 2.2e-16

AIC and BIC values for the model:
        AIC       BIC
1 -463.1393 -422.0566
# Model dengan q = 1 (lag sampai periode ke-1)
model.dlmberganda2 = dlm(formula = logprice ~ interest + logm1,
                         data = data.frame(data1), q = 1)
summary(model.dlmberganda2)

Call:
lm(formula = as.formula(model.formula), data = design)

Residuals:
      Min        1Q    Median        3Q       Max 
-0.134002 -0.044697  0.006407  0.036962  0.113063 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept) -7.77917    0.13299 -58.492  < 2e-16 ***
interest.t  -3.22103    0.94184  -3.420 0.000824 ***
interest.1   6.52775    0.94501   6.908 1.66e-10 ***
logm1.t      0.73918    0.08419   8.780 5.61e-15 ***
logm1.1      0.63330    0.08429   7.513 6.55e-12 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 0.05443 on 138 degrees of freedom
Multiple R-squared:  0.9832,    Adjusted R-squared:  0.9828 
F-statistic:  2025 on 4 and 138 DF,  p-value: < 2.2e-16

AIC and BIC values for the model:
        AIC       BIC
1 -419.7575 -401.9805

9.3 ARDL(p,q)

9.3.1 Contoh: ARDL(2,3)

Untuk p = 2 dan q = 3, persamaannya adalah:

\[ Y_t = \alpha + \phi_1 Y_{t-1} + \phi_2 Y_{t-2} + \beta_0 X_t + \beta_1 X_{t-1} + \beta_2 X_{t-2} + \beta_3 X_{t-3} + \varepsilon_t \]

  • p = orde lag untuk variabel dependen (Y).
  • q = orde lag untuk variabel independen (X).
#Mencari orde lag optimum model ARDL
ardlBoundOrders(data = data1 , formula = logprice ~ interest + logm1,
                ic="AIC")
$p
   interest logm1
65        0     4

$q
[1] 4

$Stat.table
           q = 1     q = 2     q = 3     q = 4     q = 5     q = 6     q = 7
p = 1  -760.1786 -757.9195 -846.8342 -975.2079 -965.7536 -958.9072 -956.7315
p = 2  -760.0433 -759.3090 -843.6247 -971.2514 -961.7929 -955.2809 -953.4890
p = 3  -753.7746 -753.7746 -841.2485 -970.4543 -961.4343 -953.7173 -950.0412
p = 4  -829.8076 -832.6436 -832.6436 -971.0837 -962.1804 -955.0429 -953.4667
p = 5  -749.4144 -753.2292 -962.9290 -962.9290 -961.7063 -954.3406 -951.7660
p = 6  -742.2103 -742.9945 -891.6195 -952.3771 -952.3771 -952.2461 -950.1105
p = 7  -728.9374 -733.0286 -851.2943 -945.7445 -944.6879 -944.6879 -949.3720
p = 8  -747.9277 -746.2948 -812.4289 -937.9446 -938.9491 -937.3393 -937.3393
p = 9  -722.6891 -724.5786 -863.2734 -928.9215 -927.2914 -926.8716 -936.6432
p = 10 -714.8175 -714.5658 -816.3319 -918.5218 -918.6350 -916.9076 -921.1246
p = 11 -703.1807 -705.3383 -794.0772 -909.6457 -908.8225 -906.9542 -912.9605
p = 12 -716.7111 -714.7403 -774.0127 -910.0315 -910.6834 -908.7146 -909.6612
p = 13 -697.7175 -698.1931 -793.4602 -895.5927 -894.9273 -893.5995 -897.7589
p = 14 -686.5600 -685.7967 -766.5292 -886.0709 -885.4341 -885.2283 -890.1638
p = 15 -676.7280 -678.3689 -753.2854 -875.6392 -874.1257 -874.3117 -879.2727
           q = 8     q = 9    q = 10    q = 11    q = 12    q = 13    q = 14
p = 1  -954.3375 -946.6293 -936.5328 -927.7728 -920.6435 -917.5463 -918.3110
p = 2  -951.1470 -943.9360 -933.7047 -924.7949 -917.5334 -913.6213 -914.4063
p = 3  -948.4683 -941.1039 -930.8509 -922.0563 -914.5728 -910.5351 -913.4996
p = 4  -948.2330 -941.8238 -931.5689 -923.2663 -916.2063 -911.6023 -913.9345
p = 5  -947.5994 -939.3767 -929.0155 -920.4475 -913.5968 -909.0781 -911.6312
p = 6  -945.5758 -937.4076 -927.2439 -919.3949 -911.9537 -907.7394 -910.2890
p = 7  -945.5181 -937.1826 -926.9640 -917.9619 -910.2774 -905.9449 -907.8712
p = 8  -941.9617 -933.5959 -923.3691 -914.6251 -907.0608 -902.2187 -903.9255
p = 9  -936.6432 -935.7172 -925.2881 -917.0877 -911.6973 -903.9027 -904.6405
p = 10 -926.6891 -926.6891 -924.6986 -917.0904 -911.4197 -903.4313 -903.0612
p = 11 -917.9145 -918.2328 -918.2328 -919.2867 -913.3674 -904.8733 -903.6541
p = 12 -916.1321 -914.4362 -914.4610 -914.4610 -912.5159 -904.2394 -901.6216
p = 13 -905.4744 -903.7559 -902.4406 -902.2530 -902.2530 -902.9434 -901.2363
p = 14 -896.2370 -896.2620 -894.2896 -897.5711 -899.1407 -899.1407 -902.2350
p = 15 -884.5637 -886.8221 -884.9832 -890.5665 -893.2335 -891.6220 -891.6220
          q = 15
p = 1  -908.0863
p = 2  -904.1665
p = 3  -903.3006
p = 4  -903.9256
p = 5  -901.6220
p = 6  -900.1824
p = 7  -897.9867
p = 8  -894.1031
p = 9  -894.7387
p = 10 -893.6199
p = 11 -893.6060
p = 12 -892.4805
p = 13 -892.5115
p = 14 -893.6214
p = 15 -891.3741

$min.Stat
[1] -977.2745

$Stat.p
    interest logm1      Stat
65         0     4 -977.2745
1          0     0 -976.5191
2          1     0 -976.2558
17         0     1 -975.9606
66         1     4 -975.6027
18         1     1 -975.2079
49         0     3 -974.4859
3          2     0 -974.4275
33         0     2 -974.0166
50         1     3 -973.7500
67         2     4 -973.6028
34         1     2 -973.2324
19         2     1 -973.2188
68         3     4 -972.5992
4          3     0 -972.4875
51         2     3 -971.7743
20         3     1 -971.3872
35         2     2 -971.2514
69         4     4 -971.0837
5          4     0 -970.5114
52         3     3 -970.4543
81         0     5 -969.9284
53         4     3 -969.5311
21         4     1 -969.4756
36         3     2 -969.3907
82         1     5 -968.6783
37         4     2 -967.4756
83         2     5 -966.8835
84         3     5 -965.6393
85         4     5 -963.9662
86         5     5 -962.9290
70         5     4 -961.2547
54         5     3 -960.9580
97         0     6 -960.7402
6          5     0 -960.6858
22         5     1 -959.8419
98         1     6 -959.6604
38         5     2 -957.8547
99         2     6 -957.7528
100        3     6 -956.7875
101        4     6 -955.2416
71         6     4 -954.8953
87         6     5 -954.6855
102        5     6 -954.3662
103        6     6 -954.0973
7          6     0 -954.0615
113        0     7 -953.9160
55         6     3 -953.2860
23         6     1 -953.1080
114        1     7 -952.6540
39         6     2 -951.1356
115        2     7 -950.6562
116        3     7 -949.6038
88         7     5 -949.2090
72         7     4 -948.5194
117        4     7 -947.7999
104        7     6 -947.7424
56         7     3 -947.6915
8          7     0 -947.5092
120        7     7 -947.3660
24         7     1 -947.0094
118        5     7 -946.9631
119        6     7 -946.8080
40         7     2 -945.0123
129        0     8 -943.9035
130        1     8 -942.6627
131        2     8 -940.6818
145        0     9 -940.0114
132        3     8 -939.6913
89         8     5 -939.1878
73         8     4 -938.5330
146        1     9 -938.2680
133        4     8 -937.8368
105        8     6 -937.6834
57         8     3 -937.6370
9          8     0 -937.5705
121        8     7 -937.5351
136        7     8 -937.3948
25         8     1 -937.0088
134        5     8 -936.9393
135        6     8 -936.8904
147        2     9 -936.3875
148        3     9 -936.3159
137        8     8 -935.5389
41         8     2 -935.0088
149        4     9 -934.3458
150        5     9 -934.1858
152        7     9 -934.0733
151        6     9 -932.9538
153        8     9 -932.3338
154        9     9 -930.9065
161        0    10 -929.8056
90         9     5 -929.2731
74         9     4 -928.5254
162        1    10 -928.1257
10         9     0 -927.9853
58         9     3 -927.9744
122        9     7 -927.9061
106        9     6 -927.6344
26         9     1 -927.4482
164        3    10 -926.5271
163        2    10 -926.2965
138        9     8 -926.1307
42         9     2 -925.4484
165        4    10 -924.5287
168        7    10 -924.2716
166        5    10 -924.0521
167        6    10 -922.7596
169        8    10 -922.5928
155       10     9 -921.2169
170        9    10 -921.1777
177        0    11 -920.2608
171       10    10 -920.0124
91        10     5 -919.0182
178        1    11 -918.7342
75        10     4 -918.4135
11        10     0 -917.8597
59        10     3 -917.7711
123       10     7 -917.6569
107       10     6 -917.3861
27        10     1 -917.2925
179        2    11 -916.9417
180        3    11 -916.8682
193        0    12 -916.1477
139       10     8 -915.9643
92        11     5 -915.3201
43        10     2 -915.2941
156       11     9 -915.0851
181        4    11 -914.8854
194        1    12 -914.4423
124       11     7 -914.3141
184        7    11 -914.1880
76        11     4 -914.1395
182        5    11 -914.0440
108       11     6 -913.4052
140       11     8 -913.3026
195        2    12 -913.1680
172       11    10 -913.0914
60        11     3 -912.7714
183        6    11 -912.7548
196        3    12 -912.5820
185        8    11 -912.5636
12        11     0 -912.2009
28        11     1 -912.0389
186        9    11 -911.1737
157       12     9 -911.1513
188       11    11 -911.1189
93        12     5 -910.7693
198        5    12 -910.7434
197        4    12 -910.6154
125       12     7 -910.5873
141       12     8 -910.0719
44        11     2 -910.0439
187       10    11 -909.9928
200        7    12 -909.4197
173       12    10 -909.2473
77        12     4 -909.1913
109       12     6 -908.7753
199        6    12 -908.7635
201        8    12 -908.1609
61        12     3 -908.0357
29        12     1 -907.8613
209        0    13 -907.6473
13        12     0 -907.6158
205       12    12 -907.5931
204       11    12 -907.5525
202        9    12 -907.3633
189       12    11 -907.3200
210        1    13 -906.1005
45        12     2 -905.9070
203       10    12 -905.7653
211        2    13 -904.7293
212        3    13 -903.9077
214        5    13 -902.0824
158       13     9 -901.9574
213        4    13 -901.9144
94        13     5 -901.6338
126       13     7 -901.3766
142       13     8 -900.9367
216        7    13 -900.5676
225        0    14 -900.5066
174       13    10 -900.1413
215        6    13 -900.1102
78        13     4 -900.0282
110       13     6 -899.6703
226        1    14 -899.0967
217        8    13 -899.0866
62        13     3 -898.8589
30        13     1 -898.7940
190       13    11 -898.4409
221       12    13 -898.4110
220       11    13 -898.3058
218        9    13 -898.2568
14        13     0 -898.2039
206       13    12 -897.9014
227        2    14 -897.3889
46        13     2 -896.8637
219       10    13 -896.6244
222       13    13 -896.4458
228        3    14 -896.2512
230        5    14 -895.1320
95        14     5 -894.6021
229        4    14 -894.3023
159       14     9 -894.2497
127       14     7 -893.9663
143       14     8 -893.6932
231        6    14 -893.4037
79        14     4 -893.1343
232        7    14 -893.1064
111       14     6 -892.6253
175       14    10 -892.5085
63        14     3 -891.9131
191       14    11 -891.1895
233        8    14 -891.1877
234        9    14 -891.1729
31        14     1 -890.7573
236       11    14 -890.5576
241        0    15 -890.5500
15        14     0 -890.3449
237       12    14 -890.1854
235       10    14 -889.8957
207       14    12 -889.7107
242        1    15 -889.0419
47        14     2 -888.9410
238       13    14 -888.1867
223       14    13 -887.7488
239       14    14 -887.6659
243        2    15 -887.3088
244        3    15 -886.0691
246        5    15 -884.7479
96        15     5 -884.2869
245        4    15 -884.1417
160       15     9 -883.9364
128       15     7 -883.6409
144       15     8 -883.4503
247        6    15 -883.0158
80        15     4 -882.8148
248        7    15 -882.7881
112       15     6 -882.3106
176       15    10 -882.2093
64        15     3 -881.6497
253       12    15 -881.4274
252       11    15 -881.3077
250        9    15 -881.1831
192       15    11 -880.9028
249        8    15 -880.8964
32        15     1 -880.5983
251       10    15 -880.2736
16        15     0 -880.2468
254       13    15 -879.4467
208       15    12 -879.4364
255       14    15 -879.2846
48        15     2 -878.8432
224       15    13 -877.4985
240       15    14 -877.4570

berdasarkan tabel 3, kombinasi terbaik adalah p = 1 &q=4

model.ardlDlmberganda = ardlDlm(formula = logprice ~ interest + logm1,
                        data = data.frame(data1) , p = 4 , q = 1)
#Untuk fungsi ardlDlm() → notasinya terbalik.
#p = lag X & q = lag Y
summary(model.ardlDlmberganda)

Time series regression with "ts" data:
Start = 5, End = 144

Call:
dynlm(formula = as.formula(model.text), data = data)

Residuals:
      Min        1Q    Median        3Q       Max 
-0.042843 -0.010058  0.000608  0.010383  0.041257 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept) -0.63112    0.27547  -2.291  0.02359 *  
interest.t  -0.52613    0.33072  -1.591  0.11411    
interest.1   1.32204    0.50318   2.627  0.00966 ** 
interest.2  -0.22306    0.49160  -0.454  0.65079    
interest.3  -0.38347    0.49499  -0.775  0.43994    
interest.4   0.29294    0.37021   0.791  0.43024    
logm1.t      0.21005    0.06791   3.093  0.00243 ** 
logm1.1      0.02775    0.05453   0.509  0.61167    
logm1.2     -0.22723    0.05499  -4.132 6.45e-05 ***
logm1.3      0.03204    0.05174   0.619  0.53687    
logm1.4      0.06950    0.06554   1.060  0.29094    
logprice.1   0.91533    0.03459  26.461  < 2e-16 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 0.01787 on 128 degrees of freedom
Multiple R-squared:  0.9982,    Adjusted R-squared:  0.9981 
F-statistic:  6496 on 11 and 128 DF,  p-value: < 2.2e-16
#model p interest 0 p logm1 4 
rem.p = list(interest = c(1,2,3,4))
remove = list(p = rem.p)
model.ardlDlmberganda2 = ardlDlm(formula = logprice ~ interest + logm1,
                        data = data.frame(data1) , p = 4 , q = 1 ,
                        remove = remove)
summary(model.ardlDlmberganda2)

Time series regression with "ts" data:
Start = 5, End = 144

Call:
dynlm(formula = as.formula(model.text), data = data)

Residuals:
     Min       1Q   Median       3Q      Max 
-0.04490 -0.01236  0.00241  0.01181  0.03639 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept) -0.40668    0.19949  -2.039  0.04349 *  
interest.t   0.31257    0.15653   1.997  0.04790 *  
logm1.t      0.17890    0.06459   2.770  0.00642 ** 
logm1.1      0.01712    0.05453   0.314  0.75400    
logm1.2     -0.25041    0.05343  -4.687 6.83e-06 ***
logm1.3      0.02177    0.05263   0.414  0.67979    
logm1.4      0.10613    0.06132   1.731  0.08587 .  
logprice.1   0.94369    0.02460  38.364  < 2e-16 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 0.01832 on 132 degrees of freedom
Multiple R-squared:  0.9981,    Adjusted R-squared:  0.998 
F-statistic:  9706 on 7 and 132 DF,  p-value: < 2.2e-16

Proses selanjutnya sama dengan pemodelan menggunakan peubah tunggal.

10 Tugas Individu

intruksi tugas

  1. Gunakan dataset kualitas udara pada gdrive : https://drive.google.com/drive/folders/18ZcJpGKzmwlw9zxGZjTtVnJV0Ufau23g?usp=drive_link

  2. Tetapkan AQI sebagai variabel dependen (Y), lalu pilih satu variabel polutan (CO, NO₂, O₃, PM10, PM2.5, atau SO₂) sebagai variabel independen (X).

  3. Terapkan metode regresi dengan peubah lag yang sudah dipelajari

  4. Bandingkan kinerja model, kemudian tentukan model terbaik menggunakan metrik akurasi.

  5. Upload data dan tugas per individu ke GitHub pada folder Pertemuan-3 dengan format HTML bernama: Tugas-Pertemuan-3.html dan rmd

  6. Deadline Pengumpulan: Kamis 18 September 2025, 23.59 WIB.