Latar Belakang

Analisis regresi merupakan salah satu metode pendugaan dengan melihat hubungan antara peubah penjelas dan peubah respon kontinu. Parameter model regresi linier dapat diduga menggunakan metode kuadrat terkecil (MKT) dengan meminimumkan jumlah kuadrat sisaan. Pada beberapa kondisi, MKT sering tidak memuaskan yang disebabkan oleh dua hal (Tibshibrani, 1996), yaitu :keakuratan prediksi dan kemudahan interpretasi. Parameter dugaan hasil MKT memiliki bias yang rendah namun ragam yang tinggi serta semakin banyak peubah penjelas pada model, maka semakin sulit untuk diinterpretasi. Pada kondisi seperti ini dapat diatasi dengan penggunaan metode seleksi peubah (best subset) dan penyusutan koefisien regresi (shrinkage).

Metode seleksi peubah menggunakan metode seleksi peubah terbaik (Best Subset Regression) dan regresi bertatar (Stepwise Regression) dapat mengurangi ragam prediksi dengan mengorbankan sedikit bias (Soleh A.M, 2013). Interpretasi model lebih mudah dilakukan karena model hanya memuat peubah penjelas terbaik. Namun kekurangan metode ini adalah penduga model menjadi tidak stabil, karena perubahan kecil pada data dapat menghasilkan model yang berbeda termasuk peubah penjelas yang terpilih.

Alternatif metode lainnya dapat menggunakan metode penyusutan koefisien regresi seperti regresi gulud (Ridge Regression), regresi LASSO (Least Absolute Shrinkage and Selection Operator), dan regresi jaring elastis (ElasticNet Regression). Regresi gulud memberikan kendala atau pinalti pada MKT sehingga koefisien menyusut mendekati nol (Hastie et al, 2008). Dalam menyusutkan koefisien regresi, regresi gulud mengoptimalkan tuning parameter lambda untuk mencari nilai jumlah kuadrat sisaan (JKT/RSS) model terkecil. Pinalti pada regresi gulud akan mengecilkan semua koefisien regresi menuju nol namun tidak akan menetapkan salah satu dari koefisien tersebut tepat nol kecuali jika lambda = tak terhingga. Hal ini menimbulkan persoalan dalam interpretasi model terutama jika banyaknya peubah penjelas cukup besar. Kelemahan pada regresi gulud ini diatasi oleh penerapan regresi LASSO. Pada regresi gulud menggunakan pinalti beta kuadratik, namun pada regresi LASSO menggunakan pinalti beta mutlak. Hal ini nilai koefisien regresi dapat disusutkan menjadi tepat nol yakni ketika tuning parameter lambda cukup besar. Kelebihan lainnya pada regresi LASSO yakni dengan dihasilkannya koefisien regresi yang tepat nol, secara tidak langsung juga melakukan seleksi peubah. Peubah penjelas dengan koefisien regresi yang tepat nol tidak akan dimasukkan ke dalam model dalam memprediksi peubah respon. Pada metode regresi jaring elastis (Elastic Net Regression) menggunakan kedua pinalti tersebut yaitu beta kuadratik dan mutlak secara bersamaan. Metode ini merupakan pengembangan metode regresi gulud dan regresi LASSO dengan menggabungkan kelebihan kedua metode tersebut secara sekaligus.

Pada kajian analisis ini, akan dibandingkan performa model regresi linear, regresi seleksi peubah, regresi gulud, regresi LASSO, dan regresi jaring elastis dalam memprediksi data hasil bangkitan.

Tujuan

Tujuan kajian analisis ini adalah mengevaluasi performa regresi linear, regresi seleksi peubah, regresi gulud, regresi LASSO, dan regresi jaring elastis dalam memprediksi data hasil bangkitan.

Metodologi

Data yang digunakan dalam kajian analisis ini adalah data simulasi yang dibangkitkan menggunakan perangkat lunak R. Langkah-langkah kajian analisis yang dilakukan sebagi berikut :

  1. Bangkitkan data berukuran n = 1000 dengan ketentuan sebagai berikut :
  1. X1, X2, …, X100 menyebar Normal(2, 1,12) dengan X1-X2 berkorelasi sebesar 0.65 dan X4-X5 berkorelasi sebesar -0.85
  2. Mengganti nilai X11 dan X12 menjadi X11 = 1.15(X1) + miu dan X12 = -0.75(X2) + miu dengan miu menyebar Normal(0, 1.212)
  3. Menetapkan nilai B0 = 0.5, B1 = B2 = …, = B25 = 2.5 dan B26 = B27 = …, = B100 = 0
  4. Menetapkan nilai galat e menyebar Normal(0, 1.312)
  5. Menetapkan nilai peubah respon Y = B0 + B1X1 + B2X2 + … + B100*X11 + e
  1. Melakukan pra pemrosesan data sebagai berikut :
  1. Membagi data menjadi dua bagian yaitu data latih 80 % data dan data latih 20 % data
  2. Memeriksa apakah ada data hilang atau tidak
  3. Memeriksa apakah peubah respon menyebar normal atau tidak
  1. Membuat model regresi linear, regresi seleksi peubah terbaik, regresi gulud, regresi LASSO, dan regresi jaring elastis dengan data latih yang sudah dibangkitkan dan memprediksi dengan data uji.
  2. Mengevaluasi performa model berdasarkan nilai MAE terkecil
  3. Kesimpulan

Bangkitkan Data

k <- 0 + 2 + 9
sigmaX <- 1.12
meanX <- 2
n <- 1000

set.seed(123)
dataX <- round(replicate(100, rnorm(n, mean = meanX, sd=sigmaX), round(2)),4)
dataX <- as.data.frame(dataX)

correlatedValue = function(x, r){
  r2 = r**2
  ve = 1-r2
  SD = sqrt(ve)
  e = rnorm(length(x), mean = 0, sd=SD)
  y = r*x + e
  return(y)
}

dataX$V2 <- round(correlatedValue(x=dataX$V1, r=0.65),4)
dataX$V5 <- round(correlatedValue(x=dataX$V4, r=-(0.85)),4)
print(cor(dataX$V1,dataX$V2))
## [1] 0.6674958
print(cor(dataX$V4,dataX$V5))
## [1] -0.8748241
set.seed(123)
miu <- rnorm(n, 0, 1.212)
X1 <- dataX$V1
X2 <- dataX$V2
X11 <- round(X1*1.15 + miu,4)
X12 <- round(X2*-0.65 + miu,4)
dataX$V11 <- X11
dataX$V12 <- X12

B0 <- 0.5
B1 = B2 = B3 = B4 = B5 = B6 =  B7 = B8 = B9 = B10 = B11 = B12 = B13 = B14 = B15 = B16 =  B17 = B18 = B19 = B20 = B21 = B22 = B23 = B24 = B25 = 2.5

set.seed(123)
epsilon <- rnorm(n, 0, 1.312)
Y <- round(B0 + B1*dataX$V1+ B2*dataX$V2 + B3*dataX$V3 + B4*dataX$V4 + B5*dataX$V5 + B6*dataX$V6 + B7*dataX$V7 + B8*dataX$V8 + B9*dataX$V9 + B10*dataX$V10 + B11*dataX$V11 + B12*dataX$V12 + B13*dataX$V13 + B14*dataX$V14 + B15*dataX$V15 + B16*dataX$V16 + B17*dataX$V17 + B18*dataX$V18 + B19*dataX$V19 + B20*dataX$V20 + B21*dataX$V11 + B22*dataX$V12 + B23*dataX$V13 + B24*dataX$V14 + B25*dataX$V15 +  epsilon,4)

dataX <- cbind(Y, dataX)
write.csv(dataX, file = "DAT.SIM.01.csv")
dataX$Y<- as.integer(dataX$Y)
dataX$V1<- as.integer(dataX$V1)
dataX$V2<- as.integer(dataX$V2)
dataX$V3<- as.integer(dataX$V3)
dataX$V4<- as.integer(dataX$V4)
dataX$V5<- as.integer(dataX$V5)
dataX$V6<- as.integer(dataX$V6)
dataX$V7<- as.integer(dataX$V7)
dataX$V8<- as.integer(dataX$V8)
dataX$V9<- as.integer(dataX$V9)
dataX$V10<- as.integer(dataX$V10)
dataX$V11<- as.integer(dataX$V11)
dataX$V12<- as.integer(dataX$V12)
dataX$V13<- as.integer(dataX$V13)
dataX$V14<- as.integer(dataX$V14)
dataX$V15<- as.integer(dataX$V15)
dataX$V16<- as.integer(dataX$V16)
dataX$V17<- as.integer(dataX$V17)
dataX$V18<- as.integer(dataX$V18)
dataX$V19<- as.integer(dataX$V19)
dataX$V20<- as.integer(dataX$V20)
dataX$V21<- as.integer(dataX$V21)
dataX$V22<- as.integer(dataX$V22)
dataX$V23<- as.integer(dataX$V23)
dataX$V24<- as.integer(dataX$V24)
dataX$V25<- as.integer(dataX$V25)
dataX$V26<- as.integer(dataX$V26)
dataX$V27<- as.integer(dataX$V27)
dataX$V28<- as.integer(dataX$V28)
dataX$V29<- as.integer(dataX$V29)
dataX$V30<- as.integer(dataX$V30)
dataX$V31<- as.integer(dataX$V31)
dataX$V32<- as.integer(dataX$V32)
dataX$V33<- as.integer(dataX$V33)
dataX$V34<- as.integer(dataX$V34)
dataX$V35<- as.integer(dataX$V35)
dataX$V36<- as.integer(dataX$V36)
dataX$V37<- as.integer(dataX$V37)
dataX$V38<- as.integer(dataX$V38)
dataX$V39<- as.integer(dataX$V39)
dataX$V40<- as.integer(dataX$V40)
dataX$V41<- as.integer(dataX$V41)
dataX$V42<- as.integer(dataX$V42)
dataX$V43<- as.integer(dataX$V43)
dataX$V44<- as.integer(dataX$V44)
dataX$V45<- as.integer(dataX$V45)
dataX$V46<- as.integer(dataX$V46)
dataX$V47<- as.integer(dataX$V47)
dataX$V48<- as.integer(dataX$V48)
dataX$V49<- as.integer(dataX$V49)
dataX$V50<- as.integer(dataX$V50)
dataX$V51<- as.integer(dataX$V51)
dataX$V52<- as.integer(dataX$V52)
dataX$V53<- as.integer(dataX$V53)
dataX$V54<- as.integer(dataX$V54)
dataX$V55<- as.integer(dataX$V55)
dataX$V56<- as.integer(dataX$V56)
dataX$V57<- as.integer(dataX$V57)
dataX$V58<- as.integer(dataX$V58)
dataX$V59<- as.integer(dataX$V59)
dataX$V60<- as.integer(dataX$V60)
dataX$V61<- as.integer(dataX$V61)
dataX$V62<- as.integer(dataX$V62)
dataX$V63<- as.integer(dataX$V63)
dataX$V64<- as.integer(dataX$V64)
dataX$V65<- as.integer(dataX$V65)
dataX$V66<- as.integer(dataX$V66)
dataX$V67<- as.integer(dataX$V67)
dataX$V68<- as.integer(dataX$V68)
dataX$V69<- as.integer(dataX$V69)
dataX$V70<- as.integer(dataX$V70)
dataX$V71<- as.integer(dataX$V71)
dataX$V72<- as.integer(dataX$V72)
dataX$V73<- as.integer(dataX$V73)
dataX$V74<- as.integer(dataX$V74)
dataX$V75<- as.integer(dataX$V75)
dataX$V76<- as.integer(dataX$V76)
dataX$V77<- as.integer(dataX$V77)
dataX$V78<- as.integer(dataX$V78)
dataX$V79<- as.integer(dataX$V79)
dataX$V80<- as.integer(dataX$V80)
dataX$V81<- as.integer(dataX$V81)
dataX$V82<- as.integer(dataX$V82)
dataX$V83<- as.integer(dataX$V83)
dataX$V84<- as.integer(dataX$V84)
dataX$V85<- as.integer(dataX$V85)
dataX$V86<- as.integer(dataX$V86)
dataX$V87<- as.integer(dataX$V87)
dataX$V88<- as.integer(dataX$V88)
dataX$V89<- as.integer(dataX$V89)
dataX$V90<- as.integer(dataX$V90)
dataX$V91<- as.integer(dataX$V91)
dataX$V92<- as.integer(dataX$V92)
dataX$V93<- as.integer(dataX$V93)
dataX$V94<- as.integer(dataX$V94)
dataX$V95<- as.integer(dataX$V95)
dataX$V96<- as.integer(dataX$V96)
dataX$V97<- as.integer(dataX$V97)
dataX$V98<- as.integer(dataX$V98)
dataX$V99<- as.integer(dataX$V99)
dataX$V100<- as.integer(dataX$V100)
glimpse(dataX)
## Rows: 1,000
## Columns: 101
## $ Y    <int> 74, 88, 111, 110, 123, 147, 97, 74, 112, 90, 128, 122, 118, 131, …
## $ V1   <int> 1, 1, 3, 2, 2, 3, 2, 0, 1, 1, 3, 2, 2, 2, 1, 4, 2, 0, 2, 1, 0, 1,…
## $ V2   <int> 1, 2, 2, 1, 1, 3, 1, 0, 1, 0, 1, 1, 1, 2, 0, 2, 1, 0, 1, 0, 0, 1,…
## $ V3   <int> 1, 2, 1, 3, 2, 1, 0, 1, 4, 1, 1, 2, 2, 2, 1, 3, 0, 1, 1, 2, 2, 3,…
## $ V4   <int> 1, 1, 0, 1, 4, 1, 3, 1, 2, 1, 3, 1, 0, 3, 1, 3, 2, 3, 2, 1, 2, 1,…
## $ V5   <int> -1, -1, 0, -1, -3, -1, -3, -2, -2, -2, -4, -1, 0, -2, -1, -3, -1,…
## $ V6   <int> 1, 3, 0, 3, 3, 2, 2, 2, 1, 1, 1, 3, 2, 1, 0, 2, 0, 1, 4, 0, 1, 1,…
## $ V7   <int> 1, 3, 1, 1, 2, 1, 2, 3, 1, 0, 2, 1, 1, 2, 2, 3, 1, 1, 1, 1, 4, 1,…
## $ V8   <int> 0, 2, 4, 2, 3, 1, 1, 3, 2, 2, 1, 1, 2, 2, 2, 1, 0, 3, 1, 2, 0, 1,…
## $ V9   <int> 2, 4, 0, 2, 1, 1, 1, 1, 3, 1, 1, 2, 2, 3, 1, 1, 0, 3, 2, 3, 0, 3,…
## $ V10  <int> 4, 1, 1, 0, 5, 1, 3, 0, 3, 2, 1, 2, 3, 3, 2, 1, 0, 0, 0, 2, 1, 2,…
## $ V11  <int> 0, 1, 6, 2, 2, 6, 3, 0, 0, 1, 5, 3, 3, 2, 0, 6, 3, -2, 4, 1, 0, 1…
## $ V12  <int> -1, -1, 0, 0, 0, 0, 0, -2, -1, 0, 0, 0, 0, -1, 0, 0, 0, -2, 0, -1…
## $ V13  <int> 1, 2, 0, 2, 2, 3, 0, 2, 2, 1, 2, 1, 3, 2, 1, 1, 1, 2, 0, 2, 0, 1,…
## $ V14  <int> 0, 0, 2, 1, 1, 3, 0, 0, 1, 2, 3, 3, 2, 2, 2, 3, 2, 2, 2, 2, 3, 1,…
## $ V15  <int> 0, 2, 0, 2, 2, 2, 4, 1, 2, 1, 2, 2, 1, 4, 0, 1, 0, 4, 2, 2, 3, 1,…
## $ V16  <int> 0, 1, 1, 3, 2, 3, 0, 2, 2, 2, 2, 0, 2, 1, 2, 1, 2, 3, 1, 2, 3, 0,…
## $ V17  <int> 0, 0, 3, 2, 1, 2, 1, 2, 1, 0, 3, 2, 2, 3, 1, 2, 1, 1, 3, 1, 2, 0,…
## $ V18  <int> 2, 1, 0, 1, 3, 0, 2, 3, 3, 2, 0, 1, 0, 1, 2, 2, 3, 2, 1, 1, 1, 0,…
## $ V19  <int> 2, 0, 1, 0, 1, 2, 1, 2, 4, 3, 0, 3, 1, 1, 2, 1, 1, 1, 2, 0, 2, 2,…
## $ V20  <int> 4, 0, 0, 1, 1, 0, 4, 0, 3, 3, 1, 2, 0, 1, 1, 2, 3, 0, 3, 4, 1, 0,…
## $ V21  <int> 1, 1, 0, 0, 0, 0, 1, 3, 1, 2, 2, 3, 0, 0, 3, 3, 1, 2, 1, 0, 2, 0,…
## $ V22  <int> 0, 0, 3, 0, 1, 1, 1, 1, 1, 2, 1, 2, 2, 2, 1, 2, 0, 2, 2, 2, 2, 2,…
## $ V23  <int> -1, 2, 0, 1, 1, 2, 2, 2, 1, 1, 1, 1, 2, 1, 3, 0, 0, 4, 0, 2, 1, 3…
## $ V24  <int> 2, 0, 1, 1, 1, 1, 1, 1, 1, 2, 1, 0, 2, 1, 2, 0, 1, 3, 1, 0, 3, 1,…
## $ V25  <int> 1, 3, 1, 0, 0, 3, 4, 0, 2, 3, 2, 2, 3, 3, 1, 3, 2, 1, 3, 1, 1, 2,…
## $ V26  <int> 1, 0, 2, 2, 2, 1, 1, 1, 3, 1, 2, 2, 0, 3, 1, 2, 1, 3, 0, 2, 1, 2,…
## $ V27  <int> 2, 2, 0, 0, 1, 2, 2, 1, 3, 1, 1, 2, 1, 0, 2, 3, 2, 2, 2, 2, 1, 1,…
## $ V28  <int> 1, 1, 2, 4, 0, 2, 0, 1, 3, 0, 0, 1, 2, 4, 2, 0, 0, 2, 1, 1, 0, 0,…
## $ V29  <int> 3, 1, 2, 0, 3, 1, 2, 0, 2, 1, 1, 2, 2, 1, 3, 3, 2, 3, 2, 1, 2, 3,…
## $ V30  <int> 0, 0, 1, 3, 2, 2, 1, 2, 3, 1, 3, 2, 0, 1, 3, 2, 0, 3, 4, 5, 1, 0,…
## $ V31  <int> 1, 2, 1, 0, 3, 1, 2, 2, 2, 2, 1, 2, 0, 2, 3, 1, 0, 1, 0, 2, 2, 3,…
## $ V32  <int> 3, 0, 2, 2, 2, 0, 2, 2, 1, 2, 3, 0, 0, 3, 1, 0, 2, 1, 1, 3, 1, 1,…
## $ V33  <int> 1, 1, 3, 0, 2, 2, 2, 2, 0, 1, 1, 3, 1, 0, 0, 2, 2, 0, 1, 3, 1, 3,…
## $ V34  <int> 2, 2, 2, 2, 0, 2, 2, 3, 2, 2, 3, 2, 0, 0, 2, 4, 0, 0, 1, 2, 0, 1,…
## $ V35  <int> 1, 3, 2, 1, 1, 0, 2, 1, 1, 1, 3, 2, 2, 1, 2, 2, 2, 1, 4, 2, 3, 1,…
## $ V36  <int> 1, 2, 2, 1, 2, 3, 2, 1, 2, 1, 2, 1, 0, 3, 2, 0, 1, 3, 2, 1, 1, 0,…
## $ V37  <int> 2, 2, 1, 2, 2, 2, 1, 0, 1, 4, 1, 3, 3, 2, 1, 1, 2, 0, 1, 2, 0, 2,…
## $ V38  <int> 1, 2, 1, 1, 3, 2, 3, 0, 3, 0, 1, 3, 2, 1, 0, 2, 1, 1, -1, 1, 2, 3…
## $ V39  <int> 0, 1, 4, 1, 2, 2, 1, 2, 0, 0, 1, 4, 1, 1, 2, 2, 1, 1, 2, 1, 5, 1,…
## $ V40  <int> 1, 2, 2, 1, 2, 2, 2, 2, 1, 0, 0, 0, 1, 2, 2, -1, 3, 0, 3, 0, 2, 2…
## $ V41  <int> 2, 2, 1, 1, 3, 1, 3, 3, 1, 0, 2, 1, 0, 3, 2, 2, 1, 4, 1, 2, 2, 2,…
## $ V42  <int> 2, 2, 1, 1, 0, 3, 2, 1, 0, 1, 0, 1, 0, 3, 2, 2, 0, 0, 0, 0, 1, 2,…
## $ V43  <int> 1, 1, 3, 1, 2, 2, 2, 2, 2, 1, 0, 3, 1, 2, 3, 1, 0, 2, 1, 2, 0, 1,…
## $ V44  <int> 0, 3, 2, 1, 4, 0, 1, 0, 0, 5, 0, 2, 2, 1, 0, 3, 1, 3, 0, 2, 2, 1,…
## $ V45  <int> 0, 2, 1, 0, 2, 3, 1, 2, 2, 2, 2, 1, 1, 0, 4, 1, 0, 2, 2, 2, 3, 1,…
## $ V46  <int> 3, 1, 2, 1, 2, 3, 2, 1, 1, 2, 2, 1, 0, 2, 2, 2, 1, 3, 2, 1, 2, 1,…
## $ V47  <int> 2, 1, 0, 0, 2, 1, 1, 4, 3, 0, 4, 2, 1, 3, 1, 1, 2, 3, 2, 4, 1, 2,…
## $ V48  <int> 2, 2, 1, 2, 1, 0, 1, 3, 2, 3, 1, 3, 3, 3, 3, 1, 3, 3, 2, 0, 2, 2,…
## $ V49  <int> 3, 1, 1, 0, 1, 1, 1, 0, 1, 1, 4, 0, 1, 1, 3, 1, 0, 3, 3, 1, 3, 1,…
## $ V50  <int> 0, 2, 3, 0, 0, 5, 1, 0, 2, 1, 0, 0, 1, 1, 1, 3, 2, 2, 2, 2, 0, 1,…
## $ V51  <int> 2, 3, 1, 1, 1, 4, 0, 1, 1, 2, 0, 1, 1, 1, 0, 2, 0, 1, 2, 0, 0, 4,…
## $ V52  <int> 3, 4, 2, 4, 0, 1, 1, 1, 0, 1, 1, 0, 2, 1, 0, 1, 1, 0, 0, 0, 1, 1,…
## $ V53  <int> 1, 3, 3, 0, 1, 3, 1, 3, 2, 0, 1, 3, 1, 0, 3, 2, 3, 3, 0, 2, 1, 0,…
## $ V54  <int> 2, 2, 0, 1, 2, 2, 2, 0, 0, 0, 0, 1, 3, 0, 1, 1, 1, 1, 0, 2, 1, 0,…
## $ V55  <int> 0, 2, 4, 0, 3, 2, 1, 1, 2, 0, 1, 0, 4, 2, 1, 2, 0, 1, 1, 3, 2, 3,…
## $ V56  <int> 2, 3, 1, 3, 2, 2, 2, 1, 2, 1, 1, 0, 2, 3, 0, 0, 2, 1, 0, 1, 2, 1,…
## $ V57  <int> 1, 2, 0, 1, 1, 3, 3, 1, 2, 1, 0, 0, 1, 3, 0, 1, 0, 4, 3, 3, 2, 2,…
## $ V58  <int> 3, 3, 1, 2, 2, 2, 2, 0, 3, 3, 3, 1, 0, 1, 2, 2, 2, 2, 2, 1, 0, 1,…
## $ V59  <int> 2, 2, 2, 3, 1, 2, 3, 5, 1, 2, 2, 1, 0, 4, 0, 1, 2, 2, 0, 4, 2, 4,…
## $ V60  <int> 1, 1, 1, 3, 3, 3, 1, 0, 1, 2, 1, 2, 0, 2, 1, 2, 0, 1, 2, 2, 1, 0,…
## $ V61  <int> 1, 2, 1, 1, 4, 1, 0, 4, 2, 2, 1, 1, 0, 2, 3, 4, 3, 1, 0, 1, 1, 2,…
## $ V62  <int> 2, 2, 1, 3, 1, 2, 2, 3, 3, 1, 1, 1, 3, 1, 1, 0, 2, 1, 2, 2, 2, 0,…
## $ V63  <int> 1, 3, 1, 1, 2, 1, 0, 1, 2, 3, 1, 0, 3, 1, 2, 2, 2, 2, 1, 2, 2, 0,…
## $ V64  <int> 1, 1, 0, 1, 4, 4, 3, 1, 2, 1, 1, 2, 2, 1, 2, 1, 3, 2, 0, 1, 3, 1,…
## $ V65  <int> 2, 0, 0, 1, 2, 1, 0, 2, 1, 1, 2, 2, 3, 2, 2, 3, 3, 0, 2, 1, 2, 4,…
## $ V66  <int> 2, 1, 0, 3, 3, 0, 1, 0, 1, 2, 1, 0, 1, 1, 1, 3, 1, 2, 0, 3, 2, 1,…
## $ V67  <int> 2, 2, 1, 2, 3, 1, 2, 3, 4, 2, 2, 2, 2, 0, 2, 1, 0, 2, 1, 4, 1, 1,…
## $ V68  <int> 1, 2, 3, 1, 1, 1, 1, 1, 1, 1, 2, 2, 1, 1, 2, 1, 1, 1, 1, 2, 2, 1,…
## $ V69  <int> 1, 0, 2, 2, 2, 2, 0, 1, 2, 1, 3, 0, 1, 1, 2, 2, 0, 1, 2, 3, 2, 2,…
## $ V70  <int> 0, 2, 1, 1, 1, 2, 2, 0, 1, 4, 4, 3, 0, 2, 1, 3, 0, 2, 1, 2, 0, 1,…
## $ V71  <int> 2, 1, 2, 1, 1, 1, 2, 1, 3, 3, 1, 0, 2, 0, 3, 1, 0, 1, 0, 2, 0, 3,…
## $ V72  <int> 1, 1, 0, 2, 1, 4, 1, 1, 2, 2, 1, 2, 0, 3, 1, 2, 2, 0, 2, 2, 4, 4,…
## $ V73  <int> 0, 0, 2, 3, 1, 0, 1, 2, 3, 1, 2, 2, 2, 3, 0, 0, 1, 2, 2, 0, 1, 2,…
## $ V74  <int> 2, 5, 2, 1, 0, 0, 3, 2, 3, 3, 1, 3, 2, 1, 2, 3, 1, 1, 1, 2, 1, 2,…
## $ V75  <int> 2, 2, 0, 0, 4, 1, 0, 2, 3, 3, 2, 0, 3, 3, 0, 0, 2, 1, 2, 1, 1, 2,…
## $ V76  <int> 2, 2, 2, 1, 0, 2, 3, 2, 2, 0, 1, 2, 2, 2, 1, 2, 1, 0, 2, 1, 0, 1,…
## $ V77  <int> 0, 1, 3, 1, 2, 1, 0, 0, 1, 4, 1, 2, 1, 1, 2, 2, 3, 1, 0, 0, 2, 0,…
## $ V78  <int> 1, 1, 1, 1, 1, 3, 2, 2, 1, 1, 2, 0, 0, 0, 2, 2, 2, 1, 1, 2, 3, 3,…
## $ V79  <int> 3, 2, 1, 0, 1, 2, 2, 2, 3, 4, 2, 2, 1, 2, 1, 2, 0, 0, 2, 2, 2, 1,…
## $ V80  <int> 0, 1, 2, 2, 2, 1, 4, 4, 2, 1, 0, 3, 2, 0, 2, 3, 2, 0, 0, 1, 4, 2,…
## $ V81  <int> 2, 1, 1, 2, 3, 1, 1, 2, 2, 1, 2, 1, 4, 2, 0, 2, 3, 1, 1, 2, 1, 3,…
## $ V82  <int> 2, 4, 0, 3, 0, 3, 3, 1, 3, 1, 2, 4, 2, 3, 2, 2, 1, 2, 2, 2, 0, 2,…
## $ V83  <int> 1, 2, 1, 0, 2, 2, 2, 0, 3, 2, 1, 1, 1, 3, 0, 1, 2, 1, 2, 1, 3, 2,…
## $ V84  <int> 0, 4, 1, 1, 2, 0, 2, 4, 0, 2, 2, 3, 3, 2, 3, 2, 1, 2, 4, 2, 0, 2,…
## $ V85  <int> 3, 2, 3, 2, 1, 0, 0, 0, 3, 3, 1, 3, 0, 1, 2, 2, 0, 2, 0, 2, 0, 4,…
## $ V86  <int> 3, 2, 4, 0, 0, 1, 2, 3, 2, 0, 1, 3, 2, 2, 0, 3, 2, 3, 0, 1, 1, 3,…
## $ V87  <int> 0, 2, 3, 2, 2, 1, 2, 4, 2, 0, 3, 2, 1, 3, 1, 2, 2, 1, 2, 1, 3, 3,…
## $ V88  <int> 2, 2, 0, 0, 0, 3, 1, 3, 3, 1, 3, 1, 1, 4, 3, 1, 3, 1, 3, 1, 1, 3,…
## $ V89  <int> 1, 2, 3, 0, 1, 2, 0, 2, 2, 2, 1, 3, 1, 1, 2, 4, 2, 1, 2, 2, 2, 1,…
## $ V90  <int> 2, 0, 3, 2, 1, 1, 1, 2, 1, 0, 0, 0, 2, 3, 3, 2, -1, 2, 5, 1, 1, 0…
## $ V91  <int> 2, 2, 1, 1, 1, 3, 0, 3, 3, 0, 3, 3, 0, 0, 1, 4, 3, 3, 0, 2, 2, 1,…
## $ V92  <int> 1, 3, 1, 1, 2, 2, 2, 1, 2, 3, 1, 1, 3, 2, 1, 3, 2, 0, 1, 2, 2, 0,…
## $ V93  <int> 0, 1, 2, 2, 0, 3, 2, 0, 2, 2, 0, 2, 2, 3, 3, 3, 3, 1, 2, 3, 1, 0,…
## $ V94  <int> 1, 4, 2, 1, 1, 1, 0, 0, 1, 2, 1, 2, 1, 3, 1, 2, 0, 3, 1, 2, 3, 3,…
## $ V95  <int> 3, 2, 3, 1, 2, 4, 1, 0, 2, 2, 3, 0, 0, 2, 0, 1, 4, 3, 1, 1, 2, 2,…
## $ V96  <int> 1, 2, 2, 1, 0, 2, 0, 1, 1, 3, 1, 2, 3, 3, 1, 3, 4, 2, 0, 1, 2, 0,…
## $ V97  <int> 2, 1, 1, 1, 1, 2, 2, 1, 2, 0, 2, 3, 3, 2, 0, 2, 0, 1, 1, 0, 3, 1,…
## $ V98  <int> 1, 0, 1, 2, 1, 1, 2, 0, 1, 0, 1, 2, 1, 1, 3, 0, 2, 0, 3, 2, 0, 1,…
## $ V99  <int> 3, 2, 4, 0, 3, 2, 0, -1, 1, 1, 3, 2, 3, 2, 3, 1, 2, 0, 2, 4, 3, 1…
## $ V100 <int> 2, 3, 1, 1, 1, 0, 2, 2, 2, 1, 2, 1, 2, 2, 2, 1, 2, 0, 3, 0, 1, 1,…
DAT.SIM.01 <- dataX
head(DAT.SIM.01)
##     Y V1 V2 V3 V4 V5 V6 V7 V8 V9 V10 V11 V12 V13 V14 V15 V16 V17 V18 V19 V20
## 1  74  1  1  1  1 -1  1  1  0  2   4   0  -1   1   0   0   0   0   2   2   4
## 2  88  1  2  2  1 -1  3  3  2  4   1   1  -1   2   0   2   1   0   1   0   0
## 3 111  3  2  1  0  0  0  1  4  0   1   6   0   0   2   0   1   3   0   1   0
## 4 110  2  1  3  1 -1  3  1  2  2   0   2   0   2   1   2   3   2   1   0   1
## 5 123  2  1  2  4 -3  3  2  3  1   5   2   0   2   1   2   2   1   3   1   1
## 6 147  3  3  1  1 -1  2  1  1  1   1   6   0   3   3   2   3   2   0   2   0
##   V21 V22 V23 V24 V25 V26 V27 V28 V29 V30 V31 V32 V33 V34 V35 V36 V37 V38 V39
## 1   1   0  -1   2   1   1   2   1   3   0   1   3   1   2   1   1   2   1   0
## 2   1   0   2   0   3   0   2   1   1   0   2   0   1   2   3   2   2   2   1
## 3   0   3   0   1   1   2   0   2   2   1   1   2   3   2   2   2   1   1   4
## 4   0   0   1   1   0   2   0   4   0   3   0   2   0   2   1   1   2   1   1
## 5   0   1   1   1   0   2   1   0   3   2   3   2   2   0   1   2   2   3   2
## 6   0   1   2   1   3   1   2   2   1   2   1   0   2   2   0   3   2   2   2
##   V40 V41 V42 V43 V44 V45 V46 V47 V48 V49 V50 V51 V52 V53 V54 V55 V56 V57 V58
## 1   1   2   2   1   0   0   3   2   2   3   0   2   3   1   2   0   2   1   3
## 2   2   2   2   1   3   2   1   1   2   1   2   3   4   3   2   2   3   2   3
## 3   2   1   1   3   2   1   2   0   1   1   3   1   2   3   0   4   1   0   1
## 4   1   1   1   1   1   0   1   0   2   0   0   1   4   0   1   0   3   1   2
## 5   2   3   0   2   4   2   2   2   1   1   0   1   0   1   2   3   2   1   2
## 6   2   1   3   2   0   3   3   1   0   1   5   4   1   3   2   2   2   3   2
##   V59 V60 V61 V62 V63 V64 V65 V66 V67 V68 V69 V70 V71 V72 V73 V74 V75 V76 V77
## 1   2   1   1   2   1   1   2   2   2   1   1   0   2   1   0   2   2   2   0
## 2   2   1   2   2   3   1   0   1   2   2   0   2   1   1   0   5   2   2   1
## 3   2   1   1   1   1   0   0   0   1   3   2   1   2   0   2   2   0   2   3
## 4   3   3   1   3   1   1   1   3   2   1   2   1   1   2   3   1   0   1   1
## 5   1   3   4   1   2   4   2   3   3   1   2   1   1   1   1   0   4   0   2
## 6   2   3   1   2   1   4   1   0   1   1   2   2   1   4   0   0   1   2   1
##   V78 V79 V80 V81 V82 V83 V84 V85 V86 V87 V88 V89 V90 V91 V92 V93 V94 V95 V96
## 1   1   3   0   2   2   1   0   3   3   0   2   1   2   2   1   0   1   3   1
## 2   1   2   1   1   4   2   4   2   2   2   2   2   0   2   3   1   4   2   2
## 3   1   1   2   1   0   1   1   3   4   3   0   3   3   1   1   2   2   3   2
## 4   1   0   2   2   3   0   1   2   0   2   0   0   2   1   1   2   1   1   1
## 5   1   1   2   3   0   2   2   1   0   2   0   1   1   1   2   0   1   2   0
## 6   3   2   1   1   3   2   0   0   1   1   3   2   1   3   2   3   1   4   2
##   V97 V98 V99 V100
## 1   2   1   3    2
## 2   1   0   2    3
## 3   1   1   4    1
## 4   1   2   0    1
## 5   1   1   3    1
## 6   2   1   2    0

Pra Pemrosesan Data

Membagi data menjadi data latih 80% dan data uji 20%

set.seed(123)
indeks <- createDataPartition(DAT.SIM.01$Y, p = 0.8, list = FALSE)

data.train <- DAT.SIM.01[indeks,]
data.test <- DAT.SIM.01[-indeks,]
head(data.train)
##      Y V1 V2 V3 V4 V5 V6 V7 V8 V9 V10 V11 V12 V13 V14 V15 V16 V17 V18 V19 V20
## 1   74  1  1  1  1 -1  1  1  0  2   4   0  -1   1   0   0   0   0   2   2   4
## 3  111  3  2  1  0  0  0  1  4  0   1   6   0   0   2   0   1   3   0   1   0
## 4  110  2  1  3  1 -1  3  1  2  2   0   2   0   2   1   2   3   2   1   0   1
## 6  147  3  3  1  1 -1  2  1  1  1   1   6   0   3   3   2   3   2   0   2   0
## 7   97  2  1  0  3 -3  2  2  1  1   3   3   0   0   0   4   0   1   2   1   4
## 10  90  1  0  1  1 -2  1  0  2  1   2   1   0   1   2   1   2   0   2   3   3
##    V21 V22 V23 V24 V25 V26 V27 V28 V29 V30 V31 V32 V33 V34 V35 V36 V37 V38 V39
## 1    1   0  -1   2   1   1   2   1   3   0   1   3   1   2   1   1   2   1   0
## 3    0   3   0   1   1   2   0   2   2   1   1   2   3   2   2   2   1   1   4
## 4    0   0   1   1   0   2   0   4   0   3   0   2   0   2   1   1   2   1   1
## 6    0   1   2   1   3   1   2   2   1   2   1   0   2   2   0   3   2   2   2
## 7    1   1   2   1   4   1   2   0   2   1   2   2   2   2   2   2   1   3   1
## 10   2   2   1   2   3   1   1   0   1   1   2   2   1   2   1   1   4   0   0
##    V40 V41 V42 V43 V44 V45 V46 V47 V48 V49 V50 V51 V52 V53 V54 V55 V56 V57 V58
## 1    1   2   2   1   0   0   3   2   2   3   0   2   3   1   2   0   2   1   3
## 3    2   1   1   3   2   1   2   0   1   1   3   1   2   3   0   4   1   0   1
## 4    1   1   1   1   1   0   1   0   2   0   0   1   4   0   1   0   3   1   2
## 6    2   1   3   2   0   3   3   1   0   1   5   4   1   3   2   2   2   3   2
## 7    2   3   2   2   1   1   2   1   1   1   1   0   1   1   2   1   2   3   2
## 10   0   0   1   1   5   2   2   0   3   1   1   2   1   0   0   0   1   1   3
##    V59 V60 V61 V62 V63 V64 V65 V66 V67 V68 V69 V70 V71 V72 V73 V74 V75 V76 V77
## 1    2   1   1   2   1   1   2   2   2   1   1   0   2   1   0   2   2   2   0
## 3    2   1   1   1   1   0   0   0   1   3   2   1   2   0   2   2   0   2   3
## 4    3   3   1   3   1   1   1   3   2   1   2   1   1   2   3   1   0   1   1
## 6    2   3   1   2   1   4   1   0   1   1   2   2   1   4   0   0   1   2   1
## 7    3   1   0   2   0   3   0   1   2   1   0   2   2   1   1   3   0   3   0
## 10   2   2   2   1   3   1   1   2   2   1   1   4   3   2   1   3   3   0   4
##    V78 V79 V80 V81 V82 V83 V84 V85 V86 V87 V88 V89 V90 V91 V92 V93 V94 V95 V96
## 1    1   3   0   2   2   1   0   3   3   0   2   1   2   2   1   0   1   3   1
## 3    1   1   2   1   0   1   1   3   4   3   0   3   3   1   1   2   2   3   2
## 4    1   0   2   2   3   0   1   2   0   2   0   0   2   1   1   2   1   1   1
## 6    3   2   1   1   3   2   0   0   1   1   3   2   1   3   2   3   1   4   2
## 7    2   2   4   1   3   2   2   0   2   2   1   0   1   0   2   2   0   1   0
## 10   1   4   1   1   1   2   2   3   0   0   1   2   0   0   3   2   2   2   3
##    V97 V98 V99 V100
## 1    2   1   3    2
## 3    1   1   4    1
## 4    1   2   0    1
## 6    2   1   2    0
## 7    2   2   0    2
## 10   0   0   1    1
head(data.test)
##      Y V1 V2 V3 V4 V5 V6 V7 V8 V9 V10 V11 V12 V13 V14 V15 V16 V17 V18 V19 V20
## 2   88  1  2  2  1 -1  3  3  2  4   1   1  -1   2   0   2   1   0   1   0   0
## 5  123  2  1  2  4 -3  3  2  3  1   5   2   0   2   1   2   2   1   3   1   1
## 8   74  0  0  1  1 -2  2  3  3  1   0   0  -2   2   0   1   2   2   3   2   0
## 9  112  1  1  4  2 -2  1  1  2  3   3   0  -1   2   1   2   2   1   3   4   3
## 18  73  0  0  1  3 -2  1  1  3  3   0  -2  -2   2   2   4   3   1   2   1   0
## 19 119  2  1  1  2 -2  4  1  1  2   0   4   0   0   2   2   1   3   1   2   3
##    V21 V22 V23 V24 V25 V26 V27 V28 V29 V30 V31 V32 V33 V34 V35 V36 V37 V38 V39
## 2    1   0   2   0   3   0   2   1   1   0   2   0   1   2   3   2   2   2   1
## 5    0   1   1   1   0   2   1   0   3   2   3   2   2   0   1   2   2   3   2
## 8    3   1   2   1   0   1   1   1   0   2   2   2   2   3   1   1   0   0   2
## 9    1   1   1   1   2   3   3   3   2   3   2   1   0   2   1   2   1   3   0
## 18   2   2   4   3   1   3   2   2   3   3   1   1   0   0   1   3   0   1   1
## 19   1   2   0   1   3   0   2   1   2   4   0   1   1   1   4   2   1  -1   2
##    V40 V41 V42 V43 V44 V45 V46 V47 V48 V49 V50 V51 V52 V53 V54 V55 V56 V57 V58
## 2    2   2   2   1   3   2   1   1   2   1   2   3   4   3   2   2   3   2   3
## 5    2   3   0   2   4   2   2   2   1   1   0   1   0   1   2   3   2   1   2
## 8    2   3   1   2   0   2   1   4   3   0   0   1   1   3   0   1   1   1   0
## 9    1   1   0   2   0   2   1   3   2   1   2   1   0   2   0   2   2   2   3
## 18   0   4   0   2   3   2   3   3   3   3   2   1   0   3   1   1   1   4   2
## 19   3   1   0   1   0   2   2   2   2   3   2   2   0   0   0   1   0   3   2
##    V59 V60 V61 V62 V63 V64 V65 V66 V67 V68 V69 V70 V71 V72 V73 V74 V75 V76 V77
## 2    2   1   2   2   3   1   0   1   2   2   0   2   1   1   0   5   2   2   1
## 5    1   3   4   1   2   4   2   3   3   1   2   1   1   1   1   0   4   0   2
## 8    5   0   4   3   1   1   2   0   3   1   1   0   1   1   2   2   2   2   0
## 9    1   1   2   3   2   2   1   1   4   1   2   1   3   2   3   3   3   2   1
## 18   2   1   1   1   2   2   0   2   2   1   1   2   1   0   2   1   1   0   1
## 19   0   2   0   2   1   0   2   0   1   1   2   1   0   2   2   1   2   2   0
##    V78 V79 V80 V81 V82 V83 V84 V85 V86 V87 V88 V89 V90 V91 V92 V93 V94 V95 V96
## 2    1   2   1   1   4   2   4   2   2   2   2   2   0   2   3   1   4   2   2
## 5    1   1   2   3   0   2   2   1   0   2   0   1   1   1   2   0   1   2   0
## 8    2   2   4   2   1   0   4   0   3   4   3   2   2   3   1   0   0   0   1
## 9    1   3   2   2   3   3   0   3   2   2   3   2   1   3   2   2   1   2   1
## 18   1   0   0   1   2   1   2   2   3   1   1   1   2   3   0   1   3   3   2
## 19   1   2   0   1   2   2   4   0   0   2   3   2   5   0   1   2   1   1   0
##    V97 V98 V99 V100
## 2    1   0   2    3
## 5    1   1   3    1
## 8    1   0  -1    2
## 9    2   1   1    2
## 18   1   0   0    0
## 19   1   3   2    3

Memeriksa apakah ada data hilang / missing

sum(anyNA(data.train))
## [1] 0

Memeriksa apakah peubah respon Y sudah menyebar normal

ggpubr::gghistogram(data = data.train, x = "Y", fill = "yellow") + scale_y_continuous(expand = c(0,0))
## Warning: Using `bins = 30` by default. Pick better value with the argument
## `bins`.

Hasil pemrosesan data didapatkan data yang sudah dibagi menjadi data train dan data test. Pada data train tidak terdapat data hilang dan variabel respon menyebar normal. Hal ini menandakan bahwa data train sudah siap untuk dibuatkan ke dalam model regresi linear, regresi seleksi peubah terbaik, regresi gulud, regresi LASSO, dan regresi jaring elastis.

Membuat Model

Uji Asumsi Klasik Analisis Regresi

Dalam penggunaan metode kuadrat tengah terkecil (MKT) terdapat beberapa asumsi regresi klasik yang harus diuji sebagai berikut : 1. Sisaan menyebar normal 2. Sisaan memiliki ragam yang homogen (konstan) 3. Sisaan saling bebas 4. Multikolinearitas

Model Regresi Linear

model.regresi.linear <- lm(Y~., data = data.train)
summary(model.regresi.linear)
## 
## Call:
## lm(formula = Y ~ ., data = data.train)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -17.5709  -3.1188   0.0566   3.0629  13.9154 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 16.551643   2.605867   6.352 3.83e-10 ***
## V1           4.351409   0.512555   8.490  < 2e-16 ***
## V2           1.525206   0.323505   4.715 2.92e-06 ***
## V3           2.431327   0.178210  13.643  < 2e-16 ***
## V4           2.162966   0.293925   7.359 5.21e-13 ***
## V5           2.191349   0.309092   7.090 3.29e-12 ***
## V6           2.740667   0.173956  15.755  < 2e-16 ***
## V7           2.232646   0.175321  12.735  < 2e-16 ***
## V8           2.250465   0.167150  13.464  < 2e-16 ***
## V9           2.610128   0.167606  15.573  < 2e-16 ***
## V10          2.642479   0.176074  15.008  < 2e-16 ***
## V11          6.573465   0.270785  24.276  < 2e-16 ***
## V12          3.466450   0.476114   7.281 8.95e-13 ***
## V13          5.277090   0.175129  30.133  < 2e-16 ***
## V14          5.007867   0.173426  28.876  < 2e-16 ***
## V15          4.709994   0.172038  27.378  < 2e-16 ***
## V16          2.832360   0.171605  16.505  < 2e-16 ***
## V17          2.228721   0.167349  13.318  < 2e-16 ***
## V18          2.321515   0.170648  13.604  < 2e-16 ***
## V19          2.461780   0.174024  14.146  < 2e-16 ***
## V20          2.483444   0.170456  14.569  < 2e-16 ***
## V21          0.121989   0.175993   0.693   0.4884    
## V22          0.311899   0.177584   1.756   0.0795 .  
## V23          0.147121   0.173905   0.846   0.3979    
## V24          0.212430   0.174089   1.220   0.2228    
## V25         -0.107407   0.166007  -0.647   0.5178    
## V26         -0.104556   0.180504  -0.579   0.5626    
## V27         -0.131992   0.177852  -0.742   0.4582    
## V28          0.226806   0.172279   1.317   0.1884    
## V29          0.126034   0.172221   0.732   0.4645    
## V30         -0.220707   0.174396  -1.266   0.2061    
## V31          0.237266   0.174551   1.359   0.1745    
## V32         -0.224840   0.177269  -1.268   0.2051    
## V33          0.076177   0.178326   0.427   0.6694    
## V34         -0.085518   0.171518  -0.499   0.6182    
## V35          0.201455   0.170265   1.183   0.2371    
## V36          0.174484   0.181992   0.959   0.3380    
## V37         -0.154026   0.171932  -0.896   0.3706    
## V38          0.232239   0.172298   1.348   0.1781    
## V39          0.173590   0.176086   0.986   0.3246    
## V40          0.009873   0.169578   0.058   0.9536    
## V41         -0.265033   0.176008  -1.506   0.1326    
## V42          0.041339   0.169972   0.243   0.8079    
## V43         -0.036930   0.173845  -0.212   0.8318    
## V44          0.019317   0.177016   0.109   0.9131    
## V45          0.052071   0.171815   0.303   0.7619    
## V46         -0.119634   0.168332  -0.711   0.4775    
## V47         -0.192958   0.166607  -1.158   0.2472    
## V48         -0.181408   0.173409  -1.046   0.2959    
## V49         -0.007231   0.181767  -0.040   0.9683    
## V50         -0.005442   0.169531  -0.032   0.9744    
## V51          0.000143   0.167145   0.001   0.9993    
## V52         -0.305258   0.172507  -1.770   0.0772 .  
## V53         -0.063606   0.174754  -0.364   0.7160    
## V54         -0.063777   0.168303  -0.379   0.7048    
## V55          0.102452   0.170641   0.600   0.5484    
## V56         -0.170433   0.169484  -1.006   0.3150    
## V57          0.187456   0.169054   1.109   0.2679    
## V58          0.015732   0.181481   0.087   0.9309    
## V59         -0.080146   0.169423  -0.473   0.6363    
## V60          0.119279   0.174109   0.685   0.4935    
## V61         -0.058997   0.172236  -0.343   0.7320    
## V62         -0.152777   0.177990  -0.858   0.3910    
## V63          0.140543   0.176439   0.797   0.4260    
## V64         -0.305468   0.168435  -1.814   0.0702 .  
## V65          0.271861   0.172947   1.572   0.1164    
## V66         -0.169608   0.174483  -0.972   0.3314    
## V67         -0.322922   0.175213  -1.843   0.0657 .  
## V68          0.015587   0.180912   0.086   0.9314    
## V69          0.056360   0.180593   0.312   0.7551    
## V70          0.154523   0.172317   0.897   0.3702    
## V71          0.228565   0.179811   1.271   0.2041    
## V72         -0.199801   0.169433  -1.179   0.2387    
## V73          0.027051   0.177208   0.153   0.8787    
## V74          0.110846   0.176139   0.629   0.5294    
## V75         -0.243778   0.170577  -1.429   0.1534    
## V76         -0.218921   0.170051  -1.287   0.1984    
## V77         -0.002171   0.173142  -0.013   0.9900    
## V78         -0.038314   0.172311  -0.222   0.8241    
## V79         -0.108883   0.173472  -0.628   0.5304    
## V80         -0.041350   0.166737  -0.248   0.8042    
## V81          0.123786   0.175361   0.706   0.4805    
## V82         -0.031149   0.162395  -0.192   0.8479    
## V83          0.067811   0.174491   0.389   0.6977    
## V84          0.050946   0.174461   0.292   0.7704    
## V85          0.020459   0.178328   0.115   0.9087    
## V86         -0.278107   0.172455  -1.613   0.1073    
## V87          0.006003   0.178410   0.034   0.9732    
## V88          0.242294   0.173901   1.393   0.1640    
## V89          0.231220   0.178273   1.297   0.1951    
## V90          0.137853   0.170840   0.807   0.4200    
## V91          0.057881   0.175587   0.330   0.7418    
## V92          0.109140   0.178153   0.613   0.5403    
## V93         -0.101326   0.174401  -0.581   0.5614    
## V94         -0.357585   0.177719  -2.012   0.0446 *  
## V95         -0.265361   0.173522  -1.529   0.1267    
## V96          0.344979   0.179455   1.922   0.0550 .  
## V97         -0.230002   0.177017  -1.299   0.1943    
## V98          0.249162   0.176072   1.415   0.1575    
## V99          0.161082   0.169467   0.951   0.3422    
## V100        -0.040963   0.172710  -0.237   0.8126    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 4.985 on 700 degrees of freedom
## Multiple R-squared:  0.967,  Adjusted R-squared:  0.9623 
## F-statistic: 205.2 on 100 and 700 DF,  p-value: < 2.2e-16
Sisaan menyebar normal (Uji Jarque Bera)

Hipotesis : H0 : Sisaan menyebar normal H1 : Sisaan tidak menyebar normal

jarque.bera.test(model.regresi.linear$residuals)
## 
##  Jarque Bera Test
## 
## data:  model.regresi.linear$residuals
## X-squared = 0.35701, df = 2, p-value = 0.8365

Nilai p-value hasil uji Jarque Bera sebesar 0.8365 > 0.05, hal ini menandakan tidak cukup bukti untuk menolak H0 / gagal tolak H0. Dengan tingkat kepercayaan 95% dapat dikatakan bahwa sisaan menyebar normal.

Sisaan memiliki ragam yang homogen (Uji Breusch Pagan)

Hipotesis : H0 : Sisaan homogen H1 : Sisaan tidak homogen

bptest(model.regresi.linear, data = DAT.SIM.01)
## 
##  studentized Breusch-Pagan test
## 
## data:  model.regresi.linear
## BP = 105.21, df = 100, p-value = 0.3412

Nilai p-Value hasil uji Breush Pagan sebesar 0.3412 > 0.05, hal ini menandakan tidak cukup bukti untuk menolak H0 / gagal tolak H0. Dengan tingkat kepercayaan 95% dapat dikatakan bahwa sisaan memiliki ragam yang homogen (konstan)

Sisaan saling bebas (Uji Durbin Watson)

Hipotesis : H0 : Sisaan saling bebas H1 : Sisaan tidak saling bebas

dwtest(model.regresi.linear)
## 
##  Durbin-Watson test
## 
## data:  model.regresi.linear
## DW = 1.9898, p-value = 0.4464
## alternative hypothesis: true autocorrelation is greater than 0

Nilai p-Value hasil uji Durbin Watson sebesar 0.4464 > 0.05, hal ini menandakan tidak cukup bukti untuk menolak H0 / gagal tolak H0. Dengan tingkat kepercayaan 95% dapat dikatakan bahwa sisaan saling bebas atau tidak ada autokorelasi

Multikolinearitas (Nilai Variance Inflation Factor (VIF))
vif(model.regresi.linear)
##        V1        V2        V3        V4        V5        V6        V7        V8 
##  9.859183  2.986742  1.142864  3.191945  3.176066  1.135861  1.167592  1.136962 
##        V9       V10       V11       V12       V13       V14       V15       V16 
##  1.153166  1.168007 12.069284  3.717045  1.167975  1.143789  1.137468  1.147510 
##       V17       V18       V19       V20       V21       V22       V23       V24 
##  1.136317  1.116181  1.169285  1.166407  1.164244  1.171535  1.178129  1.136768 
##       V25       V26       V27       V28       V29       V30       V31       V32 
##  1.124246  1.175202  1.158268  1.161678  1.118670  1.138116  1.169020  1.141702 
##       V33       V34       V35       V36       V37       V38       V39       V40 
##  1.159280  1.135183  1.145771  1.191939  1.115526  1.173597  1.155169  1.125545 
##       V41       V42       V43       V44       V45       V46       V47       V48 
##  1.132544  1.138796  1.142632  1.173180  1.133684  1.152016  1.111677  1.125962 
##       V49       V50       V51       V52       V53       V54       V55       V56 
##  1.142097  1.183673  1.140522  1.143138  1.123609  1.142416  1.108325  1.110466 
##       V57       V58       V59       V60       V61       V62       V63       V64 
##  1.123029  1.158043  1.127158  1.176676  1.140685  1.140117  1.116569  1.166171 
##       V65       V66       V67       V68       V69       V70       V71       V72 
##  1.133264  1.110892  1.158347  1.118306  1.123369  1.118666  1.218804  1.138590 
##       V73       V74       V75       V76       V77       V78       V79       V80 
##  1.145184  1.141012  1.167923  1.160514  1.137807  1.115903  1.129608  1.105986 
##       V81       V82       V83       V84       V85       V86       V87       V88 
##  1.142010  1.121857  1.165568  1.129511  1.159757  1.174263  1.111889  1.121383 
##       V89       V90       V91       V92       V93       V94       V95       V96 
##  1.139645  1.132556  1.129106  1.180500  1.177181  1.114833  1.123984  1.127896 
##       V97       V98       V99      V100 
##  1.144945  1.144474  1.143441  1.123875

Dapat dilihat bahwa terdapat nilai yang tinggi di atas 5 pada V1 dan V11, hal ini menandakan adanya multikolinearitas.

Prediksi Data

preds_linear <- predict(model.regresi.linear, data.test)
head(preds_linear)
##         2         5         8         9        18        19 
##  87.30999 125.39733  66.04281 109.18904  76.27754 121.57398

Melihat akurasi prediksi dengan plot :

plot(data.test$Y, preds_linear, col="red")

cor(data.test$Y, preds_linear)
## [1] 0.9775027

Model Regresi Seleksi Peubah terbaik

Stepforward regression

model.forward.subsets <- regsubsets(Y ~ ., data = data.train, nvmax = 100,method = "forward", really.big = T)
summary(model.forward.subsets)
## Subset selection object
## Call: regsubsets.formula(Y ~ ., data = data.train, nvmax = 100, method = "forward", 
##     really.big = T)
## 100 Variables  (and intercept)
##      Forced in Forced out
## V1       FALSE      FALSE
## V2       FALSE      FALSE
## V3       FALSE      FALSE
## V4       FALSE      FALSE
## V5       FALSE      FALSE
## V6       FALSE      FALSE
## V7       FALSE      FALSE
## V8       FALSE      FALSE
## V9       FALSE      FALSE
## V10      FALSE      FALSE
## V11      FALSE      FALSE
## V12      FALSE      FALSE
## V13      FALSE      FALSE
## V14      FALSE      FALSE
## V15      FALSE      FALSE
## V16      FALSE      FALSE
## V17      FALSE      FALSE
## V18      FALSE      FALSE
## V19      FALSE      FALSE
## V20      FALSE      FALSE
## V21      FALSE      FALSE
## V22      FALSE      FALSE
## V23      FALSE      FALSE
## V24      FALSE      FALSE
## V25      FALSE      FALSE
## V26      FALSE      FALSE
## V27      FALSE      FALSE
## V28      FALSE      FALSE
## V29      FALSE      FALSE
## V30      FALSE      FALSE
## V31      FALSE      FALSE
## V32      FALSE      FALSE
## V33      FALSE      FALSE
## V34      FALSE      FALSE
## V35      FALSE      FALSE
## V36      FALSE      FALSE
## V37      FALSE      FALSE
## V38      FALSE      FALSE
## V39      FALSE      FALSE
## V40      FALSE      FALSE
## V41      FALSE      FALSE
## V42      FALSE      FALSE
## V43      FALSE      FALSE
## V44      FALSE      FALSE
## V45      FALSE      FALSE
## V46      FALSE      FALSE
## V47      FALSE      FALSE
## V48      FALSE      FALSE
## V49      FALSE      FALSE
## V50      FALSE      FALSE
## V51      FALSE      FALSE
## V52      FALSE      FALSE
## V53      FALSE      FALSE
## V54      FALSE      FALSE
## V55      FALSE      FALSE
## V56      FALSE      FALSE
## V57      FALSE      FALSE
## V58      FALSE      FALSE
## V59      FALSE      FALSE
## V60      FALSE      FALSE
## V61      FALSE      FALSE
## V62      FALSE      FALSE
## V63      FALSE      FALSE
## V64      FALSE      FALSE
## V65      FALSE      FALSE
## V66      FALSE      FALSE
## V67      FALSE      FALSE
## V68      FALSE      FALSE
## V69      FALSE      FALSE
## V70      FALSE      FALSE
## V71      FALSE      FALSE
## V72      FALSE      FALSE
## V73      FALSE      FALSE
## V74      FALSE      FALSE
## V75      FALSE      FALSE
## V76      FALSE      FALSE
## V77      FALSE      FALSE
## V78      FALSE      FALSE
## V79      FALSE      FALSE
## V80      FALSE      FALSE
## V81      FALSE      FALSE
## V82      FALSE      FALSE
## V83      FALSE      FALSE
## V84      FALSE      FALSE
## V85      FALSE      FALSE
## V86      FALSE      FALSE
## V87      FALSE      FALSE
## V88      FALSE      FALSE
## V89      FALSE      FALSE
## V90      FALSE      FALSE
## V91      FALSE      FALSE
## V92      FALSE      FALSE
## V93      FALSE      FALSE
## V94      FALSE      FALSE
## V95      FALSE      FALSE
## V96      FALSE      FALSE
## V97      FALSE      FALSE
## V98      FALSE      FALSE
## V99      FALSE      FALSE
## V100     FALSE      FALSE
## 1 subsets of each size up to 100
## Selection Algorithm: forward
##            V1  V2  V3  V4  V5  V6  V7  V8  V9  V10 V11 V12 V13 V14 V15 V16 V17
## 1  ( 1 )   " " " " " " " " " " " " " " " " " " " " "*" " " " " " " " " " " " "
## 2  ( 1 )   " " " " " " " " " " " " " " " " " " " " "*" " " "*" " " " " " " " "
## 3  ( 1 )   " " " " " " " " " " " " " " " " " " " " "*" " " "*" " " "*" " " " "
## 4  ( 1 )   " " " " " " " " " " " " " " " " " " " " "*" " " "*" "*" "*" " " " "
## 5  ( 1 )   " " " " " " " " " " " " " " " " " " " " "*" " " "*" "*" "*" " " " "
## 6  ( 1 )   " " " " " " " " " " " " " " " " " " " " "*" " " "*" "*" "*" "*" " "
## 7  ( 1 )   " " " " " " " " " " " " " " " " " " "*" "*" " " "*" "*" "*" "*" " "
## 8  ( 1 )   " " " " " " " " " " " " " " " " "*" "*" "*" " " "*" "*" "*" "*" " "
## 9  ( 1 )   " " " " " " " " " " " " " " " " "*" "*" "*" " " "*" "*" "*" "*" " "
## 10  ( 1 )  " " " " " " " " " " "*" " " " " "*" "*" "*" " " "*" "*" "*" "*" " "
## 11  ( 1 )  " " " " " " " " " " "*" " " "*" "*" "*" "*" " " "*" "*" "*" "*" " "
## 12  ( 1 )  " " " " " " " " " " "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" " "
## 13  ( 1 )  " " " " " " " " " " "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*"
## 14  ( 1 )  " " " " " " " " " " "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*"
## 15  ( 1 )  " " " " "*" " " " " "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*"
## 16  ( 1 )  "*" " " "*" " " " " "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*"
## 17  ( 1 )  "*" " " "*" " " " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 18  ( 1 )  "*" "*" "*" " " " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 19  ( 1 )  "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 20  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 21  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 22  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 23  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 24  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 25  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 26  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 27  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 28  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 29  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 30  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 31  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 32  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 33  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 34  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 35  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 36  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 37  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 38  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 39  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 40  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 41  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 42  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 43  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 44  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 45  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 46  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 47  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 48  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 49  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 50  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 51  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 52  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 53  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 54  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 55  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 56  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 57  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 58  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 59  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 60  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 61  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 62  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 63  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 64  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 65  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 66  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 67  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 68  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 69  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 70  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 71  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 72  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 73  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 74  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 75  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 76  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 77  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 78  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 79  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 80  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 81  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 82  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 83  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 84  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 85  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 86  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 87  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 88  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 89  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 90  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 91  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 92  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 93  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 94  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 95  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 96  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 97  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 98  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 99  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 100  ( 1 ) "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
##            V18 V19 V20 V21 V22 V23 V24 V25 V26 V27 V28 V29 V30 V31 V32 V33 V34
## 1  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 2  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 3  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 4  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 5  ( 1 )   " " "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 6  ( 1 )   " " "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 7  ( 1 )   " " "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 8  ( 1 )   " " "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 9  ( 1 )   " " "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 10  ( 1 )  " " "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 11  ( 1 )  " " "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 12  ( 1 )  " " "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 13  ( 1 )  " " "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 14  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 15  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 16  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 17  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 18  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 19  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 20  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 21  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 22  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 23  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 24  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 25  ( 1 )  "*" "*" "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 26  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " " " " " " " " " " "
## 27  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " " " " " " " " " " "
## 28  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " " " " " " " " " " "
## 29  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " " " " " " " " " " "
## 30  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " " " " " " " " " " "
## 31  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " " " " " " " " " " "
## 32  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " " " " " " " " " " "
## 33  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " " " " " " " " " " "
## 34  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " " " " " " " " " " "
## 35  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " "*" " " " " " " " "
## 36  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " "*" " " "*" " " " "
## 37  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " "*" " " "*" " " " "
## 38  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " "*" "*" "*" " " " "
## 39  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " "*" "*" "*" " " " "
## 40  ( 1 )  "*" "*" "*" " " "*" " " " " " " " " " " " " " " "*" "*" "*" " " " "
## 41  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " " " " " "*" "*" "*" " " " "
## 42  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 43  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 44  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 45  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 46  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 47  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 48  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 49  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 50  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 51  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 52  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 53  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 54  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 55  ( 1 )  "*" "*" "*" " " "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 56  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 57  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 58  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 59  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 60  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " "*" " " "*" "*" "*" " " " "
## 61  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " "*" "*" " " "*" "*" "*" " " " "
## 62  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " "*" "*" " " "*" "*" "*" " " " "
## 63  ( 1 )  "*" "*" "*" "*" "*" "*" "*" " " " " "*" "*" " " "*" "*" "*" " " " "
## 64  ( 1 )  "*" "*" "*" "*" "*" "*" "*" " " " " "*" "*" "*" "*" "*" "*" " " " "
## 65  ( 1 )  "*" "*" "*" "*" "*" "*" "*" " " " " "*" "*" "*" "*" "*" "*" " " " "
## 66  ( 1 )  "*" "*" "*" "*" "*" "*" "*" " " " " "*" "*" "*" "*" "*" "*" " " " "
## 67  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" " " " "
## 68  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" " " " "
## 69  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" " " " "
## 70  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" " " " "
## 71  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " " "
## 72  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " " "
## 73  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " " "
## 74  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " "*"
## 75  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " "*"
## 76  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 77  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 78  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 79  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 80  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 81  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 82  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 83  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 84  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 85  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 86  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 87  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 88  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 89  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 90  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 91  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 92  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 93  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 94  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 95  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 96  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 97  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 98  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 99  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 100  ( 1 ) "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
##            V35 V36 V37 V38 V39 V40 V41 V42 V43 V44 V45 V46 V47 V48 V49 V50 V51
## 1  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 2  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 3  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 4  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 5  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 6  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 7  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 8  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 9  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 10  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 11  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 12  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 13  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 14  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 15  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 16  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 17  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 18  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 19  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 20  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 21  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 22  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 23  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 24  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 25  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 26  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 27  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 28  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 29  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 30  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 31  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 32  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 33  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 34  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 35  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 36  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 37  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " "*" " " " " " " " "
## 38  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " "*" " " " " " " " "
## 39  ( 1 )  "*" " " " " " " " " " " "*" " " " " " " " " " " "*" " " " " " " " "
## 40  ( 1 )  "*" " " " " " " " " " " "*" " " " " " " " " " " "*" " " " " " " " "
## 41  ( 1 )  "*" " " " " " " " " " " "*" " " " " " " " " " " "*" " " " " " " " "
## 42  ( 1 )  "*" " " " " " " " " " " "*" " " " " " " " " " " "*" " " " " " " " "
## 43  ( 1 )  "*" " " " " " " " " " " "*" " " " " " " " " " " "*" " " " " " " " "
## 44  ( 1 )  "*" " " " " "*" " " " " "*" " " " " " " " " " " "*" " " " " " " " "
## 45  ( 1 )  "*" " " " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 46  ( 1 )  "*" " " " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 47  ( 1 )  "*" " " " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 48  ( 1 )  "*" " " " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 49  ( 1 )  "*" "*" " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 50  ( 1 )  "*" "*" " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 51  ( 1 )  "*" "*" " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 52  ( 1 )  "*" "*" " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 53  ( 1 )  "*" "*" " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 54  ( 1 )  "*" "*" " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 55  ( 1 )  "*" "*" " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 56  ( 1 )  "*" "*" " " "*" " " " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 57  ( 1 )  "*" "*" " " "*" "*" " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 58  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 59  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 60  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 61  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 62  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 63  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 64  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " " " "*" "*" " " " " " "
## 65  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 66  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 67  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 68  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 69  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 70  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 71  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 72  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 73  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 74  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 75  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 76  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 77  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 78  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 79  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 80  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 81  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " " " "*" "*" "*" " " " " " "
## 82  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " "*" "*" "*" "*" " " " " " "
## 83  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " "*" "*" "*" "*" " " " " " "
## 84  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " "*" "*" "*" "*" " " " " " "
## 85  ( 1 )  "*" "*" "*" "*" "*" " " "*" " " " " " " "*" "*" "*" "*" " " " " " "
## 86  ( 1 )  "*" "*" "*" "*" "*" " " "*" "*" " " " " "*" "*" "*" "*" " " " " " "
## 87  ( 1 )  "*" "*" "*" "*" "*" " " "*" "*" " " " " "*" "*" "*" "*" " " " " " "
## 88  ( 1 )  "*" "*" "*" "*" "*" " " "*" "*" "*" " " "*" "*" "*" "*" " " " " " "
## 89  ( 1 )  "*" "*" "*" "*" "*" " " "*" "*" "*" " " "*" "*" "*" "*" " " " " " "
## 90  ( 1 )  "*" "*" "*" "*" "*" " " "*" "*" "*" " " "*" "*" "*" "*" " " " " " "
## 91  ( 1 )  "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" " " " " " "
## 92  ( 1 )  "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" " " " " " "
## 93  ( 1 )  "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" " " " " " "
## 94  ( 1 )  "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" " " " " " "
## 95  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " " " " "
## 96  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " " "
## 97  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " " "
## 98  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 99  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 100  ( 1 ) "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
##            V52 V53 V54 V55 V56 V57 V58 V59 V60 V61 V62 V63 V64 V65 V66 V67 V68
## 1  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 2  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 3  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 4  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 5  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 6  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 7  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 8  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 9  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 10  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 11  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 12  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 13  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 14  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 15  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 16  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 17  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 18  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 19  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 20  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 21  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 22  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 23  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 24  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" " " " " " " " "
## 25  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " " " " "
## 26  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " " " " "
## 27  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " " " " "
## 28  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " " " " "
## 29  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 30  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 31  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 32  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 33  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 34  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 35  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 36  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 37  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 38  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 39  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 40  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 41  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 42  ( 1 )  "*" " " " " " " " " " " " " " " " " " " " " " " "*" "*" " " "*" " "
## 43  ( 1 )  "*" " " " " " " " " " " " " " " " " " " "*" " " "*" "*" " " "*" " "
## 44  ( 1 )  "*" " " " " " " " " " " " " " " " " " " "*" " " "*" "*" " " "*" " "
## 45  ( 1 )  "*" " " " " " " " " " " " " " " " " " " "*" " " "*" "*" " " "*" " "
## 46  ( 1 )  "*" " " " " " " " " " " " " " " " " " " "*" " " "*" "*" " " "*" " "
## 47  ( 1 )  "*" " " " " " " " " " " " " " " " " " " "*" " " "*" "*" " " "*" " "
## 48  ( 1 )  "*" " " " " " " " " " " " " " " " " " " "*" " " "*" "*" " " "*" " "
## 49  ( 1 )  "*" " " " " " " " " " " " " " " " " " " "*" " " "*" "*" " " "*" " "
## 50  ( 1 )  "*" " " " " " " " " " " " " " " " " " " "*" " " "*" "*" " " "*" " "
## 51  ( 1 )  "*" " " " " " " " " "*" " " " " " " " " "*" " " "*" "*" " " "*" " "
## 52  ( 1 )  "*" " " " " " " " " "*" " " " " " " " " "*" " " "*" "*" " " "*" " "
## 53  ( 1 )  "*" " " " " " " " " "*" " " " " " " " " "*" " " "*" "*" " " "*" " "
## 54  ( 1 )  "*" " " " " " " "*" "*" " " " " " " " " "*" " " "*" "*" " " "*" " "
## 55  ( 1 )  "*" " " " " " " "*" "*" " " " " " " " " "*" " " "*" "*" "*" "*" " "
## 56  ( 1 )  "*" " " " " " " "*" "*" " " " " " " " " "*" " " "*" "*" "*" "*" " "
## 57  ( 1 )  "*" " " " " " " "*" "*" " " " " " " " " "*" " " "*" "*" "*" "*" " "
## 58  ( 1 )  "*" " " " " " " "*" "*" " " " " " " " " "*" " " "*" "*" "*" "*" " "
## 59  ( 1 )  "*" " " " " " " "*" "*" " " " " "*" " " "*" " " "*" "*" "*" "*" " "
## 60  ( 1 )  "*" " " " " " " "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 61  ( 1 )  "*" " " " " " " "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 62  ( 1 )  "*" " " " " " " "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 63  ( 1 )  "*" " " " " " " "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 64  ( 1 )  "*" " " " " " " "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 65  ( 1 )  "*" " " " " " " "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 66  ( 1 )  "*" " " " " " " "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 67  ( 1 )  "*" " " " " " " "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 68  ( 1 )  "*" " " " " "*" "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 69  ( 1 )  "*" " " " " "*" "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 70  ( 1 )  "*" " " " " "*" "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 71  ( 1 )  "*" " " " " "*" "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 72  ( 1 )  "*" " " " " "*" "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 73  ( 1 )  "*" " " " " "*" "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 74  ( 1 )  "*" " " " " "*" "*" "*" " " " " "*" " " "*" "*" "*" "*" "*" "*" " "
## 75  ( 1 )  "*" " " " " "*" "*" "*" " " "*" "*" " " "*" "*" "*" "*" "*" "*" " "
## 76  ( 1 )  "*" " " " " "*" "*" "*" " " "*" "*" " " "*" "*" "*" "*" "*" "*" " "
## 77  ( 1 )  "*" " " "*" "*" "*" "*" " " "*" "*" " " "*" "*" "*" "*" "*" "*" " "
## 78  ( 1 )  "*" " " "*" "*" "*" "*" " " "*" "*" " " "*" "*" "*" "*" "*" "*" " "
## 79  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" " " "*" "*" "*" "*" "*" "*" " "
## 80  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 81  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 82  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 83  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 84  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 85  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 86  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 87  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 88  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 89  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 90  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 91  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 92  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 93  ( 1 )  "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 94  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 95  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 96  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 97  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 98  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 99  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 100  ( 1 ) "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
##            V69 V70 V71 V72 V73 V74 V75 V76 V77 V78 V79 V80 V81 V82 V83 V84 V85
## 1  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 2  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 3  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 4  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 5  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 6  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 7  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 8  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 9  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 10  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 11  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 12  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 13  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 14  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 15  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 16  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 17  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 18  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 19  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 20  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 21  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 22  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 23  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 24  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 25  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 26  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 27  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " "
## 28  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 29  ( 1 )  " " " " " " " " " " " " "*" " " " " " " " " " " " " " " " " " " " "
## 30  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 31  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 32  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 33  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 34  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 35  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 36  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 37  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 38  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 39  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 40  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 41  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 42  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 43  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 44  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 45  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 46  ( 1 )  " " " " " " " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 47  ( 1 )  " " " " "*" " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 48  ( 1 )  " " "*" "*" " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 49  ( 1 )  " " "*" "*" " " " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 50  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 51  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 52  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 53  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 54  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 55  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 56  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 57  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 58  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 59  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 60  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 61  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " " " " " " " " " " "
## 62  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " "*" " " " " " " " "
## 63  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " "*" " " " " " " " "
## 64  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " "*" " " " " " " " "
## 65  ( 1 )  " " "*" "*" "*" " " " " "*" "*" " " " " " " " " "*" " " " " " " " "
## 66  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " " " " " "*" " " " " " " " "
## 67  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " " " " " "*" " " " " " " " "
## 68  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " " " " " "*" " " " " " " " "
## 69  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " " " " " " "
## 70  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " " " " " " "
## 71  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " " " " " " "
## 72  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " " " " " " "
## 73  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " "*" " " " "
## 74  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " "*" " " " "
## 75  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " "*" " " " "
## 76  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " "*" " " " "
## 77  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " "*" " " " "
## 78  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " "*" " " " "
## 79  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " "*" " " " "
## 80  ( 1 )  " " "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " "*" " " " "
## 81  ( 1 )  "*" "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " "*" " " " "
## 82  ( 1 )  "*" "*" "*" "*" " " "*" "*" "*" " " " " "*" " " "*" " " "*" " " " "
## 83  ( 1 )  "*" "*" "*" "*" " " "*" "*" "*" " " "*" "*" " " "*" " " "*" " " " "
## 84  ( 1 )  "*" "*" "*" "*" " " "*" "*" "*" " " "*" "*" "*" "*" " " "*" " " " "
## 85  ( 1 )  "*" "*" "*" "*" " " "*" "*" "*" " " "*" "*" "*" "*" " " "*" "*" " "
## 86  ( 1 )  "*" "*" "*" "*" " " "*" "*" "*" " " "*" "*" "*" "*" " " "*" "*" " "
## 87  ( 1 )  "*" "*" "*" "*" " " "*" "*" "*" " " "*" "*" "*" "*" " " "*" "*" " "
## 88  ( 1 )  "*" "*" "*" "*" " " "*" "*" "*" " " "*" "*" "*" "*" " " "*" "*" " "
## 89  ( 1 )  "*" "*" "*" "*" " " "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" " "
## 90  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" " "
## 91  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" " "
## 92  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*"
## 93  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*"
## 94  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*"
## 95  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*"
## 96  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*"
## 97  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*"
## 98  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*"
## 99  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 100  ( 1 ) "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
##            V86 V87 V88 V89 V90 V91 V92 V93 V94 V95 V96 V97 V98 V99 V100
## 1  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 2  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 3  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 4  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 5  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 6  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 7  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 8  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 9  ( 1 )   " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 10  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 11  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 12  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 13  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 14  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 15  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 16  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 17  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 18  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 19  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 20  ( 1 )  " " " " " " " " " " " " " " " " " " " " " " " " " " " " " " 
## 21  ( 1 )  " " " " " " " " " " " " " " " " "*" " " " " " " " " " " " " 
## 22  ( 1 )  " " " " " " " " " " " " " " " " "*" " " " " " " " " " " " " 
## 23  ( 1 )  " " " " " " " " " " " " " " " " "*" " " " " " " " " " " " " 
## 24  ( 1 )  " " " " " " " " " " " " " " " " "*" " " " " " " " " " " " " 
## 25  ( 1 )  " " " " " " " " " " " " " " " " "*" " " " " " " " " " " " " 
## 26  ( 1 )  " " " " " " " " " " " " " " " " "*" " " " " " " " " " " " " 
## 27  ( 1 )  " " " " " " " " " " " " " " " " "*" " " " " " " "*" " " " " 
## 28  ( 1 )  " " " " " " " " " " " " " " " " "*" " " " " " " "*" " " " " 
## 29  ( 1 )  " " " " " " " " " " " " " " " " "*" " " " " " " "*" " " " " 
## 30  ( 1 )  " " " " " " " " " " " " " " " " "*" " " " " " " "*" " " " " 
## 31  ( 1 )  " " " " " " " " " " " " " " " " "*" "*" " " " " "*" " " " " 
## 32  ( 1 )  " " " " " " " " " " " " " " " " "*" "*" "*" " " "*" " " " " 
## 33  ( 1 )  " " " " "*" " " " " " " " " " " "*" "*" "*" " " "*" " " " " 
## 34  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" " " "*" " " " " 
## 35  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" " " "*" " " " " 
## 36  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" " " "*" " " " " 
## 37  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" " " "*" " " " " 
## 38  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" " " "*" " " " " 
## 39  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" " " "*" " " " " 
## 40  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 41  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 42  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 43  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 44  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 45  ( 1 )  "*" " " "*" " " " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 46  ( 1 )  "*" " " "*" "*" " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 47  ( 1 )  "*" " " "*" "*" " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 48  ( 1 )  "*" " " "*" "*" " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 49  ( 1 )  "*" " " "*" "*" " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 50  ( 1 )  "*" " " "*" "*" " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 51  ( 1 )  "*" " " "*" "*" " " " " " " " " "*" "*" "*" "*" "*" " " " " 
## 52  ( 1 )  "*" " " "*" "*" " " " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 53  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 54  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 55  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 56  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 57  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 58  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 59  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 60  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 61  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 62  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 63  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 64  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 65  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 66  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 67  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 68  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 69  ( 1 )  "*" " " "*" "*" "*" " " " " " " "*" "*" "*" "*" "*" "*" " " 
## 70  ( 1 )  "*" " " "*" "*" "*" " " "*" " " "*" "*" "*" "*" "*" "*" " " 
## 71  ( 1 )  "*" " " "*" "*" "*" " " "*" " " "*" "*" "*" "*" "*" "*" " " 
## 72  ( 1 )  "*" " " "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 73  ( 1 )  "*" " " "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 74  ( 1 )  "*" " " "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 75  ( 1 )  "*" " " "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 76  ( 1 )  "*" " " "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 77  ( 1 )  "*" " " "*" "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 78  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 79  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 80  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 81  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 82  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 83  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 84  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 85  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 86  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " 
## 87  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 88  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 89  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 90  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 91  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 92  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 93  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 94  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 95  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 96  ( 1 )  "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 97  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 98  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 99  ( 1 )  "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" 
## 100  ( 1 ) "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
res.sum <- summary(model.forward.subsets)
data.frame(
  Adj.R2 = which.max(res.sum$adjr2),
  CP = which.min(res.sum$cp),
  BIC = which.min(res.sum$bic)
)
##   Adj.R2 CP BIC
## 1     51 33  20

Pada output di atas dapat dilihat bahwa penyusunan model dengan 20 peubah penjelas menghasilkan nilai BIC terkecil. Maka dari itu penyusunan model regresi akan dimasukkan 20 peubah terbaik.

# id: model id
# object: regsubsets object
# data: data used to fit regsubsets
# outcome: outcome variable
get_model_formula <- function(id, object, outcome){
  # get models data
  models <- summary(object)$which[id,-1]
  # Get outcome variable
  #form <- as.formula(object$call[[2]])
  #outcome <- all.vars(form)[1]
  # Get model predictors
  predictors <- names(which(models == TRUE))
  predictors <- paste(predictors, collapse = "+")
  # Build model formula
  as.formula(paste0(outcome, "~", predictors))
}

20 peubah penjelas yang akan dibuatkan model regresi

get_model_formula(20, model.forward.subsets, "Y")
## Y ~ V1 + V2 + V3 + V4 + V5 + V6 + V7 + V8 + V9 + V10 + V11 + 
##     V12 + V13 + V14 + V15 + V16 + V17 + V18 + V19 + V20
## <environment: 0x000001fa2fa81eb8>

Model regresi terbaik hasil seleksi peubah

model.regresi.terbaik <- lm(Y ~ V1 + V2 + V3 + V4 + V5 + V6 + V7 + V8 + V9 + V10 + V11 + 
    V12 + V13 + V14 + V15 + V16 + V17 + V18 + V19 + V20, data = data.train)
summary(model.regresi.terbaik)
## 
## Call:
## lm(formula = Y ~ V1 + V2 + V3 + V4 + V5 + V6 + V7 + V8 + V9 + 
##     V10 + V11 + V12 + V13 + V14 + V15 + V16 + V17 + V18 + V19 + 
##     V20, data = data.train)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -17.6280  -3.4048   0.1137   3.3310  14.2054 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  16.6598     1.1255  14.802  < 2e-16 ***
## V1            4.0780     0.4824   8.453  < 2e-16 ***
## V2            1.6828     0.3020   5.572 3.46e-08 ***
## V3            2.4013     0.1696  14.157  < 2e-16 ***
## V4            2.1611     0.2765   7.816 1.76e-14 ***
## V5            2.2121     0.2915   7.589 9.22e-14 ***
## V6            2.6452     0.1643  16.098  < 2e-16 ***
## V7            2.2791     0.1641  13.890  < 2e-16 ***
## V8            2.3256     0.1579  14.729  < 2e-16 ***
## V9            2.5307     0.1582  15.993  < 2e-16 ***
## V10           2.5953     0.1661  15.627  < 2e-16 ***
## V11           6.6458     0.2547  26.096  < 2e-16 ***
## V12           3.6323     0.4461   8.142 1.54e-15 ***
## V13           5.2076     0.1633  31.885  < 2e-16 ***
## V14           4.9730     0.1641  30.296  < 2e-16 ***
## V15           4.7847     0.1625  29.442  < 2e-16 ***
## V16           2.8968     0.1617  17.911  < 2e-16 ***
## V17           2.2621     0.1591  14.219  < 2e-16 ***
## V18           2.3092     0.1632  14.149  < 2e-16 ***
## V19           2.4908     0.1636  15.227  < 2e-16 ***
## V20           2.4678     0.1596  15.467  < 2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 4.977 on 780 degrees of freedom
## Multiple R-squared:  0.9634, Adjusted R-squared:  0.9624 
## F-statistic:  1025 on 20 and 780 DF,  p-value: < 2.2e-16
Sisaan menyebar normal (Uji Jarque Bera)

Hipotesis : H0 : Sisaan menyebar normal H1 : Sisaan tidak menyebar normal

jarque.bera.test(model.regresi.terbaik$residuals)
## 
##  Jarque Bera Test
## 
## data:  model.regresi.terbaik$residuals
## X-squared = 1.1813, df = 2, p-value = 0.554

Nilai p-value hasil uji Jarque Bera sebesar 0.554 > 0.05, hal ini menandakan tidak cukup bukti untuk menolak H0 / gagal tolak H0. Dengan tingkat kepercayaan 95% dapat dikatakan bahwa sisaan menyebar normal.

Sisaan memiliki ragam yang homogen (Uji Breusch Pagan)

Hipotesis : H0 : Sisaan homogen H1 : Sisaan tidak homogen

bptest(model.regresi.terbaik, data = DAT.SIM.01)
## 
##  studentized Breusch-Pagan test
## 
## data:  model.regresi.terbaik
## BP = 36.406, df = 20, p-value = 0.01378

Nilai p-Value hasil uji Breush Pagan sebesar 0.001378 < 0.05, hal ini menandakan cukup bukti untuk menolak H0 / tolak H0. Dengan tingkat kepercayaan 95% dapat dikatakan bahwa sisaan memiliki ragam yang tidak homogen.

Sisaan saling bebas (Uji Durbin Watson)

Hipotesis : H0 : Sisaan saling bebas H1 : Sisaan tidak saling bebas

dwtest(model.regresi.terbaik)
## 
##  Durbin-Watson test
## 
## data:  model.regresi.terbaik
## DW = 1.9937, p-value = 0.4675
## alternative hypothesis: true autocorrelation is greater than 0

Nilai p-Value hasil uji Durbin Watson sebesar 0.4675 > 0.05, hal ini menandakan tidak cukup bukti untuk menolak H0 / gagal tolak H0. Dengan tingkat kepercayaan 95% dapat dikatakan bahwa sisaan saling bebas atau tidak ada autokorelasi

Multikolinearitas (Nilai Variance Inflation Factor (VIF))
vif(model.regresi.terbaik)
##        V1        V2        V3        V4        V5        V6        V7        V8 
##  8.761241  2.611134  1.038668  2.833613  2.833616  1.016729  1.025985  1.017779 
##        V9       V10       V11       V12       V13       V14       V15       V16 
##  1.031055  1.042431 10.709299  3.273919  1.019071  1.027938  1.018189  1.022531 
##       V17       V18       V19       V20 
##  1.030245  1.024175  1.036396  1.025221

Dapat dilihat bahwa terdapat nilai yang tinggi di atas 5 pada V1 dan V11, hal ini menandakan adanya multikolinearitas.

Prediksi Data

preds_best <- predict(model.regresi.terbaik, data.test)
head(preds_best)
##         2         5         8         9        18        19 
##  89.20158 126.47875  68.59583 110.55281  77.61893 119.52129

Melihat akurasi prediksi dengan plot :

plot(data.test$Y, preds_best, col="red")

cor(data.test$Y, preds_best)
## [1] 0.9793237

Pada data bangkitkan sudah ditentukan di awal terdapat 4 peubah penjelas yang berkorelasi kuat. Akan dilihat peubah mana yang berkorelasi menggunakan matriks dan plot korelasi sebagai berikut:

data_korelasi <- data.train[c(2,3,5,6,12,13)]
mk <- cor(data_korelasi)

round(mk,2)
##        V1    V2    V4    V5   V11   V12
## V1   1.00  0.60 -0.02  0.00  0.94  0.68
## V2   0.60  1.00  0.01 -0.02  0.61  0.10
## V4  -0.02  0.01  1.00 -0.80 -0.01 -0.03
## V5   0.00 -0.02 -0.80  1.00 -0.01  0.01
## V11  0.94  0.61 -0.01 -0.01  1.00  0.71
## V12  0.68  0.10 -0.03  0.01  0.71  1.00
corrplot(mk, type="lower",
         order = "hclust", # mengurutkan berdasarkan hierarchical clustering
         tl.col= "black", # warna tulisan
         addCoef.col = "black", # tambahkan koefisien korelasi
         diag=FALSE, #menyembunyikan koefisien pada diagonal
         tl.srt= 45, # kemiringan tulisan 45 derajat
         method = "circle") # Bentuk Visualisasi

Dapat dilihat dari matriks korelasi di atas, bahwa terdapat korelasi yang tinggi antara V1-V2, V1-V11, V1-V12, V2-V11, V4-V5, dan V11-V12. Solusi dari masalah ini dapat diselesaikan dengan beberapa cara yaitu dengan menghilangkan peubah yang berkorelasi tinggi atau dengan menggunakan metode penyusutan (shrinkage) peubah seperti regresi gulud, LASSO, dan jaring elastis.

Regresi Gulud

set.seed(123)

model_ridge_mae <- cv.glmnet(Y~.,data=data.train,alpha=0, type.measure="mae",family="gaussian",nfolds=10)
plot(model_ridge_mae)

Pada plot di atas, banyaknya peubah terpilih dilihat dari angka di atas plot, garis putus-putus adalah nilai lambda yang optimum berdasarkan nilai mae terkecil.

info_ridge_mae <- broom::glance(model_ridge_mae)
info_ridge_mae
## # A tibble: 1 × 3
##   lambda.min lambda.1se  nobs
##        <dbl>      <dbl> <int>
## 1       2.14       2.83   801
broom::tidy(model_ridge_mae)
## # A tibble: 100 × 6
##    lambda estimate std.error conf.low conf.high nzero
##     <dbl>    <dbl>     <dbl>    <dbl>     <dbl> <int>
##  1 21429.     20.7     0.529     20.2      21.3   100
##  2 19525.     20.7     0.528     20.1      21.2   100
##  3 17791.     20.7     0.527     20.1      21.2   100
##  4 16210.     20.7     0.527     20.1      21.2   100
##  5 14770.     20.6     0.527     20.1      21.2   100
##  6 13458.     20.6     0.527     20.1      21.2   100
##  7 12263.     20.6     0.527     20.1      21.2   100
##  8 11173.     20.6     0.527     20.1      21.2   100
##  9 10181.     20.6     0.527     20.1      21.1   100
## 10  9276.     20.6     0.527     20.1      21.1   100
## # … with 90 more rows

Nilai koefisien terpilih dapat dikeluarkan dengan bantuan package broom di bawah ini:

broom::tidy(model_ridge_mae$glmnet.fit)
## # A tibble: 10,100 × 5
##    term         step estimate lambda dev.ratio
##    <chr>       <dbl>    <dbl>  <dbl>     <dbl>
##  1 (Intercept)     1     101. 21429.  4.71e-36
##  2 (Intercept)     2     101. 19525.  6.10e- 3
##  3 (Intercept)     3     101. 17791.  6.69e- 3
##  4 (Intercept)     4     101. 16210.  7.33e- 3
##  5 (Intercept)     5     101. 14770.  8.04e- 3
##  6 (Intercept)     6     101. 13458.  8.82e- 3
##  7 (Intercept)     7     101. 12263.  9.68e- 3
##  8 (Intercept)     8     101. 11173.  1.06e- 2
##  9 (Intercept)     9     101. 10181.  1.16e- 2
## 10 (Intercept)    10     101.  9276.  1.28e- 2
## # … with 10,090 more rows
ridge_mae_min <- broom::tidy(model_ridge_mae$glmnet.fit) %>% filter(lambda==info_ridge_mae$lambda.min)
ridge_mae_min
## # A tibble: 101 × 5
##    term         step estimate lambda dev.ratio
##    <chr>       <dbl>    <dbl>  <dbl>     <dbl>
##  1 (Intercept)   100    24.8    2.14     0.962
##  2 V1            100     6.09   2.14     0.962
##  3 V2            100     2.85   2.14     0.962
##  4 V3            100     2.18   2.14     0.962
##  5 V4            100     1.56   2.14     0.962
##  6 V5            100     1.47   2.14     0.962
##  7 V6            100     2.49   2.14     0.962
##  8 V7            100     2.10   2.14     0.962
##  9 V8            100     2.00   2.14     0.962
## 10 V9            100     2.43   2.14     0.962
## # … with 91 more rows

Koefisien Regresi Gulud 100 peubah bebas

ridge_model <- glmnet(Y~.,data=data.train, lambda = model_ridge_mae$lambda.min , alpha = 0)
coef(ridge_model)
## 101 x 1 sparse Matrix of class "dgCMatrix"
##                        s0
## (Intercept) 24.7750652933
## V1           6.0902941043
## V2           2.8504647309
## V3           2.1834601417
## V4           1.5604453134
## V5           1.4725077477
## V6           2.4909738040
## V7           2.1035022367
## V8           2.0036750679
## V9           2.4340437526
## V10          2.4965383544
## V11          4.5781186160
## V12          5.4579345751
## V13          4.8194969934
## V14          4.5889824379
## V15          4.2684329898
## V16          2.6254067880
## V17          2.0788605870
## V18          2.0221560846
## V19          2.1879800249
## V20          2.2350270273
## V21          0.0955820852
## V22          0.3072644731
## V23          0.1268410575
## V24          0.1938285169
## V25         -0.1174629058
## V26         -0.2175829566
## V27         -0.0659545429
## V28          0.2724447788
## V29          0.1776463457
## V30         -0.2405335676
## V31          0.2046058812
## V32         -0.2333273341
## V33          0.0145762422
## V34         -0.1033452180
## V35          0.0056817506
## V36          0.1356507634
## V37         -0.1450060532
## V38          0.1988981618
## V39          0.1965531495
## V40         -0.0410273988
## V41         -0.2541430368
## V42         -0.0379625182
## V43          0.0161586591
## V44          0.0281705100
## V45         -0.0047549688
## V46         -0.1584863448
## V47         -0.2361323603
## V48         -0.1993707278
## V49          0.0967776534
## V50         -0.0260815812
## V51         -0.0006605362
## V52         -0.3115927631
## V53         -0.0614642395
## V54         -0.1760912228
## V55          0.1357329284
## V56         -0.2386363968
## V57          0.1647999632
## V58         -0.0496368006
## V59         -0.0502566624
## V60          0.0146863113
## V61         -0.1008597272
## V62         -0.1500305272
## V63          0.1797946299
## V64         -0.2328955923
## V65          0.2399883968
## V66         -0.1845368015
## V67         -0.3245129597
## V68         -0.0868339106
## V69          0.1398095569
## V70          0.1039560217
## V71          0.2637619527
## V72         -0.2311090872
## V73          0.0937568624
## V74          0.0648291665
## V75         -0.2253608734
## V76         -0.0889487324
## V77          0.0083857088
## V78         -0.0057006735
## V79         -0.1036971709
## V80         -0.1137633399
## V81          0.0911181008
## V82         -0.0226351738
## V83          0.0581261846
## V84          0.0480224579
## V85         -0.1019267193
## V86         -0.2390677117
## V87         -0.0088706719
## V88          0.2620332957
## V89          0.2256242723
## V90          0.0406777645
## V91          0.0878523779
## V92         -0.1024489259
## V93          0.0084632793
## V94         -0.2971656241
## V95         -0.1814240404
## V96          0.2922719728
## V97         -0.1923526069
## V98          0.2466244966
## V99          0.1714133004
## V100        -0.0854004578

Dapat dilihat bahwa Regresi Gulud menyusutkan koefisien regresi mendekati angka nol namun tidak tepat angka nol.

Prediksi Data

preds_ridge <- predict(ridge_model, data.test)
head(preds_ridge)
##             s0
## [1,]  88.45845
## [2,] 125.01836
## [3,]  64.18933
## [4,] 109.33488
## [5,]  76.54012
## [6,] 119.56958
plot(data.test$Y, preds_ridge, col = "red")

cor(data.test$Y, preds_ridge)
##             s0
## [1,] 0.9767666

Regresi LASSO

set.seed(123)

model_lasso_mae <- cv.glmnet(Y~.,data=data.train,alpha = 1,
                         type.measure="mae",
                         family="gaussian",
                         nfolds=10)
plot(model_lasso_mae)

Pada plot di atas, banyaknya peubah terpilih dilihat dari angka di atas plot, garis putus-putus adalah nilai lambda yang optimum berdasarkan nilai mae terkecil.

info_lasso_mae <- broom::glance(model_lasso_mae)
info_lasso_mae
## # A tibble: 1 × 3
##   lambda.min lambda.1se  nobs
##        <dbl>      <dbl> <int>
## 1      0.117      0.326   801
broom::tidy(model_lasso_mae)
## # A tibble: 78 × 6
##    lambda estimate std.error conf.low conf.high nzero
##     <dbl>    <dbl>     <dbl>    <dbl>     <dbl> <int>
##  1  21.4      20.7     0.546     20.1      21.2     0
##  2  19.5      19.5     0.542     18.9      20.0     1
##  3  17.8      18.4     0.519     17.8      18.9     1
##  4  16.2      17.4     0.495     16.9      17.9     1
##  5  14.8      16.5     0.468     16.0      17.0     1
##  6  13.5      15.7     0.444     15.3      16.2     1
##  7  12.3      15.1     0.418     14.6      15.5     1
##  8  11.2      14.5     0.395     14.1      14.9     1
##  9  10.2      14.0     0.377     13.6      14.3     2
## 10   9.28     13.5     0.361     13.1      13.9     2
## # … with 68 more rows

Nilai koefisien terpilih dapat dikeluarkan dengan bantuan package broom di bawah ini:

broom::tidy(model_lasso_mae$glmnet.fit)
## # A tibble: 2,807 × 5
##    term         step estimate lambda dev.ratio
##    <chr>       <dbl>    <dbl>  <dbl>     <dbl>
##  1 (Intercept)     1    101.   21.4      0    
##  2 (Intercept)     2     99.6  19.5      0.118
##  3 (Intercept)     3     98.0  17.8      0.217
##  4 (Intercept)     4     96.6  16.2      0.298
##  5 (Intercept)     5     95.3  14.8      0.366
##  6 (Intercept)     6     94.2  13.5      0.422
##  7 (Intercept)     7     93.1  12.3      0.469
##  8 (Intercept)     8     92.1  11.2      0.508
##  9 (Intercept)     9     91.2  10.2      0.540
## 10 (Intercept)    10     90.1   9.28     0.568
## # … with 2,797 more rows
coef_mae_min <- broom::tidy(model_lasso_mae$glmnet.fit) %>% filter(lambda==info_lasso_mae$lambda.min)
coef_mae_min
## # A tibble: 61 × 5
##    term         step estimate lambda dev.ratio
##    <chr>       <dbl>    <dbl>  <dbl>     <dbl>
##  1 (Intercept)    57    20.0   0.117     0.965
##  2 V1             57     4.12  0.117     0.965
##  3 V2             57     1.44  0.117     0.965
##  4 V3             57     2.29  0.117     0.965
##  5 V4             57     1.58  0.117     0.965
##  6 V5             57     1.60  0.117     0.965
##  7 V6             57     2.57  0.117     0.965
##  8 V7             57     2.17  0.117     0.965
##  9 V8             57     2.17  0.117     0.965
## 10 V9             57     2.48  0.117     0.965
## # … with 51 more rows

Membangun model regresi LASSO berdasarkan lambda minimum.

lasso_model <- glmnet(Y~.,data=data.train, lambda = model_lasso_mae$lambda.min , alpha = 1)
coef(lasso_model)
## 101 x 1 sparse Matrix of class "dgCMatrix"
##                      s0
## (Intercept) 19.97380798
## V1           4.19308071
## V2           1.44729181
## V3           2.29527815
## V4           1.58935373
## V5           1.60766628
## V6           2.57364065
## V7           2.16566532
## V8           2.16897237
## V9           2.48035475
## V10          2.53030923
## V11          6.64440111
## V12          3.29464141
## V13          5.15525710
## V14          4.85843103
## V15          4.65778779
## V16          2.77238931
## V17          2.13815791
## V18          2.19189022
## V19          2.40158290
## V20          2.33382974
## V21          0.07293040
## V22          0.17979096
## V23          .         
## V24          0.08059796
## V25          .         
## V26         -0.02756608
## V27         -0.05351490
## V28          0.09628456
## V29          .         
## V30         -0.12492904
## V31          0.10266443
## V32         -0.10248795
## V33          .         
## V34          .         
## V35          0.04812559
## V36          .         
## V37         -0.03242592
## V38          0.06488556
## V39          0.02120486
## V40          .         
## V41         -0.18273385
## V42          .         
## V43          .         
## V44          .         
## V45          .         
## V46          .         
## V47         -0.06845487
## V48         -0.07747962
## V49          .         
## V50          .         
## V51          .         
## V52         -0.18926730
## V53          .         
## V54          .         
## V55          0.01878968
## V56         -0.05274571
## V57          0.02900460
## V58          .         
## V59          .         
## V60          .         
## V61          .         
## V62         -0.07513937
## V63          0.05633126
## V64         -0.18135314
## V65          0.18556755
## V66         -0.02420491
## V67         -0.19144768
## V68          .         
## V69          .         
## V70          0.03788197
## V71          0.04872358
## V72         -0.03432072
## V73          .         
## V74          .         
## V75         -0.18040039
## V76         -0.09617952
## V77          .         
## V78          .         
## V79          .         
## V80          .         
## V81          .         
## V82          .         
## V83          .         
## V84          .         
## V85          .         
## V86         -0.17540473
## V87          .         
## V88          0.12283703
## V89          0.05446312
## V90          .         
## V91          .         
## V92          .         
## V93          .         
## V94         -0.22476610
## V95         -0.11947994
## V96          0.17795424
## V97         -0.11709762
## V98          0.17578021
## V99          0.04886036
## V100         .

Pada output di atas dapat dilihat bahwa Regresi LASSO mampu menyusutkan koefisien regresi hingga tepat angka nol, hal ini lah yang membuat regresi lasso mampu melakukan penyusutan koefisien regresi dan juga melakukan seleksi peubah.

Prediksi Data

preds_lasso <- predict(lasso_model, data.test)
head(preds_lasso)
##             s0
## [1,]  87.84042
## [2,] 123.99615
## [3,]  68.81116
## [4,] 108.82186
## [5,]  75.99800
## [6,] 120.26121

Melihat akurasi prediksi menggunakan plot

plot(data.test$Y, preds_lasso, col = "red")

cor(data.test$Y, preds_lasso)
##             s0
## [1,] 0.9790778

Regresi Jaring Elastis (Elastic Net)

set.seed(123)

model_elastic_mae <- cv.glmnet(Y~.,data=data.train,alpha = 0.5,
                         type.measure="mae",
                         family="gaussian",
                         nfolds=10)
plot(model_elastic_mae)

Pada plot di atas, banyaknya peubah terpilih dilihat dari angka di atas plot, garis putus-putus adalah nilai lambda yang optimum berdasarkan nilai mae terkecil.

info_elastic_mae <- broom::glance(model_elastic_mae)
info_elastic_mae
## # A tibble: 1 × 3
##   lambda.min lambda.1se  nobs
##        <dbl>      <dbl> <int>
## 1      0.213      0.594   801
broom::tidy(model_elastic_mae)
## # A tibble: 79 × 6
##    lambda estimate std.error conf.low conf.high nzero
##     <dbl>    <dbl>     <dbl>    <dbl>     <dbl> <int>
##  1   42.9     20.7     0.539     20.2      21.2     0
##  2   39.1     20.0     0.539     19.4      20.5     2
##  3   35.6     19.1     0.521     18.6      19.6     2
##  4   32.4     18.3     0.502     17.8      18.8     2
##  5   29.5     17.5     0.484     17.0      18.0     2
##  6   26.9     16.8     0.465     16.3      17.2     2
##  7   24.5     16.1     0.441     15.7      16.5     2
##  8   22.3     15.5     0.418     15.1      15.9     2
##  9   20.4     15.0     0.399     14.6      15.3     2
## 10   18.6     14.5     0.383     14.1      14.8     2
## # … with 69 more rows

Nilai koefisien terpilih dapat dikeluarkan dengan bantuan package broom di bawah ini:

broom::tidy(model_elastic_mae$glmnet.fit)
## # A tibble: 2,923 × 5
##    term         step estimate lambda dev.ratio
##    <chr>       <dbl>    <dbl>  <dbl>     <dbl>
##  1 (Intercept)     1    101.    42.9    0     
##  2 (Intercept)     2    100.    39.1    0.0729
##  3 (Intercept)     3     98.6   35.6    0.152 
##  4 (Intercept)     4     97.1   32.4    0.223 
##  5 (Intercept)     5     95.7   29.5    0.286 
##  6 (Intercept)     6     94.4   26.9    0.342 
##  7 (Intercept)     7     93.1   24.5    0.391 
##  8 (Intercept)     8     91.9   22.3    0.434 
##  9 (Intercept)     9     90.8   20.4    0.471 
## 10 (Intercept)    10     89.8   18.6    0.504 
## # … with 2,913 more rows
elastic_mae_min <- broom::tidy(model_elastic_mae$glmnet.fit) %>% filter(lambda==info_elastic_mae$lambda.min)
elastic_mae_min
## # A tibble: 66 × 5
##    term         step estimate lambda dev.ratio
##    <chr>       <dbl>    <dbl>  <dbl>     <dbl>
##  1 (Intercept)    58    20.1   0.213     0.965
##  2 V1             58     4.44  0.213     0.965
##  3 V2             58     1.58  0.213     0.965
##  4 V3             58     2.30  0.213     0.965
##  5 V4             58     1.61  0.213     0.965
##  6 V5             58     1.62  0.213     0.965
##  7 V6             58     2.57  0.213     0.965
##  8 V7             58     2.17  0.213     0.965
##  9 V8             58     2.16  0.213     0.965
## 10 V9             58     2.48  0.213     0.965
## # … with 56 more rows

Menghitung nilai MAE pada model jaring elastis dengan memasukkan semua nilai aplha yang mungkin.

for (i in 0:10) {
    assign(paste("fit", i, sep=""), cv.glmnet(Y~.,data=data.train, type.measure="mae", alpha=i/10))
}

yhat0 <- predict(fit0, s=fit0$lambda.1se, data.test)
yhat1 <- predict(fit1, s=fit1$lambda.1se, data.test)
yhat2 <- predict(fit2, s=fit2$lambda.1se, data.test)
yhat3 <- predict(fit3, s=fit3$lambda.1se, data.test)
yhat4 <- predict(fit4, s=fit4$lambda.1se, data.test)
yhat5 <- predict(fit5, s=fit5$lambda.1se, data.test)
yhat6 <- predict(fit6, s=fit6$lambda.1se, data.test)
yhat7 <- predict(fit7, s=fit7$lambda.1se, data.test)
yhat8 <- predict(fit8, s=fit8$lambda.1se, data.test)
yhat9 <- predict(fit9, s=fit9$lambda.1se, data.test)
yhat10 <- predict(fit10, s=fit10$lambda.1se, data.test)

mae0 <- mean(abs(data.test$Y-yhat0))
mae1 <- mean(abs(data.test$Y-yhat1))
mae2 <- mean(abs(data.test$Y-yhat2))
mae3 <- mean(abs(data.test$Y-yhat3))
mae4 <- mean(abs(data.test$Y-yhat4))
mae5 <- mean(abs(data.test$Y-yhat5))
mae6 <- mean(abs(data.test$Y-yhat6))
mae7 <- mean(abs(data.test$Y-yhat7))
mae8 <- mean(abs(data.test$Y-yhat8))
mae9 <- mean(abs(data.test$Y-yhat9))
mae10 <- mean(abs(data.test$Y-yhat10))

alpha<-seq(0,1,by=0.1)
mae<-c(mae0,mae1,mae2,mae3,mae4,mae5,mae6,mae7,mae8,mae9,mae10)
cbind(alpha,mae)
##       alpha      mae
##  [1,]   0.0 4.467550
##  [2,]   0.1 4.282022
##  [3,]   0.2 4.239542
##  [4,]   0.3 4.209632
##  [5,]   0.4 4.197505
##  [6,]   0.5 4.183044
##  [7,]   0.6 4.175569
##  [8,]   0.7 4.169099
##  [9,]   0.8 4.164666
## [10,]   0.9 4.161968
## [11,]   1.0 4.167746

Pada output di atas dapat dilihat bahwa nilai MAE terkecil saat alpha = 0.9

Membangun model regresi jaring elastis berdasarkan lambda minimum dan alpha = 0.9.

elastic_model <- glmnet(Y~.,data=data.train, lambda = model_elastic_mae$lambda.min , alpha = 0.9)
coef(elastic_model)
## 101 x 1 sparse Matrix of class "dgCMatrix"
##                       s0
## (Intercept) 21.284625617
## V1           4.147311429
## V2           1.375840488
## V3           2.206238490
## V4           1.250076947
## V5           1.260777178
## V6           2.497562155
## V7           2.104851706
## V8           2.117571067
## V9           2.406053177
## V10          2.467747130
## V11          6.663669372
## V12          3.210963970
## V13          5.058775410
## V14          4.767515293
## V15          4.593094070
## V16          2.724387641
## V17          2.075008825
## V18          2.095395170
## V19          2.354805141
## V20          2.249698572
## V21          0.028535003
## V22          0.093376963
## V23          .          
## V24          .          
## V25          .          
## V26          .          
## V27         -0.011289843
## V28          0.012420428
## V29          .          
## V30         -0.060751167
## V31          0.029754649
## V32         -0.032706152
## V33          .          
## V34          .          
## V35          .          
## V36          .          
## V37          .          
## V38          .          
## V39          .          
## V40          .          
## V41         -0.131624043
## V42          .          
## V43          .          
## V44          .          
## V45          .          
## V46          .          
## V47          .          
## V48         -0.022493227
## V49          .          
## V50          .          
## V51          .          
## V52         -0.129234224
## V53          .          
## V54          .          
## V55          .          
## V56          .          
## V57          .          
## V58          .          
## V59          .          
## V60          .          
## V61          .          
## V62         -0.002478461
## V63          0.013349235
## V64         -0.098558786
## V65          0.130155502
## V66          .          
## V67         -0.117740561
## V68          .          
## V69          .          
## V70          .          
## V71          .          
## V72          .          
## V73          .          
## V74          .          
## V75         -0.117341232
## V76         -0.009892407
## V77          .          
## V78          .          
## V79          .          
## V80          .          
## V81          .          
## V82          .          
## V83          .          
## V84          .          
## V85          .          
## V86         -0.105454748
## V87          .          
## V88          0.055079397
## V89          .          
## V90          .          
## V91          .          
## V92          .          
## V93          .          
## V94         -0.136413385
## V95         -0.034605972
## V96          0.107398797
## V97         -0.054747066
## V98          0.116169843
## V99          .          
## V100         .

Pada output di atas dapat dilihat bahwa Regresi LASSO mampu menyusutkan koefisien regresi hingga tepat angka nol, hal ini lah yang membuat regresi lasso mampu melakukan penyusutan koefisien regresi dan juga melakukan seleksi peubah.

Prediksi Data

preds_elastic <- predict(elastic_model, data.test)
head(preds_elastic)
##             s0
## [1,]  88.43970
## [2,] 123.68908
## [3,]  70.09376
## [4,] 108.67442
## [5,]  76.28918
## [6,] 119.98140

Melihat akurasi prediksi menggunakan plot

plot(data.test$Y, preds_elastic, col = "red")

cor(data.test$Y, preds_elastic)
##             s0
## [1,] 0.9794699

Mengevaluasi Model

Evaluasi model dilakukan dengan membandingkan nilai MAE antara kelima model yang sudah dibangun.

Membuat fungsi untuk menghitung nilai MAE :

mae <- function(response,pred){
  mean(abs(response-pred),na.rm = TRUE)
}

Tabel perbandingan model

mae_regresi_linear <- mae(data.test$Y, preds_linear) 
mae_best_subset <- mae(data.test$Y,preds_best)
mae_ridge_fnl <- mae(data.test$Y,preds_ridge)
mae_lasso_fnl <- mae(data.test$Y,preds_lasso)
mae_elastic_fnl <- mae(data.test$Y,preds_elastic)

komparasi_mae <- as.data.frame(c(mae_regresi_linear,mae_best_subset,mae_ridge_fnl,mae_lasso_fnl,mae_elastic_fnl))
colnames(komparasi_mae)
## [1] "c(mae_regresi_linear, mae_best_subset, mae_ridge_fnl, mae_lasso_fnl, mae_elastic_fnl)"
names(komparasi_mae)[names(komparasi_mae) == "c(mae_regresi_linear, mae_best_subset,mae_ridge_fnl, mae_lasso_fnl, mae_elastic_fnl)"] <- "Nilai MAE"

Model<-c("Linear","Best-Subset","Ridge", "Lasso", "Elastic-Net (0.9)")
komparasi_mae<- cbind(Model, komparasi_mae)
komparasi_mae
##               Model
## 1            Linear
## 2       Best-Subset
## 3             Ridge
## 4             Lasso
## 5 Elastic-Net (0.9)
##   c(mae_regresi_linear, mae_best_subset, mae_ridge_fnl, mae_lasso_fnl, mae_elastic_fnl)
## 1                                                                              4.486339
## 2                                                                              4.307232
## 3                                                                              4.422310
## 4                                                                              4.264795
## 5                                                                              4.190336

Kesimpulan

  1. Berdasarkan tabel perbandingan performa model berdasarkan nilai MAE di atas, dapat disimpulkan bahwa model regresi terbaik adalah model regresi jaring elastis (Elastic Net) dengan nilai alpha = 0.9 dan nilai MAE = 4.1903

Referensi

  1. Hastie T. et al (2008). The Element of Statistical Learning. https://link.springer.com/book/10.1007/978-0-387-84858-7.
  2. Hastie T. (2013, May 9). glmnet: Lasso and Elastic-net Regularization in R. Revolutions. https://blog.revolutionanalytics.com/2013/05/hastie-glmnet.html
  3. Haqqoni M.G. Asumsi Regresi. https://rpubs.com/mghozyah/linear-regression-simulation
  4. Kelly J.M (2022, 23 March). Ridge, Lasso, and Elastic Net Regression Using glmnet. https://rpubs.com/jmkelly91/881590
  5. Soleh M.A (2013, April). LASSO : Solusi Alternatif Seleksi Peubah dan Penyusutan Koefisien Model Regresi Linier. https://journal.ipb.ac.id/index.php/statistika/article/view/12314.
  6. Tibshirani R. (1996). Regression Shrinkage and Selection via The LASSO. https://rss.onlinelibrary.wiley.com/doi/10.1111/j.2517-6161.1996.tb02080.x