Eksplorasi dan Pemulusan Data Time Series

Kelompok 5

P2-MPDW 2022

Anggota:

Hanung Safrizal (G1401201050)
Dhea Puspita Adinda (G1401201090)
Maulana Ahsan Fadillah (G1401201062)
Muhammad Irsyad Robbani (G1401201064)
Muhammad Raziv Zulfikar (G1401201083)

Pendahuluan

Latar Belakang

Beras merupakan makanan pokok bagi sebagian besar masyarakat Indonesia. Konsumsi beras di Indonesia meningkat setiap tahunnya seiring dengan meningkatnya jumlah penduduk di Indonesia. Ketergantungan masyarakat Indonesia terhadap beras bisa menimbulkan masalah jika ketersediaan beras sudah tidak dapat tercukupi. Pemenuhan kebutuhan dan stabilitas harga beras merupakan isu yang tetap relevan dari waktu ke waktu. Jumlah penduduk yang besar menjadikan permasalahan beras bukan hanya pada jumlah ketersediaan dan harga saja, namun termasuk keberagaman kualitas dan jenis produk beras (BPS 2009). Pada awal tahun 2018 harga beras mengalami peningkatan. Kenaikan harga beras ini jika terus dibiarkan akan menyebabkan terjadinya inflasi yang berdampak pada melambatnya pertumbuhan ekonomi nasional serta dampak negatif lainnya. Dalam rangka perumusan kebijakan pengendalian inflasi maka data dan informasi terkait proyeksi keadaan pasar sangat dibutuhkan. Oleh karena itu, pemodelan harga beras di Indonesia sangat perlu dilakukan.

Tujuan

  1. Mengeksplorasi dan forecasting data time series harga beras premium di Indonesia per bulan pada tahun 2013-2022.

Tinjauan Pustaka

  1. Metode Single Moving Average Metode single moving average adalah metode peramalan yang menggunakan sejumlah data aktual permintaan yang baru untuk membangkitkan nilai ramalan untuk permintaan dimasa yang akan datang.(Naufal dan Andrean, 2017).
  2. Metode Double Moving Average Metode Double Moving Average adalah suatu variasi dari prosedur rata-rata bergerak yang diharapkan dapat mengatasi adanya faktor tren secara lebih baik (Layakana dan Iskandar, 2020)
  3. Metode Single Exponential Smoothing Metode Single Exponential Smoothing adalah metode yang menunjukan pembobotan menurun secara eksponensial terhadap nilai observasi yang lebih tua. Yaitu nilai yang lebih baru diberikan bobot yang relatif lebih besar dibanding nilai observasi yang lebih lama. Metode ini memberikan sebuah pembobotan eksponensial rata-rata bergerak dari semua nilai observasi sebelumnya (Hartono et.al, 2012).
  4. Metode Double Exponential Smoothing Metode Double Exponential Smoothing merupakan metode yang digunakan untuk meramalkan data yang mengalami trend kenaikan (Hanief dan Purwanto 2017).
  5. Winter Smoothing Metode pemulusan eksponensial linear dari Winter’s digunakan untuk peramalan jika data memiliki komponen musiman. Metode winter didasarkan pada tiga persamaan pemulusan, yaitu stasioneritas, trend, dan musiman. Holt Winter sendiri memiliki dua metode yang berbeda, bergantung pada sifat musiman itu sendiri, yaitu aditif atau multiplikatif.

Package dan Data

1. Memanggil Library

Packages yang digunakan dalam analisis eksplorasi dan pemulusan data deret waktu, sebagai berikut:

library(forecast)
## Warning: package 'forecast' was built under R version 4.1.3
## Registered S3 method overwritten by 'quantmod':
##   method            from
##   as.zoo.data.frame zoo
library(TTR)
## Warning: package 'TTR' was built under R version 4.1.3
library(TSA)
## Warning: package 'TSA' was built under R version 4.1.3
## Registered S3 methods overwritten by 'TSA':
##   method       from    
##   fitted.Arima forecast
##   plot.Arima   forecast
## 
## Attaching package: 'TSA'
## The following objects are masked from 'package:stats':
## 
##     acf, arima
## The following object is masked from 'package:utils':
## 
##     tar
library(readxl)
## Warning: package 'readxl' was built under R version 4.1.3
library(tidyverse)
## Warning: package 'tidyverse' was built under R version 4.1.3
## -- Attaching packages --------------------------------------- tidyverse 1.3.2 --
## v ggplot2 3.3.6     v purrr   0.3.4
## v tibble  3.1.8     v dplyr   1.0.9
## v tidyr   1.2.0     v stringr 1.4.1
## v readr   2.1.2     v forcats 0.5.2
## Warning: package 'ggplot2' was built under R version 4.1.3
## Warning: package 'tibble' was built under R version 4.1.3
## Warning: package 'tidyr' was built under R version 4.1.3
## Warning: package 'readr' was built under R version 4.1.2
## Warning: package 'dplyr' was built under R version 4.1.3
## Warning: package 'stringr' was built under R version 4.1.3
## Warning: package 'forcats' was built under R version 4.1.3
## -- Conflicts ------------------------------------------ tidyverse_conflicts() --
## x dplyr::filter() masks stats::filter()
## x dplyr::lag()    masks stats::lag()
## x readr::spec()   masks TSA::spec()

2. Memanggil Data

Data yang digunakan pada pengerjaan tugas analisis eksplorasi dan pemulusan deret waktua adalah data beras premium di Indonesia per bulan dari tahun 2013 hingga 2022. Data diperoleh dari web BPS Indonesia ( https://www.bps.go.id/indicator/36/500/1/harga-beras-di-penggilingan-menurut-kualitas.html). Pada data harga beras terdapat 3 peubah yang digunakan yaitu beras premium, medium, dan luar kualitas tetapi peubah yang digunakan dalam eksplorasi kali ini adalah data beras premium dengan jumlah amatan sebanyak 115 amatan.

setwd("D:/SEMESTER 5/METODE PERAMALAN DERET WAKTU")
data <- read_xlsx("Harga Gabah.xlsx")

3. Split Data

Spliting data dilakukan untuk memisahkan data menjadi dua bagian yaitu data training dan data testing. Data training digunakan untuk menentukan model yang sesuai dengan keseluruhan data dan membuat model, sedangkan data testing digunakan untuk menguji performa dari model yang telah didapatkan. Data dibagi dengan perbandingan 80% : 20%. Jumlah data training sebanyak 92 amatan (amatan 1 sampai 92) dan data testing sebanyak 23 amatan (amatan 93 sampai 115).

train <- data[1:92, 3]
test <- data[93:115, 3]

data.gabah <- ts(data[,2])
data.premium <- ts(data[,3])
train.ts <- ts(train)
test.ts <- ts(test)

Eksplorasi Data Analisis

Plot Data Time Series

Plot Time Series Harga Beras Premium

data.premium <- ts(data$Premium)
ts.plot(data.premium, main = "Harga Beras Premium Tahun 2013 - 2022", ylab = "Harga", lwd = 1.5)
points(data.premium)
legend("topleft", c("Premium"), cex = 0.7,
       col = c("black", "red", "blue"), lty = 1)

plot(data$`Harga Gabah`, data$Premium, pch = 20, col = "blue", main = "Harga Beras Premium vs Harga Gabah Kering")

Berdasarkan scatter plot di atas, dapat dilihat bahwa hubungan peubah harga beras premium (X) dan harga gabah kering (Y) memiliki hubungan yang positif cukup kuat yaitu dengan nilai korelasi sebesar 0.817009, artinya dengan kenaikan harga beras premium, harga gabah juga naik. # Single Moving Average (SMA) ## Pemulusan SMA dengan n=4

data.sma<-SMA(data.premium, n=4)
data.sma
## Time Series:
## Start = 1 
## End = 115 
## Frequency = 1 
##   [1]        NA        NA        NA  7641.970  7578.912  7522.652  7584.505
##   [8]  7669.647  7719.840  7794.297  7818.372  7872.230  7987.885  8102.180
##  [15]  8170.402  8156.205  8106.365  8072.450  8081.307  8183.692  8258.962
##  [22]  8316.285  8397.995  8570.225  8923.462  9270.422  9524.597  9551.942
##  [29]  9298.530  9081.195  8924.512  8924.265  9107.827  9242.740  9397.527
##  [36]  9531.725  9601.575  9683.997  9685.892  9551.942  9416.560  9308.970
##  [43]  9259.602  9319.485  9301.647  9246.207  9216.955  9210.525  9290.722
##  [50]  9359.675  9392.467  9388.222  9389.427  9398.400  9397.187  9425.222
##  [57]  9433.822  9448.420  9487.187  9593.100  9812.930 10032.697 10121.150
##  [64] 10037.292  9830.785  9604.862  9511.705  9494.982  9507.020  9548.832
##  [71]  9611.592  9701.592  9836.355  9927.007  9937.897  9849.550  9687.292
##  [78]  9564.192  9490.397  9506.727  9539.812  9575.727  9631.430  9708.495
##  [85]  9818.062  9923.470 10008.537 10053.520 10002.012  9961.550  9923.870
##  [92]  9909.985  9921.107  9894.595  9840.292  9796.595  9773.862  9763.662
##  [99]  9736.780  9677.255  9638.977  9580.242  9528.902  9516.302  9473.422
## [106]  9451.500  9485.920  9529.215  9621.382  9715.735  9777.570  9753.622
## [113]  9675.722  9593.352  9553.837
data.ramal<-c(NA,data.sma)
data.ramal #forecast 1 periode ke depan
##   [1]        NA        NA        NA        NA  7641.970  7578.912  7522.652
##   [8]  7584.505  7669.647  7719.840  7794.297  7818.372  7872.230  7987.885
##  [15]  8102.180  8170.402  8156.205  8106.365  8072.450  8081.307  8183.692
##  [22]  8258.962  8316.285  8397.995  8570.225  8923.462  9270.422  9524.597
##  [29]  9551.942  9298.530  9081.195  8924.512  8924.265  9107.827  9242.740
##  [36]  9397.527  9531.725  9601.575  9683.997  9685.892  9551.942  9416.560
##  [43]  9308.970  9259.602  9319.485  9301.647  9246.207  9216.955  9210.525
##  [50]  9290.722  9359.675  9392.467  9388.222  9389.427  9398.400  9397.187
##  [57]  9425.222  9433.822  9448.420  9487.187  9593.100  9812.930 10032.697
##  [64] 10121.150 10037.292  9830.785  9604.862  9511.705  9494.982  9507.020
##  [71]  9548.832  9611.592  9701.592  9836.355  9927.007  9937.897  9849.550
##  [78]  9687.292  9564.192  9490.397  9506.727  9539.812  9575.727  9631.430
##  [85]  9708.495  9818.062  9923.470 10008.537 10053.520 10002.012  9961.550
##  [92]  9923.870  9909.985  9921.107  9894.595  9840.292  9796.595  9773.862
##  [99]  9763.662  9736.780  9677.255  9638.977  9580.242  9528.902  9516.302
## [106]  9473.422  9451.500  9485.920  9529.215  9621.382  9715.735  9777.570
## [113]  9753.622  9675.722  9593.352  9553.837
data.gab<-cbind(aktual=c(data.premium,rep(NA,5)),pemulusan=c(data.sma,rep(NA,5)),ramalan=c(data.ramal,rep(data.ramal[length(data.ramal)],4)))
data.gab #forecast 5 periode ke depan
##        aktual     pemulusan          ramalan           
##   [1,] "7797.63"  NA                 NA                
##   [2,] "7773.26"  NA                 NA                
##   [3,] "7576.27"  NA                 NA                
##   [4,] "7420.72"  "7641.97"          NA                
##   [5,] "7545.40"  "7578.9125"        "7641.97"         
##   [6,] "7548.22"  "7522.6525"        "7578.9125"       
##   [7,] "7823.68"  "7584.505"         "7522.6525"       
##   [8,] "7761.29"  "7669.6475"        "7584.505"        
##   [9,] "7746.17"  "7719.84"          "7669.6475"       
##  [10,] "7846.05"  "7794.2975"        "7719.84"         
##  [11,] "7919.98"  "7818.3725"        "7794.2975"       
##  [12,] "7976.72"  "7872.23"          "7818.3725"       
##  [13,] "8208.79"  "7987.885"         "7872.23"         
##  [14,] "8303.23"  "8102.18"          "7987.885"        
##  [15,] "8192.87"  "8170.4025"        "8102.18"         
##  [16,] "7919.93"  "8156.205"         "8170.4025"       
##  [17,] "8009.43"  "8106.365"         "8156.205"        
##  [18,] "8167.57"  "8072.45"          "8106.365"        
##  [19,] "8228.30"  "8081.3075"        "8072.45"         
##  [20,] "8329.47"  "8183.6925"        "8081.3075"       
##  [21,] "8310.51"  "8258.9625"        "8183.6925"       
##  [22,] "8396.86"  "8316.285"         "8258.9625"       
##  [23,] "8555.14"  "8397.995"         "8316.285"        
##  [24,] "9018.39"  "8570.225"         "8397.995"        
##  [25,] "9723.46"  "8923.4625"        "8570.225"        
##  [26,] "9784.70"  "9270.4225"        "8923.4625"       
##  [27,] "9571.84"  "9524.5975"        "9270.4225"       
##  [28,] "9127.77"  "9551.9425"        "9524.5975"       
##  [29,] "8709.81"  "9298.53"          "9551.9425"       
##  [30,] "8915.36"  "9081.195"         "9298.53"         
##  [31,] "8945.11"  "8924.5125"        "9081.195"        
##  [32,] "9126.78"  "8924.265"         "8924.5125"       
##  [33,] "9444.06"  "9107.8275"        "8924.265"        
##  [34,] "9455.01"  "9242.74"          "9107.8275"       
##  [35,] "9564.26"  "9397.5275"        "9242.74"         
##  [36,] "9663.57"  "9531.725"         "9397.5275"       
##  [37,] "9723.46"  "9601.575"         "9531.725"        
##  [38,] "9784.70"  "9683.9975"        "9601.575"        
##  [39,] "9571.84"  "9685.8925"        "9683.9975"       
##  [40,] "9127.77"  "9551.9425"        "9685.8925"       
##  [41,] "9181.93"  "9416.56"          "9551.9425"       
##  [42,] "9354.34"  "9308.97"          "9416.56"         
##  [43,] "9374.37"  "9259.6025"        "9308.97"         
##  [44,] "9367.30"  "9319.485"         "9259.6025"       
##  [45,] "9110.58"  "9301.6475"        "9319.485"        
##  [46,] "9132.58"  "9246.2075"        "9301.6475"       
##  [47,] "9257.36"  "9216.955"         "9246.2075"       
##  [48,] "9341.58"  "9210.525"         "9216.955"        
##  [49,] "9431.37"  "9290.7225"        "9210.525"        
##  [50,] "9408.39"  "9359.675"         "9290.7225"       
##  [51,] "9388.53"  "9392.4675"        "9359.675"        
##  [52,] "9324.60"  "9388.2225"        "9392.4675"       
##  [53,] "9436.19"  "9389.4275"        "9388.2225"       
##  [54,] "9444.28"  "9398.4"           "9389.4275"       
##  [55,] "9383.68"  "9397.1875"        "9398.4"          
##  [56,] "9436.74"  "9425.2225"        "9397.1875"       
##  [57,] "9470.59"  "9433.82249999999" "9425.2225"       
##  [58,] "9502.67"  "9448.41999999999" "9433.82249999999"
##  [59,] "9538.75"  "9487.18749999999" "9448.41999999999"
##  [60,] "9860.39"  "9593.09999999999" "9487.18749999999"
##  [61,] "10349.91" "9812.93"          "9593.09999999999"
##  [62,] "10381.74" "10032.6975"       "9812.93"         
##  [63,] "9892.56"  "10121.15"         "10032.6975"      
##  [64,] "9524.96"  "10037.2925"       "10121.15"        
##  [65,] "9523.88"  "9830.785"         "10037.2925"      
##  [66,] "9478.05"  "9604.8625"        "9830.785"        
##  [67,] "9519.93"  "9511.705"         "9604.8625"       
##  [68,] "9458.07"  "9494.9825"        "9511.705"        
##  [69,] "9572.03"  "9507.02"          "9494.9825"       
##  [70,] "9645.30"  "9548.8325"        "9507.02"         
##  [71,] "9770.97"  "9611.5925"        "9548.8325"       
##  [72,] "9818.07"  "9701.5925"        "9611.5925"       
##  [73,] "10111.08" "9836.355"         "9701.5925"       
##  [74,] "10007.91" "9927.0075"        "9836.355"        
##  [75,] "9814.53"  "9937.8975"        "9927.0075"       
##  [76,] "9464.68"  "9849.55"          "9937.8975"       
##  [77,] "9462.05"  "9687.2925"        "9849.55"         
##  [78,] "9515.51"  "9564.1925"        "9687.2925"       
##  [79,] "9519.35"  "9490.3975"        "9564.1925"       
##  [80,] "9530.00"  "9506.7275"        "9490.3975"       
##  [81,] "9594.39"  "9539.8125"        "9506.7275"       
##  [82,] "9659.17"  "9575.7275"        "9539.8125"       
##  [83,] "9742.16"  "9631.42999999999" "9575.7275"       
##  [84,] "9838.26"  "9708.495"         "9631.42999999999"
##  [85,] "10032.66" "9818.0625"        "9708.495"        
##  [86,] "10080.80" "9923.47"          "9818.0625"       
##  [87,] "10082.43" "10008.5375"       "9923.47"         
##  [88,] "10018.19" "10053.52"         "10008.5375"      
##  [89,] "9826.63"  "10002.0125"       "10053.52"        
##  [90,] "9918.95"  "9961.55"          "10002.0125"      
##  [91,] "9931.71"  "9923.87"          "9961.55"         
##  [92,] "9962.65"  "9909.985"         "9923.87"         
##  [93,] "9871.12"  "9921.1075"        "9909.985"        
##  [94,] "9812.90"  "9894.595"         "9921.1075"       
##  [95,] "9714.50"  "9840.2925"        "9894.595"        
##  [96,] "9787.86"  "9796.595"         "9840.2925"       
##  [97,] "9780.19"  "9773.8625"        "9796.595"        
##  [98,] "9772.10"  "9763.6625"        "9773.8625"       
##  [99,] "9606.97"  "9736.78"          "9763.6625"       
## [100,] "9549.76"  "9677.255"         "9736.78"         
## [101,] "9627.08"  "9638.9775"        "9677.255"        
## [102,] "9537.16"  "9580.2425"        "9638.9775"       
## [103,] "9401.61"  "9528.9025"        "9580.2425"       
## [104,] "9499.36"  "9516.3025"        "9528.9025"       
## [105,] "9455.56"  "9473.4225"        "9516.3025"       
## [106,] "9449.47"  "9451.5"           "9473.4225"       
## [107,] "9539.29"  "9485.92"          "9451.5"          
## [108,] "9672.54"  "9529.215"         "9485.92"         
## [109,] "9824.23"  "9621.3825"        "9529.215"        
## [110,] "9826.88"  "9715.735"         "9621.3825"       
## [111,] "9786.63"  "9777.56999999999" "9715.735"        
## [112,] "9576.75"  "9753.62249999999" "9777.56999999999"
## [113,] "9512.63"  "9675.72249999999" "9753.62249999999"
## [114,] "9497.40"  "9593.35249999999" "9675.72249999999"
## [115,] "9628.57"  "9553.83749999999" "9593.35249999999"
## [116,] NA         NA                 "9553.83749999999"
## [117,] NA         NA                 "9553.83749999999"
## [118,] NA         NA                 "9553.83749999999"
## [119,] NA         NA                 "9553.83749999999"
## [120,] NA         NA                 "9553.83749999999"

Hasil peramalan dengan menggunakan N=5 pada metode SMA adalah sebesar 9553.83749

Plot time series

ts.plot(data.gab[,1], xlab="Time Period ", ylab="Sales", main= "SMA N=4 Data Premium")
points(data.gab[,1])
lines(data.gab[,2],col="green",lwd=2)
lines(data.gab[,3],col="red",lwd=2)
legend("topleft",c("data aktual","data pemulusan","data peramalan"), lty=8, col=c("black","green","red"), cex=0.8)

Menghitung nilai keakuratan

data.premium=as.numeric(data.premium)
error.sma = data.premium-data.ramal[1:length(data.premium)]
SSE.sma = sum(error.sma[5:length(data.premium)]^2)
MSE.sma = mean(error.sma[5:length(data.premium)]^2)
MAPE.sma = mean(abs((error.sma[5:length(data.premium)]/data.premium[5:length(data.premium)])*100))

akurasi.sma <- matrix(c(SSE.sma, MSE.sma, MAPE.sma))
row.names(akurasi.sma)<- c("SSE", "MSE", "MAPE")
colnames(akurasi.sma) <- c("Akurasi m = 4")
akurasi.sma
##      Akurasi m = 4
## SSE   8.957821e+06
## MSE   8.070109e+04
## MAPE  2.204061e+00

Double Moving Average (DMA)

Pemulusan DMA dengan n=4

dma <- SMA(data.sma, n = 4)
At <- 2*data.sma - dma
Bt <- 2/(4-1)*(data.sma - dma)
data.dma<- At+Bt
data.ramal2<- c(NA, data.dma)

t = 1:5
f = c()

for (i in t) {
  f[i] = At[length(At)] + Bt[length(Bt)]*(i)
}

data.gab2 <- cbind(aktual = c(data.premium,rep(NA,5)), pemulusan1 = c(data.sma,rep(NA,5)),pemulusan2 = c(data.dma, rep(NA,5)),At = c(At, rep(NA,5)), Bt = c(Bt,rep(NA,5)),ramalan = c(data.ramal2, f[-1]))
data.gab2
##          aktual pemulusan1 pemulusan2        At          Bt   ramalan
##   [1,]  7797.63         NA         NA        NA          NA        NA
##   [2,]  7773.26         NA         NA        NA          NA        NA
##   [3,]  7576.27         NA         NA        NA          NA        NA
##   [4,]  7420.72   7641.970         NA        NA          NA        NA
##   [5,]  7545.40   7578.912         NA        NA          NA        NA
##   [6,]  7548.22   7522.652         NA        NA          NA        NA
##   [7,]  7823.68   7584.505   7588.663  7587.000    1.663333        NA
##   [8,]  7761.29   7669.647   7804.178  7750.366   53.812083  7588.663
##   [9,]  7746.17   7719.840   7879.305  7815.519   63.785833  7804.178
##  [10,]  7846.05   7794.297   7964.672  7896.522   68.150000  7879.305
##  [11,]  7919.98   7818.372   7931.428  7886.206   45.222083  7964.672
##  [12,]  7976.72   7872.230   7990.638  7943.275   47.363333  7931.428
##  [13,]  8208.79   7987.885   8187.366  8107.574   79.792500  7990.638
##  [14,]  8303.23   8102.180   8363.869  8259.193  104.675417  8187.366
##  [15,]  8192.87   8170.402   8399.116  8307.631   91.485417  8363.869
##  [16,]  7919.93   8156.205   8242.933  8208.242   34.691250  8399.116
##  [17,]  8009.43   8106.365   8060.660  8078.942  -18.282083  8242.933
##  [18,]  8167.57   8072.450   7982.607  8018.544  -35.937083  8060.660
##  [19,]  8228.30   8081.307   8043.350  8058.533  -15.182917  7982.607
##  [20,]  8329.47   8183.692   8304.924  8256.431   48.492500  8043.350
##  [21,]  8310.51   8258.962   8442.061  8368.822   73.239583  8304.924
##  [22,]  8396.86   8316.285   8493.324  8422.508   70.815417  8442.061
##  [23,]  8555.14   8397.995   8579.264  8506.756   72.507500  8493.324
##  [24,]  9018.39   8570.225   8877.489  8754.583  122.905417  8579.264
##  [25,]  9723.46   8923.462   9542.580  9294.933  247.647083  8877.489
##  [26,]  9784.70   9270.422  10070.250  9750.319  319.930833  9542.580
##  [27,]  9571.84   9524.597  10278.632  9977.018  301.613750 10070.250
##  [28,]  9127.77   9551.942   9942.503  9786.279  156.224167 10278.632
##  [29,]  8709.81   9298.530   9110.458  9185.687  -75.228750  9942.503
##  [30,]  8915.36   9081.195   8609.743  8798.324 -188.580833  9110.458
##  [31,]  8945.11   8924.512   8441.958  8634.980 -193.021667  8609.743
##  [32,]  9126.78   8924.265   8702.831  8791.404  -88.573750  8441.958
##  [33,]  9444.06   9107.827   9271.790  9206.205   65.585000  8702.831
##  [34,]  9455.01   9242.740   9564.246  9435.644  128.602500  9271.790
##  [35,]  9564.26   9397.527   9779.923  9626.965  152.958333  9564.246
##  [36,]  9663.57   9531.725   9884.675  9743.495  141.180000  9779.923
##  [37,]  9723.46   9601.575   9865.214  9759.758  105.455417  9884.675
##  [38,]  9784.70   9683.997   9901.150  9814.289   86.860833  9865.214
##  [39,]  9571.84   9685.892   9786.051  9745.987   40.063333  9901.150
##  [40,]  9127.77   9551.942   9420.427  9473.033  -52.606250  9786.051
##  [41,]  9181.93   9416.560   9136.496  9248.522 -112.025417  9420.427
##  [42,]  9354.34   9308.970   9005.851  9127.099 -121.247500  9136.496
##  [43,]  9374.37   9259.602   9051.825  9134.936  -83.110833  9005.851
##  [44,]  9367.30   9319.485   9308.369  9312.816   -4.446250  9051.825
##  [45,]  9110.58   9301.647   9308.683  9305.869    2.814167  9308.369
##  [46,]  9132.58   9246.207   9186.994  9210.679  -23.685417  9308.683
##  [47,]  9257.36   9216.955   9126.757  9162.836  -36.079167  9186.994
##  [48,]  9341.58   9210.525   9155.010  9177.216  -22.205833  9126.757
##  [49,]  9431.37   9290.722   9373.422  9340.342   33.080000  9155.010
##  [50,]  9408.39   9359.675   9510.018  9449.881   60.137083  9373.422
##  [51,]  9388.53   9392.467   9524.334  9471.587   52.746667  9510.018
##  [52,]  9324.60   9388.222   9438.974  9418.673   20.300417  9524.334
##  [53,]  9436.19   9389.427   9401.060  9396.407    4.652917  9438.974
##  [54,]  9444.28   9398.400   9408.851  9404.671    4.180417  9401.060
##  [55,]  9383.68   9397.187   9403.651  9401.066    2.585417  9408.851
##  [56,]  9436.74   9425.222   9462.994  9447.886   15.108750  9403.651
##  [57,]  9470.59   9433.822   9467.430  9453.987   13.442917  9462.994
##  [58,]  9502.67   9448.420   9485.515  9470.677   14.837917  9467.430
##  [59,]  9538.75   9487.187   9551.395  9525.712   25.682917  9485.515
##  [60,]  9860.39   9593.100   9763.879  9695.567   68.311667  9551.395
##  [61,] 10349.91   9812.930  10192.131 10040.451  151.680417  9763.879
##  [62,] 10381.74  10032.697  10534.729 10333.916  200.812500 10192.131
##  [63,]  9892.56  10121.150  10506.451 10352.331  154.120417 10534.729
##  [64,]  9524.96  10037.292  10097.751 10073.567   24.183333 10506.451
##  [65,]  9523.88   9830.785   9539.625  9656.089 -116.464167 10097.751
##  [66,]  9478.05   9604.862   9115.429  9311.203 -195.773333  9539.625
##  [67,]  9519.93   9511.705   9120.945  9277.249 -156.304167  9115.429
##  [68,]  9458.07   9494.982   9302.314  9379.381  -77.067500  9120.945
##  [69,]  9572.03   9507.020   9469.316  9484.398  -15.081667  9302.314
##  [70,]  9645.30   9548.832   9604.162  9582.030   22.131667  9469.316
##  [71,]  9770.97   9611.592   9729.902  9682.578   47.323750  9604.162
##  [72,]  9818.07   9701.592   9883.814  9810.926   72.888750  9729.902
##  [73,] 10111.08   9836.355  10105.958  9998.117  107.841250  9883.814
##  [74,] 10007.91   9927.007  10190.125 10084.878  105.247083 10105.958
##  [75,]  9814.53   9937.897  10083.205 10025.082   58.122917 10190.125
##  [76,]  9464.68   9849.550   9785.962  9811.397  -25.435000 10083.205
##  [77,]  9462.05   9687.292   9415.385  9524.148 -108.762917  9785.962
##  [78,]  9515.51   9564.192   9238.291  9368.652 -130.360417  9415.385
##  [79,]  9519.35   9490.397   9227.963  9332.937 -104.973750  9238.291
##  [80,]  9530.00   9506.727   9414.353  9451.302  -36.950000  9227.963
##  [81,]  9594.39   9539.812   9564.029  9554.342    9.686667  9414.353
##  [82,]  9659.17   9575.727   9654.996  9623.289   31.707500  9564.029
##  [83,]  9742.16   9631.430   9744.773  9699.436   45.337083  9654.996
##  [84,]  9838.26   9708.495   9866.210  9803.124   63.085833  9744.773
##  [85,] 10032.66   9818.062  10042.452  9952.696   89.755833  9866.210
##  [86,] 10080.80   9923.470  10178.646 10076.576  102.070417 10042.452
##  [87,] 10082.43  10008.537  10248.365 10152.434   95.930833 10178.646
##  [88,] 10018.19  10053.520  10224.558 10156.143   68.415000 10248.365
##  [89,]  9826.63  10002.012  10010.558 10007.140    3.418333 10224.558
##  [90,]  9918.95   9961.550   9886.792  9916.695  -29.903333 10010.558
##  [91,]  9931.71   9923.870   9821.590  9862.502  -40.912083  9886.792
##  [92,]  9962.65   9909.985   9844.369  9870.616  -26.246250  9821.590
##  [93,]  9871.12   9921.107   9907.740  9913.087   -5.347083  9844.369
##  [94,]  9812.90   9894.595   9864.938  9876.801  -11.862917  9907.740
##  [95,]  9714.50   9840.292   9754.955  9789.090  -34.135000  9864.938
##  [96,]  9787.86   9796.595   9685.674  9730.043  -44.368333  9754.955
##  [97,]  9780.19   9773.862   9686.406  9721.389  -34.982500  9685.674
##  [98,]  9772.10   9763.662   9713.761  9733.722  -19.960417  9686.406
##  [99,]  9606.97   9736.780   9685.205  9705.835  -20.630000  9713.761
## [100,]  9549.76   9677.255   9576.197  9616.620  -40.423333  9685.205
## [101,]  9627.08   9638.977   9530.325  9573.786  -43.460833  9576.197
## [102,]  9537.16   9580.242   9450.124  9502.171  -52.047500  9530.325
## [103,]  9401.61   9528.902   9399.833  9451.461  -51.627917  9450.124
## [104,]  9499.36   9516.302   9433.296  9466.499  -33.202500  9399.833
## [105,]  9455.56   9473.422   9387.931  9422.127  -34.196667  9433.296
## [106,]  9449.47   9451.500   9383.114  9410.468  -27.354583  9387.931
## [107,]  9539.29   9485.920   9492.810  9490.054    2.755833  9383.114
## [108,]  9672.54   9529.215   9602.883  9573.416   29.467083  9492.810
## [109,]  9824.23   9621.382   9787.013  9720.761   66.252083  9602.883
## [110,]  9826.88   9715.735   9928.521  9843.407   85.114583  9787.013
## [111,]  9786.63   9777.570   9971.894  9894.164   77.729583  9928.521
## [112,]  9576.75   9753.622   9814.531  9790.167   24.363333  9971.894
## [113,]  9512.63   9675.722   9584.156  9620.782  -36.626667  9814.531
## [114,]  9497.40   9593.352   9415.495  9486.638  -71.142917  9584.156
## [115,]  9628.57   9553.837   9403.344  9463.541  -60.197500  9415.495
## [116,]       NA         NA         NA        NA          NA  9403.344
## [117,]       NA         NA         NA        NA          NA  9343.146
## [118,]       NA         NA         NA        NA          NA  9282.949
## [119,]       NA         NA         NA        NA          NA  9222.751
## [120,]       NA         NA         NA        NA          NA  9162.554

Hasil peramalan dengan menggunakan N=5 selama 5 periode kedepan pada metode DMA terus menurun dari 9403.344 sampai 9162.554

Plot time series

ts.plot(data.gab2[,1], xlab="Time Period ", ylab="Sales", main= "DMA N=4 Data Premium")
points(data.gab2[,1])
lines(data.gab2[,3],col="green",lwd=2)
lines(data.gab2[,6],col="red",lwd=2)
legend("topleft",c("data aktual","data pemulusan","data peramalan"), lty=8, col=c("black","green","red"), cex=0.8)

Menghitung nilai keakuratan

error.dma = data.premium-data.ramal2[1:length(data.premium)]
SSE.dma = sum(error.dma[8:length(data.premium)]^2)
MSE.dma = mean(error.dma[8:length(data.premium)]^2)
MAPE.dma = mean(abs((error.dma[8:length(data.premium)]/data.premium[8:length(data.premium)])*100))

akurasi.dma <- matrix(c(SSE.dma, MSE.dma, MAPE.dma))
row.names(akurasi.dma)<- c("SSE", "MSE", "MAPE")
colnames(akurasi.dma) <- c("Akurasi m = 4")
akurasi.dma
##      Akurasi m = 4
## SSE   1.135747e+07
## MSE   1.051618e+05
## MAPE  2.419106e+00
akurasi.sma
##      Akurasi m = 4
## SSE   8.957821e+06
## MSE   8.070109e+04
## MAPE  2.204061e+00

Dengan metode SMA, diketahui SSE, MSE, dan MAPE bernilai lebih kecil Metode SMA lebih baik digunakan # Looping for Best M value ## SMA

m = 2:30
akurasi.full <- c()
data.premium <- as.numeric(data.premium)

for(i in m){
  data.sma <- SMA(data.premium, n = i)
  data.ramal <- c(NA, data.sma)
  
  error.sma = data.premium - data.ramal[1:length(data.premium)]
  SSE.sma = sum(error.sma[(i+1):length(data.premium)]^2)
  MSE.sma = mean(error.sma[(i+1):length(data.premium)]^2)
  MAPE.sma = mean(abs((error.sma[(i+1):length(data.premium)]/data.premium[(i+1):length(data.premium)])*100))
  
  tabel <- matrix(c(SSE.sma, MSE.sma, MAPE.sma))
  colnames(tabel) <- paste("M =", i)
  rownames(tabel) <- c("SSE", "MSE", "MAPE")
  
  akurasi.full <- cbind(akurasi.full, tabel)
}

View(akurasi.full)

Nilai error yang terkecil merupakan n optimal, terlihat error yang terkecil ada pada N=2 ## DMA

m = 2:30
akurasi.full2 <- c()

for(i in m){
  data.sma <- SMA(data.premium, n = i)
  dma <- SMA(data.sma, n = i)
  At <- 2*data.sma - dma
  Bt <- 2/(i - 1) * (data.sma - dma)
  data.dma <- At + Bt
  data.ramal2 <- c(NA, data.dma)
  
  error.dma <- data.premium - data.ramal2[1:length(data.premium)]
  SSE.dma <- sum(error.dma[(i*2):length(data.premium)]^2)
  MSE.dma <- mean(error.dma[(i*2):length(data.premium)]^2)
  MAPE.dma <- mean(abs(error.dma[(i*2):115]/data.premium[(i*2):115]) * 100)
  
  tabel2 <- matrix(c(SSE.dma, MSE.dma, MAPE.dma))
  colnames(tabel2) <- paste("M =", i)
  rownames <- c("SSE", "MSE", "MAPE")
  
  akurasi.full2 <- cbind(akurasi.full2, tabel2)
}

akurasi.full2 <- as.data.frame(akurasi.full2)
View(akurasi.full2)

Nilai error yang terkecil merupakan n optimal, terlihat error yang terkecil ada pada N=2 # Perbandingan SMA dan DMA

comp <- cbind(akurasi.full[,1], akurasi.full2[,1])
colnames(comp) <- c("SMA","DMA")
rownames(comp) <- c("SSE","MSE","MAPE")
comp <- as.data.frame(comp)
comp
##               SMA          DMA
## SSE  5.774968e+06 5.916803e+06
## MSE  5.110591e+04 5.282860e+04
## MAPE 1.721257e+00 1.815579e+00

Jika dibandingkan antara SMA dan DMA pada nilai n optimum yaitu n = 2, metode SMA menghasilkan error yang lebih kecil dibanding DMA pada ketiga ukuran kebaikan diatas.

Pemulusan Eksponensial

Import data

setwd("D:/SEMESTER 5/METODE PERAMALAN DERET WAKTU/Kelompok 15")
dataekspo <- read_xlsx("Data Kelompok 15.xlsx")
str(dataekspo)
## tibble [115 x 5] (S3: tbl_df/tbl/data.frame)
##  $ Tahun        : num [1:115] 2013 2013 2013 2013 2013 ...
##  $ Bulan        : chr [1:115] "Januari" "Februari" "Maret" "April" ...
##  $ Premium      : chr [1:115] "7797.63" "7773.26" "7576.27" "7420.72" ...
##  $ Medium       : chr [1:115] "7697.37" "7645.05" "7503.27" "7290.96" ...
##  $ Luar Kualitas: chr [1:115] "7545.32" "7328.44" "7033.14" "6870.91" ...

Membentuk objek time series

training<-dataekspo[1:92,3] 
testing<-dataekspo[93:115,3] 
dataekspo.ts<-ts(dataekspo$Premium) 
training.ts<-ts(training)
testing.ts<-ts(testing,start=93)

Eksplorasi

plot(dataekspo.ts, col="red",main="Plot seluruh data")
points(dataekspo.ts)

plot(training.ts, col="blue",main="Plot data training")
points(training.ts)

plot(testing.ts, col="blue",main="Plot data testing")
points(testing.ts)

Single Exponential

Fungsi Holtwinter

ses1<- HoltWinters(training.ts, gamma = FALSE, beta = FALSE, alpha = 0.2)
plot(ses1)

## Fungsi Holtwinter optimal

sesopt<- HoltWinters(training.ts, gamma = FALSE, beta = FALSE)
sesopt
## Holt-Winters exponential smoothing without trend and without seasonal component.
## 
## Call:
## HoltWinters(x = training.ts, beta = FALSE, gamma = FALSE)
## 
## Smoothing parameters:
##  alpha: 0.762373
##  beta : FALSE
##  gamma: FALSE
## 
## Coefficients:
##   [,1]
## a 87.4
plot(sesopt)

## Fungsi Holtwinter

ramalan1<- forecast(ses1, h=10)
ramalan1$mean
## Time Series:
## Start = 93 
## End = 102 
## Frequency = 1 
##  [1] 63.4927 63.4927 63.4927 63.4927 63.4927 63.4927 63.4927 63.4927 63.4927
## [10] 63.4927

Fungsi Holtwinter optimal

ramalanopt<- forecast(sesopt, h=10)
ramalanopt
##     Point Forecast    Lo 80    Hi 80      Lo 95    Hi 95
##  93           87.4 59.62289 115.1771  44.918579 129.8814
##  94           87.4 52.47132 122.3287  33.981195 140.8188
##  95           87.4 46.55324 128.2468  24.930277 149.8697
##  96           87.4 41.39019 133.4098  17.034072 157.7659
##  97           87.4 36.75074 138.0493   9.938645 164.8614
##  98           87.4 32.50198 142.2980   3.440729 171.3593
##  99           87.4 28.55922 146.2408  -2.589203 177.3892
## 100           87.4 24.86456 149.9355  -8.239711 183.0397
## 101           87.4 21.37632 153.4237 -13.574512 188.3745
## 102           87.4 18.06335 156.7367 -18.641264 193.4413

Akurasi Data Training

SSE

sse1.train <- ses1$SSE
sse1.train
## [1] 45448.35
sseopt.train <- sesopt$SSE
sseopt.train
## [1] 42377.31

MSE

mse1.train <- sse1.train/length(training.ts)
mse1.train
## [1] 494.0038
mseopt.train <- sseopt.train/length(training.ts)
mseopt.train
## [1] 460.6229

RMSE

rmse1.train <- sqrt(mse1.train)
rmse1.train
## [1] 22.2262
rmseopt.train <- sqrt(mseopt.train)
rmseopt.train
## [1] 21.46213
akurasi.ses1 <- matrix(c(sse1.train, mse1.train, rmse1.train, sseopt.train, mseopt.train, rmseopt.train), nrow=3, ncol=2)
row.names(akurasi.ses1)<- c("SSE", "MSE", "RMSE")
colnames(akurasi.ses1) <- c("alpha=0.2", "alpha=0.762373")
akurasi.ses1
##       alpha=0.2 alpha=0.762373
## SSE  45448.3534    42377.30956
## MSE    494.0038      460.62293
## RMSE    22.2262       21.46213

Metode SES memiliki parameter pemulusan yaitu alpha yang bernilai antara 0 dan 1. Pada awal pemulusan, digunakan alpha 0.2. Lalu dilakukan pemulusan dengan alpha yang optimal secara otomatis dengan program, didapatkan alpha sebesar 0.762373. Parameter pemulusan yang dipilih adalah alpha =0.762373 karena memiliki error yang lebih kecil.

Double Exponential

des1<- HoltWinters(training.ts, gamma = FALSE, beta = 0.2, alpha = 0.2)
plot(des1)

desopt<- HoltWinters(training.ts, gamma = FALSE)
desopt
## Holt-Winters exponential smoothing with trend and without seasonal component.
## 
## Call:
## HoltWinters(x = training.ts, gamma = FALSE)
## 
## Smoothing parameters:
##  alpha: 0.7916783
##  beta : 0
##  gamma: FALSE
## 
## Coefficients:
##       [,1]
## a 87.29966
## b -1.00000
plot(desopt)

## Forecasting

ramalandes1<- forecast(des1, h=10)
ramalandes1
##     Point Forecast    Lo 80     Hi 80     Lo 95    Hi 95
##  93       60.48033 29.29621  91.66444 12.788341 108.1723
##  94       62.87615 30.80651  94.94579 13.829861 111.9224
##  95       65.27197 32.03491  98.50903 14.440275 116.1037
##  96       67.66779 32.96504 102.37054 14.594514 120.7411
##  97       70.06361 33.59019 106.53703 14.282331 125.8449
##  98       72.45943 33.91203 111.00683 13.506270 131.4126
##  99       74.85525 33.93865 115.77185 12.278711 137.4318
## 100       77.25107 33.68250 120.81964 10.618692 143.8835
## 101       79.64689 33.15850 126.13528  8.549031 150.7448
## 102       82.04271 32.38256 131.70286  6.094065 157.9914
ramalandesopt<- forecast(desopt, h=10)
ramalandesopt
##     Point Forecast     Lo 80    Hi 80      Lo 95    Hi 95
##  93       86.29966 58.361403 114.2379  43.571780 129.0275
##  94       85.29966 49.666007 120.9333  30.802689 139.7966
##  95       84.29966 42.359605 126.2397  20.157879 148.4415
##  96       83.29966 35.884692 130.7146  10.784722 155.8146
##  97       82.29966 29.979586 134.6197   2.283009 162.3163
##  98       81.29966 24.496480 138.1028  -5.573309 168.1726
##  99       80.29966 19.342198 141.2571 -12.926736 173.5261
## 100       79.29966 14.453510 144.1458 -19.873972 178.4733
## 101       78.29966  9.785178 146.8142 -26.484200 183.0835
## 102       77.29966  5.303512 149.2958 -32.808948 187.4083

Akurasi data training

SSE

sse.des.train <- des1$SSE
sse.des.train
## [1] 52776.93
sseopt.des.train <- desopt$SSE
sseopt.des.train
## [1] 42764.66

MSE

mse.des.train <- sse1.train/length(training.ts)
mse.des.train
## [1] 494.0038
mseopt.des.train <- sseopt.train/length(training.ts)
mseopt.des.train
## [1] 460.6229

RMSE

rmse.des.train <- sqrt(mse.des.train)
rmse.des.train
## [1] 22.2262
rmseopt.des.train <- sqrt(mseopt.des.train)
rmseopt.des.train
## [1] 21.46213
akurasi.des1 <- matrix(c(sse.des.train,mse.des.train,rmse.des.train,sseopt.des.train,mseopt.des.train,rmseopt.des.train), nrow=3, ncol=2)
row.names(akurasi.des1)<- c("SSE", "MSE", "RMSE")
colnames(akurasi.des1) <- c("alpha=0.2, beta=0.2", "alpha=0.7916783, beta=0")
akurasi.des1
##      alpha=0.2, beta=0.2 alpha=0.7916783, beta=0
## SSE           52776.9306             42764.66255
## MSE             494.0038               460.62293
## RMSE             22.2262                21.46213

Metode DES memiliki dua parameter pemulusan yaitu alpha dan beta yang bernilai antara 0 dan 1. Pada awal pemulusan, digunakan alpha dan beta sebesar 0.2. Lalu dilakukan pemulusan dengan alpha dan beta yang optimal secara otomatis dengan program, didapatkan alpha sebesar 0.7916783 dan beta sebesar 0. Parameter pemulusan yang dipilih adalah alpha = 0.7916783 dan beta = 0 karena memiliki error yang lebih kecil. ## Perbandingan Single Exponential dan Double Exponential

selisihses<-ramalanopt$mean-testing.ts
selisihses
## Time Series:
## Start = 93 
## End = 102 
## Frequency = 1 
##       testing.ts
##  [1,]       64.4
##  [2,]       67.4
##  [3,]       72.4
##  [4,]       68.4
##  [5,]       70.4
##  [6,]       71.4
##  [7,]       76.4
##  [8,]       78.4
##  [9,]       75.4
## [10,]       80.4
SSEtestingses<-sum(selisihses^2)
str(testing.ts)
##  Time-Series [1:23, 1] from 93 to 115: 23 20 15 19 17 16 11 9 12 7 ...
##  - attr(*, "dimnames")=List of 2
##   ..$ : NULL
##   ..$ : chr "Premium"
selisihdes<-ramalandesopt$mean-testing.ts
selisihdes
## Time Series:
## Start = 93 
## End = 102 
## Frequency = 1 
##       testing.ts
##  [1,]   63.29966
##  [2,]   65.29966
##  [3,]   69.29966
##  [4,]   64.29966
##  [5,]   65.29966
##  [6,]   65.29966
##  [7,]   69.29966
##  [8,]   70.29966
##  [9,]   66.29966
## [10,]   70.29966
SSEtestingdes<-sum(selisihdes^2)

akurasi <- matrix(c(SSEtestingses, SSEtestingdes), nrow=1, ncol=2)
row.names(akurasi)<- "SSE"
colnames(akurasi) <- c("SES", "DES")
akurasi
##         SES      DES
## SSE 52797.4 44818.05

Dibandingkan degan metode moving average, metode exponential memiliki error yang jauh lebih kecil, hal ini menandakan bahwa adanya perbedaan pengaruh harga beras antar periode sehingga metode exponential lebih cocok digunakan.

Pemulusan Winter

Import data

setwd("D:/SEMESTER 5/METODE PERAMALAN DERET WAKTU/Kelompok 15")
winter <- read_xlsx("Data Kelompok 15.xlsx")

Membagi data menjadi training dan testing

training<-winter[1:92,3]
testing<-winter[93:115,3]

Membentuk objek time series

winter.ts<-ts(winter$Premium, frequency = 12,)
training.ts<-ts(training, frequency = 12)
testing.ts<-ts(testing, start=93, frequency = 12)

Membuat plot time series

plot(winter.ts, col="red")
points(winter.ts)

plot(training.ts, col="blue")
points(training.ts)

## Pemulusan

aditif <- HoltWinters(training.ts, seasonal = "additive")
aditif 
## Holt-Winters exponential smoothing with trend and additive seasonal component.
## 
## Call:
## HoltWinters(x = training.ts, seasonal = "additive")
## 
## Smoothing parameters:
##  alpha: 0.8548849
##  beta : 0
##  gamma: 1
## 
## Coefficients:
##            [,1]
## a    89.2468595
## b     1.0007284
## s1    1.0606530
## s2    0.3786525
## s3    3.0591237
## s4    5.4550945
## s5  -21.6750436
## s6   -1.1475027
## s7   19.0352662
## s8   -5.8143374
## s9    4.4927227
## s10  -2.5830969
## s11  -1.8064322
## s12  -1.2468595

Forecasting

ramalan1 <- forecast(aditif, h=23)
ramalan1
##        Point Forecast       Lo 80    Hi 80     Lo 95    Hi 95
## Sep  8       91.30824  62.2785608 120.3379  46.91118 135.7053
## Oct  8       91.62697  53.4352486 129.8187  33.21777 150.0362
## Nov  8       95.30817  49.7615697 140.8548  25.65066 164.9657
## Dec  8       98.70487  46.8360156 150.5737  19.37830 178.0314
## Jan  9       72.57546  15.0753450 130.0756 -15.36338 160.5143
## Feb  9       94.10373  31.4766712 156.7308  -1.67609 189.8835
## Mar  9      115.28722  47.9222982 182.6522  12.26146 218.3130
## Apr  9       91.43835  19.6475528 163.2291 -18.35620 201.2329
## May  9      102.74614  26.7869144 178.7054 -13.42347 218.9157
## Jun  9       96.67105  16.7605418 176.5816 -25.54152 218.8836
## Jul  9       98.44844  14.7730315 182.1238 -29.52205 226.4189
## Aug  9      100.00874  12.7306846 187.2868 -33.47153 233.4890
## Sep  9      103.31698  11.3377373 195.2962 -37.35313 243.9871
## Oct  9      103.63571   8.3673173 198.9041 -42.06472 249.3361
## Nov  9      107.31691   8.8691986 205.7646 -43.24587 257.8797
## Dec  9      110.71361   9.1860906 212.2411 -44.55933 265.9865
## Jan 10       84.58420 -19.9324126 189.1008 -75.26016 244.4286
## Feb 10      106.11247  -1.3100957 213.5350 -58.17617 270.4011
## Mar 10      127.29597  17.0440159 237.5479 -41.31984 295.9118
## Apr 10      103.44709  -9.5634293 216.4576 -69.38759 276.2818
## May 10      114.75488  -0.9484604 230.4582 -62.19811 291.7079
## Jun 10      108.67979  -9.6551095 227.0147 -72.29782 289.6574
## Jul 10      110.45718 -10.4520125 231.3664 -74.45747 295.3718

Akurasi data training

SSE

sse1.train <- aditif$SSE
sse1.train
## [1] 40536.24

Winter Multiplikatif

Pemulusan

multi <- HoltWinters(training.ts,seasonal = "multiplicative")
multi
## Holt-Winters exponential smoothing with trend and multiplicative seasonal component.
## 
## Call:
## HoltWinters(x = training.ts, seasonal = "multiplicative")
## 
## Smoothing parameters:
##  alpha: 0.8058113
##  beta : 0
##  gamma: 0.9152878
## 
## Coefficients:
##           [,1]
## a   94.1463388
## b    1.0007284
## s1   0.9197850
## s2   0.9274307
## s3   1.0152787
## s4   1.0986079
## s5   0.1342742
## s6   0.2141852
## s7   0.8424770
## s8   0.6074020
## s9   0.9713126
## s10  0.8706238
## s11  0.9139195
## s12  0.9351503

Forecasting

ramalan2 <- forecast(multi, h=23)
ramalan2
##        Point Forecast      Lo 80     Hi 80      Lo 95     Hi 95
## Sep  8       87.51485   61.52127 113.50843   47.76110 127.26859
## Oct  8       89.17042   55.45444 122.88640   37.60627 140.73457
## Nov  8       98.63282   56.54037 140.72528   34.25796 163.00769
## Dec  8      107.82754   57.92646 157.72862   31.51042 184.14466
## Jan  9       13.31329  -13.32401  39.95059  -27.42495  54.05153
## Feb  9       21.45080  -22.31968  65.22128  -45.49037  88.39197
## Mar  9       85.21776  -78.73445 249.16997 -165.52551 335.96103
## Apr  9       62.04743  -58.50379 182.59865 -122.31975 246.41461
## May  9      100.19371  -92.86186 293.24928 -195.05931 395.44673
## Jun  9       90.67862  -83.79501 265.15226 -176.15578 357.51302
## Jul  9       96.10261  -88.22008 280.42531 -185.79463 377.99985
## Aug  9       99.27095  -90.27886 288.82077 -190.62047 389.16238
## Sep  9       98.56031  -89.76996 286.89058 -189.46599 386.58660
## Oct  9      100.30769  -90.75909 291.37448 -191.90373 392.51912
## Nov  9      110.82504  -99.21155 320.86163 -210.39820 432.04829
## Dec  9      121.02044 -107.08416 349.12505 -227.83545 469.87633
## Jan 10       14.92575  -25.92065  55.77216  -47.54343  77.39493
## Feb 10       24.02289  -39.32223  87.36802  -72.85511 120.90090
## Mar 10       95.33485 -141.53193 332.20162 -266.92162 457.59132
## Apr 10       69.34157 -103.37411 242.05724 -194.80427 333.48740
## May 10      111.85795 -163.90453 387.62044 -309.88438 533.60028
## Jun 10      101.13372 -147.13417 349.40161 -278.55925 480.82669
## Jul 10      107.07764 -154.31856 368.47383 -292.69335 506.84862

Akurasi data training

SSE

sse2.train <- multi$SSE
sse2.train
## [1] 33622.63

Akurasi data testing

selisih1<-as.numeric(ramalan1$mean)-as.numeric(testing.ts)
selisih1
##  [1]  68.30824  71.62697  80.30817  79.70487  55.57546  78.10373 104.28722
##  [8]  82.43835  90.74614  89.67105  97.44844  95.00874 100.31698 101.63571
## [15]  99.31691  96.71361  63.58420  84.11247 109.29597  93.44709 108.75488
## [22] 104.67979  97.45718
SSEtesting1<-sum(selisih1^2)

selisih2<-as.numeric(ramalan2$mean)-as.numeric(testing.ts)
selisih2
##  [1]  64.514846  69.170419  83.632824  88.827542  -3.686713   5.450799
##  [7]  74.217761  53.047432  88.193711  83.678622  95.102613  94.270954
## [13]  95.560306  98.307694 102.825043 107.020440  -6.074248   2.022893
## [19]  77.334849  59.341565 105.857953  97.133718  94.077635
SSEtesting2<-sum(selisih2^2)

akurasi <- matrix(c(SSEtesting1, SSEtesting2), nrow=1, ncol=2)
row.names(akurasi)<- "SSE"
colnames(akurasi) <- c("Aditif", "Multiplikatif")
akurasi
##       Aditif Multiplikatif
## SSE 187947.9        144766

Kesimpulan

  1. Metode Exponential lebih cocok digunakan dibandingkan dengan metode moving average. Adanya perbedaan pengaruh harga beras antar periode membuat metode exponential lebih cocok digunakan.
  2. Dalam Metode Eksponensial, metode Double Exponential Smoothing lebih cocok digunakan dibandingkan Single Exponential Smoothing.
  3. Pada metode Winter, multiplikative lebih cocok digunakan dibandingkan additive.

Daftar Pustaka

Fani E, Widjajati FA, Soehardjoepri. 2017. Perbandingan Metode Winter Eksponensial Smoothing dan Metode Event Based untuk Menentukan Penjualan Produk Terbaik di Perusahaan X. Jurnal Sains dan Seni ITS. 6(1):2337-3529.

H. R. Naufal and R Adrean, “R. naufal Hayâ and R. Adrean, ‘Sistem Informasi Inventory Berdasarkan Prediksi Data Penjualan Barang Menggunakan Metode Single Moving Average Pada CV. Agung Youanda,’ ProTekInfo (Pengembangan Ris. dan Obs.Tek. Inform., vol. 4, pp. 29–33, 2017,” ProTekInfo (Pengembangan Ris. dan Obs. Tek.Inf., vol. 4, pp. 29–33, 2017.

Hartono A, Dwijana D, Handiwidjojo. 2012. Perbandingan metode single exponential smoothing dan exponential smoothing adjusted for trend untuk meramalkan penjualan. Studi kasus: Toko onderdil mobil “Prodi, Purwodadi”. Jurnal EKSIS. 5(1):8-18.

Hapsari, V. 2013. Perbandingan Metode Dekomposisi Klasik dengan Metode Pemulusan Eksponensial Holt-Winter dalam meramalkan Tingkat Pencemaran Udara di Kota Bandung Periode 2003-2012. Universitas Pendidikan Indonesia.

M. Layakana, S. Iskandar. "Penerapan Metode Double Moving Average dan Double Eksponensial Smoothing Dalam Meramalkan Jumlah Produksi Crude Palm Oil(CPO) Pada PT.Perkebunan Nusantara IV Unit Dolok Sinumbah. 6(1): 47.

Purwanto A, Hanief S. 2017. Teknik peramalan dengan double exponential smoothing pada distributor gula. Jurnal Teknologi Informasi dan Komputer. 3(1):362-367

https://www.bps.go.id/indicator/36/500/10/rata-rata-harga-beras-bulanan-di-tingkat-penggilingan-menurut-kualitas.html