In this assigment, I have chose the stock price for Apple.Inc, and specificlty I have choosen its adjusted.Close stock price from 1980 01.01 to 2020.12.04(the time frame does varies based on differnet models that I applyed to)
Following are the models I have chosen to apply: fit, lm tslm, classcial add.& mul. decomposition, STL add.& mul. decomposition, ETS, ETS.ZZZ, ETS.AAA, Arima, Auto Arima, Garch and Var. All models are evaluated with residuals and accuracy check.
#fit, addictve and multip,kicative decompsocition,ETS, Arima, auto.arima,Garch,Var
library(psych)
library(fpp2)
## Registered S3 method overwritten by 'quantmod':
## method from
## as.zoo.data.frame zoo
## ── Attaching packages ───────────────────────────────────────────────────────────────────────────── fpp2 2.4 ──
## ✓ ggplot2 3.3.2 ✓ fma 2.4
## ✓ forecast 8.13 ✓ expsmooth 2.3
## ── Conflicts ──────────────────────────────────────────────────────────────────────────────── fpp2_conflicts ──
## x ggplot2::%+%() masks psych::%+%()
## x ggplot2::alpha() masks psych::alpha()
library(forecast)
library(knitr)
library(stats)
library(dplyr)
##
## Attaching package: 'dplyr'
## The following objects are masked from 'package:stats':
##
## filter, lag
## The following objects are masked from 'package:base':
##
## intersect, setdiff, setequal, union
library(tidyr)
library(rugarch)
## Loading required package: parallel
##
## Attaching package: 'rugarch'
## The following object is masked from 'package:stats':
##
## sigma
library(tseries)
library(fGarch)
## Loading required package: timeDate
## Loading required package: timeSeries
##
## Attaching package: 'timeSeries'
## The following object is masked from 'package:psych':
##
## outlier
## Loading required package: fBasics
##
## Attaching package: 'fBasics'
## The following object is masked from 'package:psych':
##
## tr
Apple = AAPL <- read.csv("~/Desktop/AAPL.csv", stringsAsFactors=TRUE)
summary(Apple)
## Date Open High Low
## 1980-12-12: 1 0.354911: 38 0.372768: 35 0.357143: 51
## 1980-12-15: 1 0.401786: 37 0.375000: 34 0.352679: 42
## 1980-12-16: 1 0.366071: 36 0.363839: 32 0.366071: 38
## 1980-12-17: 1 0.357143: 34 0.370536: 32 0.343750: 35
## 1980-12-18: 1 0.397321: 34 0.334821: 31 0.348214: 35
## 1980-12-19: 1 0.372768: 33 0.361607: 31 0.397321: 35
## (Other) :10076 (Other) :9870 (Other) :9887 (Other) :9846
## Close Adj.Close Volume
## 0.399554: 40 0.086316: 20 246400000: 7
## 0.352679: 31 0.053184: 18 239680000: 6
## 0.354911: 31 0.089803: 17 244160000: 6
## 0.372768: 30 0.069750: 16 255360000: 5
## 0.366071: 28 0.095035: 16 302400000: 5
## 0.357143: 27 0.062775: 14 118720000: 4
## (Other) :9895 (Other) :9981 (Other) :10049
Apple.ts = ts(Apple[,6], frequency = 12, start = c(1980, 12, 12), end = c(2020,12,04))
autoplot(Apple.ts) + ggtitle("Adjust close stock price for Apple from 1980 ~2020")
ggseasonplot(Apple.ts)
apple.fit = snaive(Apple)
summary(apple.fit)
##
## Forecast method: Seasonal naive method
##
## Model Information:
## Call: snaive(y = Apple)
##
## Residual sd: 3338.71
##
## Error measures:
## ME RMSE MAE MPE MAPE MASE ACF1
## Training set -2241.531 3338.71 2612.198 -359.0673 366.0833 8.223053 0.8834297
##
## Forecasts:
## Point Forecast Lo 80 Hi 80 Lo 95 Hi 95
## 10083 10082 5803.271 14360.729 3538.249 16625.75
## 10084 2103 -3948.037 8154.037 -7151.262 11357.26
fc1 = applefit.fc = forecast(apple.fit)
autoplot(fc1)
apple.decomp.add = decompose(Apple.ts,type="additive")
apple.decomp.multi = decompose(Apple.ts, type = "multiplicative")
autoplot(apple.decomp.add) + ggtitle("Additive decomposition of Apple adijected closed stock price")
autoplot(apple.decomp.multi) + ggtitle("Multiplicative decomposition of Apple adijected closed stock price")
#Apple.ts
stl.apple.add=stl(Apple.ts,s.window = "periodic")
stl.apple.mult=stl(log(Apple.ts),s.window="periodic")
autoplot(stl.apple.add) + ggtitle("STL model-additive decomposition of Apple adijected closed stock price")
autoplot(stl.apple.mult) + ggtitle("STL model-Multiplitative decomposition of Apple adijected closed stock price")
autoplot(forecast(stl.apple.add, method="naive"))
autoplot(forecast(stl.apple.mult, method="naive"))
fc2 = forecast(stl.apple.add, method="naive")
fc3 = forecast(stl.apple.mult, method="naive")
autoplot(fc2)
autoplot(fc3)
Appleprice = ts(Apple)
fit = lm(Date~Adj.Close, data = Appleprice)
coefficients(fit)
## (Intercept) Adj.Close
## 1832.524594 1.160782
checkresiduals(fit)
##
## Breusch-Godfrey test for serial correlation of order up to 10
##
## data: Residuals
## LM test = 9928.6, df = 10, p-value < 2.2e-16
#Appleprice = ts(Apple)
#Appleprice
appleregmodel = tslm(Open ~ Adj.Close, data = Appleprice)
summary(appleregmodel)
##
## Call:
## tslm(formula = Open ~ Adj.Close, data = Appleprice)
##
## Residuals:
## Min 1Q Median 3Q Max
## -3002.89 -240.33 81.05 254.79 816.82
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -2.335e+02 6.670e+00 -35.0 <2e-16 ***
## Adj.Close 7.244e-01 1.899e-03 381.5 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 413.2 on 10080 degrees of freedom
## Multiple R-squared: 0.9352, Adjusted R-squared: 0.9352
## F-statistic: 1.455e+05 on 1 and 10080 DF, p-value: < 2.2e-16
residual = resid(appleregmodel)
plot(fitted(appleregmodel, residual))
qqnorm(residual)
qqline(residual)
apple.ts.stl = stl(Apple.ts,s.window = "periodic")
apple.seasonal = seasadj(apple.ts.stl)
ggseasonplot(apple.seasonal)
apple.etsmodel1 =ets(apple.seasonal)
fc4 = apple.etsmodel.fc = forecast
apple.etsmodel2 =ets(apple.seasonal, model ="ZZZ")
fc5 = apple.etsmodel2.fc = forecast(apple.etsmodel2,h = 60)
apple.etsmodel3 =ets(apple.seasonal, model ="AAA")
fc6 = apple.etsmodel2.fc = forecast(apple.etsmodel3,h = 60)
apple.autoarima = auto.arima(apple.seasonal, seasonal=FALSE)
autoplot(apple.autoarima)
checkresiduals(apple.autoarima)
##
## Ljung-Box test
##
## data: Residuals from ARIMA(0,1,1)
## Q* = 1.1095, df = 23, p-value = 1
##
## Model df: 1. Total lags used: 24
fc7 = forecast(apple.autoarima, h=6)
apple.arima = arima(apple.seasonal)
autoplot(apple.arima)
checkresiduals(apple.arima)
##
## Ljung-Box test
##
## data: Residuals from ARIMA(0,0,0) with non-zero mean
## Q* = 6.2857, df = 23, p-value = 0.9998
##
## Model df: 1. Total lags used: 24
fc8 = forecast(apple.arima)
autoplot(fc7)
autoplot(fc8)
apple.garch = garchFit(data = Apple.ts)
##
## Series Initialization:
## ARMA Model: arma
## Formula Mean: ~ arma(0, 0)
## GARCH Model: garch
## Formula Variance: ~ garch(1, 1)
## ARMA Order: 0 0
## Max ARMA Order: 0
## GARCH Order: 1 1
## Max GARCH Order: 1
## Maximum Order: 1
## Conditional Dist: norm
## h.start: 2
## llh.start: 1
## Length of Series: 481
## Recursion Init: mci
## Series Scale: 322.9692
##
## Parameter Initialization:
## Initial Parameters: $params
## Limits of Transformations: $U, $V
## Which Parameters are Fixed? $includes
## Parameter Matrix:
## U V params includes
## mu -2.88719189 2.887192 0.2887192 TRUE
## omega 0.00000100 100.000000 0.1000000 TRUE
## alpha1 0.00000001 1.000000 0.1000000 TRUE
## gamma1 -0.99999999 1.000000 0.1000000 FALSE
## beta1 0.00000001 1.000000 0.8000000 TRUE
## delta 0.00000000 2.000000 2.0000000 FALSE
## skew 0.10000000 10.000000 1.0000000 FALSE
## shape 1.00000000 10.000000 4.0000000 FALSE
## Index List of Parameters to be Optimized:
## mu omega alpha1 beta1
## 1 2 3 5
## Persistence: 0.9
##
##
## --- START OF TRACE ---
## Selected Algorithm: nlminb
##
## R coded nlminb Solver:
##
## 0: 789.37761: 0.288719 0.100000 0.100000 0.800000
## 1: 736.39986: 0.288464 0.240632 0.0917310 0.860385
## 2: 693.66121: 0.288035 0.277808 0.0483498 0.718164
## 3: 685.62599: 0.281767 0.385644 1.00000e-08 0.678134
## 4: 683.25556: 0.280732 0.313985 1.00000e-08 0.717686
## 5: 682.51729: 0.326281 0.274303 1.00000e-08 0.737309
## 6: 682.15251: 0.307891 0.250252 1.00000e-08 0.745227
## 7: 682.00619: 0.288125 0.252856 1.00000e-08 0.747238
## 8: 682.00465: 0.288963 0.254290 1.00000e-08 0.746788
## 9: 682.00461: 0.288705 0.254040 1.00000e-08 0.746949
## 10: 682.00461: 0.288689 0.253934 1.00000e-08 0.747049
## 11: 682.00455: 0.288580 0.252356 1.00000e-08 0.748588
## 12: 682.00419: 0.288048 0.239773 1.00000e-08 0.761005
## 13: 682.00344: 0.287314 0.214091 1.00000e-08 0.786545
## 14: 681.99732: 0.284581 0.111451 1.00000e-08 0.888865
## 15: 681.98075: 0.284271 0.100673 1.00000e-08 0.900251
## 16: 681.95167: 0.283651 0.0790923 1.00000e-08 0.923002
## 17: 681.93825: 0.283440 0.0715862 1.00000e-08 0.930638
## 18: 681.89232: 0.283018 0.0565568 1.00000e-08 0.945894
## 19: 681.86458: 0.282868 0.0512184 1.00000e-08 0.951290
## 20: 681.78161: 0.282569 0.0405423 1.00000e-08 0.962084
## 21: 681.74508: 0.282464 0.0367958 1.00000e-08 0.965871
## 22: 681.70430: 0.282254 0.0293028 1.00000e-08 0.973445
## 23: 681.66707: 0.282048 0.0207530 1.00000e-08 0.979803
## 24: 681.59258: 0.282006 0.0198453 1.00000e-08 0.981951
## 25: 681.49226: 0.282051 0.0212553 1.00000e-08 0.980095
## 26: 681.49074: 0.282069 0.0219521 1.00000e-08 0.979523
## 27: 681.49069: 0.282076 0.0221339 1.00000e-08 0.979256
## 28: 681.49013: 0.282079 0.0222756 1.00000e-08 0.979178
## 29: 681.49007: 0.282076 0.0221559 1.00000e-08 0.979287
## 30: 681.49007: 0.282077 0.0221541 1.00000e-08 0.979285
## 31: 681.49006: 0.282078 0.0221553 1.00000e-08 0.979287
## 32: 681.49006: 0.282078 0.0221499 1.00000e-08 0.979289
## 33: 681.49006: 0.282079 0.0221476 1.00000e-08 0.979295
## 34: 681.49005: 0.282080 0.0221422 1.00000e-08 0.979296
## 35: 681.49005: 0.282080 0.0221399 1.00000e-08 0.979302
## 36: 681.49005: 0.282081 0.0221344 1.00000e-08 0.979304
## 37: 681.49004: 0.282082 0.0221320 1.00000e-08 0.979309
## 38: 681.49004: 0.282082 0.0221265 1.00000e-08 0.979311
## 39: 681.49004: 0.282083 0.0221241 1.00000e-08 0.979317
## 40: 681.49004: 0.282084 0.0221186 1.00000e-08 0.979319
## 41: 681.49003: 0.282084 0.0221240 1.00000e-08 0.979317
## 42: 681.49003: 0.282085 0.0221310 1.00000e-08 0.979307
## 43: 681.49003: 0.282086 0.0221236 1.00000e-08 0.979317
## 44: 681.49003: 0.282087 0.0221035 1.00000e-08 0.979332
## 45: 681.49002: 0.282088 0.0221232 1.00000e-08 0.979317
## 46: 681.49002: 0.282090 0.0221375 1.00000e-08 0.979298
## 47: 681.49001: 0.282091 0.0221220 1.00000e-08 0.979318
## 48: 681.49000: 0.282096 0.0221051 1.00000e-08 0.979326
## 49: 681.48999: 0.282097 0.0221262 1.00000e-08 0.979313
## 50: 681.48997: 0.282104 0.0221221 1.00000e-08 0.979311
## 51: 681.48996: 0.282105 0.0221072 1.00000e-08 0.979331
## 52: 681.48995: 0.282110 0.0221164 1.00000e-08 0.979317
## 53: 681.48994: 0.282111 0.0221563 1.00000e-08 0.979286
## 54: 681.48993: 0.282112 0.0221185 1.00000e-08 0.979320
## 55: 681.48992: 0.282122 0.0221502 1.00000e-08 0.979300
## 56: 681.48989: 0.282127 0.0221783 1.00000e-08 0.979263
## 57: 681.48988: 0.282134 0.0222135 1.00000e-08 0.979234
## 58: 681.48980: 0.282151 0.0220843 1.00000e-08 0.979359
## 59: 681.48979: 0.282152 0.0220787 1.00000e-08 0.979353
## 60: 681.48978: 0.282154 0.0220805 1.00000e-08 0.979356
## 61: 681.48976: 0.282159 0.0220781 1.00000e-08 0.979353
## 62: 681.48572: 0.284205 0.0211222 1.00000e-08 0.980255
## 63: 681.47968: 0.286281 0.0214565 1.00000e-08 0.979943
## 64: 681.47568: 0.289794 0.0220259 1.00000e-08 0.979413
## 65: 681.47567: 0.289793 0.0220731 1.00000e-08 0.979368
## 66: 681.47567: 0.289790 0.0220772 1.00000e-08 0.979364
##
## Final Estimate of the Negative LLH:
## LLH: 3460.481 norm LLH: 7.194346
## mu omega alpha1 beta1
## 9.359327e+01 2.302853e+03 1.000000e-08 9.793635e-01
##
## R-optimhess Difference Approximated Hessian Matrix:
## mu omega alpha1 beta1
## mu -4.640786e-03 9.978582e-06 1.372178e+00 1.176696e+00
## omega 9.978582e-06 -4.124040e-05 4.702679e+00 -4.517450e+00
## alpha1 1.372178e+00 4.702679e+00 3.077255e+06 1.070806e+04
## beta1 1.176696e+00 -4.517450e+00 1.070806e+04 -5.058054e+05
## attr(,"time")
## Time difference of 0.02870297 secs
##
## --- END OF TRACE ---
## Warning in sqrt(diag(fit$cvar)): NaNs produced
##
## Time to Estimate Parameters:
## Time difference of 0.1430879 secs
## Warning: Using formula(x) is deprecated when x is a character vector of length > 1.
## Consider formula(paste(x, collapse = " ")) instead.
summary(apple.garch)
##
## Title:
## GARCH Modelling
##
## Call:
## garchFit(data = Apple.ts)
##
## Mean and Variance Equation:
## data ~ garch(1, 1)
## <environment: 0x7fa3ca3390e0>
## [data = Apple.ts]
##
## Conditional Distribution:
## norm
##
## Coefficient(s):
## mu omega alpha1 beta1
## 9.3593e+01 2.3029e+03 1.0000e-08 9.7936e-01
##
## Std. Errors:
## based on Hessian
##
## Error Analysis:
## Estimate Std. Error t value Pr(>|t|)
## mu 9.359e+01 1.468e+01 6.374 1.84e-10 ***
## omega 2.303e+03 3.583e+02 6.427 1.30e-10 ***
## alpha1 1.000e-08 NA NA NA
## beta1 9.794e-01 3.506e-03 279.315 < 2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Log Likelihood:
## -3460.481 normalized: -7.194346
##
## Description:
## Sun Dec 6 13:43:24 2020 by user:
##
##
## Standardised Residuals Tests:
## Statistic p-Value
## Jarque-Bera Test R Chi^2 4151932 0
## Shapiro-Wilk Test R W 0.0870996 0
## Ljung-Box Test R Q(10) 2.995734 0.9815243
## Ljung-Box Test R Q(15) 3.821864 0.9982646
## Ljung-Box Test R Q(20) 4.741752 0.9998169
## Ljung-Box Test R^2 Q(10) 0.02384788 1
## Ljung-Box Test R^2 Q(15) 0.03619076 1
## Ljung-Box Test R^2 Q(20) 0.04898836 1
## LM Arch Test R TR^2 0.02881284 1
##
## Information Criterion Statistics:
## AIC BIC SIC HQIC
## 14.40532 14.44005 14.40519 14.41897
fc9 = apple.garch.fc =predict(apple.garch, plot = "TRUE" )
#autoplot(fc9)
SS = SAMSUNG <- read.csv("~/Desktop/SAMSUNG.csv", stringsAsFactors=TRUE)
ss.ts = ts(SS[,6], frequency = 12, start = c(2000, 01, 01), end = c(2020,12,04))
app.ts = ts(Apple[,6], frequency = 12, start = c(2000, 01, 01), end = c(2020,12,04))
par(mfrow=c(1,2))
plot.ts(ss.ts)
plot.ts(app.ts)
cor(ss.ts, app.ts)
## [1] 0.1149583
ss.diff = diff(ss.ts)
app.diff = diff(app.ts)
par(mfrow=c(1,2))
autoplot(ss.ts)
autoplot(app.ts)
d1 = data.frame(ss.diff,app.diff)
d.ts = ts(d1, start = c(2000,01,01), end = c(2020,12,04),frequency = 12)
autoplot(d.ts)
differ =diff(d.ts)
autoplot(differ)
library(vars)
## Loading required package: MASS
##
## Attaching package: 'MASS'
## The following object is masked from 'package:dplyr':
##
## select
## The following objects are masked from 'package:fma':
##
## cement, housing, petrol
## Loading required package: strucchange
## Loading required package: zoo
##
## Attaching package: 'zoo'
## The following object is masked from 'package:timeSeries':
##
## time<-
## The following objects are masked from 'package:base':
##
## as.Date, as.Date.numeric
## Loading required package: sandwich
## Loading required package: urca
## Loading required package: lmtest
VARselect(differ,lag.max=8,type="const")[["selection"]]
## AIC(n) HQ(n) SC(n) FPE(n)
## 8 7 6 8
var1 = VAR(differ, p=1, type="const")
summary(var1)
##
## VAR Estimation Results:
## =========================
## Endogenous variables: ss.diff, app.diff
## Deterministic variables: const
## Sample size: 250
## Log Likelihood: -3173.406
## Roots of the characteristic polynomial:
## 0.6659 0.4452
## Call:
## VAR(y = differ, p = 1, type = "const")
##
##
## Estimation results for equation ss.diff:
## ========================================
## ss.diff = ss.diff.l1 + app.diff.l1 + const
##
## Estimate Std. Error t value Pr(>|t|)
## ss.diff.l1 -0.4389294 0.0571596 -7.679 3.74e-13 ***
## app.diff.l1 -0.0007437 0.0014196 -0.524 0.601
## const -0.1338211 1.5222906 -0.088 0.930
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
##
## Residual standard error: 24.07 on 247 degrees of freedom
## Multiple R-Squared: 0.1982, Adjusted R-squared: 0.1917
## F-statistic: 30.52 on 2 and 247 DF, p-value: 1.427e-12
##
##
## Estimation results for equation app.diff:
## =========================================
## app.diff = ss.diff.l1 + app.diff.l1 + const
##
## Estimate Std. Error t value Pr(>|t|)
## ss.diff.l1 1.89967 1.91960 0.990 0.323
## app.diff.l1 -0.67208 0.04768 -14.097 <2e-16 ***
## const -0.13673 51.12338 -0.003 0.998
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
##
## Residual standard error: 808.3 on 247 degrees of freedom
## Multiple R-Squared: 0.4464, Adjusted R-squared: 0.4419
## F-statistic: 99.59 on 2 and 247 DF, p-value: < 2.2e-16
##
##
##
## Covariance matrix of residuals:
## ss.diff app.diff
## ss.diff 579.3 2465
## app.diff 2464.6 653392
##
## Correlation matrix of residuals:
## ss.diff app.diff
## ss.diff 1.0000 0.1267
## app.diff 0.1267 1.0000
fc10 = Var1.fc = forecast(var1)
autoplot(fc10)
par(mfrow=c(1,2))
#autoplot(fc4)
#autoplot(fc5)
#autoplot(fc6)
kable(accuracy(fc1))
ME | RMSE | MAE | MPE | MAPE | MASE | ACF1 | |
---|---|---|---|---|---|---|---|
Training set | -2241.531 | 3338.71 | 2612.198 | -359.0673 | 366.0833 | 8.223053 | 0.8834297 |
kable(accuracy(fc2))
ME | RMSE | MAE | MPE | MAPE | MASE | ACF1 | |
---|---|---|---|---|---|---|---|
Training set | 0.0354167 | 444.3561 | 60.53384 | -15.55304 | 104.8223 | 1.270888 | -0.4996439 |
kable(accuracy(fc3))
ME | RMSE | MAE | MPE | MAPE | MASE | ACF1 | |
---|---|---|---|---|---|---|---|
Training set | 0.0002404 | 0.2992514 | 0.1059212 | -5.266324e+15 | 5.266324e+15 | 0.3114905 | -0.3613811 |
#kable(accuracy(fc4))
kable(accuracy(fc5))
ME | RMSE | MAE | MPE | MAPE | MASE | ACF1 | |
---|---|---|---|---|---|---|---|
Training set | -12.5545 | 317.5526 | 61.77642 | -34.62886 | 89.74864 | 1.296976 | 0.0073341 |
kable(accuracy(fc6))
ME | RMSE | MAE | MPE | MAPE | MASE | ACF1 | |
---|---|---|---|---|---|---|---|
Training set | 3.44537 | 317.9595 | 61.28886 | -7.575718 | 89.99875 | 1.28674 | 0.0122818 |
kable(accuracy(fc7))
ME | RMSE | MAE | MPE | MAPE | MASE | ACF1 | |
---|---|---|---|---|---|---|---|
Training set | -11.75867 | 317.5791 | 60.08599 | -33.49894 | 89.03781 | 1.261486 | 0.0052517 |
I would like to talk about three models: STL vs classical decomposition, ETS vs.Arima, and VAR models
As I observed from above information, there isn’t a huge difference between the classical and STL decomposition models, except the STL decomposition models have a stronger flutuations in the residuals plots; this is very intersting to me that for this dataset, I can see the dataset does distributs with many occasional and extrem peaks of daily Adjusted. Closed stock price; therefore, I expect that STL may perfomace better with reducing the residuals accross the entir time frame. Another intersting point that I feel is worth to take a look at is that at around 2010, from the original dataset, I could tell thatthe adjest. close price is experiencing a roller coster time period, however, this phonomeno isn’t refelcted by the STL add.decomposition model, and overall, the STL multi. decomoposition model appears with more variations in detals, also the final accuracy check, I see that the STL multi. decomposition model has better performance with improving accuracy for forecasting.
Besides only the ETS model, I have also using ETS.ZZZ ad ETS. AAA for forecasting, the reason that I have chosen these two is that the dataset, or this specific categrory of the dataset contains zero and negative values that other ETS models do not support to be operated. From the plots of the three models, they all seem to present quite similar results, except the ETS.AAA model seems to catch the peak happens at around 1995 at a bit right-skewed position, but still, it appears to be almost the same based on the final accuracy check.All three models, however, have really high values for RMSE and MASE, exepcially they all have MASE values which is much greater than 1; Same situation happen when I setup the arima and auto arima model. Therefore, I am not very sure whether both ETS and Arima model support stron accuracy for modeling on this dataset. For Arima and Auto.arima model, witth residual checking process, I notice thay their lags seem to lie differently from the positions of the trends, and comparing to ETS model, I thi k the two arima models definitly handles trends better,since I could still see that lags from ETS models still lie quite away from the main trends.
For the last VAR model, I have chosen another time serise, wwhich is the Adjust.Close Stock price from SAMSUNG.INC, and both SAMSUNG and APPLE’ s dataset have been adjusted to a time series from 2000.01.01 to 2020.12.04 for better comparison. By first checking the corrolation, I was pretty surpursing to see the results that the two do not seem to hold strong corrolation at all; as two big electronic productions manufactories in the markets, I was expectign them to be really strong competitors with each other. I have take the difference between these two and made a new time serise based for the new dataset. As the result, for the model itself, it has a prettey presentation on the p-value and adjusted R^2 values, which tells that the choice of using VAR for this comparision does provides certain enough level of accuracy and for the furture frecasting, I think insted of forecasting for the differences between the two, it actually forecasted for each of them, and it seems that Apple mains a realvily more stable flutuation than SAMSUNG.