In Example 3.39, we presented the diagnostics for the MA(2) fit to the GNP growth rate series. Using that example as a guide, complete the diagnostics for the AR(1) fit.
gnpgr = diff(log(gnp))
plot(gnpgr)
Logging and differencing the series overall stationarize the data.
sarima(gnpgr, 1, 0, 0)
initial value -4.589567
iter 2 value -4.654150
iter 3 value -4.654150
iter 4 value -4.654151
iter 4 value -4.654151
iter 4 value -4.654151
final value -4.654151
converged
initial value -4.655919
iter 2 value -4.655921
iter 3 value -4.655922
iter 4 value -4.655922
iter 5 value -4.655922
iter 5 value -4.655922
iter 5 value -4.655922
final value -4.655922
converged
$fit
Call:
stats::arima(x = xdata, order = c(p, d, q), seasonal = list(order = c(P, D,
Q), period = S), xreg = xmean, include.mean = FALSE, optim.control = list(trace = trc,
REPORT = 1, reltol = tol))
Coefficients:
ar1 xmean
0.3467 0.0083
s.e. 0.0627 0.0010
sigma^2 estimated as 9.03e-05: log likelihood = 718.61, aic = -1431.22
$degrees_of_freedom
[1] 220
$ttable
Estimate SE t.value p.value
ar1 0.3467 0.0627 5.5255 0
xmean 0.0083 0.0010 8.5398 0
$AIC
[1] -8.294403
$AICc
[1] -8.284898
$BIC
[1] -9.263748
Plot Analysis for AR(1) Model: Standardized Residuals - There are no trend detected. The pot looks like white noise. There are several outliers, especially around 1950s that were around three standard deviations from the mean. ACF of Residuals - All the lags are within significance. No departure from model assumptions Normal Q-Q Plot of Standardized Residuals - Due to the outliers, there are some outliers detected at the tails that shows departure from normality. P-Values for Ljung-Box Statistic - The values are overall above the 0.0 dotted line and shows that the Q-statistic is not significant.
Crude oil prices in dollars per barrel are in oil; see Appendix R for more details. Fit an ARIMA(p,d,q) model to the growth rate performing all necessary diagnostics. Comment.
We begin by creating a time series plot of the data
plot.ts(oil, xlab = "Time (Weeks)", ylab = "Price in Dollars Per Barrel")
Since the above plot has an obvious upward trend, the data may require differencing.
Plotting the ACF and PACF of the data yields:
acf2(oil, 48)
ACF PACF
[1,] 0.99 0.99
[2,] 0.99 -0.17
[3,] 0.98 -0.04
[4,] 0.97 -0.10
[5,] 0.95 0.00
[6,] 0.94 -0.04
[7,] 0.93 -0.03
[8,] 0.92 -0.01
[9,] 0.90 -0.14
[10,] 0.89 -0.03
[11,] 0.87 -0.04
[12,] 0.86 -0.02
[13,] 0.84 -0.01
[14,] 0.82 0.05
[15,] 0.81 -0.05
[16,] 0.79 0.00
[17,] 0.77 0.06
[18,] 0.76 -0.04
[19,] 0.74 -0.04
[20,] 0.72 0.02
[21,] 0.71 0.09
[22,] 0.69 0.05
[23,] 0.68 -0.04
[24,] 0.66 0.04
[25,] 0.65 0.00
[26,] 0.64 0.10
[27,] 0.63 -0.02
[28,] 0.61 0.11
[29,] 0.61 -0.02
[30,] 0.59 -0.06
[31,] 0.59 0.00
[32,] 0.58 0.03
[33,] 0.57 0.04
[34,] 0.56 0.00
[35,] 0.55 -0.02
[36,] 0.55 0.06
[37,] 0.54 -0.03
[38,] 0.54 0.01
[39,] 0.53 -0.04
[40,] 0.53 -0.01
[41,] 0.52 0.02
[42,] 0.52 0.04
[43,] 0.51 0.05
[44,] 0.51 -0.01
[45,] 0.51 -0.03
[46,] 0.50 -0.06
[47,] 0.50 -0.02
[48,] 0.50 -0.01
The ACF is decreasing over time with PACF only significant on lag 1. AR = 1.
We then create a differenced data to make the data stationary on the mean and remove the trend
plot(diff(oil), ylab = 'Differenced Price')
The above plot looks good. I = 1.
However, there is an increase in the variance as time increased. Plot may require logistic transformation.
plot(diff(log(oil)), ylab = 'Diff(Log(Price))')
The logistic transformation improved the above plot somewhat.
Now we plot the ACF and PACF
acf2(diff(log(oil)))
ACF PACF
[1,] 0.13 0.13
[2,] -0.07 -0.09
[3,] 0.13 0.16
[4,] -0.01 -0.06
[5,] 0.02 0.05
[6,] -0.03 -0.08
[7,] -0.03 0.00
[8,] 0.13 0.12
[9,] 0.08 0.05
[10,] 0.02 0.03
[11,] 0.01 -0.02
[12,] 0.00 0.00
[13,] -0.02 -0.03
[14,] 0.06 0.09
[15,] -0.05 -0.07
[16,] -0.09 -0.06
[17,] 0.03 0.01
[18,] 0.05 0.04
[19,] -0.05 -0.05
[20,] -0.07 -0.05
[21,] 0.04 0.05
[22,] 0.09 0.06
[23,] -0.05 -0.06
[24,] -0.08 -0.05
[25,] -0.07 -0.08
[26,] 0.00 0.02
[27,] -0.11 -0.11
[28,] -0.07 0.01
[29,] 0.02 0.00
[30,] -0.02 -0.01
[31,] -0.03 -0.05
[32,] -0.05 -0.04
[33,] -0.03 0.02
[34,] 0.00 0.02
[35,] -0.09 -0.08
[36,] -0.01 0.02
[37,] -0.04 -0.04
[38,] -0.01 0.04
[39,] 0.02 -0.01
[40,] -0.01 -0.01
[41,] -0.06 -0.05
[42,] 0.01 0.03
[43,] 0.00 -0.03
[44,] -0.01 0.00
[45,] 0.04 0.08
[46,] 0.01 0.00
[47,] 0.05 0.05
[48,] 0.07 0.01
[49,] -0.01 0.04
[50,] -0.03 -0.08
[51,] 0.01 0.01
[52,] -0.04 -0.07
[53,] -0.04 0.00
[54,] -0.03 -0.06
[55,] 0.00 0.00
[56,] -0.01 -0.06
[57,] -0.10 -0.11
[58,] -0.01 0.01
[59,] -0.05 -0.09
[60,] -0.04 -0.01
[61,] -0.03 -0.04
[62,] 0.01 0.04
[63,] 0.01 -0.01
[64,] -0.01 0.00
[65,] -0.04 -0.04
[66,] 0.02 0.03
[67,] 0.00 0.00
[68,] -0.01 0.00
[69,] -0.03 -0.04
[70,] -0.02 -0.02
[71,] -0.05 -0.04
[72,] -0.01 0.00
[73,] -0.01 -0.01
[74,] -0.02 0.00
[75,] 0.01 0.02
[76,] 0.02 -0.01
[77,] 0.04 0.04
[78,] -0.01 -0.02
[79,] 0.03 0.08
[80,] 0.02 -0.03
[81,] -0.04 -0.03
[82,] -0.01 -0.03
[83,] 0.02 0.03
[84,] 0.03 -0.03
[85,] 0.01 -0.02
[86,] 0.03 0.03
[87,] 0.08 0.04
[88,] -0.04 -0.09
[89,] -0.02 -0.01
[90,] 0.01 -0.02
[91,] -0.04 -0.03
[92,] 0.05 0.03
[93,] 0.07 0.05
[94,] -0.04 -0.11
[95,] 0.02 0.02
[96,] 0.05 -0.01
[97,] 0.01 0.02
[98,] 0.00 -0.03
[99,] 0.01 0.06
[100,] 0.04 0.01
[101,] 0.01 -0.05
[102,] -0.03 0.02
[103,] -0.04 -0.03
[104,] -0.01 0.01
[105,] 0.02 0.00
[106,] 0.01 0.04
[107,] 0.01 -0.01
[108,] 0.06 0.07
[109,] 0.08 0.04
[110,] 0.04 0.04
[111,] 0.02 0.00
[112,] 0.01 0.05
[113,] 0.03 -0.01
[114,] 0.02 0.00
[115,] -0.02 -0.04
[116,] -0.04 -0.03
[117,] -0.01 -0.03
[118,] 0.04 0.02
[119,] 0.05 0.04
[120,] -0.02 -0.01
[121,] -0.02 -0.04
[122,] 0.03 -0.01
[123,] 0.01 0.03
[124,] -0.04 -0.03
[125,] -0.08 -0.07
[126,] 0.02 0.00
[127,] 0.00 -0.02
[128,] -0.04 -0.04
[129,] 0.01 0.01
[130,] 0.02 0.01
[131,] 0.01 -0.01
[132,] 0.02 0.02
[133,] 0.00 0.05
[134,] -0.01 0.02
[135,] 0.00 0.01
[136,] -0.03 0.02
[137,] -0.06 -0.02
[138,] 0.01 0.04
[139,] -0.02 0.01
[140,] -0.02 -0.03
[141,] 0.02 -0.02
[142,] -0.01 0.02
[143,] -0.03 -0.01
[144,] 0.00 0.02
[145,] 0.00 -0.01
[146,] -0.04 0.02
[147,] -0.01 -0.02
[148,] -0.02 -0.03
[149,] -0.04 -0.02
[150,] -0.04 -0.01
[151,] 0.01 0.02
[152,] 0.01 -0.01
[153,] 0.04 0.04
[154,] 0.03 0.03
[155,] 0.01 -0.04
[156,] 0.05 0.03
[157,] 0.01 0.00
[158,] -0.06 -0.05
[159,] 0.02 0.02
[160,] 0.05 0.03
[161,] -0.02 0.00
[162,] 0.05 0.03
[163,] 0.00 0.01
[164,] -0.01 0.00
[165,] 0.00 0.00
[166,] -0.01 0.02
[167,] -0.02 0.01
[168,] -0.01 -0.01
[169,] 0.00 0.03
[170,] -0.03 -0.03
[171,] -0.01 0.00
[172,] -0.02 -0.04
[173,] -0.02 0.00
[174,] 0.04 0.02
[175,] -0.01 -0.03
[176,] -0.03 -0.01
[177,] 0.02 0.01
[178,] 0.01 0.02
[179,] -0.01 -0.01
[180,] -0.01 0.00
[181,] -0.04 -0.04
[182,] 0.07 0.08
[183,] -0.01 -0.05
[184,] -0.04 0.02
[185,] 0.05 -0.01
[186,] -0.02 -0.02
[187,] -0.01 0.03
[188,] 0.01 0.00
[189,] -0.05 -0.03
[190,] -0.04 -0.04
[191,] -0.01 0.01
[192,] 0.01 -0.01
[193,] 0.04 0.07
[194,] -0.01 -0.01
[195,] 0.00 0.02
[196,] 0.06 -0.01
[197,] -0.06 -0.03
[198,] -0.02 0.01
[199,] 0.02 0.00
[200,] 0.00 0.00
[201,] 0.00 -0.02
[202,] 0.00 -0.01
[203,] -0.04 -0.05
[204,] 0.00 -0.01
[205,] 0.04 0.02
[206,] 0.04 0.04
[207,] 0.04 0.05
[208,] 0.01 -0.01
From the above ACF plot, it appears that aside from AR = 1, I = 1, MA = 1 as well since lag 1 is only one significant. The rest of the lags are overall insignificant.
Thus the proposed ARIMA model is ARIMA(1,1,1)
fit <- arima(log(oil), order = c(1,1,1))
fit
Call:
arima(x = log(oil), order = c(1, 1, 1))
Coefficients:
ar1 ma1
-0.5253 0.7142
s.e. 0.0872 0.0683
sigma^2 estimated as 0.002104: log likelihood = 904.58, aic = -1803.15
arima.fit = arima.sim(list(order=c(1,1,1), ar = -0.5253, ma = 0.7142), n = 100)
sarima(log(oil), 1,1,1)
initial value -3.057594
iter 2 value -3.061420
iter 3 value -3.067360
iter 4 value -3.067479
iter 5 value -3.071834
iter 6 value -3.074359
iter 7 value -3.074843
iter 8 value -3.076656
iter 9 value -3.080467
iter 10 value -3.081546
iter 11 value -3.081603
iter 12 value -3.081615
iter 13 value -3.081642
iter 14 value -3.081643
iter 14 value -3.081643
iter 14 value -3.081643
final value -3.081643
converged
initial value -3.082345
iter 2 value -3.082345
iter 3 value -3.082346
iter 4 value -3.082346
iter 5 value -3.082346
iter 5 value -3.082346
iter 5 value -3.082346
final value -3.082346
converged
$fit
Call:
stats::arima(x = xdata, order = c(p, d, q), seasonal = list(order = c(P, D,
Q), period = S), xreg = constant, optim.control = list(trace = trc, REPORT = 1,
reltol = tol))
Coefficients:
ar1 ma1 constant
-0.5264 0.7146 0.0018
s.e. 0.0871 0.0683 0.0022
sigma^2 estimated as 0.002102: log likelihood = 904.89, aic = -1801.79
$degrees_of_freedom
[1] 541
$ttable
Estimate SE t.value p.value
ar1 -0.5264 0.0871 -6.0422 0.0000
ma1 0.7146 0.0683 10.4699 0.0000
constant 0.0018 0.0022 0.7981 0.4252
$AIC
[1] -5.153858
$AICc
[1] -5.150053
$BIC
[1] -6.130184
Plot Analysis for ARIMA(1,1,1) Model: Standardized Residuals - There are no trend detected. The plot looks like white noise with a few outliers at around year 2009. ACF of Residuals - All the lags are within significance. No departure from model assumptions Normal Q-Q Plot of Standardized Residuals - Due to the outliers, there are some outliers detected at the tails that shows departure from normality. P-Values for Ljung-Box Statistic - Most of the p-values are below the 0.0 dotted line. This shows that the Q-statistic is significant. Perhaps this model is not the best fit.
Fit an ARIMA(p,d,q) model to the global temperature data gtemp performing all of the necessary diagnostics. After deciding on an appropriate model, forecast (with limits) the next 10 years. Comment.
To gain an understanding of the dataset, we first look it up in R Documentation
?gtemp
This dataset shows the variations (in degrees centigrade) in global temperature over the past 129 years prior to 2009.
As always, we first plot the time series as well as examine the ACF and PDF
plot.ts(gtemp)
From the plot, we can see that the mean and variance for this time series might not constant. Thus, differencing as well as logistical transformation may be required to stationarize the data.
par(mfrow = c(1,2))
acf2(gtemp)
ACF PACF
[1,] 0.88 0.88
[2,] 0.82 0.23
[3,] 0.79 0.14
[4,] 0.78 0.17
[5,] 0.72 -0.13
[6,] 0.70 0.08
[7,] 0.67 0.00
[8,] 0.63 -0.07
[9,] 0.58 -0.06
[10,] 0.58 0.14
[11,] 0.55 -0.07
[12,] 0.49 -0.13
[13,] 0.46 0.04
[14,] 0.46 0.11
[15,] 0.44 -0.02
[16,] 0.41 0.01
[17,] 0.40 0.09
[18,] 0.40 0.03
[19,] 0.36 -0.13
[20,] 0.34 0.00
[21,] 0.32 -0.03
[22,] 0.30 -0.03
ACF plot shows a gradual tapering to zero, thus this indicating a potential AR model. From the PACF plot above, lag 1 is highly significant, thus we assume AR(1) model is included.
We then differenced the data:
The differenced temperature stationarized the mean. No logistic transformation is necessary since the variance appears constant.
acf2(diff(gtemp))
ACF PACF
[1,] -0.29 -0.29
[2,] -0.16 -0.28
[3,] -0.12 -0.32
[4,] 0.22 0.00
[5,] -0.15 -0.20
[6,] 0.02 -0.10
[7,] 0.03 -0.04
[8,] 0.11 0.05
[9,] -0.20 -0.13
[10,] 0.15 0.09
[11,] 0.04 0.10
[12,] -0.07 -0.02
[13,] -0.17 -0.12
[14,] 0.15 -0.03
[15,] 0.06 0.01
[16,] -0.08 -0.07
[17,] 0.00 0.03
[18,] 0.14 0.10
[19,] -0.14 -0.06
[20,] 0.04 0.09
[21,] 0.00 0.03
[22,] 0.11 0.08
From the above ACF, it appears that lag 2 is most significant, so MA(1).
Our proposed model is then ARIMA(1,1,1).
We then figure out the coefficients of the fit model
fit.gtemp <- arima(gtemp, order = c(1,1,1))
fit.gtemp
Call:
arima(x = gtemp, order = c(1, 1, 1))
Coefficients:
ar1 ma1
0.2256 -0.7158
s.e. 0.1235 0.0792
sigma^2 estimated as 0.009539: log likelihood = 116.83, aic = -227.65
We then used the coefficients to fit the proposed model to the dataset
arima(gtemp, 1,1,1)
Error in arima(gtemp, 1, 1, 1) :
'order' must be a non-negative numeric vector of length 3
Plot Analysis for ARIMA(1,1,1) Model: Standardized Residuals - There are no trend detected. The plot looks like white noise with a few outliers at around year 1960. ACF of Residuals - All the lags are within significance.
Normal Q-Q Plot of Standardized Residuals - Aside from 1 outlier, there are no departure from normality. P-Values for Ljung-Box Statistic - Most of the p-values are above the 0.0 dotted line. The model fits.
Then we used model to forecast the next 10 years, from years 2010 to 2020:
predict = sarima.for(gtemp, 10,1,1,1)
plot(gtemp,type='l', ylim = c(-0.4, 0.8))
lines(predict$pred, col='blue')
lines(predict$pred+2*pred$se,col='red')
lines(predict$pred-2*pred$se,col='red')
The above plot shows the continued upward trend of increasing land-ocean temperature deviations. This means that as time increase, the variations in the global temperature also increase so we can expect the more flucuations in the global temperature in the next decade. Ironically, since it is currently 2018, we can see if this is actually true by comparing our forecast to another dataset that has more updated information. But that will be another project for another time.
Plot (or sketch) the ACF of the seasonal ARIMA(0,1)*(1,0)12 model with Phi = 0.8 and theta = 0.5.
phi = c(rep(0,11), 0.8)
ACF = ARMAacf(ar = phi, ma = 0.5, 50)[-1]
PACF = ARMAacf(ar = phi, ma = 0.5, 50, pacf = TRUE)
par(mfrow = c(1,2))
plot(ACF, type = 'h', xlab = 'lag', ylim = c(-.4, .8)); abline(h = 0)
plot(PACF, type = 'h', xlab = 'lag', ylim = c(-.4, .8)); abline(h = 0)
Fit a seasonal ARIMA model of your choice to the unemployment data(unemp) displayed in Figure 3.21.
Use the estimated model to forecast the next 12 months.
We begin by first plotting the time series as well as the ACF and PACF.
plot.ts(unemp)
acf2(unemp, 48)
ACF PACF
[1,] 0.96 0.96
[2,] 0.92 -0.04
[3,] 0.88 0.12
[4,] 0.87 0.21
[5,] 0.86 0.07
[6,] 0.83 -0.22
[7,] 0.81 0.18
[8,] 0.78 -0.28
[9,] 0.75 0.05
[10,] 0.73 0.22
[11,] 0.74 0.19
[12,] 0.74 -0.05
[13,] 0.69 -0.55
[14,] 0.64 0.05
[15,] 0.61 0.14
[16,] 0.60 0.06
[17,] 0.60 0.15
[18,] 0.58 0.06
[19,] 0.57 0.09
[20,] 0.54 -0.09
[21,] 0.53 0.01
[22,] 0.53 0.02
[23,] 0.54 -0.02
[24,] 0.55 0.06
[25,] 0.51 -0.24
[26,] 0.48 0.01
[27,] 0.45 0.05
[28,] 0.45 -0.02
[29,] 0.45 0.00
[30,] 0.43 0.03
[31,] 0.42 0.05
[32,] 0.40 -0.01
[33,] 0.39 0.04
[34,] 0.39 0.02
[35,] 0.40 0.00
[36,] 0.42 0.02
[37,] 0.38 -0.19
[38,] 0.34 0.01
[39,] 0.32 -0.04
[40,] 0.31 -0.03
[41,] 0.31 0.03
[42,] 0.30 0.09
[43,] 0.29 -0.01
[44,] 0.27 0.02
[45,] 0.26 0.03
[46,] 0.26 0.04
[47,] 0.28 0.01
[48,] 0.29 -0.02
The above ACF shows a gradual tapering to zero, which indicates that the data may require differencing. The PACF has a significant lags 1, 2, and 3. This might have an AR(3) model with seasonal component since there appears to be a pattern with the significant lags.
To remove the seasonal difference to the data:
sdiff=diff(unemp, 12)
plot(sdiff)
The above plot appears more stationarized since there is no upward trend.
acf2(sdiff)
ACF PACF
[1,] 0.96 0.96
[2,] 0.90 -0.24
[3,] 0.82 -0.33
[4,] 0.73 -0.11
[5,] 0.62 -0.14
[6,] 0.50 -0.10
[7,] 0.38 -0.09
[8,] 0.26 0.04
[9,] 0.14 -0.05
[10,] 0.03 -0.06
[11,] -0.06 0.07
[12,] -0.15 -0.10
[13,] -0.21 0.37
[14,] -0.24 -0.01
[15,] -0.27 -0.17
[16,] -0.28 -0.07
[17,] -0.29 -0.03
[18,] -0.28 -0.03
[19,] -0.27 -0.06
[20,] -0.26 -0.07
[21,] -0.25 -0.08
[22,] -0.23 -0.06
[23,] -0.22 0.08
[24,] -0.21 -0.06
[25,] -0.20 0.24
[26,] -0.19 -0.08
[27,] -0.18 -0.04
[28,] -0.17 0.01
[29,] -0.16 -0.09
[30,] -0.16 -0.13
[31,] -0.16 -0.03
[32,] -0.16 -0.08
[33,] -0.16 0.00
[34,] -0.16 -0.05
[35,] -0.16 0.03
[36,] -0.16 0.04
[37,] -0.15 0.20
[38,] -0.14 0.08
[39,] -0.12 -0.08
[40,] -0.10 0.02
[41,] -0.08 -0.03
[42,] -0.05 -0.04
[43,] -0.02 -0.01
[44,] 0.02 0.07
[45,] 0.06 0.02
[46,] 0.10 -0.02
[47,] 0.15 -0.02
[48,] 0.19 0.02
To also remove the trend from the series, we difference the data, then plot the ACF and PACF of the result.
plot(diff(sdiff), ylab = 'Differenced Unemployment')
acf2(diff(sdiff), 48)
ACF PACF
[1,] 0.21 0.21
[2,] 0.33 0.29
[3,] 0.15 0.05
[4,] 0.17 0.05
[5,] 0.10 0.01
[6,] 0.06 -0.02
[7,] -0.06 -0.12
[8,] -0.02 -0.03
[9,] -0.09 -0.05
[10,] -0.17 -0.15
[11,] -0.08 0.02
[12,] -0.48 -0.43
[13,] -0.18 -0.02
[14,] -0.16 0.15
[15,] -0.11 0.03
[16,] -0.15 -0.04
[17,] -0.09 -0.01
[18,] -0.09 0.00
[19,] 0.03 0.01
[20,] -0.01 0.01
[21,] 0.02 -0.01
[22,] -0.02 -0.16
[23,] 0.01 0.01
[24,] -0.02 -0.27
[25,] 0.09 0.05
[26,] -0.05 -0.01
[27,] -0.01 -0.05
[28,] 0.03 0.05
[29,] 0.08 0.09
[30,] 0.01 -0.04
[31,] 0.03 0.02
[32,] -0.05 -0.07
[33,] 0.01 -0.01
[34,] 0.02 -0.08
[35,] -0.06 -0.08
[36,] -0.02 -0.23
[37,] -0.12 -0.08
[38,] 0.01 0.06
[39,] -0.03 -0.07
[40,] -0.03 -0.01
[41,] -0.10 0.03
[42,] -0.02 -0.03
[43,] -0.13 -0.11
[44,] 0.00 -0.04
[45,] -0.06 0.01
[46,] 0.01 0.00
[47,] 0.02 -0.03
[48,] 0.11 -0.04
Non-Seasonal: From the ACF plot above, it appears there are significant lags in lag 1 and 2. Thus MA(2) model should also be included in addition to the AR(3) model.
Seasonal: With the seasonal lags, it appears from ACF PACF that there is a MA(1) component. In ACF, the only lag of big significance was lag 1. With PACF, it tapers off over time.
So far, our SARIMA model is (3,0,2)(0,1,1)[12]
unemp.fit <- sarima(unemp, 3,0,2,0,1,1,12)
initial value 4.609734
iter 2 value 4.227097
iter 3 value 4.131686
iter 4 value 4.116309
iter 5 value 3.813352
iter 6 value 3.812355
iter 7 value 3.583729
iter 8 value 3.409732
iter 9 value 3.226038
iter 10 value 3.132945
iter 11 value 3.125648
iter 12 value 3.121203
iter 13 value 3.118366
iter 14 value 3.118051
iter 15 value 3.117914
iter 16 value 3.117691
iter 17 value 3.116917
iter 18 value 3.114901
iter 19 value 3.113303
iter 20 value 3.112225
iter 21 value 3.103769
iter 22 value 3.091396
iter 23 value 3.089535
iter 24 value 3.084192
iter 25 value 3.081766
iter 26 value 3.078024
iter 27 value 3.069704
iter 28 value 3.063884
iter 29 value 3.060846
iter 30 value 3.057542
iter 31 value 3.056277
iter 32 value 3.055367
iter 33 value 3.052188
iter 34 value 3.050800
iter 35 value 3.049668
iter 36 value 3.048388
iter 37 value 3.046105
iter 38 value 3.045394
iter 39 value 3.045273
iter 40 value 3.045240
iter 41 value 3.045202
iter 42 value 3.045036
iter 43 value 3.044562
iter 44 value 3.044117
iter 45 value 3.043809
iter 46 value 3.043403
iter 47 value 3.043397
iter 48 value 3.043363
iter 49 value 3.043271
iter 50 value 3.043177
iter 51 value 3.043172
iter 52 value 3.043171
iter 53 value 3.043171
iter 54 value 3.043168
iter 55 value 3.043166
iter 56 value 3.043164
iter 57 value 3.043163
iter 58 value 3.043163
iter 59 value 3.043163
iter 60 value 3.043163
iter 60 value 3.043162
iter 60 value 3.043162
final value 3.043162
converged
initial value 3.044901
iter 2 value 3.044851
iter 3 value 3.044813
iter 4 value 3.044773
iter 5 value 3.044768
iter 6 value 3.044759
iter 7 value 3.044754
iter 8 value 3.044752
iter 9 value 3.044751
iter 10 value 3.044749
iter 11 value 3.044748
iter 12 value 3.044748
iter 12 value 3.044748
iter 12 value 3.044748
final value 3.044748
converged
unemp.fit
$fit
Call:
stats::arima(x = xdata, order = c(p, d, q), seasonal = list(order = c(P, D,
Q), period = S), xreg = constant, optim.control = list(trace = trc, REPORT = 1,
reltol = tol))
Coefficients:
ar1 ar2 ar3 ma1 ma2 sma1 constant
1.0603 0.5256 -0.6158 0.0203 -0.3931 -0.6759 1.1704
s.e. 0.1569 0.2734 0.1413 0.1755 0.1516 0.0409 0.6518
sigma^2 estimated as 428.6: log likelihood = -1606.93, aic = 3229.85
$degrees_of_freedom
[1] 353
$ttable
Estimate SE t.value p.value
ar1 1.0603 0.1569 6.7589 0.0000
ar2 0.5256 0.2734 1.9227 0.0553
ar3 -0.6158 0.1413 -4.3582 0.0000
ma1 0.0203 0.1755 0.1154 0.9082
ma2 -0.3931 0.1516 -2.5936 0.0099
sma1 -0.6759 0.0409 -16.5363 0.0000
constant 1.1704 0.6518 1.7957 0.0734
$AIC
[1] 7.098133
$AICc
[1] 7.104576
$BIC
[1] 6.171876
All the above plots looks good.
sarima(unemp, 3,0,2,0,1,1,12)
initial value 4.609734
iter 2 value 4.227097
iter 3 value 4.131686
iter 4 value 4.116309
iter 5 value 3.813352
iter 6 value 3.812355
iter 7 value 3.583729
iter 8 value 3.409732
iter 9 value 3.226038
iter 10 value 3.132945
iter 11 value 3.125648
iter 12 value 3.121203
iter 13 value 3.118366
iter 14 value 3.118051
iter 15 value 3.117914
iter 16 value 3.117691
iter 17 value 3.116917
iter 18 value 3.114901
iter 19 value 3.113303
iter 20 value 3.112225
iter 21 value 3.103769
iter 22 value 3.091396
iter 23 value 3.089535
iter 24 value 3.084192
iter 25 value 3.081766
iter 26 value 3.078024
iter 27 value 3.069704
iter 28 value 3.063884
iter 29 value 3.060846
iter 30 value 3.057542
iter 31 value 3.056277
iter 32 value 3.055367
iter 33 value 3.052188
iter 34 value 3.050800
iter 35 value 3.049668
iter 36 value 3.048388
iter 37 value 3.046105
iter 38 value 3.045394
iter 39 value 3.045273
iter 40 value 3.045240
iter 41 value 3.045202
iter 42 value 3.045036
iter 43 value 3.044562
iter 44 value 3.044117
iter 45 value 3.043809
iter 46 value 3.043403
iter 47 value 3.043397
iter 48 value 3.043363
iter 49 value 3.043271
iter 50 value 3.043177
iter 51 value 3.043172
iter 52 value 3.043171
iter 53 value 3.043171
iter 54 value 3.043168
iter 55 value 3.043166
iter 56 value 3.043164
iter 57 value 3.043163
iter 58 value 3.043163
iter 59 value 3.043163
iter 60 value 3.043163
iter 60 value 3.043162
iter 60 value 3.043162
final value 3.043162
converged
initial value 3.044901
iter 2 value 3.044851
iter 3 value 3.044813
iter 4 value 3.044773
iter 5 value 3.044768
iter 6 value 3.044759
iter 7 value 3.044754
iter 8 value 3.044752
iter 9 value 3.044751
iter 10 value 3.044749
iter 11 value 3.044748
iter 12 value 3.044748
iter 12 value 3.044748
iter 12 value 3.044748
final value 3.044748
converged
$fit
Call:
stats::arima(x = xdata, order = c(p, d, q), seasonal = list(order = c(P, D,
Q), period = S), xreg = constant, optim.control = list(trace = trc, REPORT = 1,
reltol = tol))
Coefficients:
ar1 ar2 ar3 ma1 ma2 sma1 constant
1.0603 0.5256 -0.6158 0.0203 -0.3931 -0.6759 1.1704
s.e. 0.1569 0.2734 0.1413 0.1755 0.1516 0.0409 0.6518
sigma^2 estimated as 428.6: log likelihood = -1606.93, aic = 3229.85
$degrees_of_freedom
[1] 353
$ttable
Estimate SE t.value p.value
ar1 1.0603 0.1569 6.7589 0.0000
ar2 0.5256 0.2734 1.9227 0.0553
ar3 -0.6158 0.1413 -4.3582 0.0000
ma1 0.0203 0.1755 0.1154 0.9082
ma2 -0.3931 0.1516 -2.5936 0.0099
sma1 -0.6759 0.0409 -16.5363 0.0000
constant 1.1704 0.6518 1.7957 0.0734
$AIC
[1] 7.098133
$AICc
[1] 7.104576
$BIC
[1] 6.171876
The above plot looks good. We have our model.