In this week, we will be discussing Neural Networks, or neuralnets. These are based on simple mathematical models of the brain and allow for xomplex nonlinear relationships between the response and predictor variables. One can think of a neuranet as a network of “neurons”, which are organized in layers. Each input has attached coefficients, or “weights”. Forecasts are obtained by a linear combination of inputs, and weights are selected in the neuralnet framework using a “learning algorithm” minimizing a cost function. For our purposes, inputs and the regression coefficients are lagged values of the time series. This time, we will be observing Consumer Price Index (CPI) information for the time period of 12/2009-12/2019

CPIAUCSL_1_ <- read_excel("C:/Users/abact/Downloads/CPIAUCSL (1).xls")
View(CPIAUCSL_1_)
cpi <- ts(CPIAUCSL_1_$CPIAUCSL, start=c(2009, 12), frequency = 12)
require(forecast)
require(fpp2)
require(ggplot2)
autoplot(cpi) + xlab("Time, Year")

summary(cpi)
   Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
  217.2   228.8   237.0   236.9   244.2   258.4 

We see that there seems to be strong upward trend, and should not be too affected by seasonality. Returning to neuralnets, they consist of an initial input layer of weighted regression values. One can use variables they believe are correlated to the response variable for this layer. Again, for our purposes, we will be using lagged observations. Weights are automatically determined for these inuts by a learning algortihm. There is then, a “hidden layer” that these inputs affect, should the model require it. Should the hidden layer be required, it would then be classified as a “multi-layer feed-forward network”, where each node layer receives inputs from the previous layers. This is basically a series of layer inputs to further layer inputs. Again, each node is combined with a weight.

The hidden layer is calculated as the summation of all weighted input nodes wi,j xi where wj is the weight for input 1 applied to node j. As such, a model with no hidden layer is structurally akin to a moving average model. We can also consider seasonal lags where p=12 so that seasonality can be considered.

Using neural network autoregression, we can automate the number of lagged inputs, the seasonal lags considered, and the number of nodes in the hidden layer. This results in the form NNAR(p,P,k)m.

cpi.nnar <- nnetar(cpi, lambda=0)
nnar.pred <- forecast(cpi.nnar, h=12)
summary(nnar.pred)

Forecast method: NNAR(1,1,2)[12]

Model Information:

Average of 20 networks, each of which is
a 2-2-1 network with 9 weights
options were - linear output units 

Error measures:
                      ME      RMSE       MAE           MPE      MAPE       MASE      ACF1
Training set 0.000414096 0.4239639 0.3292742 -0.0001493837 0.1383185 0.07868945 0.3083911

Forecasts:
autoplot(nnar.pred, PI=TRUE)

Can’t seem to get prediction intervals to be working, but i believe that this model shows a somewhat accurate one-year forecast.

We see that we get the automated model NNAR(1,1,2)12. This results in a single lag, single seasonal lag, and two hidden layers. It has an extremely small RMSE and ME. Of course, this model does not take into account the negative shock that the coronavirus had, especially given the mass layoffs that occurred.

We may want to attempt a model that incorporates the unemployment rate. This would follow the theory laid down by the Phillips Curve. If this model can accurately predict the recent turbulence in wages more than the neuralnet model can, then that would be interesting.

cpi <- read.csv("C:/Users/abact/Downloads/CPIAUCSL (3).csv")
cpi <- ts(cpi$CPIAUCSL, start=c(2009,12), frequency=12)
unem <- read.csv("C:/Users/abact/Downloads/UNRATE (4).csv")
unem <- ts(unem$UNRATE, start=c(2009,12), frequency=12)
phillip.nnar <- nnetar(cpi, lambda=0, xreg=unem)
phillip.pred <- forecast(phillip.nnar,xreg=unem, h=12)
summary(phillip.pred)

Forecast method: NNAR(1,1,2)[12]

Model Information:

Average of 20 networks, each of which is
a 3-2-1 network with 11 weights
options were - linear output units 

Error measures:
                       ME      RMSE     MAE           MPE      MAPE       MASE      ACF1
Training set 0.0002767027 0.4400133 0.34096 -0.0002382096 0.1421984 0.08272387 0.3175262

Forecasts:
autoplot(phillip.pred, PI=TRUE)

We do come across a difficult question. We want to forecast the accuracy of our model with accurate, up-to-date input variables, but can’t calculate the accuracy between observed and forecast obsesrvations. That being said, the model does an admirable job calculating the most recent values, including the negative shock. Redoing the model with up-to-date observations…

cpi.nnar <- nnetar(cpi, lambda=0)
nnar.pred <- forecast(cpi.nnar, h=12)
summary(nnar.pred)

Forecast method: NNAR(1,1,2)[12]

Model Information:

Average of 20 networks, each of which is
a 2-2-1 network with 9 weights
options were - linear output units 

Error measures:
                      ME      RMSE       MAE           MPE      MAPE       MASE      ACF1
Training set 0.000414641 0.4936522 0.3679964 -0.0002474709 0.1527091 0.08928346 0.3648081

Forecasts:
autoplot(nnar.pred, PI=TRUE)

I would say that the Phillips Curve model does better in forecasting random shocks.

Anyway, that was fun. The Phillips Curve does well, but not so much better that it’s not easier to just use lagged variables.

LS0tDQp0aXRsZTogIk5ldXJhbCBOZXR3b3JrIERpc2N1c3Npb24iDQpvdXRwdXQ6IGh0bWxfbm90ZWJvb2sNCi0tLQ0KDQpJbiB0aGlzIHdlZWssIHdlIHdpbGwgYmUgZGlzY3Vzc2luZyBOZXVyYWwgTmV0d29ya3MsIG9yIG5ldXJhbG5ldHMuIFRoZXNlIGFyZSBiYXNlZCBvbiBzaW1wbGUgbWF0aGVtYXRpY2FsIG1vZGVscyBvZiB0aGUgYnJhaW4gYW5kIGFsbG93IGZvciB4b21wbGV4IG5vbmxpbmVhciByZWxhdGlvbnNoaXBzIGJldHdlZW4gdGhlIHJlc3BvbnNlIGFuZCBwcmVkaWN0b3IgdmFyaWFibGVzLiBPbmUgY2FuIHRoaW5rIG9mIGEgbmV1cmFuZXQgYXMgYSBuZXR3b3JrIG9mICJuZXVyb25zIiwgd2hpY2ggYXJlIG9yZ2FuaXplZCBpbiBsYXllcnMuIEVhY2ggaW5wdXQgaGFzIGF0dGFjaGVkIGNvZWZmaWNpZW50cywgb3IgIndlaWdodHMiLiBGb3JlY2FzdHMgYXJlIG9idGFpbmVkIGJ5IGEgbGluZWFyIGNvbWJpbmF0aW9uIG9mIGlucHV0cywgYW5kIHdlaWdodHMgYXJlIHNlbGVjdGVkIGluIHRoZSBuZXVyYWxuZXQgZnJhbWV3b3JrIHVzaW5nIGEgImxlYXJuaW5nIGFsZ29yaXRobSIgbWluaW1pemluZyBhIGNvc3QgZnVuY3Rpb24uIEZvciBvdXIgcHVycG9zZXMsIGlucHV0cyBhbmQgdGhlIHJlZ3Jlc3Npb24gY29lZmZpY2llbnRzIGFyZSBsYWdnZWQgdmFsdWVzIG9mIHRoZSB0aW1lIHNlcmllcy4NClRoaXMgdGltZSwgd2Ugd2lsbCBiZSBvYnNlcnZpbmcgQ29uc3VtZXIgUHJpY2UgSW5kZXggKENQSSkgaW5mb3JtYXRpb24gZm9yIHRoZSB0aW1lIHBlcmlvZCBvZiAxMi8yMDA5LTEyLzIwMTkNCg0KYGBge3J9DQpDUElBVUNTTF8xXyA8LSByZWFkX2V4Y2VsKCJDOi9Vc2Vycy9hYmFjdC9Eb3dubG9hZHMvQ1BJQVVDU0wgKDEpLnhscyIpDQpWaWV3KENQSUFVQ1NMXzFfKQ0KY3BpIDwtIHRzKENQSUFVQ1NMXzFfJENQSUFVQ1NMLCBzdGFydD1jKDIwMDksIDEyKSwgZnJlcXVlbmN5ID0gMTIpDQpyZXF1aXJlKGZvcmVjYXN0KQ0KcmVxdWlyZShmcHAyKQ0KcmVxdWlyZShnZ3Bsb3QyKQ0KYXV0b3Bsb3QoY3BpKSArIHhsYWIoIlRpbWUsIFllYXIiKQ0Kc3VtbWFyeShjcGkpDQpgYGANCg0KV2Ugc2VlIHRoYXQgdGhlcmUgc2VlbXMgdG8gYmUgc3Ryb25nIHVwd2FyZCB0cmVuZCwgYW5kIHNob3VsZCBub3QgYmUgdG9vIGFmZmVjdGVkIGJ5IHNlYXNvbmFsaXR5LiANClJldHVybmluZyB0byBuZXVyYWxuZXRzLCB0aGV5IGNvbnNpc3Qgb2YgYW4gaW5pdGlhbCBpbnB1dCBsYXllciBvZiB3ZWlnaHRlZCByZWdyZXNzaW9uIHZhbHVlcy4gT25lIGNhbiB1c2UgdmFyaWFibGVzIHRoZXkgYmVsaWV2ZSBhcmUgY29ycmVsYXRlZCB0byB0aGUgcmVzcG9uc2UgdmFyaWFibGUgZm9yIHRoaXMgbGF5ZXIuIEFnYWluLCBmb3Igb3VyIHB1cnBvc2VzLCB3ZSB3aWxsIGJlIHVzaW5nIGxhZ2dlZCBvYnNlcnZhdGlvbnMuIFdlaWdodHMgYXJlIGF1dG9tYXRpY2FsbHkgZGV0ZXJtaW5lZCBmb3IgdGhlc2UgaW51dHMgYnkgYSBsZWFybmluZyBhbGdvcnRpaG0uIFRoZXJlIGlzIHRoZW4sIGEgImhpZGRlbiBsYXllciIgdGhhdCB0aGVzZSBpbnB1dHMgYWZmZWN0LCBzaG91bGQgdGhlIG1vZGVsIHJlcXVpcmUgaXQuIFNob3VsZCB0aGUgaGlkZGVuIGxheWVyIGJlIHJlcXVpcmVkLCBpdCB3b3VsZCB0aGVuIGJlIGNsYXNzaWZpZWQgYXMgYSAibXVsdGktbGF5ZXIgZmVlZC1mb3J3YXJkIG5ldHdvcmsiLCB3aGVyZSBlYWNoIG5vZGUgbGF5ZXIgcmVjZWl2ZXMgaW5wdXRzIGZyb20gdGhlIHByZXZpb3VzIGxheWVycy4gVGhpcyBpcyBiYXNpY2FsbHkgYSBzZXJpZXMgb2YgbGF5ZXIgaW5wdXRzIHRvIGZ1cnRoZXIgbGF5ZXIgaW5wdXRzLiBBZ2FpbiwgZWFjaCBub2RlIGlzIGNvbWJpbmVkIHdpdGggYSB3ZWlnaHQuDQoNClRoZSBoaWRkZW4gbGF5ZXIgaXMgY2FsY3VsYXRlZCBhcyB0aGUgc3VtbWF0aW9uIG9mIGFsbCB3ZWlnaHRlZCBpbnB1dCBub2RlcyB3aSxqIHhpIHdoZXJlIHdqIGlzIHRoZSB3ZWlnaHQgZm9yIGlucHV0IDEgYXBwbGllZCB0byBub2RlIGouIEFzIHN1Y2gsIGEgbW9kZWwgd2l0aCBubyBoaWRkZW4gbGF5ZXIgaXMgc3RydWN0dXJhbGx5IGFraW4gdG8gYSBtb3ZpbmcgYXZlcmFnZSBtb2RlbC4gV2UgY2FuIGFsc28gY29uc2lkZXIgc2Vhc29uYWwgbGFncyB3aGVyZSBwPTEyIHNvIHRoYXQgc2Vhc29uYWxpdHkgY2FuIGJlIGNvbnNpZGVyZWQuDQoNClVzaW5nIG5ldXJhbCBuZXR3b3JrIGF1dG9yZWdyZXNzaW9uLCB3ZSBjYW4gYXV0b21hdGUgdGhlIG51bWJlciBvZiBsYWdnZWQgaW5wdXRzLCB0aGUgc2Vhc29uYWwgbGFncyBjb25zaWRlcmVkLCBhbmQgdGhlIG51bWJlciBvZiBub2RlcyBpbiB0aGUgaGlkZGVuIGxheWVyLiBUaGlzIHJlc3VsdHMgaW4gdGhlIGZvcm0gTk5BUihwLFAsayltLg0KDQpgYGB7cn0NCmNwaS5ubmFyIDwtIG5uZXRhcihjcGksIGxhbWJkYT0wKQ0Kbm5hci5wcmVkIDwtIGZvcmVjYXN0KGNwaS5ubmFyLCBoPTEyKQ0Kc3VtbWFyeShubmFyLnByZWQpDQphdXRvcGxvdChubmFyLnByZWQsIFBJPVRSVUUpDQpgYGANCg0KQ2FuJ3Qgc2VlbSB0byBnZXQgcHJlZGljdGlvbiBpbnRlcnZhbHMgdG8gYmUgd29ya2luZywgYnV0IGkgYmVsaWV2ZSB0aGF0IHRoaXMgbW9kZWwgc2hvd3MgYSBzb21ld2hhdCBhY2N1cmF0ZSBvbmUteWVhciBmb3JlY2FzdC4NCg0KV2Ugc2VlIHRoYXQgd2UgZ2V0IHRoZSBhdXRvbWF0ZWQgbW9kZWwgTk5BUigxLDEsMikxMi4gVGhpcyByZXN1bHRzIGluIGEgc2luZ2xlIGxhZywgc2luZ2xlIHNlYXNvbmFsIGxhZywgYW5kIHR3byBoaWRkZW4gbGF5ZXJzLiBJdCBoYXMgYW4gZXh0cmVtZWx5IHNtYWxsIFJNU0UgYW5kIE1FLiBPZiBjb3Vyc2UsIHRoaXMgbW9kZWwgZG9lcyBub3QgdGFrZSBpbnRvIGFjY291bnQgdGhlIG5lZ2F0aXZlIHNob2NrIHRoYXQgdGhlIGNvcm9uYXZpcnVzIGhhZCwgZXNwZWNpYWxseSBnaXZlbiB0aGUgbWFzcyBsYXlvZmZzIHRoYXQgb2NjdXJyZWQuIA0KDQpXZSBtYXkgd2FudCB0byBhdHRlbXB0IGEgbW9kZWwgdGhhdCBpbmNvcnBvcmF0ZXMgdGhlIHVuZW1wbG95bWVudCByYXRlLiBUaGlzIHdvdWxkIGZvbGxvdyB0aGUgdGhlb3J5IGxhaWQgZG93biBieSB0aGUgUGhpbGxpcHMgQ3VydmUuIElmIHRoaXMgbW9kZWwgY2FuIGFjY3VyYXRlbHkgcHJlZGljdCB0aGUgcmVjZW50IHR1cmJ1bGVuY2UgaW4gd2FnZXMgbW9yZSB0aGFuIHRoZSBuZXVyYWxuZXQgbW9kZWwgY2FuLCB0aGVuIHRoYXQgd291bGQgYmUgaW50ZXJlc3RpbmcuDQoNCmBgYHtyfQ0KY3BpIDwtIHJlYWQuY3N2KCJDOi9Vc2Vycy9hYmFjdC9Eb3dubG9hZHMvQ1BJQVVDU0wgKDMpLmNzdiIpDQpjcGkgPC0gdHMoY3BpJENQSUFVQ1NMLCBzdGFydD1jKDIwMDksMTIpLCBmcmVxdWVuY3k9MTIpDQp1bmVtIDwtIHJlYWQuY3N2KCJDOi9Vc2Vycy9hYmFjdC9Eb3dubG9hZHMvVU5SQVRFICg0KS5jc3YiKQ0KdW5lbSA8LSB0cyh1bmVtJFVOUkFURSwgc3RhcnQ9YygyMDA5LDEyKSwgZnJlcXVlbmN5PTEyKQ0KYGBgDQoNCmBgYHtyfQ0KcGhpbGxpcC5ubmFyIDwtIG5uZXRhcihjcGksIGxhbWJkYT0wLCB4cmVnPXVuZW0pDQpwaGlsbGlwLnByZWQgPC0gZm9yZWNhc3QocGhpbGxpcC5ubmFyLHhyZWc9dW5lbSwgaD0xMikNCnN1bW1hcnkocGhpbGxpcC5wcmVkKQ0KYXV0b3Bsb3QocGhpbGxpcC5wcmVkLCBQST1UUlVFKQ0KYGBgDQoNCldlIGRvIGNvbWUgYWNyb3NzIGEgZGlmZmljdWx0IHF1ZXN0aW9uLiBXZSB3YW50IHRvIGZvcmVjYXN0IHRoZSBhY2N1cmFjeSBvZiBvdXIgbW9kZWwgd2l0aCBhY2N1cmF0ZSwgdXAtdG8tZGF0ZSBpbnB1dCB2YXJpYWJsZXMsIGJ1dCBjYW4ndCBjYWxjdWxhdGUgdGhlIGFjY3VyYWN5IGJldHdlZW4gb2JzZXJ2ZWQgYW5kIGZvcmVjYXN0IG9ic2VzcnZhdGlvbnMuIFRoYXQgYmVpbmcgc2FpZCwgdGhlIG1vZGVsIGRvZXMgYW4gYWRtaXJhYmxlIGpvYiBjYWxjdWxhdGluZyB0aGUgbW9zdCByZWNlbnQgdmFsdWVzLCBpbmNsdWRpbmcgdGhlIG5lZ2F0aXZlIHNob2NrLiBSZWRvaW5nIHRoZSBtb2RlbCB3aXRoIHVwLXRvLWRhdGUgb2JzZXJ2YXRpb25zLi4uDQoNCmBgYHtyfQ0KY3BpLm5uYXIgPC0gbm5ldGFyKGNwaSwgbGFtYmRhPTApDQpubmFyLnByZWQgPC0gZm9yZWNhc3QoY3BpLm5uYXIsIGg9MTIpDQpzdW1tYXJ5KG5uYXIucHJlZCkNCmF1dG9wbG90KG5uYXIucHJlZCwgUEk9VFJVRSkNCmBgYA0KDQpJIHdvdWxkIHNheSB0aGF0IHRoZSBQaGlsbGlwcyBDdXJ2ZSBtb2RlbCBkb2VzIGJldHRlciBpbiBmb3JlY2FzdGluZyByYW5kb20gc2hvY2tzLg0KDQpBbnl3YXksIHRoYXQgd2FzIGZ1bi4gVGhlIFBoaWxsaXBzIEN1cnZlIGRvZXMgd2VsbCwgYnV0IG5vdCBzbyBtdWNoIGJldHRlciB0aGF0IGl0J3Mgbm90IGVhc2llciB0byBqdXN0IHVzZSBsYWdnZWQgdmFyaWFibGVzLg0KDQoNCg0KDQo=