Neural Net

Libraries

library(MASS)
library(neuralnet)
library(caTools)
library(ggplot2)

Data

Boston dataset shows features of a house. The purpose of this is to predict the median value of homes. The data has 506 obs of 14 values, and has no NA values.

head(Boston)
##      crim zn indus chas   nox    rm  age    dis rad tax ptratio  black
## 1 0.00632 18  2.31    0 0.538 6.575 65.2 4.0900   1 296    15.3 396.90
## 2 0.02731  0  7.07    0 0.469 6.421 78.9 4.9671   2 242    17.8 396.90
## 3 0.02729  0  7.07    0 0.469 7.185 61.1 4.9671   2 242    17.8 392.83
## 4 0.03237  0  2.18    0 0.458 6.998 45.8 6.0622   3 222    18.7 394.63
## 5 0.06905  0  2.18    0 0.458 7.147 54.2 6.0622   3 222    18.7 396.90
## 6 0.02985  0  2.18    0 0.458 6.430 58.7 6.0622   3 222    18.7 394.12
##   lstat medv
## 1  4.98 24.0
## 2  9.14 21.6
## 3  4.03 34.7
## 4  2.94 33.4
## 5  5.33 36.2
## 6  5.21 28.7

Preprocess Data

Data Normalization

Data should be noramliazed for a Neural net. Depending on the type of data, if non-normalized, the NN might not be valid.

maxs <- apply(Boston, MARGIN = 2, max)
mins <- apply(Boston, MARGIN = 2, min)

scaled_data <- scale(Boston, center = mins, scale = maxs-mins)

#convert the data back to a dataframe instead of keeping as a matrix
scaled_data <- as.data.frame(scaled_data)

Checking scaled data:

head(scaled_data)
##           crim   zn      indus chas       nox        rm       age
## 1 0.0000000000 0.18 0.06781525    0 0.3148148 0.5775053 0.6416066
## 2 0.0002359225 0.00 0.24230205    0 0.1728395 0.5479977 0.7826982
## 3 0.0002356977 0.00 0.24230205    0 0.1728395 0.6943859 0.5993821
## 4 0.0002927957 0.00 0.06304985    0 0.1502058 0.6585553 0.4418126
## 5 0.0007050701 0.00 0.06304985    0 0.1502058 0.6871048 0.5283213
## 6 0.0002644715 0.00 0.06304985    0 0.1502058 0.5497222 0.5746653
##         dis        rad        tax   ptratio     black      lstat      medv
## 1 0.2692031 0.00000000 0.20801527 0.2872340 1.0000000 0.08967991 0.4222222
## 2 0.3489620 0.04347826 0.10496183 0.5531915 1.0000000 0.20447020 0.3688889
## 3 0.3489620 0.04347826 0.10496183 0.5531915 0.9897373 0.06346578 0.6600000
## 4 0.4485446 0.08695652 0.06679389 0.6489362 0.9942761 0.03338852 0.6311111
## 5 0.4485446 0.08695652 0.06679389 0.6489362 1.0000000 0.09933775 0.6933333
## 6 0.4485446 0.08695652 0.06679389 0.6489362 0.9929901 0.09602649 0.5266667

Model Training

split <- sample.split(scaled_data$medv, SplitRatio = 0.7)
train <- subset(scaled_data, split == T)
test <- subset(scaled_data, split == F)

Formula to add all variable names to model, since in neural net the ~ +. does not work (line in other models).

n <- names(train)
var <- as.formula(paste("medv ~", paste(n[!n %in% "medv"], collapse = " + ")))

Neural Net Model

nn <- neuralnet(var, data=train, hidden = c(5,3), linear.output = T) #predicting a continonus var

Plot of Neural Net

Black lines - connections between each layer and weights between each connection

blue lines - biais term added in each step

plot(nn)

Predictions with NN Model

predicted <- compute(nn, test[1:13]) #removing the label medv
unscaled.pred <- predicted$net.result * (max(Boston$medv) - min (Boston$medv)) + min(Boston$medv) #unscale the data from previously
test.r <- (test$medv) * (max(Boston$medv) - min (Boston$medv)) + min(Boston$medv)

MSE of model

mse.nn <- (sum(test.r - unscaled.pred)^2)/nrow(test)
mse.nn
## [1] 24.90327696
df.error <- data.frame(test.r, unscaled.pred)
ggplot(df.error, aes(df.error$test.r, df.error$unscaled.pred)) +
  geom_point() +
  stat_smooth()
## `geom_smooth()` using method = 'loess' and formula 'y ~ x'

Nicholas Schettini

11/25/2018