Thank you

A big thank you to Leon Jessen for posting his code on github.

Building a simple neural network using Keras and Tensorflow

I have forked his project on github and put his code into an R Notebook so we can run it in class.

Motivation

The following is a minimal example for building your first simple artificial neural network using Keras and TensorFlow for R.

TensorFlow for R by Rstudio lives here.

Gettings started - Install Keras and TensorFlow for R

You can install the Keras for R package from CRAN as follows:

# install.packages("keras")

TensorFlow is the default backend engine. TensorFlow and Keras can be installed as follows:

# library(keras)
# install_keras()

Naturally, we will also need TidyVerse:

# Install from CRAN
# install.packages("tidyverse")

# Or the development version from GitHub
# install.packages("devtools")
# devtools::install_github("hadley/tidyverse")

Once installed, we simply load the libraries

library("keras")
library("tidyverse")

Artificial Neural Network Using the Iris Data Set

Right, let’s get to it!

Data

The famous (Fisher’s or Anderson’s) iris data set contains a total of 150 observations of 4 input features Sepal.Length, Sepal.Width, Petal.Length and Petal.Width and 3 output classes setosa versicolor and virginica, with 50 observations in each class. The distributions of the feature values looks like so:

iris %>% as_tibble %>% gather(feature, value, -Species) %>%
  ggplot(aes(x = feature, y = value, fill = Species)) +
  geom_violin(alpha = 0.5, scale = "width") +
  theme_bw()

Our aim is to connect the 4 input features to the correct output class using an artificial neural network. For this task, we have chosen the following simple architecture with one input layer with 4 neurons (one for each feature), one hidden layer with 4 neurons and one output layer with 3 neurons (one for each class), all fully connected:

architecture_visualisation.png

architecture_visualisation.png

Our artificial neural network will have a total of 35 parameters: 4 for each input neuron connected to the hidden layer, plus an additional 4 for the associated first bias neuron and 3 for each of the hidden neurons connected to the output layer, plus an additional 3 for the associated second bias neuron. I.e. \(4â‹…4+4+4â‹…3+3=35\)

Prepare data

We start with slightly wrangling the iris data set by renaming and scaling the features and converting character labels to numeric:

set.seed(265509)
nn_dat <- iris %>% as_tibble %>%
  mutate(sepal_length = scale(Sepal.Length),
         sepal_width  = scale(Sepal.Width),
         petal_length = scale(Petal.Length),
         petal_width  = scale(Petal.Width),          
         class_label  = as.numeric(Species) - 1) %>% # Yields 0, 1, 2
  select(sepal_length, sepal_width, petal_length, petal_width, class_label)
nn_dat %>% head(3)

Then, we create indices for splitting the iris data into a training and a test data set. We set aside 20% of the data for testing:

test_fraction   <- 0.20
n_total_samples <- nrow(nn_dat)
n_train_samples <- ceiling((1 - test_fraction) * n_total_samples)
train_indices   <- sample(n_total_samples, n_train_samples)
n_test_samples  <- n_total_samples - n_train_samples
test_indices    <- setdiff(seq(1, n_train_samples), train_indices)

Based on the indices, we can now create training and test data

x_train <- nn_dat %>% select(-class_label) %>% as.matrix %>% .[train_indices,]
y_train <- nn_dat %>% pull(class_label) %>% .[train_indices] %>% to_categorical(3)
x_test  <- nn_dat %>% select(-class_label) %>% as.matrix %>% .[test_indices,]
y_test  <- nn_dat %>% pull(class_label) %>% .[test_indices] %>% to_categorical(3)

Set Architecture

With the data in place, we now set the architecture of our artificical neural network:

model <- keras_model_sequential()
model %>% 
  layer_dense(units = 4, activation = 'relu', input_shape = 4) %>% 
  layer_dense(units = 3, activation = 'softmax')
model %>% summary
_________________________________________________________________________________________________________________
Layer (type)                                      Output Shape                                  Param #          
=================================================================================================================
dense_20 (Dense)                                  (None, 4)                                     20               
_________________________________________________________________________________________________________________
dense_21 (Dense)                                  (None, 3)                                     15               
=================================================================================================================
Total params: 35
Trainable params: 35
Non-trainable params: 0
_________________________________________________________________________________________________________________

Next, the architecture set in the model needs to be compiled:

model %>% compile(
  loss      = 'categorical_crossentropy',
  optimizer = optimizer_rmsprop(),
  metrics   = c('accuracy')
)

Train the Artificial Neural Network

Lastly we fit the model and save the training progres in the history object:

history <- model %>% fit(
  x = x_train, y = y_train,
  epochs = 200,
  batch_size = 20,
  validation_split = 0
)
Epoch 1/200

 20/120 [====>.........................] - ETA: 1s - loss: 1.3527 - acc: 0.2000
120/120 [==============================] - 0s 3ms/step - loss: 1.2111 - acc: 0.3250
Epoch 2/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.2399 - acc: 0.1500
120/120 [==============================] - 0s 98us/step - loss: 1.1889 - acc: 0.3500
Epoch 3/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.1548 - acc: 0.3500
120/120 [==============================] - 0s 105us/step - loss: 1.1727 - acc: 0.3583
Epoch 4/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.0860 - acc: 0.5000
120/120 [==============================] - 0s 123us/step - loss: 1.1583 - acc: 0.3667
Epoch 5/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.2402 - acc: 0.2500
120/120 [==============================] - 0s 122us/step - loss: 1.1449 - acc: 0.3750
Epoch 6/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.0372 - acc: 0.4500
120/120 [==============================] - 0s 124us/step - loss: 1.1324 - acc: 0.4000
Epoch 7/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.1827 - acc: 0.2500
120/120 [==============================] - 0s 120us/step - loss: 1.1203 - acc: 0.4167
Epoch 8/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.0159 - acc: 0.5000
120/120 [==============================] - 0s 124us/step - loss: 1.1086 - acc: 0.4167
Epoch 9/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.0815 - acc: 0.4500
120/120 [==============================] - 0s 117us/step - loss: 1.0972 - acc: 0.4333
Epoch 10/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.1010 - acc: 0.4500
120/120 [==============================] - 0s 121us/step - loss: 1.0859 - acc: 0.4250
Epoch 11/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9482 - acc: 0.6000
120/120 [==============================] - 0s 123us/step - loss: 1.0745 - acc: 0.4333
Epoch 12/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.0028 - acc: 0.4500
120/120 [==============================] - 0s 119us/step - loss: 1.0638 - acc: 0.4583
Epoch 13/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.0802 - acc: 0.5000
120/120 [==============================] - 0s 120us/step - loss: 1.0538 - acc: 0.4583
Epoch 14/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8796 - acc: 0.6500
120/120 [==============================] - 0s 118us/step - loss: 1.0444 - acc: 0.4583
Epoch 15/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.1174 - acc: 0.4000
120/120 [==============================] - 0s 123us/step - loss: 1.0345 - acc: 0.4583
Epoch 16/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8925 - acc: 0.6000
120/120 [==============================] - 0s 126us/step - loss: 1.0250 - acc: 0.4667
Epoch 17/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9452 - acc: 0.5500
120/120 [==============================] - 0s 125us/step - loss: 1.0157 - acc: 0.4750
Epoch 18/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.0296 - acc: 0.5000
120/120 [==============================] - 0s 122us/step - loss: 1.0065 - acc: 0.4833
Epoch 19/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9896 - acc: 0.5000
120/120 [==============================] - 0s 128us/step - loss: 0.9974 - acc: 0.5083
Epoch 20/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9242 - acc: 0.6000
120/120 [==============================] - 0s 123us/step - loss: 0.9884 - acc: 0.5167
Epoch 21/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9172 - acc: 0.7000
120/120 [==============================] - 0s 121us/step - loss: 0.9797 - acc: 0.5500
Epoch 22/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9735 - acc: 0.4500
120/120 [==============================] - 0s 124us/step - loss: 0.9709 - acc: 0.5500
Epoch 23/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9530 - acc: 0.5500
120/120 [==============================] - 0s 119us/step - loss: 0.9620 - acc: 0.5667
Epoch 24/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9373 - acc: 0.6500
120/120 [==============================] - 0s 124us/step - loss: 0.9535 - acc: 0.5667
Epoch 25/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.0274 - acc: 0.4500
120/120 [==============================] - 0s 126us/step - loss: 0.9448 - acc: 0.5750
Epoch 26/200

 20/120 [====>.........................] - ETA: 0s - loss: 1.0429 - acc: 0.3500
120/120 [==============================] - 0s 121us/step - loss: 0.9363 - acc: 0.5917
Epoch 27/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8533 - acc: 0.6500
120/120 [==============================] - 0s 125us/step - loss: 0.9278 - acc: 0.6083
Epoch 28/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8760 - acc: 0.8500
120/120 [==============================] - 0s 120us/step - loss: 0.9196 - acc: 0.6417
Epoch 29/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8840 - acc: 0.6000
120/120 [==============================] - 0s 122us/step - loss: 0.9113 - acc: 0.6333
Epoch 30/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9369 - acc: 0.5500
120/120 [==============================] - 0s 123us/step - loss: 0.9031 - acc: 0.6333
Epoch 31/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9390 - acc: 0.5500
120/120 [==============================] - 0s 121us/step - loss: 0.8948 - acc: 0.6583
Epoch 32/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8749 - acc: 0.7000
120/120 [==============================] - 0s 126us/step - loss: 0.8865 - acc: 0.6667
Epoch 33/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7863 - acc: 0.9000
120/120 [==============================] - 0s 126us/step - loss: 0.8787 - acc: 0.6667
Epoch 34/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8705 - acc: 0.6000
120/120 [==============================] - 0s 123us/step - loss: 0.8707 - acc: 0.6750
Epoch 35/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7930 - acc: 0.8000
120/120 [==============================] - 0s 120us/step - loss: 0.8634 - acc: 0.6750
Epoch 36/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8600 - acc: 0.6500
120/120 [==============================] - 0s 123us/step - loss: 0.8554 - acc: 0.6667
Epoch 37/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9184 - acc: 0.6000
120/120 [==============================] - 0s 126us/step - loss: 0.8475 - acc: 0.6667
Epoch 38/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8302 - acc: 0.7500
120/120 [==============================] - 0s 126us/step - loss: 0.8387 - acc: 0.6750
Epoch 39/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8210 - acc: 0.6500
120/120 [==============================] - 0s 125us/step - loss: 0.8298 - acc: 0.6917
Epoch 40/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.9522 - acc: 0.5500
120/120 [==============================] - 0s 121us/step - loss: 0.8218 - acc: 0.7000
Epoch 41/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8666 - acc: 0.7000
120/120 [==============================] - 0s 123us/step - loss: 0.8131 - acc: 0.6917
Epoch 42/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7964 - acc: 0.8000
120/120 [==============================] - 0s 117us/step - loss: 0.8045 - acc: 0.7167
Epoch 43/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7298 - acc: 0.8000
120/120 [==============================] - 0s 126us/step - loss: 0.7959 - acc: 0.7167
Epoch 44/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7859 - acc: 0.6500
120/120 [==============================] - 0s 121us/step - loss: 0.7874 - acc: 0.7250
Epoch 45/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8020 - acc: 0.6500
120/120 [==============================] - 0s 121us/step - loss: 0.7789 - acc: 0.7500
Epoch 46/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7578 - acc: 0.8500
120/120 [==============================] - 0s 121us/step - loss: 0.7708 - acc: 0.7750
Epoch 47/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7730 - acc: 0.7500
120/120 [==============================] - 0s 122us/step - loss: 0.7621 - acc: 0.7750
Epoch 48/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7351 - acc: 0.9000
120/120 [==============================] - 0s 121us/step - loss: 0.7542 - acc: 0.7750
Epoch 49/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.8540 - acc: 0.5500
120/120 [==============================] - 0s 129us/step - loss: 0.7457 - acc: 0.8000
Epoch 50/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7646 - acc: 0.7000
120/120 [==============================] - 0s 129us/step - loss: 0.7372 - acc: 0.8000
Epoch 51/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7967 - acc: 0.7500
120/120 [==============================] - 0s 126us/step - loss: 0.7288 - acc: 0.8000
Epoch 52/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7521 - acc: 0.8000
120/120 [==============================] - 0s 122us/step - loss: 0.7203 - acc: 0.8000
Epoch 53/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7118 - acc: 0.8500
120/120 [==============================] - 0s 120us/step - loss: 0.7115 - acc: 0.8000
Epoch 54/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7266 - acc: 0.7500
120/120 [==============================] - 0s 125us/step - loss: 0.7031 - acc: 0.8000
Epoch 55/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.6919 - acc: 0.8500
120/120 [==============================] - 0s 128us/step - loss: 0.6943 - acc: 0.8083
Epoch 56/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.6460 - acc: 0.9500
120/120 [==============================] - 0s 125us/step - loss: 0.6861 - acc: 0.8083
Epoch 57/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.7491 - acc: 0.8000
120/120 [==============================] - 0s 122us/step - loss: 0.6778 - acc: 0.8083
Epoch 58/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.6551 - acc: 0.8000
120/120 [==============================] - 0s 124us/step - loss: 0.6697 - acc: 0.8000
Epoch 59/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.6512 - acc: 0.9500
120/120 [==============================] - 0s 123us/step - loss: 0.6618 - acc: 0.8083
Epoch 60/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.6977 - acc: 0.7500
120/120 [==============================] - 0s 123us/step - loss: 0.6537 - acc: 0.8083
Epoch 61/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.6036 - acc: 0.8500
120/120 [==============================] - 0s 125us/step - loss: 0.6457 - acc: 0.8083
Epoch 62/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.6408 - acc: 0.8500
120/120 [==============================] - 0s 125us/step - loss: 0.6381 - acc: 0.8167
Epoch 63/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5809 - acc: 0.8500
120/120 [==============================] - 0s 120us/step - loss: 0.6303 - acc: 0.8000
Epoch 64/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5969 - acc: 0.9000
120/120 [==============================] - 0s 123us/step - loss: 0.6227 - acc: 0.8083
Epoch 65/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5739 - acc: 0.8500
120/120 [==============================] - 0s 124us/step - loss: 0.6149 - acc: 0.8083
Epoch 66/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.6070 - acc: 0.8000
120/120 [==============================] - 0s 124us/step - loss: 0.6070 - acc: 0.8083
Epoch 67/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4420 - acc: 0.9500
120/120 [==============================] - 0s 120us/step - loss: 0.5993 - acc: 0.8083
Epoch 68/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5089 - acc: 0.8000
120/120 [==============================] - 0s 123us/step - loss: 0.5918 - acc: 0.8000
Epoch 69/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5878 - acc: 0.7500
120/120 [==============================] - 0s 119us/step - loss: 0.5844 - acc: 0.8000
Epoch 70/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5498 - acc: 0.8000
120/120 [==============================] - 0s 121us/step - loss: 0.5771 - acc: 0.8000
Epoch 71/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.6059 - acc: 0.8500
120/120 [==============================] - 0s 121us/step - loss: 0.5698 - acc: 0.8000
Epoch 72/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5831 - acc: 0.7500
120/120 [==============================] - 0s 189us/step - loss: 0.5628 - acc: 0.8083
Epoch 73/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5433 - acc: 0.8500
120/120 [==============================] - 0s 125us/step - loss: 0.5551 - acc: 0.8083
Epoch 74/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5406 - acc: 0.8500
120/120 [==============================] - 0s 183us/step - loss: 0.5480 - acc: 0.8083
Epoch 75/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4579 - acc: 0.9500
120/120 [==============================] - 0s 125us/step - loss: 0.5412 - acc: 0.8250
Epoch 76/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5587 - acc: 0.8500
120/120 [==============================] - 0s 122us/step - loss: 0.5341 - acc: 0.8250
Epoch 77/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.6233 - acc: 0.8000
120/120 [==============================] - 0s 122us/step - loss: 0.5275 - acc: 0.8250
Epoch 78/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5365 - acc: 0.8500
120/120 [==============================] - 0s 124us/step - loss: 0.5208 - acc: 0.8250
Epoch 79/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4271 - acc: 0.9500
120/120 [==============================] - 0s 122us/step - loss: 0.5142 - acc: 0.8333
Epoch 80/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5569 - acc: 0.8500
120/120 [==============================] - 0s 120us/step - loss: 0.5073 - acc: 0.8333
Epoch 81/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5245 - acc: 0.7500
120/120 [==============================] - 0s 123us/step - loss: 0.5007 - acc: 0.8417
Epoch 82/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4595 - acc: 0.9000
120/120 [==============================] - 0s 119us/step - loss: 0.4943 - acc: 0.8417
Epoch 83/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4963 - acc: 0.8000
120/120 [==============================] - 0s 124us/step - loss: 0.4880 - acc: 0.8583
Epoch 84/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5160 - acc: 0.8000
120/120 [==============================] - 0s 122us/step - loss: 0.4820 - acc: 0.8500
Epoch 85/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3921 - acc: 0.9500
120/120 [==============================] - 0s 122us/step - loss: 0.4758 - acc: 0.8667
Epoch 86/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4532 - acc: 0.8500
120/120 [==============================] - 0s 123us/step - loss: 0.4698 - acc: 0.8667
Epoch 87/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3861 - acc: 0.9000
120/120 [==============================] - 0s 125us/step - loss: 0.4640 - acc: 0.8667
Epoch 88/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4577 - acc: 0.8500
120/120 [==============================] - 0s 124us/step - loss: 0.4582 - acc: 0.8833
Epoch 89/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5108 - acc: 0.8500
120/120 [==============================] - 0s 122us/step - loss: 0.4525 - acc: 0.8833
Epoch 90/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5417 - acc: 0.8500
120/120 [==============================] - 0s 125us/step - loss: 0.4471 - acc: 0.8833
Epoch 91/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4424 - acc: 0.8500
120/120 [==============================] - 0s 122us/step - loss: 0.4417 - acc: 0.8833
Epoch 92/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3126 - acc: 1.0000
120/120 [==============================] - 0s 119us/step - loss: 0.4362 - acc: 0.8833
Epoch 93/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5622 - acc: 0.7500
120/120 [==============================] - 0s 124us/step - loss: 0.4313 - acc: 0.8833
Epoch 94/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5598 - acc: 0.8500
120/120 [==============================] - 0s 127us/step - loss: 0.4260 - acc: 0.8833
Epoch 95/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.5797 - acc: 0.7500
120/120 [==============================] - 0s 122us/step - loss: 0.4208 - acc: 0.8750
Epoch 96/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4290 - acc: 0.8000
120/120 [==============================] - 0s 122us/step - loss: 0.4159 - acc: 0.8750
Epoch 97/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3640 - acc: 0.9000
120/120 [==============================] - 0s 125us/step - loss: 0.4114 - acc: 0.8750
Epoch 98/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3604 - acc: 0.9500
120/120 [==============================] - 0s 181us/step - loss: 0.4063 - acc: 0.8750
Epoch 99/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4632 - acc: 0.8500
120/120 [==============================] - 0s 126us/step - loss: 0.4016 - acc: 0.8750
Epoch 100/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4075 - acc: 0.9000
120/120 [==============================] - 0s 129us/step - loss: 0.3969 - acc: 0.8750
Epoch 101/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4571 - acc: 0.7500
120/120 [==============================] - 0s 125us/step - loss: 0.3925 - acc: 0.8750
Epoch 102/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4491 - acc: 0.8500
120/120 [==============================] - 0s 122us/step - loss: 0.3877 - acc: 0.8750
Epoch 103/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4963 - acc: 0.6500
120/120 [==============================] - 0s 125us/step - loss: 0.3833 - acc: 0.8667
Epoch 104/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3955 - acc: 0.8500
120/120 [==============================] - 0s 125us/step - loss: 0.3790 - acc: 0.8750
Epoch 105/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2645 - acc: 0.9500
120/120 [==============================] - 0s 122us/step - loss: 0.3747 - acc: 0.8750
Epoch 106/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3479 - acc: 0.9500
120/120 [==============================] - 0s 118us/step - loss: 0.3703 - acc: 0.8917
Epoch 107/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2782 - acc: 0.9500
120/120 [==============================] - 0s 122us/step - loss: 0.3662 - acc: 0.8917
Epoch 108/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4330 - acc: 0.9000
120/120 [==============================] - 0s 127us/step - loss: 0.3625 - acc: 0.8917
Epoch 109/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3251 - acc: 0.9500
120/120 [==============================] - 0s 121us/step - loss: 0.3585 - acc: 0.8917
Epoch 110/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4106 - acc: 0.8500
120/120 [==============================] - 0s 123us/step - loss: 0.3546 - acc: 0.8833
Epoch 111/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3672 - acc: 0.9000
120/120 [==============================] - 0s 121us/step - loss: 0.3511 - acc: 0.8917
Epoch 112/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3290 - acc: 0.9000
120/120 [==============================] - 0s 127us/step - loss: 0.3474 - acc: 0.8917
Epoch 113/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3412 - acc: 0.8500
120/120 [==============================] - 0s 120us/step - loss: 0.3437 - acc: 0.8833
Epoch 114/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3432 - acc: 0.9000
120/120 [==============================] - 0s 118us/step - loss: 0.3400 - acc: 0.8833
Epoch 115/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2424 - acc: 1.0000
120/120 [==============================] - 0s 124us/step - loss: 0.3368 - acc: 0.8917
Epoch 116/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3269 - acc: 0.8500
120/120 [==============================] - 0s 121us/step - loss: 0.3336 - acc: 0.8833
Epoch 117/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3552 - acc: 0.8500
120/120 [==============================] - 0s 121us/step - loss: 0.3300 - acc: 0.8833
Epoch 118/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3662 - acc: 0.8500
120/120 [==============================] - 0s 123us/step - loss: 0.3269 - acc: 0.8833
Epoch 119/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2724 - acc: 0.9500
120/120 [==============================] - 0s 122us/step - loss: 0.3240 - acc: 0.8833
Epoch 120/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2047 - acc: 0.9000
120/120 [==============================] - 0s 123us/step - loss: 0.3207 - acc: 0.8833
Epoch 121/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3528 - acc: 0.9500
120/120 [==============================] - 0s 123us/step - loss: 0.3178 - acc: 0.8833
Epoch 122/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3383 - acc: 0.8500
120/120 [==============================] - 0s 120us/step - loss: 0.3148 - acc: 0.8750
Epoch 123/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2974 - acc: 0.9000
120/120 [==============================] - 0s 125us/step - loss: 0.3117 - acc: 0.8750
Epoch 124/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3195 - acc: 0.9000
120/120 [==============================] - 0s 120us/step - loss: 0.3090 - acc: 0.8750
Epoch 125/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2875 - acc: 0.8500
120/120 [==============================] - 0s 126us/step - loss: 0.3060 - acc: 0.8833
Epoch 126/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4528 - acc: 0.7500
120/120 [==============================] - 0s 123us/step - loss: 0.3038 - acc: 0.8833
Epoch 127/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2807 - acc: 0.9000
120/120 [==============================] - 0s 125us/step - loss: 0.3009 - acc: 0.8833
Epoch 128/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2155 - acc: 0.9500
120/120 [==============================] - 0s 122us/step - loss: 0.2983 - acc: 0.8833
Epoch 129/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2513 - acc: 0.9000
120/120 [==============================] - 0s 121us/step - loss: 0.2962 - acc: 0.8833
Epoch 130/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2349 - acc: 0.9500
120/120 [==============================] - 0s 126us/step - loss: 0.2934 - acc: 0.8833
Epoch 131/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2662 - acc: 0.9500
120/120 [==============================] - 0s 124us/step - loss: 0.2910 - acc: 0.8833
Epoch 132/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2564 - acc: 0.9500
120/120 [==============================] - 0s 123us/step - loss: 0.2887 - acc: 0.8833
Epoch 133/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3198 - acc: 0.8500
120/120 [==============================] - 0s 121us/step - loss: 0.2860 - acc: 0.8833
Epoch 134/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3507 - acc: 0.8000
120/120 [==============================] - 0s 118us/step - loss: 0.2838 - acc: 0.8833
Epoch 135/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2359 - acc: 0.8500
120/120 [==============================] - 0s 119us/step - loss: 0.2811 - acc: 0.8833
Epoch 136/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3189 - acc: 0.9000
120/120 [==============================] - 0s 124us/step - loss: 0.2787 - acc: 0.8833
Epoch 137/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3957 - acc: 0.8000
120/120 [==============================] - 0s 126us/step - loss: 0.2765 - acc: 0.8833
Epoch 138/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2294 - acc: 0.9500
120/120 [==============================] - 0s 126us/step - loss: 0.2745 - acc: 0.8833
Epoch 139/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2982 - acc: 0.9000
120/120 [==============================] - 0s 131us/step - loss: 0.2720 - acc: 0.8833
Epoch 140/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2839 - acc: 0.8000
120/120 [==============================] - 0s 116us/step - loss: 0.2698 - acc: 0.8833
Epoch 141/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2561 - acc: 0.8500
120/120 [==============================] - 0s 123us/step - loss: 0.2676 - acc: 0.8833
Epoch 142/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2525 - acc: 0.8500
120/120 [==============================] - 0s 126us/step - loss: 0.2657 - acc: 0.8833
Epoch 143/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2697 - acc: 0.8500
120/120 [==============================] - 0s 129us/step - loss: 0.2635 - acc: 0.8833
Epoch 144/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1729 - acc: 0.9500
120/120 [==============================] - 0s 123us/step - loss: 0.2615 - acc: 0.8833
Epoch 145/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3148 - acc: 0.8500
120/120 [==============================] - 0s 120us/step - loss: 0.2596 - acc: 0.8833
Epoch 146/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.4313 - acc: 0.7000
120/120 [==============================] - 0s 126us/step - loss: 0.2579 - acc: 0.8833
Epoch 147/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2140 - acc: 0.9500
120/120 [==============================] - 0s 126us/step - loss: 0.2559 - acc: 0.8917
Epoch 148/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3206 - acc: 0.8000
120/120 [==============================] - 0s 125us/step - loss: 0.2541 - acc: 0.8833
Epoch 149/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2510 - acc: 0.9000
120/120 [==============================] - 0s 124us/step - loss: 0.2524 - acc: 0.8917
Epoch 150/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1909 - acc: 0.9500
120/120 [==============================] - 0s 126us/step - loss: 0.2509 - acc: 0.8917
Epoch 151/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2431 - acc: 0.9500
120/120 [==============================] - 0s 120us/step - loss: 0.2490 - acc: 0.8917
Epoch 152/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2555 - acc: 0.9500
120/120 [==============================] - 0s 122us/step - loss: 0.2472 - acc: 0.8917
Epoch 153/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2908 - acc: 0.8500
120/120 [==============================] - 0s 123us/step - loss: 0.2459 - acc: 0.8917
Epoch 154/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2669 - acc: 0.9000
120/120 [==============================] - 0s 125us/step - loss: 0.2439 - acc: 0.8917
Epoch 155/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2216 - acc: 0.9000
120/120 [==============================] - 0s 123us/step - loss: 0.2427 - acc: 0.8917
Epoch 156/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2075 - acc: 0.8500
120/120 [==============================] - 0s 128us/step - loss: 0.2410 - acc: 0.8917
Epoch 157/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1751 - acc: 0.9000
120/120 [==============================] - 0s 123us/step - loss: 0.2390 - acc: 0.8917
Epoch 158/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1767 - acc: 0.9500
120/120 [==============================] - 0s 118us/step - loss: 0.2376 - acc: 0.9083
Epoch 159/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2209 - acc: 0.9000
120/120 [==============================] - 0s 124us/step - loss: 0.2358 - acc: 0.9000
Epoch 160/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2913 - acc: 0.9500
120/120 [==============================] - 0s 131us/step - loss: 0.2343 - acc: 0.9083
Epoch 161/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2257 - acc: 0.8500
120/120 [==============================] - 0s 123us/step - loss: 0.2328 - acc: 0.9083
Epoch 162/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2498 - acc: 0.8500
120/120 [==============================] - 0s 125us/step - loss: 0.2314 - acc: 0.9083
Epoch 163/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1660 - acc: 0.9500
120/120 [==============================] - 0s 181us/step - loss: 0.2296 - acc: 0.9083
Epoch 164/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3105 - acc: 0.8500
120/120 [==============================] - 0s 127us/step - loss: 0.2282 - acc: 0.9167
Epoch 165/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2816 - acc: 0.9000
120/120 [==============================] - 0s 125us/step - loss: 0.2266 - acc: 0.9167
Epoch 166/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3257 - acc: 0.8000
120/120 [==============================] - 0s 127us/step - loss: 0.2250 - acc: 0.9167
Epoch 167/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2319 - acc: 0.9500
120/120 [==============================] - 0s 121us/step - loss: 0.2243 - acc: 0.9250
Epoch 168/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1504 - acc: 0.9500
120/120 [==============================] - 0s 127us/step - loss: 0.2225 - acc: 0.9250
Epoch 169/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1850 - acc: 0.9500
120/120 [==============================] - 0s 128us/step - loss: 0.2211 - acc: 0.9250
Epoch 170/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1429 - acc: 0.9500
120/120 [==============================] - 0s 124us/step - loss: 0.2196 - acc: 0.9250
Epoch 171/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1408 - acc: 1.0000
120/120 [==============================] - 0s 128us/step - loss: 0.2180 - acc: 0.9250
Epoch 172/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1454 - acc: 0.9500
120/120 [==============================] - 0s 126us/step - loss: 0.2167 - acc: 0.9250
Epoch 173/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1138 - acc: 1.0000
120/120 [==============================] - 0s 127us/step - loss: 0.2152 - acc: 0.9250
Epoch 174/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.3496 - acc: 0.8000
120/120 [==============================] - 0s 187us/step - loss: 0.2134 - acc: 0.9250
Epoch 175/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2839 - acc: 0.9000
120/120 [==============================] - 0s 123us/step - loss: 0.2120 - acc: 0.9333
Epoch 176/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.0959 - acc: 1.0000
120/120 [==============================] - 0s 128us/step - loss: 0.2107 - acc: 0.9333
Epoch 177/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2519 - acc: 0.8500
120/120 [==============================] - 0s 118us/step - loss: 0.2091 - acc: 0.9333
Epoch 178/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2173 - acc: 0.9500
120/120 [==============================] - 0s 124us/step - loss: 0.2076 - acc: 0.9333
Epoch 179/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1475 - acc: 0.9500
120/120 [==============================] - 0s 117us/step - loss: 0.2065 - acc: 0.9333
Epoch 180/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1693 - acc: 0.9500
120/120 [==============================] - 0s 125us/step - loss: 0.2052 - acc: 0.9333
Epoch 181/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1628 - acc: 1.0000
120/120 [==============================] - 0s 125us/step - loss: 0.2037 - acc: 0.9333
Epoch 182/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2297 - acc: 0.9000
120/120 [==============================] - 0s 126us/step - loss: 0.2023 - acc: 0.9333
Epoch 183/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1265 - acc: 1.0000
120/120 [==============================] - 0s 129us/step - loss: 0.2012 - acc: 0.9333
Epoch 184/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2759 - acc: 0.8500
120/120 [==============================] - 0s 127us/step - loss: 0.1996 - acc: 0.9333
Epoch 185/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2133 - acc: 0.9500
120/120 [==============================] - 0s 124us/step - loss: 0.1984 - acc: 0.9333
Epoch 186/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2225 - acc: 0.9000
120/120 [==============================] - 0s 125us/step - loss: 0.1968 - acc: 0.9333
Epoch 187/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1903 - acc: 0.9000
120/120 [==============================] - 0s 126us/step - loss: 0.1958 - acc: 0.9333
Epoch 188/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1777 - acc: 1.0000
120/120 [==============================] - 0s 118us/step - loss: 0.1943 - acc: 0.9333
Epoch 189/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2010 - acc: 0.9500
120/120 [==============================] - 0s 122us/step - loss: 0.1929 - acc: 0.9333
Epoch 190/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2233 - acc: 1.0000
120/120 [==============================] - 0s 119us/step - loss: 0.1919 - acc: 0.9333
Epoch 191/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.0962 - acc: 1.0000
120/120 [==============================] - 0s 116us/step - loss: 0.1902 - acc: 0.9333
Epoch 192/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1980 - acc: 0.9500
120/120 [==============================] - 0s 119us/step - loss: 0.1890 - acc: 0.9333
Epoch 193/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2002 - acc: 1.0000
120/120 [==============================] - 0s 122us/step - loss: 0.1882 - acc: 0.9333
Epoch 194/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1300 - acc: 1.0000
120/120 [==============================] - 0s 128us/step - loss: 0.1865 - acc: 0.9333
Epoch 195/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1982 - acc: 0.9500
120/120 [==============================] - 0s 126us/step - loss: 0.1853 - acc: 0.9333
Epoch 196/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.0924 - acc: 1.0000
120/120 [==============================] - 0s 122us/step - loss: 0.1841 - acc: 0.9333
Epoch 197/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2130 - acc: 0.8500
120/120 [==============================] - 0s 119us/step - loss: 0.1827 - acc: 0.9333
Epoch 198/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1398 - acc: 0.9500
120/120 [==============================] - 0s 124us/step - loss: 0.1816 - acc: 0.9333
Epoch 199/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.1278 - acc: 1.0000
120/120 [==============================] - 0s 122us/step - loss: 0.1801 - acc: 0.9333
Epoch 200/200

 20/120 [====>.........................] - ETA: 0s - loss: 0.2709 - acc: 0.8000
120/120 [==============================] - 0s 123us/step - loss: 0.1788 - acc: 0.9333
plot(history) +
  ggtitle("Training a neural network based classifier on the iris data set") +
  theme_bw()

Evaluate Network Performance

The final performance can be obtained like so:

perf <- model %>% evaluate(x_test, y_test)

20/20 [==============================] - 0s 6ms/step
print(perf)
$loss
[1] 0.1695985

$acc
[1] 0.95
classes <- iris %>% as_tibble %>% pull(Species) %>% unique
y_pred  <- model %>% predict_classes(x_test)
y_true  <- nn_dat %>% pull(class_label) %>% .[test_indices]
tibble(y_true = classes[y_true + 1], y_pred = classes[y_pred + 1],
       Correct = ifelse(y_true == y_pred, "Yes", "No") %>% factor) %>% 
  ggplot(aes(x = y_true, y = y_pred, colour = Correct)) +
  geom_jitter() +
  theme_bw() +
  ggtitle(label = "Classification Performance of Artificial Neural Network",
          subtitle = str_c("Accuracy = ",round(perf$acc,3)*100,"%")) +
  xlab(label = "True iris class") +
  ylab(label = "Predicted iris class")

library(gmodels)
CrossTable(y_pred, y_true,
           prop.chisq = FALSE, prop.t = FALSE, prop.r = FALSE,
           dnn = c('predicted', 'actual'))

 
   Cell Contents
|-------------------------|
|                       N |
|           N / Col Total |
|-------------------------|

 
Total Observations in Table:  20 

 
             | actual 
   predicted |         0 |         1 |         2 | Row Total | 
-------------|-----------|-----------|-----------|-----------|
           0 |        12 |         0 |         0 |        12 | 
             |     1.000 |     0.000 |     0.000 |           | 
-------------|-----------|-----------|-----------|-----------|
           1 |         0 |         5 |         1 |         6 | 
             |     0.000 |     1.000 |     0.333 |           | 
-------------|-----------|-----------|-----------|-----------|
           2 |         0 |         0 |         2 |         2 | 
             |     0.000 |     0.000 |     0.667 |           | 
-------------|-----------|-----------|-----------|-----------|
Column Total |        12 |         5 |         3 |        20 | 
             |     0.600 |     0.250 |     0.150 |           | 
-------------|-----------|-----------|-----------|-----------|

 

Conclusion

I hope this illustrated just how easy it is to get started building artificial neural network using Keras and TensorFlow in R. With relative ease, we created a 3-class predictor with an accuracy of 100%. This was a basic minimal example. The network can be expanded to create Deep Learning networks and also the entire TensorFlow API is available.

Enjoy and Happy Learning!

Leon

Thanks again Leon, this was awsome!!!

---
title: "Building a simple neural network using Keras and Tensorflow"
output: html_notebook
---

Thank you
----------

A big thank you to Leon Jessen for posting his code on github.

[Building a simple neural network using Keras and Tensorflow](https://github.com/leonjessen/keras_tensorflow_on_iris/blob/master/README.md)

I have forked his project on github and put his code into an R Notebook so we can run it in class.

Motivation
----------

The following is a minimal example for building your first simple artificial neural network using Keras and TensorFlow for R.

[TensorFlow for R by Rstudio lives here](https://tensorflow.rstudio.com/keras/).

Gettings started - Install Keras and TensorFlow for R
----------------------------------------------------

You can install the Keras for R package from CRAN as follows:
```{r eval=FALSE}
# install.packages("keras")
```

TensorFlow is the default backend engine. TensorFlow and Keras can be installed as follows:

```{r eval=FALSE}
# library(keras)
# install_keras()
```

Naturally, we will also need `TidyVerse`:

```{r eval=FALSE}
# Install from CRAN
# install.packages("tidyverse")

# Or the development version from GitHub
# install.packages("devtools")
# devtools::install_github("hadley/tidyverse")
```

Once installed, we simply load the libraries

```{r}
library("keras")
library("tidyverse")
```

Artificial Neural Network Using the Iris Data Set
-------------------------------------------------

Right, let's get to it!

### Data

The famous (Fisher's or Anderson's) `iris` data set contains a total of 150 observations of 4 input features `Sepal.Length`, `Sepal.Width`, `Petal.Length` and `Petal.Width` and 3 output classes `setosa` `versicolor` and `virginica`, with 50 observations in each class. The distributions of the feature values looks like so:

```{r}
iris %>% as_tibble %>% gather(feature, value, -Species) %>%
  ggplot(aes(x = feature, y = value, fill = Species)) +
  geom_violin(alpha = 0.5, scale = "width") +
  theme_bw()
```

Our aim is to connect the 4 input features to the correct output class using an artificial neural network. For this task, we have chosen the following simple architecture with one input layer with 4 neurons (one for each feature), one hidden layer with 4 neurons and one output layer with 3 neurons (one for each class), all fully connected:

![architecture_visualisation.png](./img/architecture_visualisation.png)

Our artificial neural network will have a total of 35 parameters: 4 for each input neuron connected to the hidden layer, plus an additional 4 for the associated first bias neuron and 3 for each of the hidden neurons connected to the output layer, plus an additional 3 for the associated second bias neuron. I.e. $4⋅4+4+4⋅3+3=35$

### Prepare data

We start with slightly wrangling the iris data set by renaming and scaling the features and converting character labels to numeric:

```{r}
set.seed(265509)
nn_dat <- iris %>% as_tibble %>%
  mutate(sepal_length = scale(Sepal.Length),
         sepal_width  = scale(Sepal.Width),
         petal_length = scale(Petal.Length),
         petal_width  = scale(Petal.Width),          
         class_label  = as.numeric(Species) - 1) %>% # Yields 0, 1, 2
  select(sepal_length, sepal_width, petal_length, petal_width, class_label)
nn_dat %>% head(3)
```

Then, we create indices for splitting the iris data into a training and a test data set. We set aside 20% of the data for testing:

```{r}
test_fraction   <- 0.20
n_total_samples <- nrow(nn_dat)
n_train_samples <- ceiling((1 - test_fraction) * n_total_samples)
train_indices   <- sample(n_total_samples, n_train_samples)
n_test_samples  <- n_total_samples - n_train_samples
test_indices    <- setdiff(seq(1, n_train_samples), train_indices)
```

Based on the indices, we can now create training and test data

```{r}
x_train <- nn_dat %>% select(-class_label) %>% as.matrix %>% .[train_indices,]
y_train <- nn_dat %>% pull(class_label) %>% .[train_indices] %>% to_categorical(3)
x_test  <- nn_dat %>% select(-class_label) %>% as.matrix %>% .[test_indices,]
y_test  <- nn_dat %>% pull(class_label) %>% .[test_indices] %>% to_categorical(3)
```

### Set Architecture

With the data in place, we now set the architecture of our artificical neural network:

```{r}
model <- keras_model_sequential()
model %>% 
  layer_dense(units = 4, activation = 'relu', input_shape = 4) %>% 
  layer_dense(units = 3, activation = 'softmax')
model %>% summary
```


Next, the architecture set in the model needs to be compiled:

```{r}
model %>% compile(
  loss      = 'categorical_crossentropy',
  optimizer = optimizer_rmsprop(),
  metrics   = c('accuracy')
)
```

### Train the Artificial Neural Network

Lastly we fit the model and save the training progres in the `history` object:

```{r}
history <- model %>% fit(
  x = x_train, y = y_train,
  epochs = 200,
  batch_size = 20,
  validation_split = 0
)
plot(history) +
  ggtitle("Training a neural network based classifier on the iris data set") +
  theme_bw()
```

### Evaluate Network Performance

The final performance can be obtained like so:

```{r}
perf <- model %>% evaluate(x_test, y_test)
print(perf)
```

```{r}
classes <- iris %>% as_tibble %>% pull(Species) %>% unique
y_pred  <- model %>% predict_classes(x_test)
y_true  <- nn_dat %>% pull(class_label) %>% .[test_indices]

tibble(y_true = classes[y_true + 1], y_pred = classes[y_pred + 1],
       Correct = ifelse(y_true == y_pred, "Yes", "No") %>% factor) %>% 
  ggplot(aes(x = y_true, y = y_pred, colour = Correct)) +
  geom_jitter() +
  theme_bw() +
  ggtitle(label = "Classification Performance of Artificial Neural Network",
          subtitle = str_c("Accuracy = ",round(perf$acc,3)*100,"%")) +
  xlab(label = "True iris class") +
  ylab(label = "Predicted iris class")
```


```{r}
library(gmodels)

CrossTable(y_pred, y_true,
           prop.chisq = FALSE, prop.t = FALSE, prop.r = FALSE,
           dnn = c('predicted', 'actual'))

```


### Conclusion

I hope this illustrated just how easy it is to get started building artificial neural network using Keras and TensorFlow in R. With relative ease, we created a 3-class predictor with an accuracy of 100%. This was a basic minimal example. The network can be expanded to create Deep Learning networks and also the entire TensorFlow API is available.

Enjoy and Happy Learning!

Leon

**Thanks again Leon, this was awsome!!!**






