2-layer hidden network I

This ANN tries to predict smartphone type based on several characteristics and personality traits like gender, age, honesty/humility, extraversion, conscientiousness, avoidance similarity, emotionality, agreeableness, socio-economic status, time owned current device, and whether the phone is perceived as a status object. Because we have just two classes (iPhone vs. Android), a 2-layer hidden network would probably work. As the model is not deterministic (i.e. every time you fit it the situation will be a little different), accuracy will vary. With the neuralnet library, the overall accuracy is roughly 58%.

[1] 0.5764192

2-layer hidden network II

Fitting the model with the nnet library, the overall accuracy raises to 68%, whereas it classifies mostly Android phones better than iPhones (86% vs. 55%). As in the case of heuristic optimization you may end up finding local optima, it is useful to run the model several times and look for the best model. As shown in the histogram below, running the model 100 times you see that the best ones reach about 70% accuracy.

Overall accuracy = 0.681 

Confusion matrix 
         Predicted (cv)
Actual    Android iPhone
  Android   0.863  0.137
  iPhone    0.448  0.552

6-layer hidden network

Is it possible to do any better, on average, with more hidden nodes? Running the model with 6 hidden layers raises the overall accuracy to above 70%, but it would also be important to implement a cross-validation scheme to avoid overfitting.

Cross-validation

You can fit the 6-layer model 100 times and see how well it does on the cross-validation set. As it is usually the case, accuracy drops on the test set to about 60%.