Lets look at the table of Reviews from four businesses made up of a high end spa, a low cost grocery store, and two different chiropractors’ offices in or around Corona, CA. This is based on the cleaned up reviews of these businesses from a social media site where cleaned up just means filtering out business ads, the date, the user photos, user rating, etc. from the header of each review. I took only the columns I might be interested in and saved it as ‘userReviewBusinessTypeRatig.csv’. We will need to tokenize the words and put them into the same format that will run predictions on the ratings based on the tokenized reviews.These reviews are rated on a scale of 1 through 5, with 5 being the best experience and 1 the worst experience.
library(reticulate)
## Warning: package 'reticulate' was built under R version 3.6.3
conda_list(conda = "auto")
## name python
## 1 Anaconda2 C:\\Users\\m\\Anaconda2\\python.exe
## 2 djangoenv C:\\Users\\m\\Anaconda2\\envs\\djangoenv\\python.exe
## 3 python36 C:\\Users\\m\\Anaconda2\\envs\\python36\\python.exe
## 4 python37 C:\\Users\\m\\Anaconda2\\envs\\python37\\python.exe
## 5 r-reticulate C:\\Users\\m\\Anaconda2\\envs\\r-reticulate\\python.exe
use_condaenv(condaenv = "python36")
import pandas as pd
import matplotlib.pyplot as plt
from textblob import TextBlob
import sklearn
import numpy as np
from sklearn.feature_extraction.text import CountVectorizer, TfidfVectorizer
from sklearn.naive_bayes import MultinomialNB
from sklearn.metrics import classification_report, f1_score, accuracy_score, confusion_matrix
import re
import string
import nltk
np.random.seed(47)
modes = pd.read_csv('userReviewBusinessTypeRating.csv', encoding = 'unicode_escape')
print(modes.head())
## userReviewOnlyContent ... businessType
## 0 What a wonderful way to start the year! This ... ... high end massage retreat
## 1 My sister and I brought my mom here for her b... ... high end massage retreat
## 2 I came to CHIROPRACTIC with severe back and ne... ... chiropractic
## 3 I have to say.... This is by far the best Chir... ... chiropractic
## 4 Dr. is my chiropractor and he is a fabulous i... ... chiropractic
##
## [5 rows x 5 columns]
print(modes.columns)
## Index(['userReviewOnlyContent', 'userRatingValue', 'userReviewContent',
## 'LowAvgHighCost', 'businessType'],
## dtype='object')
print(modes['userRatingValue'].unique())
## [5 4 1 3 2]
import numpy as np
modes = modes.reindex(np.random.permutation(modes.index))
print(modes.head())
## userReviewOnlyContent ... businessType
## 551 Still no update by this facility, don't think ... ... high end massage retreat
## 340 It's a pretty cool nice place from what I can... ... high end massage retreat
## 474 Imagine planning a family event for the last t... ... high end massage retreat
## 7 has been treating myself, family and friends... ... chiropractic
## 239 Love the deli department , cheap fast food for... ... grocery store
##
## [5 rows x 5 columns]
print(modes.tail())
## userReviewOnlyContent ... businessType
## 23 I booked a Winter Warm Up package at HIGH END... ... high end massage retreat
## 584 Greatly run business. DOCTOR and his staff a... ... chiropractic
## 264 Good prices for many items. Just be very care... ... grocery store
## 327 Great birthday experience here. I know they o... ... high end massage retreat
## 135 I went to see Dr. after a friend referred me.... ... chiropractic
##
## [5 rows x 5 columns]
modes.groupby('userRatingValue').describe()
## userReviewOnlyContent ... businessType
## count unique ... top freq
## userRatingValue ...
## 1 88 75 ... high end massage retreat 53
## 2 34 32 ... high end massage retreat 28
## 3 54 48 ... high end massage retreat 34
## 4 103 93 ... high end massage retreat 62
## 5 335 318 ... chiropractic 204
##
## [5 rows x 16 columns]
stopwords = nltk.corpus.stopwords.words('english')
ps=nltk.PorterStemmer()
wn=nltk.WordNetLemmatizer()
def lemmatize(text):
text="".join([word.lower() for word in text if word not in string.punctuation])
tokens=re.split('\W+', text)
text=" ".join([wn.lemmatize(word) for word in tokens if word not in stopwords])
return text
modes['lemmatizedReviews']=modes['userReviewOnlyContent'].apply(lambda x: lemmatize(x))
modes.columns
## Index(['userReviewOnlyContent', 'userRatingValue', 'userReviewContent',
## 'LowAvgHighCost', 'businessType', 'lemmatizedReviews'],
## dtype='object')
modes.head()
## userReviewOnlyContent ... lemmatizedReviews
## 551 Still no update by this facility, don't think ... ... still update facility dont think ill ever go b...
## 340 It's a pretty cool nice place from what I can... ... pretty cool nice place tell next morning woke...
## 474 Imagine planning a family event for the last t... ... imagine planning family event last three month...
## 7 has been treating myself, family and friends... ... treating family friend many year drive long b...
## 239 Love the deli department , cheap fast food for... ... love deli department cheap fast food student g...
##
## [5 rows x 6 columns]
from sklearn.model_selection import train_test_split
X_train,X_test,y_train,y_test=train_test_split(modes[['userReviewOnlyContent','lemmatizedReviews']],modes['userRatingValue'],test_size=0.15)
X_train.head()
## userReviewOnlyContent lemmatizedReviews
## 526 Imagine planning a family event for the last t... imagine planning family event last three month...
## 174 Produce is absolutely disgusting and rotten al... produce absolutely disgusting rotten always st...
## 129 I took my husband for a chiro adjustment and h... took husband chiro adjustment wont stop talkin...
## 560 I've been coming here for over The weekends ar... ive coming weekend least favorite time visit c...
## 304 Had a wonderful day with a friend a few month... wonderful day friend month ago loved pamperin...
from sklearn.feature_extraction.text import CountVectorizer
y_train.head()
## 526 1
## 174 1
## 129 5
## 560 1
## 304 4
## Name: userRatingValue, dtype: int64
n_gram3_vect=CountVectorizer(ngram_range=(1,3))
type(X_train['lemmatizedReviews'])
## <class 'pandas.core.series.Series'>
X_train['lemmatizedReviews'].head()
## 526 imagine planning family event last three month...
## 174 produce absolutely disgusting rotten always st...
## 129 took husband chiro adjustment wont stop talkin...
## 560 ive coming weekend least favorite time visit c...
## 304 wonderful day friend month ago loved pamperin...
## Name: lemmatizedReviews, dtype: object
n_gram3_vect_fit=n_gram3_vect.fit(X_train['lemmatizedReviews'])
n_gram3_train=n_gram3_vect_fit.transform(X_train['lemmatizedReviews'])
n_gram3_test=n_gram3_vect_fit.transform(X_test['lemmatizedReviews'])
print(len(n_gram3_vect_fit.get_feature_names()))
## 52613
Ngram3 = n_gram3_vect_fit.get_feature_names()
print(Ngram3[0:350])
## ['058', '058 078', '058 078 098', '078', '078 098', '078 098 shaker', '098', '098 shaker', '098 shaker depending', '10', '10 copay', '10 copay kaiser', '10 discount', '10 discount since', '10 got', '10 got let', '10 massage', '10 massage mall', '10 morning', '10 morning hard', '10 pack', '10 pack breast', '10 per', '10 per meal', '10 pound', '10 pound lighter', '100', '100 arrived', '100 arrived busy', '100 better', '100 better amazed', '100 better could', '100 every', '100 every time', '100 friend', '100 friend 295', '100 general', '100 general admittance', '100 grotto', '100 grotto skin', '100 per', '100 per person', '1000', '1000 word', '1000 word made', '1050', '1050 put', '1050 put card', '10am', '10am lol', '10am lol remember', '11', '11 set', '11 set inefficient', '11 well', '11 well told', '110', '110 aroma', '110 aroma therapy', '110219', '110219 nothing', '110219 nothing negative', '112', '112 photo', '1130', '1130 make', '1130 make reservation', '1130pm', '1130pm july', '1130pm july 19th', '11a7p', '11a7p didnt', '11a7p didnt even', '11am', '11am well', '11am well told', '12', '12 15', '12 15 year', '12 2019', '12 2019 husband', '12 appts', '12 appts wth', '12 gallon', '12 gallon milk', '12 grape', '12 grape 774', '12 place', '12 place get', '12 way', '12 way got', '12 year', '12 year ago', '12 year old', '12 yo', '12 yo sister', '121519', '121519 hadnt', '121519 hadnt year', '123', '123 world', '123 world class', '127', '127 back', '127 back problem', '12day', '12day always', '12day always fun', '13', '13 pound', '13 pound package', '135', '135 massage', '135 massage realize', '14', '14 star', '14 star could', '148', '148 carton', '148 carton 218', '149', '149 friend', '149 friend 66', '14yo', '14yo son', '14yo son said', '15', '15 felt', '15 felt little', '15 freeway', '15 freeway two', '15 min', '15 min go', '15 min shop', '15 minuet', '15 minuet early', '15 minute', '15 minute prior', '15 people', '15 people front', '15 year', '15 year ago', '159', '159 facial', '159 facial thorough', '15drink', '15drink pre', '15drink pre mixed', '16', '16 1130pm', '16 1130pm july', '165', '165 great', '165 great deal', '17', '17 pool', '17 pool club', '18', '18 price', '18 price soggy', '18 scrumptious', '18 scrumptious salad', '18wheeler', '18wheeler doctor', '18wheeler doctor entire', '19', '19 buck', '19 buck burger', '1996', '1996 math', '1996 math 21', '199lb', '199lb 10', '199lb 10 pack', '19th', '19th even', '19th even worse', '1hr', '1hr massage', '1hr massage best', '1lb', '1lb cherry', '1lb cherry 988', '1st', '1st foremost', '1st foremost amount', '1st place', '1st place use', '1st place well', '1st time', '1st time enjoyed', '1st time needed', '20', '20 additional', '20 additional charge', '20 hamburger', '20 hamburger patty', '20 min', '20 min end', '20 minute', '20 minute able', '20 minute even', '20 minute finally', '20 minute opening', '20 per', '20 per person', '20 prepared', '20 prepared spend', '20 recommend', '20 recommend able', '20 visit', '20 visit per', '20 year', '20 year made', '20 year opportunity', '200', '200 even', '200 even snobby', '2011', '2011 doctor', '2011 doctor patient', '2016', '2019', '2019 husband', '2019 husband michael', '2020', '2020 really', '2020 really look', '2030', '2030 min', '2030 min line', '20min', '20min get', '20min get checked', '20minute', '20minute massage', '20minute massage suppose', '21', '21 year', '21 year old', '218', '218 chobani', '218 chobani flip', '22', '22 needle', '22 needle first', '23', '23 lettuce', '23 lettuce smh', '24', '24 hour', '24 hour come', '24 hour convenient', '24 hour day', '24 hour dont', '24 hour get', '24 hour great', '24 hour hubby', '24 hour item', '24 hour low', '24 hour notice', '24 pack', '24 pack chobani', '247', '247 decent', '247 decent price', '247 literally', '247 literally go', '247 meet', '247 meet grocery', '24hr', '24hr grocery', '24hr grocery store', '24hrs', '24hrs would', '24hrs would give', '25', '25 add', '25 add perfect', '25 buck', '25 buck per', '25 cancellation', '25 cancellation fee', '25 minute', '25 minute closest', '25 portion', '25 portion huge', '25 red', '25 red clay', '25 year', '25 year barrestaurant', '250', '250 month', '250 month comparatively', '259', '259 cent', '259 cent every', '27', '27 bring', '27 bring save', '295', '295 review', '295 review 301', '2nd', '2nd charge', '2nd charge additional', '2nd incident', '2nd incident went', '2nd time', '2nd time getting', '2nd visit', '2nd visit 100', '30', '30 buck', '30 buck day', '30 drink', '30 drink 15drink', '30 extra', '30 extra crap', '30 grotto', '30 grotto included', '30 includes', '30 includes valet', '30 min', '30 min opening', '30 min reception', '30 minute', '30 minute away', '30 minute best', '30 minute late', '30 minute next', '30 per', '30 per person', '30 second', '30 second drive', '30 yummiest', '30 yummiest nacho', '300', '300 plus', '300 plus shopping', '301', '301 photo', '301 photo elite', '315', '315 said', '315 said one', '34', '34 time', '34 time per', '345']
n_gram3_train_df=pd.concat([X_train['lemmatizedReviews'].reset_index(drop=True),pd.DataFrame(n_gram3_train.toarray())],axis=1)
n_gram3_test_df=pd.concat([X_test['lemmatizedReviews'].reset_index(drop=True),pd.DataFrame(n_gram3_test.toarray())],axis=1)
n_gram3_train_df.head()
## lemmatizedReviews 0 1 ... 52610 52611 52612
## 0 imagine planning family event last three month... 0 0 ... 0 0 0
## 1 produce absolutely disgusting rotten always st... 0 0 ... 0 0 0
## 2 took husband chiro adjustment wont stop talkin... 0 0 ... 0 0 0
## 3 ive coming weekend least favorite time visit c... 0 0 ... 0 0 0
## 4 wonderful day friend month ago loved pamperin... 0 0 ... 0 0 0
##
## [5 rows x 52614 columns]
n_gram3_test_df.head()
## lemmatizedReviews 0 1 ... 52610 52611 52612
## 0 ive consistently last year mistake made start... 0 0 ... 0 0 0
## 1 place best ive coming year ive always accommod... 0 0 ... 0 0 0
## 2 place get vote grocery store year huge fresh ... 0 0 ... 0 0 0
## 3 imagine planning family event last three mont... 0 0 ... 0 0 0
## 4 new experience low cost grocery store actuall... 0 0 ... 0 0 0
##
## [5 rows x 52614 columns]
n_gram3_train3=pd.DataFrame(n_gram3_train.toarray())
n_gram3_test3=pd.DataFrame(n_gram3_test.toarray())
n_gram3_train3.columns=Ngram3
n_gram3_test3.columns=Ngram3
n_gram3_train3.head()
## 058 058 078 058 078 098 ... zumba zumba qi zumba qi gong
## 0 0 0 0 ... 0 0 0
## 1 0 0 0 ... 0 0 0
## 2 0 0 0 ... 0 0 0
## 3 0 0 0 ... 0 0 0
## 4 0 0 0 ... 0 0 0
##
## [5 rows x 52613 columns]
Write this table of ngram tokens out to csv.
n_gram3_train3.to_csv('trigram_train_healthwell.csv', index=False)
n_gram3_test3.to_csv('trigram_test_healthWell.csv', index=False)
y_train.to_csv('healthWell_trainY.csv',index=False)
y_test.to_csv('healthWell_testY.csv',index=False)
Lets read in this large file in RStudio, and combine the data into one table. We will step outside the python language and use R language for this next chunk.The following start running slow
# ngrams123train <- read.csv('trigram_train_healthwell.csv', sep=',', header=TRUE,
# na.strings=c('',' ','NA'))
#
# ngrams123test <- read.csv('trigram_test_healthWell.csv', sep=',', header=TRUE,
# na.strings=c('',' ','NA'))
#
# ytrain <- read.csv('healthWell_trainY.csv', sep=',', header=FALSE,
# na.strings=c('',' ','NA'))
# colnames(ytrain) <- 'Rating'
#
# ytest <- read.csv('healthWell_testY.csv', sep=',', header=FALSE,
# na.strings=c('',' ','NA'))
# colnames(ytest) <- 'Rating'
#
# train <- cbind(ytrain,ngrams123train)
# test <- cbind(ytest,ngrams123test)
#
# ngrams123All <- rbind(train,test)
#
# write.csv(ngrams123All,'lemm123gramsHealthWellness.csv', row.names=FALSE)
from sklearn.ensemble import RandomForestClassifier, GradientBoostingClassifier
from sklearn.metrics import precision_recall_fscore_support as score
import time
rf=RandomForestClassifier(n_estimators=50, max_depth=None, n_jobs=-1)
start=time.time()
rf_model=rf.fit(n_gram3_train3,y_train)
end=time.time()
fit_time=(end-start)
fit_time
## 7.292306423187256
start=time.time()
y_pred=rf_model.predict(n_gram3_test3)
end=time.time()
pred_time=(end-start)
pred_time
## 0.3497960567474365
prd = pd.DataFrame(y_pred)
prd.columns=['Predicted']
prd.index=y_test.index
pred=pd.concat([pd.DataFrame(prd),y_test],axis=1)
print(pred)
## Predicted userRatingValue
## 501 4 3
## 139 5 5
## 229 5 4
## 156 1 1
## 211 3 5
## .. ... ...
## 178 1 1
## 34 5 3
## 342 5 2
## 270 5 3
## 118 5 5
##
## [93 rows x 2 columns]
from sklearn.metrics import classification_report, f1_score, accuracy_score, confusion_matrix
print('accuracy', accuracy_score(y_test, y_pred))
## accuracy 0.6129032258064516
print('confusion matrix\n', confusion_matrix(y_test, y_pred))
## confusion matrix
## [[ 3 0 0 0 9]
## [ 0 0 0 0 4]
## [ 0 0 0 1 6]
## [ 1 1 2 3 9]
## [ 0 0 1 2 51]]
print('(row=expected, col=predicted)')
## (row=expected, col=predicted)
print(classification_report(y_test, y_pred))
## precision recall f1-score support
##
## 1 0.75 0.25 0.38 12
## 2 0.00 0.00 0.00 4
## 3 0.00 0.00 0.00 7
## 4 0.50 0.19 0.27 16
## 5 0.65 0.94 0.77 54
##
## accuracy 0.61 93
## macro avg 0.38 0.28 0.28 93
## weighted avg 0.56 0.61 0.54 93
gb=GradientBoostingClassifier(n_estimators=50,max_depth=5)
start=time.time()
gb_model=gb.fit(n_gram3_train3,y_train)
end=time.time()
fit_time=(end-start)
fit_time
## 1654.5893015861511
start=time.time()
y_pred=gb_model.predict(n_gram3_test3)
end=time.time()
pred_time=(end-start)
pred_time
## 0.19605112075805664
prd = pd.DataFrame(y_pred)
prd.columns=['Predicted']
prd.index=y_test.index
pred=pd.concat([pd.DataFrame(prd),y_test],axis=1)
print(pred)
## Predicted userRatingValue
## 501 4 3
## 139 5 5
## 229 4 4
## 156 1 1
## 211 3 5
## .. ... ...
## 178 1 1
## 34 5 3
## 342 5 2
## 270 4 3
## 118 5 5
##
## [93 rows x 2 columns]
from sklearn.metrics import classification_report, f1_score, accuracy_score, confusion_matrix
print('accuracy', accuracy_score(y_test, y_pred))
## accuracy 0.6344086021505376
print('confusion matrix\n', confusion_matrix(y_test, y_pred))
## confusion matrix
## [[ 5 0 0 0 7]
## [ 0 0 0 0 4]
## [ 0 0 0 3 4]
## [ 1 1 3 5 6]
## [ 1 1 1 2 49]]
print('(row=expected, col=predicted)')
## (row=expected, col=predicted)
print(classification_report(y_test, y_pred))
## precision recall f1-score support
##
## 1 0.71 0.42 0.53 12
## 2 0.00 0.00 0.00 4
## 3 0.00 0.00 0.00 7
## 4 0.50 0.31 0.38 16
## 5 0.70 0.91 0.79 54
##
## accuracy 0.63 93
## macro avg 0.38 0.33 0.34 93
## weighted avg 0.58 0.63 0.59 93
It is great that these two produced the same results of 100%, as they should because each class of modality is a duplicate up to 23 duplicates, or 24 samples of each modality that are all identical. I ran a previous script on the same data and used 1-3 ngrams and the hot stone therapy observations were all getting misclassified as deep tissue recommendations for benefits and the same for contraindications of each type.
Reload these packages to use Python in R, if you are just jumping to this section.
library(reticulate)
conda_list(conda = "auto")
## name python
## 1 Anaconda2 C:\\Users\\m\\Anaconda2\\python.exe
## 2 djangoenv C:\\Users\\m\\Anaconda2\\envs\\djangoenv\\python.exe
## 3 python36 C:\\Users\\m\\Anaconda2\\envs\\python36\\python.exe
## 4 python37 C:\\Users\\m\\Anaconda2\\envs\\python37\\python.exe
## 5 r-reticulate C:\\Users\\m\\Anaconda2\\envs\\r-reticulate\\python.exe
use_condaenv(condaenv = "python36")
import numpy as np
import pandas as pd
data = pd.read_csv('lemm123gramsHealthWellness.csv', encoding='unicode_escape')
data.shape
## (614, 52614)
data.head()
## Rating X058 X058.078 ... zumba zumba.qi zumba.qi.gong
## 0 1 0 0 ... 0 0 0
## 1 1 0 0 ... 0 0 0
## 2 5 0 0 ... 0 0 0
## 3 1 0 0 ... 0 0 0
## 4 4 0 0 ... 0 0 0
##
## [5 rows x 52614 columns]
np.random.seed(123)
data0 = data.reindex(np.random.permutation(data.index))
data1 = data0.iloc[:,3:]
data1.shape
## (614, 52611)
target=data0.iloc[:,0:1]
target.shape
## (614, 1)
print(target['Rating'].unique())
## [5 4 3 2 1]
print(len(target['Rating'].unique()))
## 5
mean_vals = np.mean(data1, axis=0)
std_val = np.std(data1)
data1_centered = (data1 - mean_vals)/std_val
print(data1_centered.shape, target.shape)
## (614, 52611) (614, 1)
print(data1.head())
## X058.078.098 X078 X078.098 ... zumba zumba.qi zumba.qi.gong
## 583 0 0 0 ... 0 0 0
## 11 0 0 0 ... 0 0 0
## 443 0 0 0 ... 0 0 0
## 442 0 0 0 ... 0 0 0
## 267 0 0 0 ... 0 0 0
##
## [5 rows x 52611 columns]
print(target.head())
## Rating
## 583 5
## 11 4
## 443 5
## 442 3
## 267 4
These values for rating are already integer, so this next step and the one that follows isn’t needed, but we could just run it anyways.
#numpy function
class_mapping = {label: idx for idx, label in enumerate(np.unique(target['Rating']))}
class_mapping
## {1: 0, 2: 1, 3: 2, 4: 3, 5: 4}
The mapping mapped ratings 1-5 into 0:4 values.
target['OH_rating']=target['Rating']
## C:/Users/m/Anaconda2/envs/python36/python.exe:1: SettingWithCopyWarning:
## A value is trying to be set on a copy of a slice from a DataFrame.
## Try using .loc[row_indexer,col_indexer] = value instead
##
## See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
target['OH_rating'] = target['Rating'].map(class_mapping)
target.head()
## Rating OH_rating
## 583 5 4
## 11 4 3
## 443 5 4
## 442 3 2
## 267 4 3
target1 = target['OH_rating']
target1.head()
## 583 4
## 11 3
## 443 4
## 442 2
## 267 3
## Name: OH_rating, dtype: int64
Testing split on data with permutated indices. There are 614 instances in this data, 20% is about 123, and 80% is about 491 instances.
X_train = data1[:491]
X_test = data1[491:]
y_train = target1[:491]
y_test = target1[491:]
################################
# for adding the names of the classes after prediction from earlier in script
y_trainNames = target['Rating']
y_trainNames = y_trainNames[:491]
y_trainNames.columns=['Rating']
y_trainNames1=pd.DataFrame(y_trainNames)
y_testNames = target['Rating']
y_testNames = y_testNames[491:]
y_testNames.columns=['Rating']
y_testNames1=pd.DataFrame(y_testNames)
################################
print(X_train.shape)
## (491, 52611)
print(y_train.shape)
## (491,)
print(X_test.shape)
## (123, 52611)
print(y_test.shape)
## (123,)
y_train
## 583 4
## 11 3
## 443 4
## 442 2
## 267 3
## ..
## 319 4
## 364 4
## 412 4
## 141 4
## 454 3
## Name: OH_rating, Length: 491, dtype: int64
import tensorflow as tf
import tensorflow.contrib.keras as keras
#optionally use import tensorflow.keras as keras when no longer experimental contributor package development
np.random.seed(123)
tf.set_random_seed(123)
model = keras.models.Sequential()
model.add(
keras.layers.Dense(
units=150, #output units need to match next layer inputs
input_dim=52611, #number of features for input above says 52611
kernel_initializer='glorot_uniform',# name of the guy behind Xavier Initialization; the biases to zero
bias_initializer='zeros',
activation='tanh'))
## WARNING: Logging before flag parsing goes to stderr.
## W0503 23:07:31.571034 41724 deprecation.py:506] From C:\Users\m\Anaconda2\envs\python36\lib\site-packages\tensorflow\python\ops\init_ops.py:1251: calling VarianceScaling.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version.
## Instructions for updating:
## Call initializer instance with the dtype argument instead of passing it to the constructor
model.add(
keras.layers.Dense(
units=150, #output matches next layer input
input_dim=150, #input matches last layer's output
kernel_initializer='glorot_uniform',
bias_initializer='zeros',
activation='tanh'))
model.add(
keras.layers.Dense(
units=19, #these are the number of class categories in our target
input_dim=150,
kernel_initializer='glorot_uniform',
bias_initializer='zeros',
activation='softmax'))#will return the class membership probs summing to 1 of all class probs
# these are hyperparameters that can be tuned if overfitting during training, or to get better accuracy
sgd_optimizer = keras.optimizers.SGD(
lr=0.001, decay=1e-7, momentum=.9)
# categorical_crossentropy is used in multiclass classification instead of binary_crossentropy
# to match the softmax function
model.compile(optimizer=sgd_optimizer,
loss='sparse_categorical_crossentropy')
# it was 'categorical_crossentropy', but that expects binary matrices of 1s and 0s
# it said to use sparse_categorical_crossentropy
model2 = keras.models.Sequential()
model2.add(
keras.layers.Dense(
units=200, #output units need to match next layer inputs
input_dim=52611, #number of features for input above says 52611
kernel_initializer='glorot_uniform',# name of the guy behind Xavier Initialization; the biases to zero
bias_initializer='zeros',
activation='tanh'))
model2.add(
keras.layers.Dense(
units=150, #output matches next layer input
input_dim=200, #input matches last layer's output
kernel_initializer='glorot_uniform',
bias_initializer='zeros',
activation='tanh'))
model2.add(
keras.layers.Dense(
units=300, #output matches next layer input
input_dim=150, #input matches last layer's output
kernel_initializer='glorot_uniform',
bias_initializer='zeros',
activation='relu'))
model2.add(
keras.layers.Dense(
units=19, #these are the number of class categories in our target
input_dim=300,
kernel_initializer='glorot_uniform',
bias_initializer='zeros',
activation='softmax'))#will return the class membership probs summing to 1 of all class probs
# these are hyperparameters that can be tuned if overfitting during training, or to get better accuracy
sgd_optimizer = keras.optimizers.SGD(
lr=0.01, decay=1e-7, momentum=.9)
# categorical_crossentropy is used in multiclass classification instead of binary_crossentropy
# to match the softmax function
model2.compile(optimizer=sgd_optimizer,
loss='sparse_categorical_crossentropy')
# it was 'categorical_crossentropy', but that expects binary matrices of 1s and 0s
# it said to use sparse_categorical_crossentropy
history = model.fit(X_train, y_train,
batch_size=64, epochs=50,
verbose=1, #setting verbose=1 will allow us to see the training and stop to tune parameters if needed
validation_split=0.1) # this takes 10% of the training set held out for testing/validation at each epoch
## Train on 441 samples, validate on 50 samples
## Epoch 1/50
##
## 64/441 [===>..........................] - ETA: 33s - loss: 2.9429
## 128/441 [=======>......................] - ETA: 14s - loss: 2.9310
## 192/441 [============>.................] - ETA: 8s - loss: 2.9290
## 256/441 [================>.............] - ETA: 4s - loss: 2.9288
## 320/441 [====================>.........] - ETA: 2s - loss: 2.9231
## 384/441 [=========================>....] - ETA: 1s - loss: 2.9146
## 441/441 [==============================] - 8s 18ms/sample - loss: 2.9041 - val_loss: 2.8358
## Epoch 2/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 2.8402
## 128/441 [=======>......................] - ETA: 1s - loss: 2.8253
## 192/441 [============>.................] - ETA: 1s - loss: 2.8064
## 256/441 [================>.............] - ETA: 0s - loss: 2.7936
## 320/441 [====================>.........] - ETA: 0s - loss: 2.7762
## 384/441 [=========================>....] - ETA: 0s - loss: 2.7626
## 441/441 [==============================] - 3s 6ms/sample - loss: 2.7510 - val_loss: 2.6408
## Epoch 3/50
##
## 64/441 [===>..........................] - ETA: 4s - loss: 2.6307
## 128/441 [=======>......................] - ETA: 3s - loss: 2.6227
## 192/441 [============>.................] - ETA: 2s - loss: 2.5905
## 256/441 [================>.............] - ETA: 1s - loss: 2.5836
## 320/441 [====================>.........] - ETA: 0s - loss: 2.5722
## 384/441 [=========================>....] - ETA: 0s - loss: 2.5488
## 441/441 [==============================] - 3s 8ms/sample - loss: 2.5346 - val_loss: 2.4170
## Epoch 4/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 2.3184
## 128/441 [=======>......................] - ETA: 1s - loss: 2.3503
## 192/441 [============>.................] - ETA: 1s - loss: 2.3589
## 256/441 [================>.............] - ETA: 1s - loss: 2.3514
## 320/441 [====================>.........] - ETA: 0s - loss: 2.3338
## 384/441 [=========================>....] - ETA: 0s - loss: 2.3229
## 441/441 [==============================] - 3s 6ms/sample - loss: 2.3140 - val_loss: 2.2001
## Epoch 5/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 2.1849
## 128/441 [=======>......................] - ETA: 2s - loss: 2.1916
## 192/441 [============>.................] - ETA: 2s - loss: 2.1595
## 256/441 [================>.............] - ETA: 1s - loss: 2.1370
## 320/441 [====================>.........] - ETA: 1s - loss: 2.1193
## 384/441 [=========================>....] - ETA: 0s - loss: 2.1195
## 441/441 [==============================] - 4s 9ms/sample - loss: 2.1073 - val_loss: 2.0033
## Epoch 6/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 1.9862
## 128/441 [=======>......................] - ETA: 1s - loss: 1.9685
## 192/441 [============>.................] - ETA: 1s - loss: 1.9412
## 256/441 [================>.............] - ETA: 0s - loss: 1.9501
## 320/441 [====================>.........] - ETA: 0s - loss: 1.9367
## 384/441 [=========================>....] - ETA: 0s - loss: 1.9272
## 441/441 [==============================] - 2s 5ms/sample - loss: 1.9250 - val_loss: 1.8310
## Epoch 7/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 1.8906
## 128/441 [=======>......................] - ETA: 1s - loss: 1.8520
## 192/441 [============>.................] - ETA: 1s - loss: 1.8644
## 256/441 [================>.............] - ETA: 0s - loss: 1.8213
## 320/441 [====================>.........] - ETA: 0s - loss: 1.8143
## 384/441 [=========================>....] - ETA: 0s - loss: 1.7814
## 441/441 [==============================] - 4s 9ms/sample - loss: 1.7661 - val_loss: 1.6899
## Epoch 8/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 1.7063
## 128/441 [=======>......................] - ETA: 2s - loss: 1.7230
## 192/441 [============>.................] - ETA: 1s - loss: 1.7610
## 256/441 [================>.............] - ETA: 1s - loss: 1.7212
## 320/441 [====================>.........] - ETA: 0s - loss: 1.6638
## 384/441 [=========================>....] - ETA: 0s - loss: 1.6636
## 441/441 [==============================] - 3s 6ms/sample - loss: 1.6328 - val_loss: 1.5740
## Epoch 9/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 1.6446
## 128/441 [=======>......................] - ETA: 1s - loss: 1.7041
## 192/441 [============>.................] - ETA: 1s - loss: 1.5906
## 256/441 [================>.............] - ETA: 0s - loss: 1.5618
## 320/441 [====================>.........] - ETA: 0s - loss: 1.5608
## 384/441 [=========================>....] - ETA: 0s - loss: 1.5269
## 441/441 [==============================] - 3s 6ms/sample - loss: 1.5182 - val_loss: 1.4793
## Epoch 10/50
##
## 64/441 [===>..........................] - ETA: 4s - loss: 1.4538
## 128/441 [=======>......................] - ETA: 3s - loss: 1.4743
## 192/441 [============>.................] - ETA: 2s - loss: 1.4108
## 256/441 [================>.............] - ETA: 1s - loss: 1.4131
## 320/441 [====================>.........] - ETA: 1s - loss: 1.4236
## 384/441 [=========================>....] - ETA: 0s - loss: 1.4273
## 441/441 [==============================] - 4s 8ms/sample - loss: 1.4204 - val_loss: 1.3993
## Epoch 11/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 1.3569
## 128/441 [=======>......................] - ETA: 1s - loss: 1.2889
## 192/441 [============>.................] - ETA: 1s - loss: 1.2727
## 256/441 [================>.............] - ETA: 0s - loss: 1.2846
## 320/441 [====================>.........] - ETA: 0s - loss: 1.3205
## 384/441 [=========================>....] - ETA: 0s - loss: 1.3242
## 441/441 [==============================] - 2s 6ms/sample - loss: 1.3353 - val_loss: 1.3351
## Epoch 12/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 1.1232
## 128/441 [=======>......................] - ETA: 1s - loss: 1.2706
## 192/441 [============>.................] - ETA: 1s - loss: 1.2784
## 256/441 [================>.............] - ETA: 1s - loss: 1.3215
## 320/441 [====================>.........] - ETA: 1s - loss: 1.3130
## 384/441 [=========================>....] - ETA: 0s - loss: 1.2816
## 441/441 [==============================] - 4s 9ms/sample - loss: 1.2616 - val_loss: 1.2821
## Epoch 13/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 1.3276
## 128/441 [=======>......................] - ETA: 1s - loss: 1.2746
## 192/441 [============>.................] - ETA: 1s - loss: 1.3192
## 256/441 [================>.............] - ETA: 1s - loss: 1.2547
## 320/441 [====================>.........] - ETA: 0s - loss: 1.2245
## 384/441 [=========================>....] - ETA: 0s - loss: 1.2140
## 441/441 [==============================] - 3s 6ms/sample - loss: 1.1972 - val_loss: 1.2375
## Epoch 14/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 1.1928
## 128/441 [=======>......................] - ETA: 1s - loss: 1.1854
## 192/441 [============>.................] - ETA: 1s - loss: 1.1942
## 256/441 [================>.............] - ETA: 0s - loss: 1.1779
## 320/441 [====================>.........] - ETA: 0s - loss: 1.1122
## 384/441 [=========================>....] - ETA: 0s - loss: 1.1163
## 441/441 [==============================] - 3s 7ms/sample - loss: 1.1403 - val_loss: 1.1979
## Epoch 15/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 1.1233
## 128/441 [=======>......................] - ETA: 1s - loss: 1.1194
## 192/441 [============>.................] - ETA: 1s - loss: 1.0759
## 256/441 [================>.............] - ETA: 0s - loss: 1.0806
## 320/441 [====================>.........] - ETA: 0s - loss: 1.1029
## 384/441 [=========================>....] - ETA: 0s - loss: 1.0955
## 441/441 [==============================] - 2s 5ms/sample - loss: 1.0889 - val_loss: 1.1654
## Epoch 16/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 1.0316
## 128/441 [=======>......................] - ETA: 1s - loss: 1.0366
## 192/441 [============>.................] - ETA: 1s - loss: 1.0179
## 256/441 [================>.............] - ETA: 0s - loss: 1.0335
## 320/441 [====================>.........] - ETA: 0s - loss: 1.0644
## 384/441 [=========================>....] - ETA: 0s - loss: 1.0574
## 441/441 [==============================] - 3s 6ms/sample - loss: 1.0430 - val_loss: 1.1359
## Epoch 17/50
##
## 64/441 [===>..........................] - ETA: 4s - loss: 0.9337
## 128/441 [=======>......................] - ETA: 2s - loss: 0.9978
## 192/441 [============>.................] - ETA: 2s - loss: 1.0344
## 256/441 [================>.............] - ETA: 1s - loss: 1.0462
## 320/441 [====================>.........] - ETA: 0s - loss: 1.0228
## 384/441 [=========================>....] - ETA: 0s - loss: 1.0171
## 441/441 [==============================] - 4s 10ms/sample - loss: 1.0012 - val_loss: 1.1108
## Epoch 18/50
##
## 64/441 [===>..........................] - ETA: 3s - loss: 0.9184
## 128/441 [=======>......................] - ETA: 2s - loss: 1.0252
## 192/441 [============>.................] - ETA: 2s - loss: 1.0387
## 256/441 [================>.............] - ETA: 2s - loss: 1.0256
## 320/441 [====================>.........] - ETA: 1s - loss: 1.0156
## 384/441 [=========================>....] - ETA: 0s - loss: 0.9713
## 441/441 [==============================] - 5s 12ms/sample - loss: 0.9629 - val_loss: 1.0877
## Epoch 19/50
##
## 64/441 [===>..........................] - ETA: 6s - loss: 1.0225
## 128/441 [=======>......................] - ETA: 5s - loss: 0.9366
## 192/441 [============>.................] - ETA: 3s - loss: 0.9016
## 256/441 [================>.............] - ETA: 2s - loss: 0.8830
## 320/441 [====================>.........] - ETA: 1s - loss: 0.9054
## 384/441 [=========================>....] - ETA: 0s - loss: 0.9248
## 441/441 [==============================] - 6s 15ms/sample - loss: 0.9269 - val_loss: 1.0667
## Epoch 20/50
##
## 64/441 [===>..........................] - ETA: 4s - loss: 0.7832
## 128/441 [=======>......................] - ETA: 4s - loss: 0.8395
## 192/441 [============>.................] - ETA: 3s - loss: 0.8736
## 256/441 [================>.............] - ETA: 2s - loss: 0.8910
## 320/441 [====================>.........] - ETA: 1s - loss: 0.9206
## 384/441 [=========================>....] - ETA: 0s - loss: 0.8884
## 441/441 [==============================] - 6s 13ms/sample - loss: 0.8941 - val_loss: 1.0477
## Epoch 21/50
##
## 64/441 [===>..........................] - ETA: 4s - loss: 0.8571
## 128/441 [=======>......................] - ETA: 3s - loss: 0.7899
## 192/441 [============>.................] - ETA: 2s - loss: 0.7881
## 256/441 [================>.............] - ETA: 1s - loss: 0.8419
## 320/441 [====================>.........] - ETA: 1s - loss: 0.8738
## 384/441 [=========================>....] - ETA: 0s - loss: 0.8714
## 441/441 [==============================] - 4s 8ms/sample - loss: 0.8626 - val_loss: 1.0313
## Epoch 22/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 0.9643
## 128/441 [=======>......................] - ETA: 1s - loss: 0.9109
## 192/441 [============>.................] - ETA: 1s - loss: 0.8662
## 256/441 [================>.............] - ETA: 0s - loss: 0.8679
## 320/441 [====================>.........] - ETA: 0s - loss: 0.8452
## 384/441 [=========================>....] - ETA: 0s - loss: 0.8387
## 441/441 [==============================] - 4s 10ms/sample - loss: 0.8346 - val_loss: 1.0167
## Epoch 23/50
##
## 64/441 [===>..........................] - ETA: 3s - loss: 0.7952
## 128/441 [=======>......................] - ETA: 3s - loss: 0.8552
## 192/441 [============>.................] - ETA: 2s - loss: 0.8071
## 256/441 [================>.............] - ETA: 2s - loss: 0.8331
## 320/441 [====================>.........] - ETA: 1s - loss: 0.8362
## 384/441 [=========================>....] - ETA: 0s - loss: 0.8303
## 441/441 [==============================] - 5s 11ms/sample - loss: 0.8070 - val_loss: 1.0029
## Epoch 24/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 0.8861
## 128/441 [=======>......................] - ETA: 1s - loss: 0.8810
## 192/441 [============>.................] - ETA: 1s - loss: 0.8409
## 256/441 [================>.............] - ETA: 0s - loss: 0.8096
## 320/441 [====================>.........] - ETA: 0s - loss: 0.7931
## 384/441 [=========================>....] - ETA: 0s - loss: 0.7976
## 441/441 [==============================] - 3s 8ms/sample - loss: 0.7811 - val_loss: 0.9890
## Epoch 25/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 0.7692
## 128/441 [=======>......................] - ETA: 2s - loss: 0.7195
## 192/441 [============>.................] - ETA: 1s - loss: 0.8212
## 256/441 [================>.............] - ETA: 1s - loss: 0.7744
## 320/441 [====================>.........] - ETA: 0s - loss: 0.7599
## 384/441 [=========================>....] - ETA: 0s - loss: 0.7525
## 441/441 [==============================] - 4s 8ms/sample - loss: 0.7562 - val_loss: 0.9766
## Epoch 26/50
##
## 64/441 [===>..........................] - ETA: 3s - loss: 0.8751
## 128/441 [=======>......................] - ETA: 3s - loss: 0.8116
## 192/441 [============>.................] - ETA: 2s - loss: 0.7434
## 256/441 [================>.............] - ETA: 1s - loss: 0.7082
## 320/441 [====================>.........] - ETA: 1s - loss: 0.7056
## 384/441 [=========================>....] - ETA: 0s - loss: 0.7256
## 441/441 [==============================] - 5s 10ms/sample - loss: 0.7333 - val_loss: 0.9640
## Epoch 27/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 0.7871
## 128/441 [=======>......................] - ETA: 2s - loss: 0.7139
## 192/441 [============>.................] - ETA: 1s - loss: 0.7227
## 256/441 [================>.............] - ETA: 1s - loss: 0.7105
## 320/441 [====================>.........] - ETA: 0s - loss: 0.7280
## 384/441 [=========================>....] - ETA: 0s - loss: 0.7135
## 441/441 [==============================] - 3s 7ms/sample - loss: 0.7108 - val_loss: 0.9540
## Epoch 28/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 0.8505
## 128/441 [=======>......................] - ETA: 3s - loss: 0.7407
## 192/441 [============>.................] - ETA: 2s - loss: 0.7067
## 256/441 [================>.............] - ETA: 2s - loss: 0.6765
## 320/441 [====================>.........] - ETA: 1s - loss: 0.6812
## 384/441 [=========================>....] - ETA: 0s - loss: 0.6712
## 441/441 [==============================] - 5s 11ms/sample - loss: 0.6901 - val_loss: 0.9450
## Epoch 29/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 0.6697
## 128/441 [=======>......................] - ETA: 1s - loss: 0.6895
## 192/441 [============>.................] - ETA: 1s - loss: 0.7019
## 256/441 [================>.............] - ETA: 1s - loss: 0.7058
## 320/441 [====================>.........] - ETA: 0s - loss: 0.6720
## 384/441 [=========================>....] - ETA: 0s - loss: 0.6693
## 441/441 [==============================] - 3s 6ms/sample - loss: 0.6695 - val_loss: 0.9343
## Epoch 30/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 0.6085
## 128/441 [=======>......................] - ETA: 2s - loss: 0.6394
## 192/441 [============>.................] - ETA: 1s - loss: 0.6137
## 256/441 [================>.............] - ETA: 1s - loss: 0.6468
## 320/441 [====================>.........] - ETA: 1s - loss: 0.6643
## 384/441 [=========================>....] - ETA: 0s - loss: 0.6528
## 441/441 [==============================] - 4s 8ms/sample - loss: 0.6503 - val_loss: 0.9245
## Epoch 31/50
##
## 64/441 [===>..........................] - ETA: 3s - loss: 0.6835
## 128/441 [=======>......................] - ETA: 2s - loss: 0.7010
## 192/441 [============>.................] - ETA: 1s - loss: 0.7010
## 256/441 [================>.............] - ETA: 1s - loss: 0.6625
## 320/441 [====================>.........] - ETA: 0s - loss: 0.6469
## 384/441 [=========================>....] - ETA: 0s - loss: 0.6449
## 441/441 [==============================] - 3s 7ms/sample - loss: 0.6317 - val_loss: 0.9180
## Epoch 32/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 0.6660
## 128/441 [=======>......................] - ETA: 1s - loss: 0.6862
## 192/441 [============>.................] - ETA: 1s - loss: 0.6492
## 256/441 [================>.............] - ETA: 1s - loss: 0.6176
## 320/441 [====================>.........] - ETA: 0s - loss: 0.6159
## 384/441 [=========================>....] - ETA: 0s - loss: 0.6019
## 441/441 [==============================] - 4s 9ms/sample - loss: 0.6139 - val_loss: 0.9093
## Epoch 33/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 0.5912
## 128/441 [=======>......................] - ETA: 2s - loss: 0.5690
## 192/441 [============>.................] - ETA: 2s - loss: 0.5461
## 256/441 [================>.............] - ETA: 1s - loss: 0.5618
## 320/441 [====================>.........] - ETA: 0s - loss: 0.5749
## 384/441 [=========================>....] - ETA: 0s - loss: 0.5938
## 441/441 [==============================] - 3s 7ms/sample - loss: 0.5962 - val_loss: 0.9016
## Epoch 34/50
##
## 64/441 [===>..........................] - ETA: 3s - loss: 0.6492
## 128/441 [=======>......................] - ETA: 2s - loss: 0.6287
## 192/441 [============>.................] - ETA: 1s - loss: 0.6122
## 256/441 [================>.............] - ETA: 1s - loss: 0.5991
## 320/441 [====================>.........] - ETA: 0s - loss: 0.5779
## 384/441 [=========================>....] - ETA: 0s - loss: 0.5778
## 441/441 [==============================] - 4s 8ms/sample - loss: 0.5801 - val_loss: 0.8942
## Epoch 35/50
##
## 64/441 [===>..........................] - ETA: 3s - loss: 0.5425
## 128/441 [=======>......................] - ETA: 3s - loss: 0.5165
## 192/441 [============>.................] - ETA: 2s - loss: 0.5291
## 256/441 [================>.............] - ETA: 1s - loss: 0.5698
## 320/441 [====================>.........] - ETA: 1s - loss: 0.5791
## 384/441 [=========================>....] - ETA: 0s - loss: 0.5703
## 441/441 [==============================] - 3s 8ms/sample - loss: 0.5641 - val_loss: 0.8868
## Epoch 36/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 0.6160
## 128/441 [=======>......................] - ETA: 1s - loss: 0.5476
## 192/441 [============>.................] - ETA: 1s - loss: 0.5717
## 256/441 [================>.............] - ETA: 0s - loss: 0.5579
## 320/441 [====================>.........] - ETA: 0s - loss: 0.5643
## 384/441 [=========================>....] - ETA: 0s - loss: 0.5435
## 441/441 [==============================] - 2s 5ms/sample - loss: 0.5490 - val_loss: 0.8814
## Epoch 37/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 0.5233
## 128/441 [=======>......................] - ETA: 1s - loss: 0.5274
## 192/441 [============>.................] - ETA: 1s - loss: 0.5240
## 256/441 [================>.............] - ETA: 1s - loss: 0.5454
## 320/441 [====================>.........] - ETA: 0s - loss: 0.5367
## 384/441 [=========================>....] - ETA: 0s - loss: 0.5208
## 441/441 [==============================] - 4s 8ms/sample - loss: 0.5345 - val_loss: 0.8759
## Epoch 38/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 0.5040
## 128/441 [=======>......................] - ETA: 1s - loss: 0.5041
## 192/441 [============>.................] - ETA: 1s - loss: 0.5006
## 256/441 [================>.............] - ETA: 1s - loss: 0.4945
## 320/441 [====================>.........] - ETA: 0s - loss: 0.5069
## 384/441 [=========================>....] - ETA: 0s - loss: 0.5198
## 441/441 [==============================] - 3s 6ms/sample - loss: 0.5200 - val_loss: 0.8704
## Epoch 39/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 0.4176
## 128/441 [=======>......................] - ETA: 1s - loss: 0.4949
## 192/441 [============>.................] - ETA: 1s - loss: 0.5026
## 256/441 [================>.............] - ETA: 0s - loss: 0.5098
## 320/441 [====================>.........] - ETA: 0s - loss: 0.5112
## 384/441 [=========================>....] - ETA: 0s - loss: 0.5226
## 441/441 [==============================] - 4s 8ms/sample - loss: 0.5066 - val_loss: 0.8649
## Epoch 40/50
##
## 64/441 [===>..........................] - ETA: 4s - loss: 0.5350
## 128/441 [=======>......................] - ETA: 3s - loss: 0.5673
## 192/441 [============>.................] - ETA: 2s - loss: 0.5390
## 256/441 [================>.............] - ETA: 1s - loss: 0.5170
## 320/441 [====================>.........] - ETA: 0s - loss: 0.5207
## 384/441 [=========================>....] - ETA: 0s - loss: 0.4917
## 441/441 [==============================] - 3s 7ms/sample - loss: 0.4934 - val_loss: 0.8597
## Epoch 41/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 0.4557
## 128/441 [=======>......................] - ETA: 1s - loss: 0.4688
## 192/441 [============>.................] - ETA: 1s - loss: 0.4553
## 256/441 [================>.............] - ETA: 0s - loss: 0.4579
## 320/441 [====================>.........] - ETA: 0s - loss: 0.4533
## 384/441 [=========================>....] - ETA: 0s - loss: 0.4650
## 441/441 [==============================] - 3s 6ms/sample - loss: 0.4809 - val_loss: 0.8552
## Epoch 42/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 0.6002
## 128/441 [=======>......................] - ETA: 1s - loss: 0.5472
## 192/441 [============>.................] - ETA: 1s - loss: 0.5248
## 256/441 [================>.............] - ETA: 1s - loss: 0.5032
## 320/441 [====================>.........] - ETA: 1s - loss: 0.4787
## 384/441 [=========================>....] - ETA: 0s - loss: 0.4698
## 441/441 [==============================] - 5s 11ms/sample - loss: 0.4687 - val_loss: 0.8502
## Epoch 43/50
##
## 64/441 [===>..........................] - ETA: 4s - loss: 0.5098
## 128/441 [=======>......................] - ETA: 2s - loss: 0.4846
## 192/441 [============>.................] - ETA: 2s - loss: 0.4521
## 256/441 [================>.............] - ETA: 1s - loss: 0.4375
## 320/441 [====================>.........] - ETA: 0s - loss: 0.4457
## 384/441 [=========================>....] - ETA: 0s - loss: 0.4590
## 441/441 [==============================] - 4s 8ms/sample - loss: 0.4571 - val_loss: 0.8448
## Epoch 44/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 0.3919
## 128/441 [=======>......................] - ETA: 2s - loss: 0.4624
## 192/441 [============>.................] - ETA: 1s - loss: 0.4612
## 256/441 [================>.............] - ETA: 1s - loss: 0.4726
## 320/441 [====================>.........] - ETA: 1s - loss: 0.4623
## 384/441 [=========================>....] - ETA: 0s - loss: 0.4561
## 441/441 [==============================] - 4s 10ms/sample - loss: 0.4455 - val_loss: 0.8398
## Epoch 45/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 0.3646
## 128/441 [=======>......................] - ETA: 1s - loss: 0.3922
## 192/441 [============>.................] - ETA: 1s - loss: 0.4557
## 256/441 [================>.............] - ETA: 1s - loss: 0.4354
## 320/441 [====================>.........] - ETA: 0s - loss: 0.4254
## 384/441 [=========================>....] - ETA: 0s - loss: 0.4308
## 441/441 [==============================] - 3s 7ms/sample - loss: 0.4346 - val_loss: 0.8359
## Epoch 46/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 0.4819
## 128/441 [=======>......................] - ETA: 1s - loss: 0.3886
## 192/441 [============>.................] - ETA: 1s - loss: 0.4154
## 256/441 [================>.............] - ETA: 0s - loss: 0.4304
## 320/441 [====================>.........] - ETA: 0s - loss: 0.4182
## 384/441 [=========================>....] - ETA: 0s - loss: 0.4250
## 441/441 [==============================] - 3s 8ms/sample - loss: 0.4244 - val_loss: 0.8317
## Epoch 47/50
##
## 64/441 [===>..........................] - ETA: 3s - loss: 0.5664
## 128/441 [=======>......................] - ETA: 2s - loss: 0.4675
## 192/441 [============>.................] - ETA: 1s - loss: 0.4841
## 256/441 [================>.............] - ETA: 1s - loss: 0.4708
## 320/441 [====================>.........] - ETA: 0s - loss: 0.4519
## 384/441 [=========================>....] - ETA: 0s - loss: 0.4338
## 441/441 [==============================] - 3s 7ms/sample - loss: 0.4144 - val_loss: 0.8286
## Epoch 48/50
##
## 64/441 [===>..........................] - ETA: 2s - loss: 0.4565
## 128/441 [=======>......................] - ETA: 1s - loss: 0.4016
## 192/441 [============>.................] - ETA: 1s - loss: 0.4144
## 256/441 [================>.............] - ETA: 1s - loss: 0.4028
## 320/441 [====================>.........] - ETA: 0s - loss: 0.4171
## 384/441 [=========================>....] - ETA: 0s - loss: 0.4083
## 441/441 [==============================] - 3s 7ms/sample - loss: 0.4044 - val_loss: 0.8248
## Epoch 49/50
##
## 64/441 [===>..........................] - ETA: 4s - loss: 0.3833
## 128/441 [=======>......................] - ETA: 3s - loss: 0.4170
## 192/441 [============>.................] - ETA: 2s - loss: 0.4073
## 256/441 [================>.............] - ETA: 1s - loss: 0.3936
## 320/441 [====================>.........] - ETA: 0s - loss: 0.4049
## 384/441 [=========================>....] - ETA: 0s - loss: 0.4034
## 441/441 [==============================] - 3s 7ms/sample - loss: 0.3952 - val_loss: 0.8215
## Epoch 50/50
##
## 64/441 [===>..........................] - ETA: 1s - loss: 0.3352
## 128/441 [=======>......................] - ETA: 1s - loss: 0.3690
## 192/441 [============>.................] - ETA: 1s - loss: 0.3804
## 256/441 [================>.............] - ETA: 1s - loss: 0.3723
## 320/441 [====================>.........] - ETA: 0s - loss: 0.3924
## 384/441 [=========================>....] - ETA: 0s - loss: 0.3854
## 441/441 [==============================] - 4s 8ms/sample - loss: 0.3861 - val_loss: 0.8173
history2 = model2.fit(X_train, y_train,
batch_size=44, epochs=25,
verbose=1, #setting verbose=1 will allow us to see the training and stop to tune parameters if needed
validation_split=0.1) # this takes 10% of the training set held out for testing/validation at each epoch
## Train on 441 samples, validate on 50 samples
## Epoch 1/25
##
## 44/441 [=>............................] - ETA: 16s - loss: 2.9453
## 88/441 [====>.........................] - ETA: 10s - loss: 2.9373
## 132/441 [=======>......................] - ETA: 7s - loss: 2.9253
## 176/441 [==========>...................] - ETA: 5s - loss: 2.9151
## 220/441 [=============>................] - ETA: 4s - loss: 2.9036
## 264/441 [================>.............] - ETA: 3s - loss: 2.8901
## 308/441 [===================>..........] - ETA: 2s - loss: 2.8710
## 352/441 [======================>.......] - ETA: 1s - loss: 2.8438
## 396/441 [=========================>....] - ETA: 0s - loss: 2.8163
## 440/441 [============================>.] - ETA: 0s - loss: 2.7875
## 441/441 [==============================] - 8s 18ms/sample - loss: 2.7864 - val_loss: 2.2878
## Epoch 2/25
##
## 44/441 [=>............................] - ETA: 2s - loss: 2.3766
## 88/441 [====>.........................] - ETA: 2s - loss: 2.2517
## 132/441 [=======>......................] - ETA: 2s - loss: 2.2194
## 176/441 [==========>...................] - ETA: 1s - loss: 2.1338
## 220/441 [=============>................] - ETA: 1s - loss: 2.0867
## 264/441 [================>.............] - ETA: 1s - loss: 2.0163
## 308/441 [===================>..........] - ETA: 0s - loss: 1.9542
## 352/441 [======================>.......] - ETA: 0s - loss: 1.8746
## 396/441 [=========================>....] - ETA: 0s - loss: 1.8347
## 440/441 [============================>.] - ETA: 0s - loss: 1.8217
## 441/441 [==============================] - 3s 7ms/sample - loss: 1.8186 - val_loss: 1.2499
## Epoch 3/25
##
## 44/441 [=>............................] - ETA: 2s - loss: 1.1117
## 88/441 [====>.........................] - ETA: 1s - loss: 1.3524
## 132/441 [=======>......................] - ETA: 1s - loss: 1.2940
## 176/441 [==========>...................] - ETA: 1s - loss: 1.2704
## 220/441 [=============>................] - ETA: 1s - loss: 1.2244
## 264/441 [================>.............] - ETA: 1s - loss: 1.1865
## 308/441 [===================>..........] - ETA: 1s - loss: 1.1308
## 352/441 [======================>.......] - ETA: 0s - loss: 1.1042
## 396/441 [=========================>....] - ETA: 0s - loss: 1.0910
## 440/441 [============================>.] - ETA: 0s - loss: 1.0750
## 441/441 [==============================] - 4s 9ms/sample - loss: 1.0743 - val_loss: 1.0299
## Epoch 4/25
##
## 44/441 [=>............................] - ETA: 2s - loss: 1.0052
## 88/441 [====>.........................] - ETA: 2s - loss: 0.8805
## 132/441 [=======>......................] - ETA: 1s - loss: 0.8738
## 176/441 [==========>...................] - ETA: 1s - loss: 0.8629
## 220/441 [=============>................] - ETA: 1s - loss: 0.8721
## 264/441 [================>.............] - ETA: 1s - loss: 0.8425
## 308/441 [===================>..........] - ETA: 0s - loss: 0.8266
## 352/441 [======================>.......] - ETA: 0s - loss: 0.8210
## 396/441 [=========================>....] - ETA: 0s - loss: 0.7798
## 440/441 [============================>.] - ETA: 0s - loss: 0.7661
## 441/441 [==============================] - 3s 7ms/sample - loss: 0.7644 - val_loss: 0.9253
## Epoch 5/25
##
## 44/441 [=>............................] - ETA: 5s - loss: 0.4994
## 88/441 [====>.........................] - ETA: 6s - loss: 0.5356
## 132/441 [=======>......................] - ETA: 4s - loss: 0.5292
## 176/441 [==========>...................] - ETA: 3s - loss: 0.5635
## 220/441 [=============>................] - ETA: 3s - loss: 0.5532
## 264/441 [================>.............] - ETA: 2s - loss: 0.5549
## 308/441 [===================>..........] - ETA: 1s - loss: 0.5792
## 352/441 [======================>.......] - ETA: 1s - loss: 0.5585
## 396/441 [=========================>....] - ETA: 0s - loss: 0.5339
## 440/441 [============================>.] - ETA: 0s - loss: 0.5176
## 441/441 [==============================] - 5s 12ms/sample - loss: 0.5167 - val_loss: 0.8140
## Epoch 6/25
##
## 44/441 [=>............................] - ETA: 6s - loss: 0.3671
## 88/441 [====>.........................] - ETA: 4s - loss: 0.3648
## 132/441 [=======>......................] - ETA: 3s - loss: 0.3516
## 176/441 [==========>...................] - ETA: 2s - loss: 0.3849
## 220/441 [=============>................] - ETA: 2s - loss: 0.3818
## 264/441 [================>.............] - ETA: 1s - loss: 0.3663
## 308/441 [===================>..........] - ETA: 1s - loss: 0.3502
## 352/441 [======================>.......] - ETA: 0s - loss: 0.3410
## 396/441 [=========================>....] - ETA: 0s - loss: 0.3398
## 440/441 [============================>.] - ETA: 0s - loss: 0.3380
## 441/441 [==============================] - 5s 12ms/sample - loss: 0.3380 - val_loss: 0.7687
## Epoch 7/25
##
## 44/441 [=>............................] - ETA: 4s - loss: 0.1235
## 88/441 [====>.........................] - ETA: 4s - loss: 0.1485
## 132/441 [=======>......................] - ETA: 3s - loss: 0.2161
## 176/441 [==========>...................] - ETA: 2s - loss: 0.2286
## 220/441 [=============>................] - ETA: 1s - loss: 0.2325
## 264/441 [================>.............] - ETA: 1s - loss: 0.2594
## 308/441 [===================>..........] - ETA: 1s - loss: 0.2630
## 352/441 [======================>.......] - ETA: 0s - loss: 0.2521
## 396/441 [=========================>....] - ETA: 0s - loss: 0.2683
## 440/441 [============================>.] - ETA: 0s - loss: 0.2607
## 441/441 [==============================] - 4s 9ms/sample - loss: 0.2601 - val_loss: 0.9078
## Epoch 8/25
##
## 44/441 [=>............................] - ETA: 2s - loss: 0.1128
## 88/441 [====>.........................] - ETA: 2s - loss: 0.1358
## 132/441 [=======>......................] - ETA: 2s - loss: 0.1634
## 176/441 [==========>...................] - ETA: 1s - loss: 0.1609
## 220/441 [=============>................] - ETA: 1s - loss: 0.1796
## 264/441 [================>.............] - ETA: 1s - loss: 0.2005
## 308/441 [===================>..........] - ETA: 0s - loss: 0.2007
## 352/441 [======================>.......] - ETA: 0s - loss: 0.2029
## 396/441 [=========================>....] - ETA: 0s - loss: 0.2013
## 440/441 [============================>.] - ETA: 0s - loss: 0.1965
## 441/441 [==============================] - 4s 10ms/sample - loss: 0.1961 - val_loss: 0.7356
## Epoch 9/25
##
## 44/441 [=>............................] - ETA: 2s - loss: 0.1074
## 88/441 [====>.........................] - ETA: 2s - loss: 0.1846
## 132/441 [=======>......................] - ETA: 1s - loss: 0.1627
## 176/441 [==========>...................] - ETA: 1s - loss: 0.1878
## 220/441 [=============>................] - ETA: 1s - loss: 0.1854
## 264/441 [================>.............] - ETA: 1s - loss: 0.1713
## 308/441 [===================>..........] - ETA: 0s - loss: 0.1695
## 352/441 [======================>.......] - ETA: 0s - loss: 0.1646
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1567
## 440/441 [============================>.] - ETA: 0s - loss: 0.1535
## 441/441 [==============================] - 4s 8ms/sample - loss: 0.1533 - val_loss: 0.6946
## Epoch 10/25
##
## 44/441 [=>............................] - ETA: 3s - loss: 0.1692
## 88/441 [====>.........................] - ETA: 2s - loss: 0.1324
## 132/441 [=======>......................] - ETA: 2s - loss: 0.1181
## 176/441 [==========>...................] - ETA: 2s - loss: 0.1129
## 220/441 [=============>................] - ETA: 2s - loss: 0.1172
## 264/441 [================>.............] - ETA: 1s - loss: 0.1214
## 308/441 [===================>..........] - ETA: 1s - loss: 0.1333
## 352/441 [======================>.......] - ETA: 0s - loss: 0.1369
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1407
## 440/441 [============================>.] - ETA: 0s - loss: 0.1448
## 441/441 [==============================] - 5s 10ms/sample - loss: 0.1444 - val_loss: 0.7160
## Epoch 11/25
##
## 44/441 [=>............................] - ETA: 2s - loss: 0.0650
## 88/441 [====>.........................] - ETA: 2s - loss: 0.0761
## 132/441 [=======>......................] - ETA: 2s - loss: 0.0864
## 176/441 [==========>...................] - ETA: 1s - loss: 0.0875
## 220/441 [=============>................] - ETA: 1s - loss: 0.0908
## 264/441 [================>.............] - ETA: 1s - loss: 0.0985
## 308/441 [===================>..........] - ETA: 0s - loss: 0.1027
## 352/441 [======================>.......] - ETA: 0s - loss: 0.1074
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1319
## 440/441 [============================>.] - ETA: 0s - loss: 0.1278
## 441/441 [==============================] - 3s 8ms/sample - loss: 0.1276 - val_loss: 0.7672
## Epoch 12/25
##
## 44/441 [=>............................] - ETA: 5s - loss: 0.0810
## 88/441 [====>.........................] - ETA: 4s - loss: 0.0942
## 132/441 [=======>......................] - ETA: 4s - loss: 0.1144
## 176/441 [==========>...................] - ETA: 3s - loss: 0.1071
## 220/441 [=============>................] - ETA: 2s - loss: 0.1123
## 264/441 [================>.............] - ETA: 2s - loss: 0.1137
## 308/441 [===================>..........] - ETA: 1s - loss: 0.1145
## 352/441 [======================>.......] - ETA: 0s - loss: 0.1173
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1159
## 440/441 [============================>.] - ETA: 0s - loss: 0.1291
## 441/441 [==============================] - 5s 10ms/sample - loss: 0.1291 - val_loss: 0.7402
## Epoch 13/25
##
## 44/441 [=>............................] - ETA: 2s - loss: 0.1225
## 88/441 [====>.........................] - ETA: 2s - loss: 0.1122
## 132/441 [=======>......................] - ETA: 2s - loss: 0.1127
## 176/441 [==========>...................] - ETA: 2s - loss: 0.1149
## 220/441 [=============>................] - ETA: 1s - loss: 0.1171
## 264/441 [================>.............] - ETA: 1s - loss: 0.1396
## 308/441 [===================>..........] - ETA: 0s - loss: 0.1412
## 352/441 [======================>.......] - ETA: 0s - loss: 0.1402
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1381
## 440/441 [============================>.] - ETA: 0s - loss: 0.1312
## 441/441 [==============================] - 4s 9ms/sample - loss: 0.1309 - val_loss: 0.8176
## Epoch 14/25
##
## 44/441 [=>............................] - ETA: 6s - loss: 0.0591
## 88/441 [====>.........................] - ETA: 4s - loss: 0.0951
## 132/441 [=======>......................] - ETA: 3s - loss: 0.0718
## 176/441 [==========>...................] - ETA: 3s - loss: 0.0866
## 220/441 [=============>................] - ETA: 2s - loss: 0.1044
## 264/441 [================>.............] - ETA: 1s - loss: 0.1031
## 308/441 [===================>..........] - ETA: 1s - loss: 0.1084
## 352/441 [======================>.......] - ETA: 0s - loss: 0.1044
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1198
## 440/441 [============================>.] - ETA: 0s - loss: 0.1212
## 441/441 [==============================] - 4s 10ms/sample - loss: 0.1212 - val_loss: 0.7710
## Epoch 15/25
##
## 44/441 [=>............................] - ETA: 3s - loss: 0.0782
## 88/441 [====>.........................] - ETA: 2s - loss: 0.0819
## 132/441 [=======>......................] - ETA: 2s - loss: 0.0586
## 176/441 [==========>...................] - ETA: 1s - loss: 0.0845
## 220/441 [=============>................] - ETA: 1s - loss: 0.0930
## 264/441 [================>.............] - ETA: 1s - loss: 0.0875
## 308/441 [===================>..........] - ETA: 1s - loss: 0.1016
## 352/441 [======================>.......] - ETA: 0s - loss: 0.1026
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1065
## 440/441 [============================>.] - ETA: 0s - loss: 0.1146
## 441/441 [==============================] - 5s 12ms/sample - loss: 0.1144 - val_loss: 0.7776
## Epoch 16/25
##
## 44/441 [=>............................] - ETA: 3s - loss: 0.0700
## 88/441 [====>.........................] - ETA: 2s - loss: 0.0669
## 132/441 [=======>......................] - ETA: 2s - loss: 0.0918
## 176/441 [==========>...................] - ETA: 2s - loss: 0.0904
## 220/441 [=============>................] - ETA: 1s - loss: 0.0788
## 264/441 [================>.............] - ETA: 1s - loss: 0.0767
## 308/441 [===================>..........] - ETA: 1s - loss: 0.0793
## 352/441 [======================>.......] - ETA: 0s - loss: 0.1011
## 396/441 [=========================>....] - ETA: 0s - loss: 0.0985
## 440/441 [============================>.] - ETA: 0s - loss: 0.1063
## 441/441 [==============================] - 6s 14ms/sample - loss: 0.1060 - val_loss: 0.7439
## Epoch 17/25
##
## 44/441 [=>............................] - ETA: 10s - loss: 0.1143
## 88/441 [====>.........................] - ETA: 9s - loss: 0.1075
## 132/441 [=======>......................] - ETA: 7s - loss: 0.1116
## 176/441 [==========>...................] - ETA: 6s - loss: 0.1069
## 220/441 [=============>................] - ETA: 5s - loss: 0.0982
## 264/441 [================>.............] - ETA: 4s - loss: 0.1207
## 308/441 [===================>..........] - ETA: 3s - loss: 0.1192
## 352/441 [======================>.......] - ETA: 2s - loss: 0.1228
## 396/441 [=========================>....] - ETA: 1s - loss: 0.1143
## 440/441 [============================>.] - ETA: 0s - loss: 0.1178
## 441/441 [==============================] - 14s 33ms/sample - loss: 0.1176 - val_loss: 0.7657
## Epoch 18/25
##
## 44/441 [=>............................] - ETA: 7s - loss: 0.1161
## 88/441 [====>.........................] - ETA: 8s - loss: 0.1029
## 132/441 [=======>......................] - ETA: 7s - loss: 0.0946
## 176/441 [==========>...................] - ETA: 5s - loss: 0.0863
## 220/441 [=============>................] - ETA: 4s - loss: 0.0909
## 264/441 [================>.............] - ETA: 3s - loss: 0.0947
## 308/441 [===================>..........] - ETA: 2s - loss: 0.0878
## 352/441 [======================>.......] - ETA: 1s - loss: 0.0953
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1002
## 440/441 [============================>.] - ETA: 0s - loss: 0.0958
## 441/441 [==============================] - 8s 17ms/sample - loss: 0.0955 - val_loss: 0.7640
## Epoch 19/25
##
## 44/441 [=>............................] - ETA: 2s - loss: 0.0548
## 88/441 [====>.........................] - ETA: 3s - loss: 0.0544
## 132/441 [=======>......................] - ETA: 3s - loss: 0.0471
## 176/441 [==========>...................] - ETA: 2s - loss: 0.0580
## 220/441 [=============>................] - ETA: 2s - loss: 0.0688
## 264/441 [================>.............] - ETA: 1s - loss: 0.0724
## 308/441 [===================>..........] - ETA: 1s - loss: 0.0708
## 352/441 [======================>.......] - ETA: 0s - loss: 0.0856
## 396/441 [=========================>....] - ETA: 0s - loss: 0.0898
## 440/441 [============================>.] - ETA: 0s - loss: 0.1014
## 441/441 [==============================] - 5s 11ms/sample - loss: 0.1018 - val_loss: 0.7722
## Epoch 20/25
##
## 44/441 [=>............................] - ETA: 5s - loss: 0.0837
## 88/441 [====>.........................] - ETA: 5s - loss: 0.0855
## 132/441 [=======>......................] - ETA: 4s - loss: 0.0787
## 176/441 [==========>...................] - ETA: 3s - loss: 0.1029
## 220/441 [=============>................] - ETA: 2s - loss: 0.0841
## 264/441 [================>.............] - ETA: 1s - loss: 0.1524
## 308/441 [===================>..........] - ETA: 1s - loss: 0.1634
## 352/441 [======================>.......] - ETA: 0s - loss: 0.1503
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1572
## 440/441 [============================>.] - ETA: 0s - loss: 0.1501
## 441/441 [==============================] - 4s 10ms/sample - loss: 0.1498 - val_loss: 0.8185
## Epoch 21/25
##
## 44/441 [=>............................] - ETA: 2s - loss: 0.0133
## 88/441 [====>.........................] - ETA: 2s - loss: 0.0416
## 132/441 [=======>......................] - ETA: 2s - loss: 0.1756
## 176/441 [==========>...................] - ETA: 1s - loss: 0.1954
## 220/441 [=============>................] - ETA: 1s - loss: 0.1816
## 264/441 [================>.............] - ETA: 1s - loss: 0.1649
## 308/441 [===================>..........] - ETA: 0s - loss: 0.1532
## 352/441 [======================>.......] - ETA: 0s - loss: 0.1364
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1354
## 440/441 [============================>.] - ETA: 0s - loss: 0.1380
## 441/441 [==============================] - 5s 10ms/sample - loss: 0.1409 - val_loss: 0.8565
## Epoch 22/25
##
## 44/441 [=>............................] - ETA: 3s - loss: 0.1712
## 88/441 [====>.........................] - ETA: 2s - loss: 0.4504
## 132/441 [=======>......................] - ETA: 2s - loss: 0.4099
## 176/441 [==========>...................] - ETA: 2s - loss: 0.3622
## 220/441 [=============>................] - ETA: 1s - loss: 0.3373
## 264/441 [================>.............] - ETA: 1s - loss: 0.3668
## 308/441 [===================>..........] - ETA: 0s - loss: 0.3771
## 352/441 [======================>.......] - ETA: 0s - loss: 0.3783
## 396/441 [=========================>....] - ETA: 0s - loss: 0.3409
## 440/441 [============================>.] - ETA: 0s - loss: 0.3157
## 441/441 [==============================] - 3s 8ms/sample - loss: 0.3150 - val_loss: 0.9141
## Epoch 23/25
##
## 44/441 [=>............................] - ETA: 2s - loss: 0.0817
## 88/441 [====>.........................] - ETA: 2s - loss: 0.0678
## 132/441 [=======>......................] - ETA: 2s - loss: 0.2082
## 176/441 [==========>...................] - ETA: 2s - loss: 0.2699
## 220/441 [=============>................] - ETA: 2s - loss: 0.2568
## 264/441 [================>.............] - ETA: 2s - loss: 0.2766
## 308/441 [===================>..........] - ETA: 1s - loss: 0.3143
## 352/441 [======================>.......] - ETA: 1s - loss: 0.2850
## 396/441 [=========================>....] - ETA: 0s - loss: 0.2746
## 440/441 [============================>.] - ETA: 0s - loss: 0.2807
## 441/441 [==============================] - 6s 13ms/sample - loss: 0.2800 - val_loss: 0.9451
## Epoch 24/25
##
## 44/441 [=>............................] - ETA: 2s - loss: 0.0669
## 88/441 [====>.........................] - ETA: 2s - loss: 0.0882
## 132/441 [=======>......................] - ETA: 2s - loss: 0.2001
## 176/441 [==========>...................] - ETA: 2s - loss: 0.1887
## 220/441 [=============>................] - ETA: 1s - loss: 0.1662
## 264/441 [================>.............] - ETA: 1s - loss: 0.1523
## 308/441 [===================>..........] - ETA: 1s - loss: 0.1715
## 352/441 [======================>.......] - ETA: 0s - loss: 0.1614
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1691
## 440/441 [============================>.] - ETA: 0s - loss: 0.1874
## 441/441 [==============================] - 4s 9ms/sample - loss: 0.1869 - val_loss: 0.9279
## Epoch 25/25
##
## 44/441 [=>............................] - ETA: 4s - loss: 0.4000
## 88/441 [====>.........................] - ETA: 3s - loss: 0.3051
## 132/441 [=======>......................] - ETA: 3s - loss: 0.2143
## 176/441 [==========>...................] - ETA: 3s - loss: 0.1955
## 220/441 [=============>................] - ETA: 2s - loss: 0.1638
## 264/441 [================>.............] - ETA: 2s - loss: 0.1530
## 308/441 [===================>..........] - ETA: 2s - loss: 0.1613
## 352/441 [======================>.......] - ETA: 1s - loss: 0.1604
## 396/441 [=========================>....] - ETA: 0s - loss: 0.1782
## 440/441 [============================>.] - ETA: 0s - loss: 0.1691
## 441/441 [==============================] - 10s 24ms/sample - loss: 0.1688 - val_loss: 0.8728
A note on the above, it was fast even with 52k dimensions. This is because that rediscovered paper on backpropagation for matriceXvector multiplication from 1982-1986, two different sources and different versions, was rediscovered, and found to solve a lot of computation. The way that the neural networks work is with an activation layer that is linear, then applying some non-linear activations functions to a hidden layer, and more layers with other mixes or variations as to what type of non-linear function it is. They are all versions to some extent of the logistic or sigmoid activation function. Mainly the hyperbolic tangent and a rectified liner unit, and the softmax function to add to the logistic function to present meaningful class membership probabilities to classification, unlike the logistic function that provided class probabilities on each class with the max being chosen but with all class probabilities not summing to 1. I know it is jibberish above if your are a grammatic and spelling fanatic. But that was a generalization of how NNs work and I myself found it initially confusing, until I actually wrote the notes on the NNs and tested it out myself. Keep in mind the kindle version is not the best version to get, especially for coding, because of the placement and white spacing, but this book and many others do usually provide the code in a separate free sourced link to run the demos yourself.
y_train_pred = model.predict_classes(X_train, verbose=0)
print('First 3 predictions: ', y_train_pred[:3])
## First 3 predictions: [4 3 4]
y_train_pred = model.predict_classes(X_train,
verbose=0)
y_train_pred1 = pd.DataFrame(y_train_pred)
y_train_pred1.columns=['predicted']
y_train1 = y_train
y_train1 = pd.DataFrame(y_train1)
y_train1.columns=['OH_Rating']
y_train_pred1.index=y_train1.index
Train=pd.concat([y_train1['OH_Rating'],y_trainNames1['Rating'],y_train_pred1['predicted']],axis=1)
print(Train)
## OH_Rating Rating predicted
## 583 4 5 4
## 11 3 4 3
## 443 4 5 4
## 442 2 3 2
## 267 3 4 2
## .. ... ... ...
## 319 4 5 4
## 364 4 5 4
## 412 4 5 4
## 141 4 5 4
## 454 3 4 3
##
## [491 rows x 3 columns]
The true rating is the ratings on a different scale, because we chose to use the mapping that was designed for categorical values having string factor names instead of integer digits.
y_test_pred = model.predict_classes(X_test,
verbose=0)
y_test_pred1 = pd.DataFrame(y_test_pred)
y_test_pred1.columns=['predicted']
y_test1 = y_test
y_test1 = pd.DataFrame(y_test1)
y_test1.columns=['OH_Rating']
y_test_pred1.index=y_test1.index
Test=pd.concat([y_test1['OH_Rating'],y_testNames1['Rating'],y_test_pred1['predicted']],axis=1)
print(Test)
## OH_Rating Rating predicted
## 88 2 3 4
## 502 1 2 1
## 581 4 5 4
## 307 3 4 3
## 300 3 4 3
## .. ... ... ...
## 98 2 3 3
## 322 4 5 4
## 382 3 4 4
## 365 4 5 4
## 510 4 5 0
##
## [123 rows x 3 columns]
s = sum(Train['OH_Rating']==Train['predicted'])
l = len(Train['Rating'])
accTrain = s/l
print('Training Correctly Predicted:',s,'Training Accuracy:',accTrain,'\n')
## Training Correctly Predicted: 440 Training Accuracy: 0.8961303462321792
Thats pretty good accuracy on these very mixed ratings from various business models.
s = sum(Test['OH_Rating']==Test['predicted'])
l = len(Test['OH_Rating'])
accTest = s/l
print('Testing Correctly Predicted:',s,'Testing Accuracy:',accTest)
## Testing Correctly Predicted: 81 Testing Accuracy: 0.6585365853658537
However, on the testing set, we see the training set over gereralized and fit the data too closely, because the testing set accuracy was almost 30% worse than the training set with validation throughout the training process. To fix this, tuning the hyper parameters, changing the test/train split ratio, and other methods could be used to see if the score could be better. Generally, changing the learning rate will help with overfitting. We had the learning rate set to 0.001 in the above, we could lower it to 0.01 and see if it improves. We could also add more layers, change the units per layers, change activation functions throughout the layers to tanh, sigmoid, ReLU, or a generated function that is differentiable throughout its life.
y_train_pred2 = model2.predict_classes(X_train, verbose=0)
print('First 3 predictions: ', y_train_pred2[:3])
## First 3 predictions: [4 3 4]
y_train_pred2 = model2.predict_classes(X_train,
verbose=0)
y_train_pred2 = pd.DataFrame(y_train_pred2)
y_train_pred2.columns=['predicted']
y_train2 = y_train
y_train2 = pd.DataFrame(y_train2)
y_train2.columns=['OH_Rating']
y_train_pred2.index=y_train2.index
Train2=pd.concat([y_train2['OH_Rating'],y_trainNames1['Rating'],y_train_pred2['predicted']],axis=1)
print(Train2)
## OH_Rating Rating predicted
## 583 4 5 4
## 11 3 4 3
## 443 4 5 4
## 442 2 3 2
## 267 3 4 2
## .. ... ... ...
## 319 4 5 4
## 364 4 5 4
## 412 4 5 4
## 141 4 5 4
## 454 3 4 3
##
## [491 rows x 3 columns]
The true rating is the ratings on a different scale, because we chose to use the mapping that was designed for categorical values having string factor names instead of integer digits.
y_test_pred2 = model2.predict_classes(X_test,
verbose=0)
y_test_pred2 = pd.DataFrame(y_test_pred2)
y_test_pred2.columns=['predicted']
y_test2 = y_test
y_test2 = pd.DataFrame(y_test2)
y_test2.columns=['OH_Rating']
y_test_pred2.index=y_test2.index
Test2=pd.concat([y_test2['OH_Rating'],y_testNames1['Rating'],y_test_pred2['predicted']],axis=1)
print(Test2)
## OH_Rating Rating predicted
## 88 2 3 0
## 502 1 2 1
## 581 4 5 4
## 307 3 4 1
## 300 3 4 3
## .. ... ... ...
## 98 2 3 3
## 322 4 5 4
## 382 3 4 4
## 365 4 5 4
## 510 4 5 0
##
## [123 rows x 3 columns]
s = sum(Train2['OH_Rating']==Train2['predicted'])
l = len(Train2['Rating'])
accTrain2 = s/l
print('Training Correctly Predicted:',s,'Training Accuracy:',accTrain2,'\n')
## Training Correctly Predicted: 456 Training Accuracy: 0.9287169042769857
Thats pretty good accuracy on these very mixed ratings from various business models.
s = sum(Test2['OH_Rating']==Test2['predicted'])
l = len(Test2['OH_Rating'])
accTest2 = s/l
print('Testing Correctly Predicted:',s,'Testing Accuracy:',accTest2)
## Testing Correctly Predicted: 75 Testing Accuracy: 0.6097560975609756
Model 2 above overfit more when adding an extra ReLu layer, increasing the units per layer and lowering the learning rate. Because the testing set accuracy went down 2% and the training set accuracy went up 2%.
The above model should be tuned more to get better results. Trying 1000s can take up a lot of time, and unless you have the tensorflow and keras set up to run on you game card GPUs if you have one, I wouldn’t recommend it.