Introduction

This is a machine learning case study analyze the churn rate for customers of a telecom company. The dataset can be downloaded here. Below also are the needed packages. If you have trouble loading any packages post-installation try restarting console to vanilla.

library(caret)
## Loading required package: ggplot2
## Loading required package: lattice
library(glmnet)
## Loading required package: Matrix
## Loaded glmnet 4.1-4
library(ranger)
library(caretEnsemble)
## 
## Attaching package: 'caretEnsemble'
## The following object is masked from 'package:ggplot2':
## 
##     autoplot
churn <- load("Churn.RData")
head(churn)
## [1] "churn_x" "churn_y"

Folding

Utilizing createfolds() from caret package the data is split into k groups for testing/training.

# Create custom indices: myFolds
myFolds <- createFolds(churn_y, k = 5)
head(myFolds)
## $Fold1
##  [1]   1  11  21  22  23  26  29  32  34  36  48  57  59  62  70  77  79  82  87
## [20]  89  92  97 104 106 109 120 123 146 147 158 162 164 167 173 178 191 193 196
## [39] 199 204 207 219 221 225 226 230 235 247 248 250
## 
## $Fold2
##  [1]   4   6   7  14  30  31  40  44  50  53  61  64  66  67  69  71  84  85  86
## [20]  90  96 105 110 115 118 130 132 136 139 140 142 143 149 152 154 161 165 170
## [39] 171 183 184 187 188 189 192 201 202 203 217 243
## 
## $Fold3
##  [1]   2   8  16  19  28  33  37  39  41  45  46  47  51  56  58  65  68  74  78
## [20]  88  91  93  94 103 108 111 124 128 131 138 141 145 151 155 156 168 169 172
## [39] 185 186 190 195 200 205 209 211 212 213 216 237 239
## 
## $Fold4
##  [1]   3   5  13  18  24  25  35  38  42  43  49  52  54  55  60  63  73  81  98
## [20]  99 101 102 112 117 119 122 127 129 133 134 135 137 144 150 163 166 177 179
## [39] 182 194 198 206 210 218 220 227 228 232 233 249
## 
## $Fold5
##  [1]   9  10  12  15  17  20  27  72  75  76  80  83  95 100 107 113 114 116 121
## [20] 125 126 148 153 157 159 160 174 175 176 180 181 197 208 214 215 222 223 224
## [39] 229 231 234 236 238 240 241 242 244 245 246

Controlling

Also from caret, trainControl() function is used to setup the nuances for training. Here the summaryFunction is set to twoClassSummary instead of defaultSummary which allows R to compute measures specific to two-class problems like churn/no churn. By setting classProbs to TRUE this enables us to look at ROC analysis. ROC depends on having class probabilities so this is key. It should be noted the default method for resampling here is “boot”.

# Create reusable trainControl object: myControl
myControl <- trainControl(
  summaryFunction = twoClassSummary,
  classProbs = TRUE, # IMPORTANT!
  verboseIter = TRUE,
  savePredictions = TRUE,
  index = myFolds
)

Training Generalized Linear Models

Generalized linear models can serve as a great baseline model.It is relatively simple, quick, and easily interpreted. Note the use of ROC as the selection metric and myControl for trControl.

# Fit glmnet model: model_glmnet
model_glmnet <- train(
  x = churn_x, 
  y = churn_y,
  metric = "ROC",
  method = "glmnet",
  trControl = myControl
)
## + Fold1: alpha=0.10, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold1: alpha=0.10, lambda=0.01821 
## + Fold1: alpha=0.55, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold1: alpha=0.55, lambda=0.01821 
## + Fold1: alpha=1.00, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold1: alpha=1.00, lambda=0.01821 
## + Fold2: alpha=0.10, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold2: alpha=0.10, lambda=0.01821 
## + Fold2: alpha=0.55, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold2: alpha=0.55, lambda=0.01821 
## + Fold2: alpha=1.00, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold2: alpha=1.00, lambda=0.01821 
## + Fold3: alpha=0.10, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold3: alpha=0.10, lambda=0.01821 
## + Fold3: alpha=0.55, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold3: alpha=0.55, lambda=0.01821 
## + Fold3: alpha=1.00, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold3: alpha=1.00, lambda=0.01821 
## + Fold4: alpha=0.10, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold4: alpha=0.10, lambda=0.01821 
## + Fold4: alpha=0.55, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold4: alpha=0.55, lambda=0.01821 
## + Fold4: alpha=1.00, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold4: alpha=1.00, lambda=0.01821 
## + Fold5: alpha=0.10, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold5: alpha=0.10, lambda=0.01821 
## + Fold5: alpha=0.55, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold5: alpha=0.55, lambda=0.01821 
## + Fold5: alpha=1.00, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold5: alpha=1.00, lambda=0.01821 
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0182 on full training set

Training Random Forest Models

Random forest models are leveraged here by using the “ranger” method. Ranger tends to be a bit quicker than randomForest and uses less memory. There is a slight drawback here as the Random Forests does not give us the easily interpreted coefficients like model_glmnet did. The model run will use the custom cross-validation folds (5) which will allow for easy comparison to model_glmnet.

# Fit random forest: model_rf
 model_rf <- train(
  x = churn_x, 
  y = churn_y,
  metric = "ROC",
  method = "ranger",
  trControl = myControl
  )
## + Fold1: mtry= 2, min.node.size=1, splitrule=gini 
## - Fold1: mtry= 2, min.node.size=1, splitrule=gini 
## + Fold1: mtry=36, min.node.size=1, splitrule=gini 
## - Fold1: mtry=36, min.node.size=1, splitrule=gini 
## + Fold1: mtry=70, min.node.size=1, splitrule=gini 
## - Fold1: mtry=70, min.node.size=1, splitrule=gini 
## + Fold1: mtry= 2, min.node.size=1, splitrule=extratrees 
## - Fold1: mtry= 2, min.node.size=1, splitrule=extratrees 
## + Fold1: mtry=36, min.node.size=1, splitrule=extratrees 
## - Fold1: mtry=36, min.node.size=1, splitrule=extratrees 
## + Fold1: mtry=70, min.node.size=1, splitrule=extratrees 
## - Fold1: mtry=70, min.node.size=1, splitrule=extratrees 
## + Fold2: mtry= 2, min.node.size=1, splitrule=gini 
## - Fold2: mtry= 2, min.node.size=1, splitrule=gini 
## + Fold2: mtry=36, min.node.size=1, splitrule=gini 
## - Fold2: mtry=36, min.node.size=1, splitrule=gini 
## + Fold2: mtry=70, min.node.size=1, splitrule=gini 
## - Fold2: mtry=70, min.node.size=1, splitrule=gini 
## + Fold2: mtry= 2, min.node.size=1, splitrule=extratrees 
## - Fold2: mtry= 2, min.node.size=1, splitrule=extratrees 
## + Fold2: mtry=36, min.node.size=1, splitrule=extratrees 
## - Fold2: mtry=36, min.node.size=1, splitrule=extratrees 
## + Fold2: mtry=70, min.node.size=1, splitrule=extratrees 
## - Fold2: mtry=70, min.node.size=1, splitrule=extratrees 
## + Fold3: mtry= 2, min.node.size=1, splitrule=gini 
## - Fold3: mtry= 2, min.node.size=1, splitrule=gini 
## + Fold3: mtry=36, min.node.size=1, splitrule=gini 
## - Fold3: mtry=36, min.node.size=1, splitrule=gini 
## + Fold3: mtry=70, min.node.size=1, splitrule=gini 
## - Fold3: mtry=70, min.node.size=1, splitrule=gini 
## + Fold3: mtry= 2, min.node.size=1, splitrule=extratrees 
## - Fold3: mtry= 2, min.node.size=1, splitrule=extratrees 
## + Fold3: mtry=36, min.node.size=1, splitrule=extratrees 
## - Fold3: mtry=36, min.node.size=1, splitrule=extratrees 
## + Fold3: mtry=70, min.node.size=1, splitrule=extratrees 
## - Fold3: mtry=70, min.node.size=1, splitrule=extratrees 
## + Fold4: mtry= 2, min.node.size=1, splitrule=gini 
## - Fold4: mtry= 2, min.node.size=1, splitrule=gini 
## + Fold4: mtry=36, min.node.size=1, splitrule=gini 
## - Fold4: mtry=36, min.node.size=1, splitrule=gini 
## + Fold4: mtry=70, min.node.size=1, splitrule=gini 
## - Fold4: mtry=70, min.node.size=1, splitrule=gini 
## + Fold4: mtry= 2, min.node.size=1, splitrule=extratrees 
## - Fold4: mtry= 2, min.node.size=1, splitrule=extratrees 
## + Fold4: mtry=36, min.node.size=1, splitrule=extratrees 
## - Fold4: mtry=36, min.node.size=1, splitrule=extratrees 
## + Fold4: mtry=70, min.node.size=1, splitrule=extratrees 
## - Fold4: mtry=70, min.node.size=1, splitrule=extratrees 
## + Fold5: mtry= 2, min.node.size=1, splitrule=gini 
## - Fold5: mtry= 2, min.node.size=1, splitrule=gini 
## + Fold5: mtry=36, min.node.size=1, splitrule=gini 
## - Fold5: mtry=36, min.node.size=1, splitrule=gini 
## + Fold5: mtry=70, min.node.size=1, splitrule=gini 
## - Fold5: mtry=70, min.node.size=1, splitrule=gini 
## + Fold5: mtry= 2, min.node.size=1, splitrule=extratrees 
## - Fold5: mtry= 2, min.node.size=1, splitrule=extratrees 
## + Fold5: mtry=36, min.node.size=1, splitrule=extratrees 
## - Fold5: mtry=36, min.node.size=1, splitrule=extratrees 
## + Fold5: mtry=70, min.node.size=1, splitrule=extratrees 
## - Fold5: mtry=70, min.node.size=1, splitrule=extratrees 
## Aggregating results
## Selecting tuning parameters
## Fitting mtry = 36, splitrule = extratrees, min.node.size = 1 on full training set

Evaluating Models

After combining models into a list this is passed to the resamples() function which evaluates model performance. Evaluation is done on the three criteria

ROC will be the metric of ROC here when evaluating model performance. The higher the better which looks like item2 or the Random Forest model.

model_list <- list(item1 = model_glmnet, item2 = model_rf)

# Pass model_list to resamples(): resamples
resamples <- resamples(model_list)

# Summarize the results
summary(resamples)
## 
## Call:
## summary.resamples(object = resamples)
## 
## Models: item1, item2 
## Number of resamples: 5 
## 
## ROC 
##            Min.   1st Qu.    Median      Mean   3rd Qu.      Max. NA's
## item1 0.5439080 0.5519452 0.5888594 0.5804555 0.6052571 0.6123077    0
## item2 0.5841379 0.6859341 0.7406057 0.7046133 0.7520000 0.7603890    0
## 
## Sens 
##            Min.   1st Qu.    Median      Mean   3rd Qu.      Max. NA's
## item1 0.9367816 0.9428571 0.9600000 0.9575829 0.9655172 0.9827586    0
## item2 0.9770115 0.9885714 0.9942529 0.9908243 0.9942857 1.0000000    0
## 
## Spec 
##             Min.    1st Qu.     Median       Mean    3rd Qu.       Max. NA's
## item1 0.03846154 0.04000000 0.07692308 0.07815385 0.11538462 0.12000000    0
## item2 0.00000000 0.03846154 0.04000000 0.04646154 0.07692308 0.07692308    0

Visual Evaluation

Visualization of the ROC performance metric is quite useful allowing us to see which model has the best area under the curve (AUC) with the curve being the ROC metric. The first graphic is a box-and-whisker plot which concurs with our resamples() summary that item2 or model_rf has the higher mean across folds. The second graphic a simple plot of the five folds shows that each time the item2, the Random Forest model, performed better. For reference the mean score of 0.73 would be considered moderately strong. Personally, I like to use 0.70 as the cutoff for a model being reasonable with confidence improving as the metric approaches 1.0.

# Create bwplot
bwplot(resamples, metric = "ROC")

# Create xyplot
xyplot(resamples, metric = "ROC")

Stacking

Stacking models is a means of creating an ensemble model by finding a good linear combination of several classification models using either linear regression, elastic net regression, or greedy optimization. Here the method is linear regression via “glm”. Notice the p values in the summary results of the model composed of both the Random Forest and GLM models.

models <- caretList(
  x = churn_x,
  y = churn_y,
  metric = "ROC",
  trControl=myControl,
  methodList=c("glmnet", "ranger")
)
## + Fold1: alpha=0.10, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold1: alpha=0.10, lambda=0.01821 
## + Fold1: alpha=0.55, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold1: alpha=0.55, lambda=0.01821 
## + Fold1: alpha=1.00, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold1: alpha=1.00, lambda=0.01821 
## + Fold2: alpha=0.10, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold2: alpha=0.10, lambda=0.01821 
## + Fold2: alpha=0.55, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold2: alpha=0.55, lambda=0.01821 
## + Fold2: alpha=1.00, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold2: alpha=1.00, lambda=0.01821 
## + Fold3: alpha=0.10, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold3: alpha=0.10, lambda=0.01821 
## + Fold3: alpha=0.55, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold3: alpha=0.55, lambda=0.01821 
## + Fold3: alpha=1.00, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold3: alpha=1.00, lambda=0.01821 
## + Fold4: alpha=0.10, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold4: alpha=0.10, lambda=0.01821 
## + Fold4: alpha=0.55, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold4: alpha=0.55, lambda=0.01821 
## + Fold4: alpha=1.00, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold4: alpha=1.00, lambda=0.01821 
## + Fold5: alpha=0.10, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold5: alpha=0.10, lambda=0.01821 
## + Fold5: alpha=0.55, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold5: alpha=0.55, lambda=0.01821 
## + Fold5: alpha=1.00, lambda=0.01821
## Warning in lognet(xd, is.sparse, ix, jx, y, weights, offset, alpha, nobs, : one
## multinomial or binomial class has fewer than 8 observations; dangerous ground
## - Fold5: alpha=1.00, lambda=0.01821 
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0182 on full training set
## + Fold1: mtry= 2, min.node.size=1, splitrule=gini 
## - Fold1: mtry= 2, min.node.size=1, splitrule=gini 
## + Fold1: mtry=36, min.node.size=1, splitrule=gini 
## - Fold1: mtry=36, min.node.size=1, splitrule=gini 
## + Fold1: mtry=70, min.node.size=1, splitrule=gini 
## - Fold1: mtry=70, min.node.size=1, splitrule=gini 
## + Fold1: mtry= 2, min.node.size=1, splitrule=extratrees 
## - Fold1: mtry= 2, min.node.size=1, splitrule=extratrees 
## + Fold1: mtry=36, min.node.size=1, splitrule=extratrees 
## - Fold1: mtry=36, min.node.size=1, splitrule=extratrees 
## + Fold1: mtry=70, min.node.size=1, splitrule=extratrees 
## - Fold1: mtry=70, min.node.size=1, splitrule=extratrees 
## + Fold2: mtry= 2, min.node.size=1, splitrule=gini 
## - Fold2: mtry= 2, min.node.size=1, splitrule=gini 
## + Fold2: mtry=36, min.node.size=1, splitrule=gini 
## - Fold2: mtry=36, min.node.size=1, splitrule=gini 
## + Fold2: mtry=70, min.node.size=1, splitrule=gini 
## - Fold2: mtry=70, min.node.size=1, splitrule=gini 
## + Fold2: mtry= 2, min.node.size=1, splitrule=extratrees 
## - Fold2: mtry= 2, min.node.size=1, splitrule=extratrees 
## + Fold2: mtry=36, min.node.size=1, splitrule=extratrees 
## - Fold2: mtry=36, min.node.size=1, splitrule=extratrees 
## + Fold2: mtry=70, min.node.size=1, splitrule=extratrees 
## - Fold2: mtry=70, min.node.size=1, splitrule=extratrees 
## + Fold3: mtry= 2, min.node.size=1, splitrule=gini 
## - Fold3: mtry= 2, min.node.size=1, splitrule=gini 
## + Fold3: mtry=36, min.node.size=1, splitrule=gini 
## - Fold3: mtry=36, min.node.size=1, splitrule=gini 
## + Fold3: mtry=70, min.node.size=1, splitrule=gini 
## - Fold3: mtry=70, min.node.size=1, splitrule=gini 
## + Fold3: mtry= 2, min.node.size=1, splitrule=extratrees 
## - Fold3: mtry= 2, min.node.size=1, splitrule=extratrees 
## + Fold3: mtry=36, min.node.size=1, splitrule=extratrees 
## - Fold3: mtry=36, min.node.size=1, splitrule=extratrees 
## + Fold3: mtry=70, min.node.size=1, splitrule=extratrees 
## - Fold3: mtry=70, min.node.size=1, splitrule=extratrees 
## + Fold4: mtry= 2, min.node.size=1, splitrule=gini 
## - Fold4: mtry= 2, min.node.size=1, splitrule=gini 
## + Fold4: mtry=36, min.node.size=1, splitrule=gini 
## - Fold4: mtry=36, min.node.size=1, splitrule=gini 
## + Fold4: mtry=70, min.node.size=1, splitrule=gini 
## - Fold4: mtry=70, min.node.size=1, splitrule=gini 
## + Fold4: mtry= 2, min.node.size=1, splitrule=extratrees 
## - Fold4: mtry= 2, min.node.size=1, splitrule=extratrees 
## + Fold4: mtry=36, min.node.size=1, splitrule=extratrees 
## - Fold4: mtry=36, min.node.size=1, splitrule=extratrees 
## + Fold4: mtry=70, min.node.size=1, splitrule=extratrees 
## - Fold4: mtry=70, min.node.size=1, splitrule=extratrees 
## + Fold5: mtry= 2, min.node.size=1, splitrule=gini 
## - Fold5: mtry= 2, min.node.size=1, splitrule=gini 
## + Fold5: mtry=36, min.node.size=1, splitrule=gini 
## - Fold5: mtry=36, min.node.size=1, splitrule=gini 
## + Fold5: mtry=70, min.node.size=1, splitrule=gini 
## - Fold5: mtry=70, min.node.size=1, splitrule=gini 
## + Fold5: mtry= 2, min.node.size=1, splitrule=extratrees 
## - Fold5: mtry= 2, min.node.size=1, splitrule=extratrees 
## + Fold5: mtry=36, min.node.size=1, splitrule=extratrees 
## - Fold5: mtry=36, min.node.size=1, splitrule=extratrees 
## + Fold5: mtry=70, min.node.size=1, splitrule=extratrees 
## - Fold5: mtry=70, min.node.size=1, splitrule=extratrees 
## Aggregating results
## Selecting tuning parameters
## Fitting mtry = 36, splitrule = extratrees, min.node.size = 1 on full training set
# Create ensemble model: stack
stack <- caretStack(all.models=models, method = "glm")

# Look at summary
summary(stack)
## 
## Call:
## NULL
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.8798  -0.4951  -0.4047  -0.3700   2.4333  
## 
## Coefficients:
##             Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   2.6799     0.6110   4.386 1.15e-05 ***
## glmnet        3.5721     0.8076   4.423 9.74e-06 ***
## ranger       -8.9324     1.1709  -7.628 2.38e-14 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 765.13  on 999  degrees of freedom
## Residual deviance: 698.57  on 997  degrees of freedom
## AIC: 704.57
## 
## Number of Fisher Scoring iterations: 5