LOADING DATA INTO R ENVIRONMENT

TRAINING THE DECISION TREE MODEL

Running the Training Model Using Information Gain

## CART 
## 
## 8001 samples
##    3 predictor
##    2 classes: 'No', 'Yes' 
## 
## No pre-processing
## Resampling: Cross-Validated (10 fold) 
## Summary of sample sizes: 7200, 7200, 7201, 7200, 7200, 7201, ... 
## Resampling results across tuning parameters:
## 
##   cp           Accuracy   Kappa    
##   0.005617978  0.9703756  0.3888087
##   0.037453184  0.9711270  0.4398514
##   0.084269663  0.9680032  0.2068109
## 
## Accuracy was used to select the optimal model using the largest value.
## The final value used for the model was cp = 0.03745318.

Variable Importance in Decision Tree Model

TESTING THE DECISION TREE MODEL

Confusion Matrix at 50% Cut-Off Probability

## Confusion Matrix and Statistics
## 
##          Actual
## Predicted  Yes   No
##       Yes   30   15
##       No    36 1918
##                                           
##                Accuracy : 0.9745          
##                  95% CI : (0.9666, 0.9809)
##     No Information Rate : 0.967           
##     P-Value [Acc > NIR] : 0.031048        
##                                           
##                   Kappa : 0.5279          
##                                           
##  Mcnemar's Test P-Value : 0.005101        
##                                           
##             Sensitivity : 0.45455         
##             Specificity : 0.99224         
##          Pos Pred Value : 0.66667         
##          Neg Pred Value : 0.98158         
##              Prevalence : 0.03302         
##          Detection Rate : 0.01501         
##    Detection Prevalence : 0.02251         
##       Balanced Accuracy : 0.72339         
##                                           
##        'Positive' Class : Yes             
## 

Performance Metrics at different Cut-Off Probabilities

##    cutoff   Accuracy Senstivity Specificity     kappa
## 1    0.00 0.03301651  1.0000000   0.0000000 0.0000000
## 2    0.05 0.92046023  0.8333333   0.9234351 0.3779228
## 3    0.10 0.92046023  0.8333333   0.9234351 0.3779228
## 4    0.15 0.92046023  0.8333333   0.9234351 0.3779228
## 5    0.20 0.97448724  0.4545455   0.9922400 0.5279024
## 6    0.25 0.97448724  0.4545455   0.9922400 0.5279024
## 7    0.30 0.97448724  0.4545455   0.9922400 0.5279024
## 8    0.35 0.97448724  0.4545455   0.9922400 0.5279024
## 9    0.40 0.97448724  0.4545455   0.9922400 0.5279024
## 10   0.45 0.97448724  0.4545455   0.9922400 0.5279024
## 11   0.50 0.97448724  0.4545455   0.9922400 0.5279024
## 12   0.55 0.97448724  0.4545455   0.9922400 0.5279024
## 13   0.60 0.97448724  0.4545455   0.9922400 0.5279024
## 14   0.65 0.96698349  0.0000000   1.0000000 0.0000000
## 15   0.70 0.96698349  0.0000000   1.0000000 0.0000000
## 16   0.75 0.96698349  0.0000000   1.0000000 0.0000000
## 17   0.80 0.96698349  0.0000000   1.0000000 0.0000000
## 18   0.85 0.96698349  0.0000000   1.0000000 0.0000000
## 19   0.90 0.96698349  0.0000000   1.0000000 0.0000000
## 20   0.95 0.96698349  0.0000000   1.0000000 0.0000000
## 21   1.00 0.96698349  0.0000000   1.0000000 0.0000000