Loading the needed packages
library(tidyverse)
library(tidymodels)
library(themis)
library(vip)
library(xgboost)
library(rpart.plot)
Loading the datset
churn<-read_csv("churn.csv")
df<-churn %>% slice_head(n = 1000)
new_data<-churn %>% slice_tail(n = 10) %>% select(!churn)
1.XGBoost model
Predicting on new data
## # A tibble: 10 × 3
## .pred_class .pred_churn .pred_no_churn
## <fct> <dbl> <dbl>
## 1 no_churn 0.000384 1.00
## 2 no_churn 0.000384 1.00
## 3 no_churn 0.000384 1.00
## 4 no_churn 0.000384 1.00
## 5 no_churn 0.000384 1.00
## 6 no_churn 0.000384 1.00
## 7 no_churn 0.000384 1.00
## 8 no_churn 0.000173 1.00
## 9 no_churn 0.000384 1.00
## 10 no_churn 0.000360 1.00
2.Random Forest model
Predicting on new data
## # A tibble: 10 × 3
## .pred_class .pred_churn .pred_no_churn
## <fct> <dbl> <dbl>
## 1 no_churn 0.376 0.624
## 2 no_churn 0.376 0.624
## 3 no_churn 0.376 0.624
## 4 no_churn 0.376 0.624
## 5 no_churn 0.376 0.624
## 6 no_churn 0.376 0.624
## 7 no_churn 0.376 0.624
## 8 no_churn 0.335 0.665
## 9 no_churn 0.376 0.624
## 10 no_churn 0.0960 0.904
3.Decision Trees model
Predicting on new data
## # A tibble: 10 × 3
## .pred_class .pred_churn .pred_no_churn
## <fct> <dbl> <dbl>
## 1 no_churn 0.0148 0.985
## 2 no_churn 0.0148 0.985
## 3 no_churn 0.0148 0.985
## 4 no_churn 0.0148 0.985
## 5 no_churn 0.0148 0.985
## 6 no_churn 0.0148 0.985
## 7 no_churn 0.0148 0.985
## 8 no_churn 0.0148 0.985
## 9 no_churn 0.0148 0.985
## 10 no_churn 0.0148 0.985
4.Comparing results


7.Variable Importance


8.Plotting the Decision Tree
