========================================================
The dataset is individual loan data set provided by the P2P lending company, Prosper, in 2014. The dataset contains 81 variables and 113,937 loans. Variables include loaner’s income characteristics, loaner’s delinquencies history, each loan’s information (e.g. amount, interest rate, term), etc.
## [1] 113937 81
I manually picked ten variables, based on my judgement.
The ideal proportion of event happened or not should be half and half. However, the default data or any fraud data set is naturally imbalance, since there will be only a small amount of people default or conduct fraudulent. Obviously, the class bias exists in this dataset. One solution will be making sure the sample of building the model are in the equal proportions.
##
## 0 1
## 96906 17026
I create my traning and testing sample by setting equal proportion in my sampling. In other word, the numbers of “0” and “1” in my training data set are the same.
Although some model has the default setting to impute the missing value with mean, median or mode, it will increase the time of buidling the model. Therefore, to avoid waiting time, I impute the missing values in the numeric variable using median (since most variables is highly skewed, average will be affected by outliers and fat tail) and factor variables using mode (most frequent value)
##
## FALSE TRUE
## 254045 8151
## One_minus_specificity sensitivity Threshold
## 1 0.00000000 0.0000000 1.00
## 2 0.00000000 0.0000000 0.98
## 3 0.00000000 0.0000000 0.96
## 4 0.00000000 0.0000000 0.94
## 5 0.00000000 0.0000000 0.92
## 6 0.00000000 0.0000000 0.90
## 7 0.00000000 0.0000000 0.88
## 8 0.00000000 0.0000000 0.86
## 9 0.00000000 0.0000000 0.84
## 10 0.00000000 0.0000000 0.82
## 11 0.00000000 0.0000000 0.80
## 12 0.08503553 0.3772514 0.78
## 13 0.08503553 0.3772514 0.76
## 14 0.08503553 0.3772514 0.74
## 15 0.08503553 0.3772514 0.72
## 16 0.08503553 0.3772514 0.70
## 17 0.08503553 0.3772514 0.68
## 18 0.08503553 0.3772514 0.66
## 19 0.08503553 0.3772514 0.64
## 20 0.08503553 0.3772514 0.62
## 21 0.19803972 0.5669538 0.60
## 22 0.19803972 0.5669538 0.58
## 23 0.19803972 0.5669538 0.56
## 24 0.19803972 0.5669538 0.54
## 25 0.19803972 0.5669538 0.52
## 26 0.19803972 0.5669538 0.50
## 27 0.19803972 0.5669538 0.48
## 28 0.42650727 0.7687940 0.46
## 29 0.42650727 0.7687940 0.44
## 30 0.42650727 0.7687940 0.42
## 31 0.42650727 0.7687940 0.40
## 32 0.43091966 0.7787784 0.38
## 33 0.43091966 0.7787784 0.36
## 34 0.43091966 0.7787784 0.34
## 35 0.43091966 0.7787784 0.32
## 36 0.43091966 0.7787784 0.30
## 37 1.00000000 1.0000000 0.28
## 38 1.00000000 1.0000000 0.26
## 39 1.00000000 1.0000000 0.24
## 40 1.00000000 1.0000000 0.22
## 41 1.00000000 1.0000000 0.20
## 42 1.00000000 1.0000000 0.18
## 43 1.00000000 1.0000000 0.16
## 44 1.00000000 1.0000000 0.14
## 45 1.00000000 1.0000000 0.12
## 46 1.00000000 1.0000000 0.10
## 47 1.00000000 1.0000000 0.08
## 48 1.00000000 1.0000000 0.06
## 49 1.00000000 1.0000000 0.04
## 50 1.00000000 1.0000000 0.02
## 51 1.00000000 1.0000000 0.00
## 52 1.00000000 1.0000000 -0.02
Resources: http://r-statistics.co/Information-Value-With-R.html