Multiple Logistic Regression and Binary Logistic Regression

DATA EXPLORATION and DATA PREPARATION

The data set contains information of 8,161 customers at an auto insurance company. The data set includes variables to create a model to predict the likelihood a customer will get into a car accident, TARGET_FLAG. TARGET_FLAG is a binary response variable, in which 1 represents a person who gets into a car crash and 0 represents a person who does not get into a car crash. In addition, a model will be built that will predict the cost of a claim, TARGET_AMT. The TARGET_AMT is zero if a person does not get into a crash and varies depending on the severity of the accident.

##   INDEX TARGET_FLAG TARGET_AMT KIDSDRIV AGE HOMEKIDS YOJ    INCOME PARENT1
## 1     1           0          0        0  60        0  11  $67,349       No
## 2     2           0          0        0  43        0  11  $91,449       No
## 3     4           0          0        0  35        1  10  $16,039       No
## 4     5           0          0        0  51        0  14                No
## 5     6           0          0        0  50        0  NA $114,986       No
## 6     7           1       2946        0  34        1  12 $125,301      Yes
##    HOME_VAL MSTATUS SEX     EDUCATION           JOB TRAVTIME    CAR_USE
## 1       $0     z_No   M           PhD  Professional       14    Private
## 2 $257,252     z_No   M z_High School z_Blue Collar       22 Commercial
## 3 $124,191      Yes z_F z_High School      Clerical        5    Private
## 4 $306,251      Yes   M  <High School z_Blue Collar       32    Private
## 5 $243,925      Yes z_F           PhD        Doctor       36    Private
## 6       $0     z_No z_F     Bachelors z_Blue Collar       46 Commercial
##   BLUEBOOK TIF   CAR_TYPE RED_CAR OLDCLAIM CLM_FREQ REVOKED MVR_PTS
## 1 $14,230   11    Minivan     yes  $4,461         2      No       3
## 2 $14,940    1    Minivan     yes      $0         0      No       0
## 3  $4,010    4      z_SUV      no $38,690         2      No       3
## 4 $15,440    7    Minivan     yes      $0         0      No       0
## 5 $18,000    1      z_SUV      no $19,217         2     Yes       3
## 6 $17,430    1 Sports Car      no      $0         0      No       0
##   CAR_AGE          URBANICITY
## 1      18 Highly Urban/ Urban
## 2       1 Highly Urban/ Urban
## 3      10 Highly Urban/ Urban
## 4       6 Highly Urban/ Urban
## 5      17 Highly Urban/ Urban
## 6       7 Highly Urban/ Urban

The variables that will be used to build the model are:

  • AGE: Age of Driver
  • BLUEBOOK: Value of Vehicle
  • CAR_AGE: Vehicle Age
  • CAR_TYPE: Type of Car (Minivan, Panel Truck, SUV, Van, Pickup, Sports Car)
  • CAR_USE: Vehicle Use (Commercial or Private)
  • CLM_FREQ: Number of Claims in the Past 5 Years
  • EDUCATION: Maximum Education Level (<High School, High School, Bachelors, Masters, PhD)
  • HOMEKIDS: Number of Children at Home
  • HOME_VAL: Home Value
  • INCOME: Income
  • JOB: Job Category (Professional, Blue Collar, Doctor, Clerical, Home Maker, Lawyer, Manager, Student)
  • KIDSDRIV: Number of Driving Children
  • MSTATUS: Marital Status
  • MVR_PTS: Motor Vehicle Record Points
  • OLDCLAIM: Total Payout for Claims in the Past 5 Years
  • PARENT1: Single Parent
  • RED_CAR: A Red Car
  • REVOKED: License Revoked (Past 7 Years)
  • SEX: Gender
  • TIF: Number of Years the Person has Been a Customer (Time in Force)
  • TRAVTIME: Distance to Work
  • URBANICITY: Home/Work Area (Highly Urban/Urban, Highly Rural/Rural)
  • YOJ: Years on the Job

The variables of INCOME, HOME_VAL, BLUEBOOK, and OLDCLAIM are converted from characters into numeric values to explore the data.

PARENT1 is converted into a dummy variable in which being a single parent is designated as 1 and not being a single parent is designated as 0.

MSTATUS is converted into a dummy variable in which being single is designated as 0 and being married is designated as 1.

SEX is converted into a dummy variable in which male is designated as 0 and female is designated as 1.

CARUSE is converted into a dummy variable in which private is designated as 0 and commerical is designated as 1.

RED_CAR is converted into a dummy variable in which no is designated as 0 and yes is designated as 1.

URBANICITY is converted into a dummy variable in which rural is designated as 0 and urban is designated as 1.

EDUCATION is converted into numeric variables in which

Income

The data for income is skewed to the right, with outliers being customers who make above about $86,000. The mean is therefore higher than the median. There are 464 missing values for income. I will impute the median income for missing income values.

Year On The Job

There are a high number of people who have 0 years on the job. These people are unemployed, have been working for 12 months or less or are students. People who have been on the job for more than about 19 years and fewer than about 3 years are outliers. There are 454 missing values. I will impute the median value for the missing values.

Age

The distribution for age looks normal. There are 6 missing values. I will impute the median for the missing values.

Home Value

There are a high number of customers with home values of 0. Either that refers to renters, customers without a permanent home or there is an error in the data collection. Even if it refers to renters, the value of zero obfuscates any differentiation in the value of the home. I will remove the 0 home values and make them NAs. There are now 2,758 missing home values. About 1/3 of customers’ home values are missing.I will impute the median value for the missing values.

##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max.    NA's 
##   50223  153074  206692  220621  270023  885282    2758

Correlation of Variables

The following are the correlation values between each of the variables. The closer the correlation is to 1 or -1, the more highly correlated the variables.

## corrplot 0.84 loaded

The variables KIDSDRIV and HOMEKIDS are positively correlated, indicating that customers with more children have more children driving.

The variables AGE and HOMEKIDS are negatively correlated, indicating that customers who are older have fewer children at home.

The variables AGE and PARENT1 are negatively correlated, indicating that customers who are single parents tend to be younger.

The variables HOMEKIDS and PARENT1 are positively correlated, indicating that customers who are single parents have more kids at home. That is a tautology.

The variables PARENT1 and MSTATUS are negatively correlated, indicating that single parents are more likely not married.

The variables YOJ and Home Maker have a positive correlation, indicating that home makers are on the job more years.

The variables INCOME and HOME_VAL are positively correlated, which makes sense. Customers with higher incomes have homes with higher values.

The variables INCOME and EDUCATION are positively correlated. Customers with higher education have higher incomes.

The variables HOME_VAL and EDUCATION are positively correlated. Customers with higher education have homes of higher values.

The variables EDUCATION and CAR_AGE are positively correlated, indicating that customers with higher education have older cars.

The variables SEX and RED_CAR are negatively correlated, indicating that men are much more likely to have red cars.

The variables SEX and z_SUV are psotively correlated, indicating that women are much more likely to have SUVs.

The variables OLDCLAIM and CLM_FREQ are positively correlated, indicating that the total number of claims in the past 5 years and the total number of claims beyond 5 years are related.

Combining Correlated Variables

I will remove PARENT1 as a variable due to its correlation with HOMEKID and MSTATUS.

I will combine HOMEKIDS and AGE by multiplying their values together. This will give much higher values to people with more children at home.

I will combine INCOME, HOME_VAL and EDUCATION by adding INCOME and HOME_VAL and multiplying that value by EDUCATION.

I will combine SEX, RED_CAR and z_SUV by adding their values since they are binary values.

I will combine OLDCLAIM and CLM_FREQ by adding them.

##   TARGET_FLAG       TARGET_AMT        KIDSDRIV           YOJ       
##  Min.   :0.0000   Min.   :     0   Min.   :0.0000   Min.   : 0.00  
##  1st Qu.:0.0000   1st Qu.:     0   1st Qu.:0.0000   1st Qu.: 9.00  
##  Median :0.0000   Median :     0   Median :0.0000   Median :11.00  
##  Mean   :0.2638   Mean   :  1504   Mean   :0.1711   Mean   :10.53  
##  3rd Qu.:1.0000   3rd Qu.:  1036   3rd Qu.:0.0000   3rd Qu.:13.00  
##  Max.   :1.0000   Max.   :107586   Max.   :4.0000   Max.   :23.00  
##     MSTATUS          TRAVTIME         CAR_USE          BLUEBOOK    
##  Min.   :0.0000   Min.   :  5.00   Min.   :0.0000   Min.   : 1500  
##  1st Qu.:0.0000   1st Qu.: 22.00   1st Qu.:0.0000   1st Qu.: 9280  
##  Median :1.0000   Median : 33.00   Median :0.0000   Median :14440  
##  Mean   :0.5997   Mean   : 33.49   Mean   :0.3712   Mean   :15710  
##  3rd Qu.:1.0000   3rd Qu.: 44.00   3rd Qu.:1.0000   3rd Qu.:20850  
##  Max.   :1.0000   Max.   :142.00   Max.   :1.0000   Max.   :69740  
##       TIF            REVOKED          MVR_PTS          CAR_AGE      
##  Min.   : 1.000   Min.   :0.0000   Min.   : 0.000   Min.   : 0.000  
##  1st Qu.: 1.000   1st Qu.:0.0000   1st Qu.: 0.000   1st Qu.: 4.000  
##  Median : 4.000   Median :0.0000   Median : 1.000   Median : 8.000  
##  Mean   : 5.351   Mean   :0.1225   Mean   : 1.696   Mean   : 8.309  
##  3rd Qu.: 7.000   3rd Qu.:0.0000   3rd Qu.: 3.000   3rd Qu.:12.000  
##  Max.   :25.000   Max.   :1.0000   Max.   :13.000   Max.   :28.000  
##    URBANICITY        Clerical          Doctor          Home Maker     
##  Min.   :0.0000   Min.   :0.0000   Min.   :0.00000   Min.   :0.00000  
##  1st Qu.:1.0000   1st Qu.:0.0000   1st Qu.:0.00000   1st Qu.:0.00000  
##  Median :1.0000   Median :0.0000   Median :0.00000   Median :0.00000  
##  Mean   :0.7955   Mean   :0.1557   Mean   :0.03014   Mean   :0.07854  
##  3rd Qu.:1.0000   3rd Qu.:0.0000   3rd Qu.:0.00000   3rd Qu.:0.00000  
##  Max.   :1.0000   Max.   :1.0000   Max.   :1.00000   Max.   :1.00000  
##      Lawyer          Manager        Professional    z_Blue Collar   
##  Min.   :0.0000   Min.   :0.0000   Min.   :0.0000   Min.   :0.0000  
##  1st Qu.:0.0000   1st Qu.:0.0000   1st Qu.:0.0000   1st Qu.:0.0000  
##  Median :0.0000   Median :0.0000   Median :0.0000   Median :0.0000  
##  Mean   :0.1023   Mean   :0.1211   Mean   :0.1369   Mean   :0.2236  
##  3rd Qu.:0.0000   3rd Qu.:0.0000   3rd Qu.:0.0000   3rd Qu.:0.0000  
##  Max.   :1.0000   Max.   :1.0000   Max.   :1.0000   Max.   :1.0000  
##   Panel Truck          Pickup         Sports Car          Van        
##  Min.   :0.00000   Min.   :0.0000   Min.   :0.0000   Min.   :0.0000  
##  1st Qu.:0.00000   1st Qu.:0.0000   1st Qu.:0.0000   1st Qu.:0.0000  
##  Median :0.00000   Median :0.0000   Median :0.0000   Median :0.0000  
##  Mean   :0.08283   Mean   :0.1702   Mean   :0.1111   Mean   :0.0919  
##  3rd Qu.:0.00000   3rd Qu.:0.0000   3rd Qu.:0.0000   3rd Qu.:0.0000  
##  Max.   :1.00000   Max.   :1.0000   Max.   :1.0000   Max.   :1.0000  
##   homekids_age    income_homeval_educ sex_redcar_suv  oldclaim_freq  
##  Min.   :  0.00   Min.   :      0     Min.   :0.000   Min.   :    0  
##  1st Qu.:  0.00   1st Qu.: 206692     1st Qu.:1.000   1st Qu.:    0  
##  Median :  0.00   Median : 451556     Median :1.000   Median :    0  
##  Mean   : 28.02   Mean   : 574724     Mean   :1.109   Mean   : 4038  
##  3rd Qu.: 50.00   3rd Qu.: 818638     3rd Qu.:2.000   3rd Qu.: 4639  
##  Max.   :280.00   Max.   :5009248     Max.   :3.000   Max.   :57038

Build Models

Creating a Test Set and Training Set

Backward Elimination - Logistic Regression Model 1

## Warning: glm.fit: algorithm did not converge
## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred
## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + MSTATUS + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + 
##     CAR_AGE + URBANICITY + Clerical + Doctor + `Home Maker` + 
##     Lawyer + Manager + Professional + `z_Blue Collar` + `Panel Truck` + 
##     Pickup + `Sports Car` + Van + homekids_age + income_homeval_educ + 
##     sex_redcar_suv + oldclaim_freq, family = binomial(link = "logit"), 
##     data = train1)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2554  -0.7371  -0.4125   0.6777   3.0696  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -3.148e+00  2.541e-01 -12.392  < 2e-16 ***
## KIDSDRIV             3.204e-01  8.082e-02   3.965 7.35e-05 ***
## YOJ                 -2.797e-02  1.065e-02  -2.626  0.00865 ** 
## MSTATUS             -7.363e-01  7.672e-02  -9.597  < 2e-16 ***
## TRAVTIME             1.605e-02  2.399e-03   6.692 2.21e-11 ***
## CAR_USE              8.121e-01  1.130e-01   7.185 6.70e-13 ***
## BLUEBOOK            -3.528e-05  5.990e-06  -5.891 3.85e-09 ***
## TIF                 -5.435e-02  9.342e-03  -5.818 5.95e-09 ***
## REVOKED              7.196e-01  1.139e-01   6.317 2.66e-10 ***
## MVR_PTS              1.329e-01  1.724e-02   7.708 1.28e-14 ***
## CAR_AGE             -5.745e-03  8.584e-03  -0.669  0.50332    
## URBANICITY           2.425e+00  1.420e-01  17.082  < 2e-16 ***
## Clerical             1.514e-01  1.462e-01   1.036  0.30028    
## Doctor              -5.988e-01  3.158e-01  -1.896  0.05792 .  
## `Home Maker`        -5.219e-02  1.727e-01  -0.302  0.76256    
## Lawyer              -8.559e-02  1.754e-01  -0.488  0.62552    
## Manager             -1.024e+00  1.664e-01  -6.154 7.58e-10 ***
## Professional        -3.396e-01  1.452e-01  -2.340  0.01929 *  
## `z_Blue Collar`     -1.028e-01  1.331e-01  -0.772  0.43997    
## `Panel Truck`        6.121e-01  1.914e-01   3.198  0.00139 ** 
## Pickup               2.473e-01  1.186e-01   2.086  0.03701 *  
## `Sports Car`         6.824e-01  1.274e-01   5.358 8.39e-08 ***
## Van                  4.815e-01  1.521e-01   3.165  0.00155 ** 
## homekids_age         3.352e-03  9.966e-04   3.363  0.00077 ***
## income_homeval_educ -4.656e-07  1.162e-07  -4.007 6.14e-05 ***
## sex_redcar_suv       3.658e-01  6.747e-02   5.421 5.93e-08 ***
## oldclaim_freq        3.069e-06  4.344e-06   0.706  0.47991    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4458.7  on 4870  degrees of freedom
## AIC: 4512.7
## 
## Number of Fisher Scoring iterations: 5

Home Maker has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + MSTATUS + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + 
##     CAR_AGE + URBANICITY + Clerical + Doctor + Lawyer + Manager + 
##     Professional + `z_Blue Collar` + `Panel Truck` + Pickup + 
##     `Sports Car` + Van + homekids_age + income_homeval_educ + 
##     sex_redcar_suv + oldclaim_freq, family = binomial(link = "logit"), 
##     data = train1)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2454  -0.7362  -0.4122   0.6737   3.0709  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -3.172e+00  2.416e-01 -13.132  < 2e-16 ***
## KIDSDRIV             3.199e-01  8.078e-02   3.961 7.48e-05 ***
## YOJ                 -2.743e-02  1.050e-02  -2.613 0.008976 ** 
## MSTATUS             -7.370e-01  7.668e-02  -9.612  < 2e-16 ***
## TRAVTIME             1.604e-02  2.399e-03   6.688 2.26e-11 ***
## CAR_USE              8.195e-01  1.104e-01   7.424 1.14e-13 ***
## BLUEBOOK            -3.536e-05  5.984e-06  -5.909 3.45e-09 ***
## TIF                 -5.430e-02  9.339e-03  -5.814 6.10e-09 ***
## REVOKED              7.206e-01  1.139e-01   6.329 2.47e-10 ***
## MVR_PTS              1.330e-01  1.724e-02   7.718 1.18e-14 ***
## CAR_AGE             -5.883e-03  8.570e-03  -0.686 0.492435    
## URBANICITY           2.426e+00  1.419e-01  17.103  < 2e-16 ***
## Clerical             1.693e-01  1.339e-01   1.265 0.206018    
## Doctor              -5.837e-01  3.118e-01  -1.872 0.061218 .  
## Lawyer              -6.836e-02  1.659e-01  -0.412 0.680306    
## Manager             -1.009e+00  1.589e-01  -6.350 2.15e-10 ***
## Professional        -3.241e-01  1.358e-01  -2.386 0.017032 *  
## `z_Blue Collar`     -8.968e-02  1.259e-01  -0.712 0.476181    
## `Panel Truck`        6.132e-01  1.914e-01   3.204 0.001356 ** 
## Pickup               2.473e-01  1.186e-01   2.085 0.037045 *  
## `Sports Car`         6.794e-01  1.270e-01   5.351 8.75e-08 ***
## Van                  4.814e-01  1.521e-01   3.164 0.001555 ** 
## homekids_age         3.366e-03  9.952e-04   3.383 0.000718 ***
## income_homeval_educ -4.611e-07  1.152e-07  -4.004 6.23e-05 ***
## sex_redcar_suv       3.641e-01  6.727e-02   5.413 6.20e-08 ***
## oldclaim_freq        3.061e-06  4.345e-06   0.704 0.481217    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4458.8  on 4871  degrees of freedom
## AIC: 4510.8
## 
## Number of Fisher Scoring iterations: 5

Lawyer has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + MSTATUS + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + 
##     CAR_AGE + URBANICITY + Clerical + Doctor + Manager + Professional + 
##     `z_Blue Collar` + `Panel Truck` + Pickup + `Sports Car` + 
##     Van + homekids_age + income_homeval_educ + sex_redcar_suv + 
##     oldclaim_freq, family = binomial(link = "logit"), data = train1)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2452  -0.7364  -0.4122   0.6742   3.0734  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -3.179e+00  2.410e-01 -13.192  < 2e-16 ***
## KIDSDRIV             3.199e-01  8.079e-02   3.959 7.52e-05 ***
## YOJ                 -2.827e-02  1.030e-02  -2.744 0.006064 ** 
## MSTATUS             -7.359e-01  7.662e-02  -9.604  < 2e-16 ***
## TRAVTIME             1.605e-02  2.398e-03   6.693 2.19e-11 ***
## CAR_USE              8.320e-01  1.061e-01   7.839 4.55e-15 ***
## BLUEBOOK            -3.547e-05  5.980e-06  -5.931 3.01e-09 ***
## TIF                 -5.424e-02  9.338e-03  -5.809 6.29e-09 ***
## REVOKED              7.188e-01  1.138e-01   6.318 2.65e-10 ***
## MVR_PTS              1.329e-01  1.723e-02   7.712 1.24e-14 ***
## CAR_AGE             -6.373e-03  8.489e-03  -0.751 0.452774    
## URBANICITY           2.425e+00  1.419e-01  17.096  < 2e-16 ***
## Clerical             1.849e-01  1.285e-01   1.438 0.150317    
## Doctor              -5.455e-01  2.979e-01  -1.831 0.067047 .  
## Manager             -9.863e-01  1.492e-01  -6.612 3.78e-11 ***
## Professional        -3.037e-01  1.265e-01  -2.399 0.016419 *  
## `z_Blue Collar`     -7.926e-02  1.234e-01  -0.642 0.520709    
## `Panel Truck`        6.227e-01  1.900e-01   3.278 0.001047 ** 
## Pickup               2.484e-01  1.186e-01   2.095 0.036158 *  
## `Sports Car`         6.813e-01  1.269e-01   5.369 7.90e-08 ***
## Van                  4.837e-01  1.520e-01   3.182 0.001461 ** 
## homekids_age         3.402e-03  9.915e-04   3.431 0.000601 ***
## income_homeval_educ -4.715e-07  1.126e-07  -4.186 2.83e-05 ***
## sex_redcar_suv       3.656e-01  6.717e-02   5.443 5.24e-08 ***
## oldclaim_freq        3.123e-06  4.343e-06   0.719 0.472152    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4459.0  on 4872  degrees of freedom
## AIC: 4509
## 
## Number of Fisher Scoring iterations: 5

Blue Collar has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + MSTATUS + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + 
##     CAR_AGE + URBANICITY + Clerical + Doctor + Manager + Professional + 
##     `Panel Truck` + Pickup + `Sports Car` + Van + homekids_age + 
##     income_homeval_educ + sex_redcar_suv + oldclaim_freq, family = binomial(link = "logit"), 
##     data = train1)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2389  -0.7363  -0.4122   0.6803   3.0733  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -3.196e+00  2.397e-01 -13.333  < 2e-16 ***
## KIDSDRIV             3.166e-01  8.062e-02   3.928 8.58e-05 ***
## YOJ                 -3.057e-02  9.658e-03  -3.166 0.001547 ** 
## MSTATUS             -7.337e-01  7.655e-02  -9.585  < 2e-16 ***
## TRAVTIME             1.602e-02  2.399e-03   6.681 2.38e-11 ***
## CAR_USE              8.027e-01  9.571e-02   8.387  < 2e-16 ***
## BLUEBOOK            -3.574e-05  5.966e-06  -5.991 2.09e-09 ***
## TIF                 -5.395e-02  9.323e-03  -5.786 7.19e-09 ***
## REVOKED              7.194e-01  1.138e-01   6.324 2.55e-10 ***
## MVR_PTS              1.330e-01  1.724e-02   7.717 1.19e-14 ***
## CAR_AGE             -5.548e-03  8.390e-03  -0.661 0.508424    
## URBANICITY           2.421e+00  1.416e-01  17.093  < 2e-16 ***
## Clerical             2.212e-01  1.155e-01   1.916 0.055424 .  
## Doctor              -5.453e-01  2.979e-01  -1.831 0.067151 .  
## Manager             -9.626e-01  1.445e-01  -6.661 2.73e-11 ***
## Professional        -2.755e-01  1.187e-01  -2.320 0.020316 *  
## `Panel Truck`        6.515e-01  1.846e-01   3.529 0.000418 ***
## Pickup               2.613e-01  1.168e-01   2.237 0.025260 *  
## `Sports Car`         6.841e-01  1.268e-01   5.395 6.85e-08 ***
## Van                  4.961e-01  1.507e-01   3.292 0.000996 ***
## homekids_age         3.453e-03  9.879e-04   3.496 0.000473 ***
## income_homeval_educ -4.511e-07  1.078e-07  -4.184 2.87e-05 ***
## sex_redcar_suv       3.676e-01  6.711e-02   5.478 4.30e-08 ***
## oldclaim_freq        3.136e-06  4.343e-06   0.722 0.470223    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4459.4  on 4873  degrees of freedom
## AIC: 4507.4
## 
## Number of Fisher Scoring iterations: 5

Car age has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + MSTATUS + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + 
##     URBANICITY + Clerical + Doctor + Manager + Professional + 
##     `Panel Truck` + Pickup + `Sports Car` + Van + homekids_age + 
##     income_homeval_educ + sex_redcar_suv + oldclaim_freq, family = binomial(link = "logit"), 
##     data = train1)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2289  -0.7365  -0.4116   0.6801   3.0704  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -3.226e+00  2.356e-01 -13.692  < 2e-16 ***
## KIDSDRIV             3.167e-01  8.061e-02   3.929 8.52e-05 ***
## YOJ                 -3.051e-02  9.655e-03  -3.160 0.001577 ** 
## MSTATUS             -7.320e-01  7.649e-02  -9.570  < 2e-16 ***
## TRAVTIME             1.602e-02  2.398e-03   6.681 2.37e-11 ***
## CAR_USE              8.094e-01  9.519e-02   8.503  < 2e-16 ***
## BLUEBOOK            -3.576e-05  5.969e-06  -5.991 2.09e-09 ***
## TIF                 -5.402e-02  9.322e-03  -5.795 6.82e-09 ***
## REVOKED              7.199e-01  1.138e-01   6.328 2.49e-10 ***
## MVR_PTS              1.332e-01  1.723e-02   7.730 1.07e-14 ***
## URBANICITY           2.419e+00  1.416e-01  17.085  < 2e-16 ***
## Clerical             2.297e-01  1.148e-01   2.001 0.045362 *  
## Doctor              -5.368e-01  2.977e-01  -1.803 0.071393 .  
## Manager             -9.636e-01  1.446e-01  -6.666 2.63e-11 ***
## Professional        -2.727e-01  1.186e-01  -2.299 0.021519 *  
## `Panel Truck`        6.491e-01  1.846e-01   3.516 0.000439 ***
## Pickup               2.604e-01  1.168e-01   2.230 0.025776 *  
## `Sports Car`         6.851e-01  1.268e-01   5.404 6.51e-08 ***
## Van                  4.964e-01  1.507e-01   3.294 0.000989 ***
## homekids_age         3.471e-03  9.873e-04   3.516 0.000438 ***
## income_homeval_educ -4.864e-07  9.442e-08  -5.151 2.59e-07 ***
## sex_redcar_suv       3.668e-01  6.709e-02   5.468 4.56e-08 ***
## oldclaim_freq        3.073e-06  4.343e-06   0.708 0.479202    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4459.8  on 4874  degrees of freedom
## AIC: 4505.8
## 
## Number of Fisher Scoring iterations: 5

Old Claim/Claim Frequency has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + MSTATUS + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + 
##     URBANICITY + Clerical + Doctor + Manager + Professional + 
##     `Panel Truck` + Pickup + `Sports Car` + Van + homekids_age + 
##     income_homeval_educ + sex_redcar_suv, family = binomial(link = "logit"), 
##     data = train1)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2332  -0.7375  -0.4125   0.6741   3.0716  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -3.227e+00  2.356e-01 -13.694  < 2e-16 ***
## KIDSDRIV             3.164e-01  8.065e-02   3.923 8.73e-05 ***
## YOJ                 -3.029e-02  9.652e-03  -3.138 0.001700 ** 
## MSTATUS             -7.333e-01  7.647e-02  -9.589  < 2e-16 ***
## TRAVTIME             1.600e-02  2.398e-03   6.672 2.52e-11 ***
## CAR_USE              8.104e-01  9.517e-02   8.515  < 2e-16 ***
## BLUEBOOK            -3.577e-05  5.969e-06  -5.992 2.07e-09 ***
## TIF                 -5.391e-02  9.320e-03  -5.784 7.31e-09 ***
## REVOKED              7.540e-01  1.030e-01   7.324 2.41e-13 ***
## MVR_PTS              1.363e-01  1.667e-02   8.176 2.93e-16 ***
## URBANICITY           2.425e+00  1.413e-01  17.159  < 2e-16 ***
## Clerical             2.292e-01  1.148e-01   1.997 0.045844 *  
## Doctor              -5.380e-01  2.979e-01  -1.806 0.070915 .  
## Manager             -9.632e-01  1.445e-01  -6.664 2.66e-11 ***
## Professional        -2.744e-01  1.186e-01  -2.314 0.020660 *  
## `Panel Truck`        6.500e-01  1.846e-01   3.520 0.000431 ***
## Pickup               2.596e-01  1.168e-01   2.223 0.026187 *  
## `Sports Car`         6.880e-01  1.267e-01   5.432 5.57e-08 ***
## Van                  4.966e-01  1.507e-01   3.295 0.000985 ***
## homekids_age         3.458e-03  9.871e-04   3.503 0.000460 ***
## income_homeval_educ -4.885e-07  9.437e-08  -5.176 2.26e-07 ***
## sex_redcar_suv       3.661e-01  6.707e-02   5.458 4.82e-08 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4460.3  on 4875  degrees of freedom
## AIC: 4504.3
## 
## Number of Fisher Scoring iterations: 5

Doctor has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + MSTATUS + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + 
##     URBANICITY + Clerical + Manager + Professional + `Panel Truck` + 
##     Pickup + `Sports Car` + Van + homekids_age + income_homeval_educ + 
##     sex_redcar_suv, family = binomial(link = "logit"), data = train1)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2358  -0.7370  -0.4115   0.6749   3.0747  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -3.232e+00  2.357e-01 -13.714  < 2e-16 ***
## KIDSDRIV             3.179e-01  8.068e-02   3.940 8.16e-05 ***
## YOJ                 -3.040e-02  9.647e-03  -3.151 0.001628 ** 
## MSTATUS             -7.289e-01  7.641e-02  -9.540  < 2e-16 ***
## TRAVTIME             1.601e-02  2.398e-03   6.676 2.45e-11 ***
## CAR_USE              8.337e-01  9.448e-02   8.824  < 2e-16 ***
## BLUEBOOK            -3.578e-05  5.963e-06  -6.000 1.97e-09 ***
## TIF                 -5.358e-02  9.321e-03  -5.749 9.00e-09 ***
## REVOKED              7.562e-01  1.029e-01   7.350 1.99e-13 ***
## MVR_PTS              1.363e-01  1.667e-02   8.174 2.97e-16 ***
## URBANICITY           2.425e+00  1.414e-01  17.150  < 2e-16 ***
## Clerical             2.347e-01  1.148e-01   2.044 0.040988 *  
## Manager             -9.283e-01  1.435e-01  -6.468 9.91e-11 ***
## Professional        -2.485e-01  1.179e-01  -2.107 0.035086 *  
## `Panel Truck`        6.677e-01  1.845e-01   3.620 0.000295 ***
## Pickup               2.567e-01  1.167e-01   2.199 0.027897 *  
## `Sports Car`         6.874e-01  1.265e-01   5.435 5.49e-08 ***
## Van                  4.967e-01  1.505e-01   3.300 0.000968 ***
## homekids_age         3.498e-03  9.869e-04   3.545 0.000393 ***
## income_homeval_educ -5.452e-07  8.981e-08  -6.070 1.28e-09 ***
## sex_redcar_suv       3.659e-01  6.704e-02   5.459 4.80e-08 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4463.9  on 4876  degrees of freedom
## AIC: 4505.9
## 
## Number of Fisher Scoring iterations: 5
##         (Intercept)            KIDSDRIV                 YOJ 
##       -4.808017e-01        4.728775e-02       -4.522150e-03 
##             MSTATUS            TRAVTIME             CAR_USE 
##       -1.084377e-01        2.381669e-03        1.240251e-01 
##            BLUEBOOK                 TIF             REVOKED 
##       -5.323281e-06       -7.971547e-03        1.125049e-01 
##             MVR_PTS          URBANICITY            Clerical 
##        2.027693e-02        3.608180e-01        3.491427e-02 
##             Manager        Professional       `Panel Truck` 
##       -1.380964e-01       -3.697466e-02        9.932959e-02 
##              Pickup        `Sports Car`                 Van 
##        3.818790e-02        1.022702e-01        7.388903e-02 
##        homekids_age income_homeval_educ      sex_redcar_suv 
##        5.203981e-04       -8.110785e-08        5.444163e-02

The variables that have an effect on whether the car was in a crash are: KIDSDRIV, YOJ, MSTATUS, TRAVTIME, CAR_USE, BLUEBOOK, TIF, REVOKED, MVR_PTS, URBANICITY, Clerical, Manager, Professional, Panel Truck, Pickup, Sports Car, Van, homekids_age =, income_homeval_edu, and sex_redcar_suv.

The marginal effects reflect the change in the probability the target equals 1 given a 1 unit change in the independent variable. The marginal effect is determined at the mean for each of the independent variables. A 1 unit increase in KIDSDRIV, the number of driving children, results in a 4.7% increase in the probability that the customer has a claim. A 1 unit increase in YOJ, years on the job, results in a .5% decrease in the probability that the customer has a claim. Being married results in a 11% decrease in the likelihood a customer has a claim. A 1 unit incrase in travel time results in a 0.2% increase in the likelihood that the customer has a claim. A customer who drives a commercial vehicle is 12% more likely to have a claim. A 1 unit increase in the bluebook value of the car results in a 0.0005% decrease in the likelihood that a customer will have a claim. A 1 unit increase in TIF, time in force, results in a 0.80 decrease in the likelihood the customer has a claim. Having a license revoked results in a 11% increase in the likelihood in the customer having a claim. A 1 unit increase in MVR_PTS, the number of points, results in a 2% increase in the likelihood the customer has a claim. A customer who lives in an urban area results in a 36% increase in the likelihood the customer will have a claim. Driving a sports car results in a 10% increase in the likelihood the customer will have a claim. Driving a can results in a 7% increase in the likelihood the customer will have a claim. A 1 unit increase in homekids_age results in a .05% increase in the likelihood a customer has a claim. A 1 unit increase in income_homeval_educ results in a .000008 decrease in the likelihood the customer has a claim. A 1 unit increase in sex_redcar_suv results in a 5.4% increase in the likelihood the customer has a claim.

Prediction from Model 1

## [1] 0.32
## [1] 0.7909815

The cutoff associated with a point the farthest distance from the ROC curve is 0.32. I will use 0.32 as the cutoff for making predictions. A value above 0.32 will be assigned a target of one and a value below 0.32 wil be assigned a value of zero. The area under the curve is 0.79.

Confusion Matrix
##     true
## pred    0    1
##    0 1846  257
##    1  560  601

The model predicted 1,846 0’s that were actually 0. The model predicted 257 0’s that were actually 1.
The model predicted 560 1’s that were actaully 0. The model predicted 601 1’s that were actually 1.

##    accuracy     error precision sensitivity specificity        f1
## 1 0.7496936 0.2503064 0.4094034   0.7672485   0.7004662 0.5339118

Logit Model 2

For the second model, I will not combine correlated varaibles. ####Creating a Test Set and Training Set

Backward Elimination - Logistic Regression Model 2

## Warning: glm.fit: algorithm did not converge
## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred
## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + AGE + HOMEKIDS + 
##     YOJ + INCOME + PARENT1 + HOME_VAL + MSTATUS + SEX + EDUCATION + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + RED_CAR + OLDCLAIM + 
##     CLM_FREQ + REVOKED + MVR_PTS + CAR_AGE + URBANICITY + Clerical + 
##     Doctor + `Home Maker` + Lawyer + Manager + Professional + 
##     `z_Blue Collar` + `Panel Truck` + Pickup + `Sports Car` + 
##     Van + z_SUV, family = binomial(link = "logit"), data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2671  -0.7212  -0.4038   0.6187   3.1041  
## 
## Coefficients:
##                   Estimate Std. Error z value Pr(>|z|)    
## (Intercept)     -3.345e+00  3.606e-01  -9.277  < 2e-16 ***
## KIDSDRIV         3.274e-01  7.979e-02   4.103 4.07e-05 ***
## AGE             -1.620e-03  5.189e-03  -0.312 0.754882    
## HOMEKIDS         6.425e-02  4.721e-02   1.361 0.173521    
## YOJ             -1.878e-02  1.095e-02  -1.716 0.086156 .  
## INCOME          -6.258e-06  1.509e-06  -4.148 3.36e-05 ***
## PARENT1          3.933e-01  1.427e-01   2.755 0.005861 ** 
## HOME_VAL         9.030e-07  7.835e-07   1.153 0.249114    
## MSTATUS         -6.179e-01  9.444e-02  -6.543 6.04e-11 ***
## SEX              3.713e-02  1.427e-01   0.260 0.794692    
## EDUCATION       -1.292e-01  5.927e-02  -2.180 0.029280 *  
## TRAVTIME         1.600e-02  2.428e-03   6.590 4.41e-11 ***
## CAR_USE          8.247e-01  1.158e-01   7.120 1.08e-12 ***
## BLUEBOOK        -2.512e-05  6.909e-06  -3.636 0.000277 ***
## TIF             -5.506e-02  9.426e-03  -5.841 5.19e-09 ***
## RED_CAR          9.476e-02  1.106e-01   0.857 0.391538    
## OLDCLAIM        -7.861e-06  4.929e-06  -1.595 0.110766    
## CLM_FREQ         1.884e-01  3.679e-02   5.123 3.01e-07 ***
## REVOKED          8.459e-01  1.174e-01   7.202 5.92e-13 ***
## MVR_PTS          1.020e-01  1.808e-02   5.644 1.66e-08 ***
## CAR_AGE          8.909e-04  9.377e-03   0.095 0.924314    
## URBANICITY       2.389e+00  1.448e-01  16.494  < 2e-16 ***
## Clerical         1.701e-01  1.497e-01   1.137 0.255659    
## Doctor          -4.867e-01  3.216e-01  -1.513 0.130254    
## `Home Maker`    -3.165e-02  1.821e-01  -0.174 0.862021    
## Lawyer           5.958e-02  1.825e-01   0.327 0.744017    
## Manager         -9.211e-01  1.707e-01  -5.398 6.75e-08 ***
## Professional    -2.233e-01  1.484e-01  -1.505 0.132399    
## `z_Blue Collar` -2.132e-02  1.395e-01  -0.153 0.878550    
## `Panel Truck`    6.250e-01  2.079e-01   3.007 0.002642 ** 
## Pickup           4.175e-01  1.276e-01   3.271 0.001072 ** 
## `Sports Car`     9.230e-01  1.676e-01   5.508 3.63e-08 ***
## Van              5.804e-01  1.614e-01   3.597 0.000322 ***
## z_SUV            7.123e-01  1.408e-01   5.058 4.23e-07 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4391.8  on 4863  degrees of freedom
## AIC: 4459.8
## 
## Number of Fisher Scoring iterations: 5

Blue Collar has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + AGE + HOMEKIDS + 
##     YOJ + INCOME + PARENT1 + HOME_VAL + MSTATUS + SEX + EDUCATION + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + RED_CAR + OLDCLAIM + 
##     CLM_FREQ + REVOKED + MVR_PTS + CAR_AGE + URBANICITY + Clerical + 
##     Doctor + `Home Maker` + Lawyer + Manager + Professional + 
##     `Panel Truck` + Pickup + `Sports Car` + Van + z_SUV, family = binomial(link = "logit"), 
##     data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2639  -0.7203  -0.4038   0.6194   3.1040  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.357e+00  3.515e-01  -9.551  < 2e-16 ***
## KIDSDRIV       3.267e-01  7.965e-02   4.102 4.10e-05 ***
## AGE           -1.596e-03  5.187e-03  -0.308 0.758284    
## HOMEKIDS       6.509e-02  4.689e-02   1.388 0.165073    
## YOJ           -1.919e-02  1.062e-02  -1.806 0.070950 .  
## INCOME        -6.312e-06  1.468e-06  -4.298 1.72e-05 ***
## PARENT1        3.926e-01  1.427e-01   2.752 0.005927 ** 
## HOME_VAL       9.191e-07  7.767e-07   1.183 0.236671    
## MSTATUS       -6.178e-01  9.444e-02  -6.542 6.08e-11 ***
## SEX            3.802e-02  1.426e-01   0.267 0.789715    
## EDUCATION     -1.262e-01  5.603e-02  -2.253 0.024258 *  
## TRAVTIME       1.599e-02  2.427e-03   6.587 4.47e-11 ***
## CAR_USE        8.199e-01  1.113e-01   7.363 1.80e-13 ***
## BLUEBOOK      -2.518e-05  6.900e-06  -3.649 0.000263 ***
## TIF           -5.499e-02  9.413e-03  -5.841 5.18e-09 ***
## RED_CAR        9.504e-02  1.106e-01   0.859 0.390095    
## OLDCLAIM      -7.858e-06  4.930e-06  -1.594 0.110960    
## CLM_FREQ       1.884e-01  3.678e-02   5.121 3.04e-07 ***
## REVOKED        8.460e-01  1.174e-01   7.203 5.89e-13 ***
## MVR_PTS        1.020e-01  1.808e-02   5.644 1.66e-08 ***
## CAR_AGE        9.415e-04  9.371e-03   0.100 0.919972    
## URBANICITY     2.388e+00  1.447e-01  16.505  < 2e-16 ***
## Clerical       1.826e-01  1.254e-01   1.456 0.145355    
## Doctor        -4.814e-01  3.198e-01  -1.505 0.132279    
## `Home Maker`  -2.459e-02  1.762e-01  -0.140 0.889029    
## Lawyer         6.656e-02  1.767e-01   0.377 0.706448    
## Manager       -9.117e-01  1.591e-01  -5.731 1.00e-08 ***
## Professional  -2.130e-01  1.321e-01  -1.613 0.106828    
## `Panel Truck`  6.330e-01  2.013e-01   3.144 0.001664 ** 
## Pickup         4.205e-01  1.261e-01   3.335 0.000853 ***
## `Sports Car`   9.226e-01  1.676e-01   5.506 3.66e-08 ***
## Van            5.838e-01  1.599e-01   3.652 0.000261 ***
## z_SUV          7.118e-01  1.408e-01   5.056 4.28e-07 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4391.8  on 4864  degrees of freedom
## AIC: 4457.8
## 
## Number of Fisher Scoring iterations: 5

Home Maker has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + AGE + HOMEKIDS + 
##     YOJ + INCOME + PARENT1 + HOME_VAL + MSTATUS + SEX + EDUCATION + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + RED_CAR + OLDCLAIM + 
##     CLM_FREQ + REVOKED + MVR_PTS + CAR_AGE + URBANICITY + Clerical + 
##     Doctor + Lawyer + Manager + Professional + `Panel Truck` + 
##     Pickup + `Sports Car` + Van + z_SUV, family = binomial(link = "logit"), 
##     data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2603  -0.7215  -0.4039   0.6192   3.1044  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.368e+00  3.438e-01  -9.795  < 2e-16 ***
## KIDSDRIV       3.268e-01  7.964e-02   4.104 4.07e-05 ***
## AGE           -1.660e-03  5.166e-03  -0.321 0.747909    
## HOMEKIDS       6.505e-02  4.688e-02   1.387 0.165292    
## YOJ           -1.882e-02  1.029e-02  -1.829 0.067416 .  
## INCOME        -6.297e-06  1.465e-06  -4.299 1.72e-05 ***
## PARENT1        3.925e-01  1.427e-01   2.751 0.005933 ** 
## HOME_VAL       9.415e-07  7.599e-07   1.239 0.215363    
## MSTATUS       -6.181e-01  9.441e-02  -6.547 5.87e-11 ***
## SEX            3.583e-02  1.417e-01   0.253 0.800385    
## EDUCATION     -1.278e-01  5.494e-02  -2.326 0.020032 *  
## TRAVTIME       1.599e-02  2.427e-03   6.587 4.50e-11 ***
## CAR_USE        8.249e-01  1.053e-01   7.838 4.58e-15 ***
## BLUEBOOK      -2.515e-05  6.897e-06  -3.647 0.000265 ***
## TIF           -5.498e-02  9.413e-03  -5.841 5.19e-09 ***
## RED_CAR        9.535e-02  1.106e-01   0.862 0.388487    
## OLDCLAIM      -7.860e-06  4.930e-06  -1.594 0.110892    
## CLM_FREQ       1.883e-01  3.678e-02   5.120 3.06e-07 ***
## REVOKED        8.465e-01  1.174e-01   7.210 5.58e-13 ***
## MVR_PTS        1.021e-01  1.807e-02   5.651 1.60e-08 ***
## CAR_AGE        9.587e-04  9.370e-03   0.102 0.918510    
## URBANICITY     2.389e+00  1.445e-01  16.529  < 2e-16 ***
## Clerical       1.876e-01  1.201e-01   1.562 0.118357    
## Doctor        -4.737e-01  3.150e-01  -1.504 0.132648    
## Lawyer         7.399e-02  1.685e-01   0.439 0.660586    
## Manager       -9.067e-01  1.551e-01  -5.845 5.06e-09 ***
## Professional  -2.085e-01  1.281e-01  -1.628 0.103589    
## `Panel Truck`  6.300e-01  2.001e-01   3.148 0.001646 ** 
## Pickup         4.195e-01  1.259e-01   3.332 0.000861 ***
## `Sports Car`   9.224e-01  1.675e-01   5.505 3.68e-08 ***
## Van            5.819e-01  1.593e-01   3.653 0.000259 ***
## z_SUV          7.121e-01  1.408e-01   5.058 4.23e-07 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4391.8  on 4865  degrees of freedom
## AIC: 4455.8
## 
## Number of Fisher Scoring iterations: 5

CAR_AGE has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + AGE + HOMEKIDS + 
##     YOJ + INCOME + PARENT1 + HOME_VAL + MSTATUS + SEX + EDUCATION + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + RED_CAR + OLDCLAIM + 
##     CLM_FREQ + REVOKED + MVR_PTS + URBANICITY + Clerical + Doctor + 
##     Lawyer + Manager + Professional + `Panel Truck` + Pickup + 
##     `Sports Car` + Van + z_SUV, family = binomial(link = "logit"), 
##     data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2610  -0.7220  -0.4036   0.6189   3.1047  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.365e+00  3.426e-01  -9.821  < 2e-16 ***
## KIDSDRIV       3.268e-01  7.963e-02   4.104 4.06e-05 ***
## AGE           -1.658e-03  5.166e-03  -0.321 0.748244    
## HOMEKIDS       6.506e-02  4.688e-02   1.388 0.165259    
## YOJ           -1.881e-02  1.029e-02  -1.828 0.067504 .  
## INCOME        -6.293e-06  1.464e-06  -4.298 1.72e-05 ***
## PARENT1        3.926e-01  1.427e-01   2.752 0.005931 ** 
## HOME_VAL       9.389e-07  7.594e-07   1.236 0.216344    
## MSTATUS       -6.182e-01  9.441e-02  -6.549 5.81e-11 ***
## SEX            3.603e-02  1.417e-01   0.254 0.799267    
## EDUCATION     -1.249e-01  4.697e-02  -2.658 0.007855 ** 
## TRAVTIME       1.599e-02  2.427e-03   6.586 4.51e-11 ***
## CAR_USE        8.244e-01  1.051e-01   7.844 4.36e-15 ***
## BLUEBOOK      -2.515e-05  6.897e-06  -3.647 0.000265 ***
## TIF           -5.497e-02  9.412e-03  -5.840 5.22e-09 ***
## RED_CAR        9.560e-02  1.105e-01   0.865 0.387111    
## OLDCLAIM      -7.856e-06  4.930e-06  -1.594 0.111041    
## CLM_FREQ       1.884e-01  3.678e-02   5.123 3.01e-07 ***
## REVOKED        8.463e-01  1.174e-01   7.210 5.61e-13 ***
## MVR_PTS        1.021e-01  1.806e-02   5.650 1.60e-08 ***
## URBANICITY     2.389e+00  1.445e-01  16.529  < 2e-16 ***
## Clerical       1.872e-01  1.201e-01   1.559 0.119013    
## Doctor        -4.751e-01  3.147e-01  -1.510 0.131080    
## Lawyer         7.522e-02  1.681e-01   0.448 0.654495    
## Manager       -9.066e-01  1.551e-01  -5.845 5.07e-09 ***
## Professional  -2.087e-01  1.280e-01  -1.630 0.103066    
## `Panel Truck`  6.304e-01  2.001e-01   3.151 0.001629 ** 
## Pickup         4.195e-01  1.259e-01   3.333 0.000860 ***
## `Sports Car`   9.221e-01  1.675e-01   5.505 3.70e-08 ***
## Van            5.818e-01  1.593e-01   3.652 0.000260 ***
## z_SUV          7.120e-01  1.408e-01   5.058 4.24e-07 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4391.8  on 4866  degrees of freedom
## AIC: 4453.8
## 
## Number of Fisher Scoring iterations: 5

SEX has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + AGE + HOMEKIDS + 
##     YOJ + INCOME + PARENT1 + HOME_VAL + MSTATUS + EDUCATION + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + RED_CAR + OLDCLAIM + 
##     CLM_FREQ + REVOKED + MVR_PTS + URBANICITY + Clerical + Doctor + 
##     Lawyer + Manager + Professional + `Panel Truck` + Pickup + 
##     `Sports Car` + Van + z_SUV, family = binomial(link = "logit"), 
##     data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2609  -0.7203  -0.4042   0.6190   3.1031  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.349e+00  3.369e-01  -9.942  < 2e-16 ***
## KIDSDRIV       3.272e-01  7.963e-02   4.109 3.98e-05 ***
## AGE           -1.789e-03  5.140e-03  -0.348 0.727755    
## HOMEKIDS       6.538e-02  4.687e-02   1.395 0.162992    
## YOJ           -1.888e-02  1.028e-02  -1.836 0.066381 .  
## INCOME        -6.298e-06  1.464e-06  -4.302 1.69e-05 ***
## PARENT1        3.926e-01  1.427e-01   2.752 0.005931 ** 
## HOME_VAL       9.333e-07  7.591e-07   1.230 0.218880    
## MSTATUS       -6.182e-01  9.440e-02  -6.549 5.80e-11 ***
## EDUCATION     -1.243e-01  4.693e-02  -2.650 0.008060 ** 
## TRAVTIME       1.599e-02  2.427e-03   6.586 4.51e-11 ***
## CAR_USE        8.232e-01  1.050e-01   7.840 4.49e-15 ***
## BLUEBOOK      -2.449e-05  6.387e-06  -3.835 0.000126 ***
## TIF           -5.499e-02  9.412e-03  -5.843 5.14e-09 ***
## RED_CAR        8.281e-02  9.833e-02   0.842 0.399693    
## OLDCLAIM      -7.886e-06  4.929e-06  -1.600 0.109592    
## CLM_FREQ       1.885e-01  3.677e-02   5.125 2.98e-07 ***
## REVOKED        8.462e-01  1.174e-01   7.209 5.64e-13 ***
## MVR_PTS        1.021e-01  1.806e-02   5.650 1.60e-08 ***
## URBANICITY     2.389e+00  1.445e-01  16.529  < 2e-16 ***
## Clerical       1.860e-01  1.200e-01   1.551 0.121008    
## Doctor        -4.795e-01  3.142e-01  -1.526 0.126988    
## Lawyer         7.335e-02  1.679e-01   0.437 0.662253    
## Manager       -9.079e-01  1.550e-01  -5.858 4.70e-09 ***
## Professional  -2.096e-01  1.280e-01  -1.637 0.101583    
## `Panel Truck`  6.142e-01  1.896e-01   3.240 0.001196 ** 
## Pickup         4.200e-01  1.259e-01   3.337 0.000848 ***
## `Sports Car`   9.421e-01  1.480e-01   6.365 1.95e-10 ***
## Van            5.721e-01  1.546e-01   3.700 0.000216 ***
## z_SUV          7.315e-01  1.183e-01   6.182 6.32e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4391.9  on 4867  degrees of freedom
## AIC: 4451.9
## 
## Number of Fisher Scoring iterations: 5

Age has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + HOMEKIDS + YOJ + 
##     INCOME + PARENT1 + HOME_VAL + MSTATUS + EDUCATION + TRAVTIME + 
##     CAR_USE + BLUEBOOK + TIF + RED_CAR + OLDCLAIM + CLM_FREQ + 
##     REVOKED + MVR_PTS + URBANICITY + Clerical + Doctor + Lawyer + 
##     Manager + Professional + `Panel Truck` + Pickup + `Sports Car` + 
##     Van + z_SUV, family = binomial(link = "logit"), data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2689  -0.7197  -0.4055   0.6184   3.0972  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.420e+00  2.687e-01 -12.728  < 2e-16 ***
## KIDSDRIV       3.220e-01  7.826e-02   4.115 3.87e-05 ***
## HOMEKIDS       7.163e-02  4.327e-02   1.655 0.097836 .  
## YOJ           -1.941e-02  1.017e-02  -1.909 0.056238 .  
## INCOME        -6.258e-06  1.459e-06  -4.290 1.78e-05 ***
## PARENT1        3.984e-01  1.417e-01   2.812 0.004930 ** 
## HOME_VAL       9.167e-07  7.574e-07   1.210 0.226114    
## MSTATUS       -6.194e-01  9.434e-02  -6.566 5.16e-11 ***
## EDUCATION     -1.256e-01  4.679e-02  -2.683 0.007288 ** 
## TRAVTIME       1.597e-02  2.427e-03   6.581 4.67e-11 ***
## CAR_USE        8.233e-01  1.050e-01   7.841 4.48e-15 ***
## BLUEBOOK      -2.475e-05  6.344e-06  -3.901 9.58e-05 ***
## TIF           -5.496e-02  9.411e-03  -5.840 5.23e-09 ***
## RED_CAR        8.168e-02  9.826e-02   0.831 0.405829    
## OLDCLAIM      -7.878e-06  4.929e-06  -1.598 0.109996    
## CLM_FREQ       1.880e-01  3.676e-02   5.116 3.13e-07 ***
## REVOKED        8.458e-01  1.174e-01   7.205 5.82e-13 ***
## MVR_PTS        1.024e-01  1.804e-02   5.672 1.41e-08 ***
## URBANICITY     2.391e+00  1.444e-01  16.552  < 2e-16 ***
## Clerical       1.883e-01  1.198e-01   1.572 0.116040    
## Doctor        -4.845e-01  3.138e-01  -1.544 0.122621    
## Lawyer         7.043e-02  1.677e-01   0.420 0.674549    
## Manager       -9.106e-01  1.548e-01  -5.883 4.03e-09 ***
## Professional  -2.094e-01  1.280e-01  -1.636 0.101752    
## `Panel Truck`  6.164e-01  1.895e-01   3.254 0.001139 ** 
## Pickup         4.198e-01  1.259e-01   3.335 0.000853 ***
## `Sports Car`   9.382e-01  1.476e-01   6.358 2.05e-10 ***
## Van            5.731e-01  1.546e-01   3.707 0.000210 ***
## z_SUV          7.291e-01  1.181e-01   6.173 6.68e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4392.0  on 4868  degrees of freedom
## AIC: 4450
## 
## Number of Fisher Scoring iterations: 5

Lawyer has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + HOMEKIDS + YOJ + 
##     INCOME + PARENT1 + HOME_VAL + MSTATUS + EDUCATION + TRAVTIME + 
##     CAR_USE + BLUEBOOK + TIF + RED_CAR + OLDCLAIM + CLM_FREQ + 
##     REVOKED + MVR_PTS + URBANICITY + Clerical + Doctor + Manager + 
##     Professional + `Panel Truck` + Pickup + `Sports Car` + Van + 
##     z_SUV, family = binomial(link = "logit"), data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2669  -0.7201  -0.4047   0.6221   3.0944  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.419e+00  2.686e-01 -12.730  < 2e-16 ***
## KIDSDRIV       3.218e-01  7.823e-02   4.113 3.91e-05 ***
## HOMEKIDS       7.050e-02  4.317e-02   1.633 0.102493    
## YOJ           -1.885e-02  1.008e-02  -1.871 0.061347 .  
## INCOME        -6.225e-06  1.456e-06  -4.277 1.90e-05 ***
## PARENT1        3.982e-01  1.417e-01   2.811 0.004936 ** 
## HOME_VAL       9.200e-07  7.566e-07   1.216 0.224006    
## MSTATUS       -6.199e-01  9.434e-02  -6.571 5.00e-11 ***
## EDUCATION     -1.180e-01  4.311e-02  -2.737 0.006206 ** 
## TRAVTIME       1.596e-02  2.427e-03   6.575 4.87e-11 ***
## CAR_USE        8.062e-01  9.669e-02   8.338  < 2e-16 ***
## BLUEBOOK      -2.468e-05  6.340e-06  -3.893 9.91e-05 ***
## TIF           -5.497e-02  9.411e-03  -5.842 5.17e-09 ***
## RED_CAR        8.264e-02  9.824e-02   0.841 0.400206    
## OLDCLAIM      -7.930e-06  4.927e-06  -1.610 0.107486    
## CLM_FREQ       1.878e-01  3.676e-02   5.110 3.23e-07 ***
## REVOKED        8.474e-01  1.173e-01   7.222 5.13e-13 ***
## MVR_PTS        1.025e-01  1.804e-02   5.681 1.34e-08 ***
## URBANICITY     2.391e+00  1.444e-01  16.557  < 2e-16 ***
## Clerical       1.785e-01  1.174e-01   1.520 0.128561    
## Doctor        -5.257e-01  2.980e-01  -1.764 0.077686 .  
## Manager       -9.317e-01  1.463e-01  -6.367 1.93e-10 ***
## Professional  -2.276e-01  1.204e-01  -1.891 0.058644 .  
## `Panel Truck`  6.112e-01  1.890e-01   3.234 0.001220 ** 
## Pickup         4.205e-01  1.258e-01   3.341 0.000834 ***
## `Sports Car`   9.363e-01  1.475e-01   6.347 2.19e-10 ***
## Van            5.723e-01  1.546e-01   3.701 0.000214 ***
## z_SUV          7.277e-01  1.181e-01   6.163 7.13e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4392.2  on 4869  degrees of freedom
## AIC: 4448.2
## 
## Number of Fisher Scoring iterations: 5

RED_CAR has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + HOMEKIDS + YOJ + 
##     INCOME + PARENT1 + HOME_VAL + MSTATUS + EDUCATION + TRAVTIME + 
##     CAR_USE + BLUEBOOK + TIF + OLDCLAIM + CLM_FREQ + REVOKED + 
##     MVR_PTS + URBANICITY + Clerical + Doctor + Manager + Professional + 
##     `Panel Truck` + Pickup + `Sports Car` + Van + z_SUV, family = binomial(link = "logit"), 
##     data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2663  -0.7182  -0.4054   0.6228   3.1106  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.368e+00  2.616e-01 -12.876  < 2e-16 ***
## KIDSDRIV       3.207e-01  7.817e-02   4.103 4.08e-05 ***
## HOMEKIDS       7.048e-02  4.316e-02   1.633 0.102473    
## YOJ           -1.853e-02  1.007e-02  -1.841 0.065693 .  
## INCOME        -6.256e-06  1.455e-06  -4.298 1.72e-05 ***
## PARENT1        3.954e-01  1.416e-01   2.792 0.005237 ** 
## HOME_VAL       9.258e-07  7.564e-07   1.224 0.220925    
## MSTATUS       -6.209e-01  9.433e-02  -6.582 4.64e-11 ***
## EDUCATION     -1.183e-01  4.310e-02  -2.746 0.006034 ** 
## TRAVTIME       1.595e-02  2.426e-03   6.572 4.95e-11 ***
## CAR_USE        8.065e-01  9.668e-02   8.341  < 2e-16 ***
## BLUEBOOK      -2.590e-05  6.173e-06  -4.196 2.71e-05 ***
## TIF           -5.496e-02  9.406e-03  -5.843 5.12e-09 ***
## OLDCLAIM      -7.966e-06  4.927e-06  -1.617 0.105879    
## CLM_FREQ       1.887e-01  3.675e-02   5.134 2.84e-07 ***
## REVOKED        8.491e-01  1.173e-01   7.237 4.57e-13 ***
## MVR_PTS        1.023e-01  1.804e-02   5.674 1.39e-08 ***
## URBANICITY     2.392e+00  1.444e-01  16.568  < 2e-16 ***
## Clerical       1.788e-01  1.175e-01   1.523 0.127830    
## Doctor        -5.196e-01  2.979e-01  -1.744 0.081188 .  
## Manager       -9.267e-01  1.462e-01  -6.339 2.31e-10 ***
## Professional  -2.251e-01  1.203e-01  -1.871 0.061292 .  
## `Panel Truck`  6.411e-01  1.856e-01   3.453 0.000554 ***
## Pickup         4.211e-01  1.258e-01   3.348 0.000815 ***
## `Sports Car`   8.996e-01  1.408e-01   6.389 1.66e-10 ***
## Van            5.873e-01  1.536e-01   3.824 0.000132 ***
## z_SUV          6.898e-01  1.090e-01   6.329 2.46e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4392.9  on 4870  degrees of freedom
## AIC: 4446.9
## 
## Number of Fisher Scoring iterations: 5

Home Value has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + HOMEKIDS + YOJ + 
##     INCOME + PARENT1 + MSTATUS + EDUCATION + TRAVTIME + CAR_USE + 
##     BLUEBOOK + TIF + OLDCLAIM + CLM_FREQ + REVOKED + MVR_PTS + 
##     URBANICITY + Clerical + Doctor + Manager + Professional + 
##     `Panel Truck` + Pickup + `Sports Car` + Van + z_SUV, family = binomial(link = "logit"), 
##     data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2451  -0.7196  -0.4025   0.6311   3.1149  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.231e+00  2.360e-01 -13.694  < 2e-16 ***
## KIDSDRIV       3.218e-01  7.813e-02   4.119 3.81e-05 ***
## HOMEKIDS       6.865e-02  4.314e-02   1.591 0.111537    
## YOJ           -1.889e-02  1.005e-02  -1.880 0.060135 .  
## INCOME        -5.273e-06  1.207e-06  -4.370 1.25e-05 ***
## PARENT1        4.033e-01  1.413e-01   2.854 0.004323 ** 
## MSTATUS       -6.144e-01  9.421e-02  -6.522 6.96e-11 ***
## EDUCATION     -1.170e-01  4.306e-02  -2.718 0.006570 ** 
## TRAVTIME       1.595e-02  2.426e-03   6.577 4.81e-11 ***
## CAR_USE        8.097e-01  9.663e-02   8.380  < 2e-16 ***
## BLUEBOOK      -2.593e-05  6.174e-06  -4.199 2.68e-05 ***
## TIF           -5.470e-02  9.398e-03  -5.821 5.85e-09 ***
## OLDCLAIM      -7.926e-06  4.922e-06  -1.610 0.107338    
## CLM_FREQ       1.868e-01  3.670e-02   5.088 3.61e-07 ***
## REVOKED        8.471e-01  1.173e-01   7.224 5.04e-13 ***
## MVR_PTS        1.028e-01  1.802e-02   5.706 1.16e-08 ***
## URBANICITY     2.389e+00  1.442e-01  16.565  < 2e-16 ***
## Clerical       1.684e-01  1.170e-01   1.439 0.150220    
## Doctor        -5.146e-01  2.982e-01  -1.726 0.084364 .  
## Manager       -9.267e-01  1.463e-01  -6.334 2.38e-10 ***
## Professional  -2.222e-01  1.203e-01  -1.847 0.064684 .  
## `Panel Truck`  6.466e-01  1.856e-01   3.484 0.000495 ***
## Pickup         4.181e-01  1.257e-01   3.325 0.000885 ***
## `Sports Car`   8.997e-01  1.407e-01   6.393 1.63e-10 ***
## Van            5.901e-01  1.536e-01   3.841 0.000123 ***
## z_SUV          6.900e-01  1.090e-01   6.333 2.40e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4394.4  on 4871  degrees of freedom
## AIC: 4446.4
## 
## Number of Fisher Scoring iterations: 5

Clerical has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + HOMEKIDS + YOJ + 
##     INCOME + PARENT1 + MSTATUS + EDUCATION + TRAVTIME + CAR_USE + 
##     BLUEBOOK + TIF + OLDCLAIM + CLM_FREQ + REVOKED + MVR_PTS + 
##     URBANICITY + Doctor + Manager + Professional + `Panel Truck` + 
##     Pickup + `Sports Car` + Van + z_SUV, family = binomial(link = "logit"), 
##     data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2635  -0.7197  -0.4021   0.6310   3.1090  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.172e+00  2.319e-01 -13.675  < 2e-16 ***
## KIDSDRIV       3.197e-01  7.812e-02   4.093 4.27e-05 ***
## HOMEKIDS       6.801e-02  4.313e-02   1.577 0.114787    
## YOJ           -1.611e-02  9.850e-03  -1.635 0.102014    
## INCOME        -5.324e-06  1.205e-06  -4.417 9.99e-06 ***
## PARENT1        4.110e-01  1.412e-01   2.911 0.003605 ** 
## MSTATUS       -6.136e-01  9.415e-02  -6.518 7.14e-11 ***
## EDUCATION     -1.334e-01  4.146e-02  -3.218 0.001290 ** 
## TRAVTIME       1.585e-02  2.423e-03   6.540 6.14e-11 ***
## CAR_USE        7.671e-01  9.176e-02   8.361  < 2e-16 ***
## BLUEBOOK      -2.594e-05  6.174e-06  -4.201 2.66e-05 ***
## TIF           -5.433e-02  9.390e-03  -5.786 7.22e-09 ***
## OLDCLAIM      -7.944e-06  4.923e-06  -1.614 0.106626    
## CLM_FREQ       1.869e-01  3.669e-02   5.094 3.51e-07 ***
## REVOKED        8.494e-01  1.172e-01   7.246 4.28e-13 ***
## MVR_PTS        1.033e-01  1.801e-02   5.738 9.56e-09 ***
## URBANICITY     2.375e+00  1.439e-01  16.502  < 2e-16 ***
## Doctor        -5.227e-01  2.980e-01  -1.754 0.079421 .  
## Manager       -9.567e-01  1.447e-01  -6.612 3.80e-11 ***
## Professional  -2.608e-01  1.171e-01  -2.228 0.025909 *  
## `Panel Truck`  6.813e-01  1.840e-01   3.702 0.000214 ***
## Pickup         4.369e-01  1.249e-01   3.497 0.000471 ***
## `Sports Car`   8.955e-01  1.406e-01   6.367 1.92e-10 ***
## Van            6.085e-01  1.530e-01   3.978 6.95e-05 ***
## z_SUV          6.881e-01  1.089e-01   6.318 2.65e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4396.4  on 4872  degrees of freedom
## AIC: 4446.4
## 
## Number of Fisher Scoring iterations: 5

HomeKIDS has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + YOJ + INCOME + 
##     PARENT1 + MSTATUS + EDUCATION + TRAVTIME + CAR_USE + BLUEBOOK + 
##     TIF + OLDCLAIM + CLM_FREQ + REVOKED + MVR_PTS + URBANICITY + 
##     Doctor + Manager + Professional + `Panel Truck` + Pickup + 
##     `Sports Car` + Van + z_SUV, family = binomial(link = "logit"), 
##     data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2785  -0.7210  -0.4025   0.6308   3.0979  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.164e+00  2.317e-01 -13.657  < 2e-16 ***
## KIDSDRIV       3.691e-01  7.167e-02   5.150 2.60e-07 ***
## YOJ           -1.457e-02  9.802e-03  -1.487 0.137034    
## INCOME        -5.392e-06  1.205e-06  -4.474 7.68e-06 ***
## PARENT1        5.210e-01  1.229e-01   4.241 2.23e-05 ***
## MSTATUS       -5.704e-01  8.986e-02  -6.348 2.19e-10 ***
## EDUCATION     -1.391e-01  4.128e-02  -3.370 0.000753 ***
## TRAVTIME       1.572e-02  2.421e-03   6.495 8.30e-11 ***
## CAR_USE        7.669e-01  9.173e-02   8.360  < 2e-16 ***
## BLUEBOOK      -2.631e-05  6.171e-06  -4.264 2.01e-05 ***
## TIF           -5.393e-02  9.382e-03  -5.748 9.03e-09 ***
## OLDCLAIM      -8.045e-06  4.923e-06  -1.634 0.102182    
## CLM_FREQ       1.862e-01  3.668e-02   5.076 3.85e-07 ***
## REVOKED        8.556e-01  1.171e-01   7.308 2.71e-13 ***
## MVR_PTS        1.041e-01  1.800e-02   5.785 7.23e-09 ***
## URBANICITY     2.375e+00  1.439e-01  16.503  < 2e-16 ***
## Doctor        -5.278e-01  2.980e-01  -1.771 0.076506 .  
## Manager       -9.682e-01  1.445e-01  -6.698 2.11e-11 ***
## Professional  -2.701e-01  1.169e-01  -2.310 0.020898 *  
## `Panel Truck`  6.870e-01  1.839e-01   3.735 0.000188 ***
## Pickup         4.369e-01  1.249e-01   3.498 0.000469 ***
## `Sports Car`   8.999e-01  1.405e-01   6.403 1.53e-10 ***
## Van            6.137e-01  1.529e-01   4.014 5.98e-05 ***
## z_SUV          6.925e-01  1.088e-01   6.363 1.97e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4398.9  on 4873  degrees of freedom
## AIC: 4446.9
## 
## Number of Fisher Scoring iterations: 5

Year On the Job has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + INCOME + PARENT1 + 
##     MSTATUS + EDUCATION + TRAVTIME + CAR_USE + BLUEBOOK + TIF + 
##     OLDCLAIM + CLM_FREQ + REVOKED + MVR_PTS + URBANICITY + Doctor + 
##     Manager + Professional + `Panel Truck` + Pickup + `Sports Car` + 
##     Van + z_SUV, family = binomial(link = "logit"), data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2850  -0.7212  -0.4011   0.6410   3.0893  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.272e+00  2.205e-01 -14.835  < 2e-16 ***
## KIDSDRIV       3.686e-01  7.164e-02   5.146 2.66e-07 ***
## INCOME        -5.917e-06  1.158e-06  -5.108 3.26e-07 ***
## PARENT1        5.127e-01  1.226e-01   4.180 2.92e-05 ***
## MSTATUS       -5.911e-01  8.879e-02  -6.658 2.77e-11 ***
## EDUCATION     -1.315e-01  4.095e-02  -3.212 0.001319 ** 
## TRAVTIME       1.563e-02  2.417e-03   6.466 1.01e-10 ***
## CAR_USE        7.683e-01  9.169e-02   8.380  < 2e-16 ***
## BLUEBOOK      -2.661e-05  6.172e-06  -4.312 1.62e-05 ***
## TIF           -5.413e-02  9.377e-03  -5.773 7.81e-09 ***
## OLDCLAIM      -8.308e-06  4.918e-06  -1.690 0.091123 .  
## CLM_FREQ       1.869e-01  3.666e-02   5.098 3.44e-07 ***
## REVOKED        8.549e-01  1.170e-01   7.305 2.77e-13 ***
## MVR_PTS        1.047e-01  1.798e-02   5.822 5.83e-09 ***
## URBANICITY     2.365e+00  1.436e-01  16.476  < 2e-16 ***
## Doctor        -5.291e-01  2.981e-01  -1.775 0.075885 .  
## Manager       -9.769e-01  1.445e-01  -6.762 1.36e-11 ***
## Professional  -2.791e-01  1.168e-01  -2.390 0.016863 *  
## `Panel Truck`  6.991e-01  1.838e-01   3.803 0.000143 ***
## Pickup         4.374e-01  1.248e-01   3.504 0.000458 ***
## `Sports Car`   9.153e-01  1.401e-01   6.533 6.44e-11 ***
## Van            6.198e-01  1.529e-01   4.055 5.02e-05 ***
## z_SUV          6.976e-01  1.087e-01   6.417 1.39e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4401.1  on 4874  degrees of freedom
## AIC: 4447.1
## 
## Number of Fisher Scoring iterations: 5

OldClaim has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + INCOME + PARENT1 + 
##     MSTATUS + EDUCATION + TRAVTIME + CAR_USE + BLUEBOOK + TIF + 
##     CLM_FREQ + REVOKED + MVR_PTS + URBANICITY + Doctor + Manager + 
##     Professional + `Panel Truck` + Pickup + `Sports Car` + Van + 
##     z_SUV, family = binomial(link = "logit"), data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2625  -0.7246  -0.4026   0.6340   3.0865  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.271e+00  2.204e-01 -14.843  < 2e-16 ***
## KIDSDRIV       3.725e-01  7.151e-02   5.208 1.91e-07 ***
## INCOME        -5.908e-06  1.158e-06  -5.103 3.35e-07 ***
## PARENT1        5.141e-01  1.226e-01   4.195 2.73e-05 ***
## MSTATUS       -5.892e-01  8.875e-02  -6.639 3.16e-11 ***
## EDUCATION     -1.309e-01  4.093e-02  -3.199 0.001381 ** 
## TRAVTIME       1.574e-02  2.415e-03   6.519 7.07e-11 ***
## CAR_USE        7.675e-01  9.169e-02   8.371  < 2e-16 ***
## BLUEBOOK      -2.672e-05  6.167e-06  -4.333 1.47e-05 ***
## TIF           -5.440e-02  9.370e-03  -5.805 6.42e-09 ***
## CLM_FREQ       1.592e-01  3.288e-02   4.841 1.29e-06 ***
## REVOKED        7.627e-01  1.039e-01   7.343 2.09e-13 ***
## MVR_PTS        1.017e-01  1.788e-02   5.688 1.28e-08 ***
## URBANICITY     2.364e+00  1.435e-01  16.479  < 2e-16 ***
## Doctor        -5.236e-01  2.976e-01  -1.760 0.078451 .  
## Manager       -9.757e-01  1.444e-01  -6.756 1.42e-11 ***
## Professional  -2.766e-01  1.167e-01  -2.370 0.017803 *  
## `Panel Truck`  7.017e-01  1.837e-01   3.821 0.000133 ***
## Pickup         4.384e-01  1.249e-01   3.511 0.000446 ***
## `Sports Car`   9.136e-01  1.401e-01   6.522 6.93e-11 ***
## Van            6.211e-01  1.528e-01   4.065 4.81e-05 ***
## z_SUV          7.009e-01  1.086e-01   6.452 1.10e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4404.0  on 4875  degrees of freedom
## AIC: 4448
## 
## Number of Fisher Scoring iterations: 5

Doctor has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train2$TARGET_FLAG ~ KIDSDRIV + INCOME + PARENT1 + 
##     MSTATUS + EDUCATION + TRAVTIME + CAR_USE + BLUEBOOK + TIF + 
##     CLM_FREQ + REVOKED + MVR_PTS + URBANICITY + Manager + Professional + 
##     `Panel Truck` + Pickup + `Sports Car` + Van + z_SUV, family = binomial(link = "logit"), 
##     data = train2)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.2685  -0.7237  -0.4043   0.6322   3.0893  
## 
## Coefficients:
##                 Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   -3.261e+00  2.203e-01 -14.800  < 2e-16 ***
## KIDSDRIV       3.747e-01  7.157e-02   5.236 1.64e-07 ***
## INCOME        -6.114e-06  1.152e-06  -5.309 1.10e-07 ***
## PARENT1        5.155e-01  1.225e-01   4.208 2.58e-05 ***
## MSTATUS       -5.866e-01  8.870e-02  -6.613 3.76e-11 ***
## EDUCATION     -1.485e-01  3.979e-02  -3.732 0.000190 ***
## TRAVTIME       1.576e-02  2.415e-03   6.524 6.85e-11 ***
## CAR_USE        7.903e-01  9.092e-02   8.693  < 2e-16 ***
## BLUEBOOK      -2.679e-05  6.160e-06  -4.349 1.37e-05 ***
## TIF           -5.410e-02  9.372e-03  -5.772 7.81e-09 ***
## CLM_FREQ       1.589e-01  3.288e-02   4.834 1.34e-06 ***
## REVOKED        7.661e-01  1.038e-01   7.381 1.57e-13 ***
## MVR_PTS        1.018e-01  1.789e-02   5.691 1.26e-08 ***
## URBANICITY     2.366e+00  1.436e-01  16.473  < 2e-16 ***
## Manager       -9.406e-01  1.433e-01  -6.566 5.16e-11 ***
## Professional  -2.498e-01  1.159e-01  -2.155 0.031186 *  
## `Panel Truck`  7.148e-01  1.836e-01   3.893 9.89e-05 ***
## Pickup         4.341e-01  1.248e-01   3.477 0.000506 ***
## `Sports Car`   9.109e-01  1.399e-01   6.510 7.50e-11 ***
## Van            6.195e-01  1.526e-01   4.059 4.92e-05 ***
## z_SUV          6.987e-01  1.086e-01   6.435 1.24e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 4407.4  on 4876  degrees of freedom
## AIC: 4449.4
## 
## Number of Fisher Scoring iterations: 5
##   (Intercept)      KIDSDRIV        INCOME       PARENT1       MSTATUS 
## -4.778999e-01  5.492667e-02 -8.961659e-07  7.555801e-02 -8.598043e-02 
##     EDUCATION      TRAVTIME       CAR_USE      BLUEBOOK           TIF 
## -2.176489e-02  2.309464e-03  1.158372e-01 -3.926466e-06 -7.929253e-03 
##      CLM_FREQ       REVOKED       MVR_PTS    URBANICITY       Manager 
##  2.329275e-02  1.122878e-01  1.492372e-02  3.467393e-01 -1.378694e-01 
##  Professional `Panel Truck`        Pickup  `Sports Car`           Van 
## -3.661444e-02  1.047719e-01  6.362690e-02  1.335134e-01  9.080778e-02 
##         z_SUV 
##  1.024094e-01

The variables that are likely to result in a customer having a claim are: having more kids who are driving, being a single parent, traveling a longer distance to work, using a commercial vehicle, the more claims a customer had in the last 5 years, a customer having his/her license revoked, a customer having points, being in an urban area, driving a panel truck, pickup, sports car van or SUV. The following variables are likely to result in a person less likely to have a claim: higher income, being married, having a higher education, a car of more value, being on the job for more years, being a manager, and being a professional.

Prediction from Model 2

## [1] 0.33
## [1] 0.7974893

The cutoff associated with a point the farthest distance from the ROC curve is 0.33. I will use 0.33 as the cutoff for making predictions. A value above 0.33 will be assigned a target of one and a value below 0.33 wil be assigned a value of zero. The area under the curve is 0.80.

Confusion Matrix
##     true
## pred    0    1
##    0 1875  268
##    1  531  590

The model predicted 1,875 0’s that were actually 0. The model predicted 268 0’s that were actually 1.
The model predicted 531 1’s that were actaully 0. The model predicted 590 1’s that were actually 1.

##    accuracy     error precision sensitivity specificity        f1
## 1 0.7552083 0.2447917 0.4121785   0.7793017   0.6876457 0.5391804

Model3

Run with a different seed for separating the data

Creating a test and training set

Backward Regression

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + MSTATUS + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + 
##     CAR_AGE + URBANICITY + Clerical + Doctor + `Home Maker` + 
##     Lawyer + Manager + Professional + `z_Blue Collar` + `Panel Truck` + 
##     Pickup + `Sports Car` + Van + homekids_age + income_homeval_educ + 
##     sex_redcar_suv + oldclaim_freq, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0398  -0.8044  -0.7526   1.4979   1.8606  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -8.638e-01  2.030e-01  -4.254  2.1e-05 ***
## KIDSDRIV             6.414e-02  7.477e-02   0.858 0.390998    
## YOJ                  9.212e-03  9.523e-03   0.967 0.333395    
## MSTATUS              1.179e-02  6.766e-02   0.174 0.861699    
## TRAVTIME             2.469e-03  2.073e-03   1.191 0.233508    
## CAR_USE             -9.222e-02  1.016e-01  -0.907 0.364196    
## BLUEBOOK            -1.638e-06  4.921e-06  -0.333 0.739158    
## TIF                 -3.437e-03  7.936e-03  -0.433 0.664965    
## REVOKED              6.899e-02  1.083e-01   0.637 0.524248    
## MVR_PTS             -1.283e-02  1.604e-02  -0.800 0.423719    
## CAR_AGE             -4.876e-03  7.434e-03  -0.656 0.511868    
## URBANICITY          -1.015e-01  8.743e-02  -1.161 0.245775    
## Clerical            -4.687e-01  1.342e-01  -3.493 0.000477 ***
## Doctor              -3.298e-01  2.279e-01  -1.447 0.147837    
## `Home Maker`        -4.241e-01  1.595e-01  -2.659 0.007833 ** 
## Lawyer              -2.794e-01  1.538e-01  -1.816 0.069363 .  
## Manager             -1.584e-01  1.357e-01  -1.167 0.243126    
## Professional        -2.864e-01  1.290e-01  -2.221 0.026363 *  
## `z_Blue Collar`     -1.617e-01  1.194e-01  -1.355 0.175520    
## `Panel Truck`       -2.424e-03  1.653e-01  -0.015 0.988304    
## Pickup              -9.405e-02  1.061e-01  -0.886 0.375444    
## `Sports Car`         1.231e-01  1.095e-01   1.124 0.260881    
## Van                 -8.620e-02  1.341e-01  -0.643 0.520225    
## homekids_age        -1.729e-04  9.151e-04  -0.189 0.850109    
## income_homeval_educ  1.061e-07  8.569e-08   1.238 0.215830    
## sex_redcar_suv       4.173e-02  5.740e-02   0.727 0.467188    
## oldclaim_freq       -6.457e-07  4.297e-06  -0.150 0.880562    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5627.8  on 4870  degrees of freedom
## AIC: 5681.8
## 
## Number of Fisher Scoring iterations: 4

Panel Truck has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + MSTATUS + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + 
##     CAR_AGE + URBANICITY + Clerical + Doctor + `Home Maker` + 
##     Lawyer + Manager + Professional + `z_Blue Collar` + Pickup + 
##     `Sports Car` + Van + homekids_age + income_homeval_educ + 
##     sex_redcar_suv + oldclaim_freq, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0397  -0.8044  -0.7527   1.4982   1.8610  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -8.638e-01  2.030e-01  -4.254  2.1e-05 ***
## KIDSDRIV             6.412e-02  7.476e-02   0.858 0.391064    
## YOJ                  9.208e-03  9.519e-03   0.967 0.333393    
## MSTATUS              1.181e-02  6.765e-02   0.175 0.861472    
## TRAVTIME             2.469e-03  2.072e-03   1.191 0.233537    
## CAR_USE             -9.281e-02  9.316e-02  -0.996 0.319143    
## BLUEBOOK            -1.666e-06  4.534e-06  -0.368 0.713241    
## TIF                 -3.437e-03  7.936e-03  -0.433 0.664917    
## REVOKED              6.901e-02  1.083e-01   0.637 0.524136    
## MVR_PTS             -1.283e-02  1.604e-02  -0.800 0.423781    
## CAR_AGE             -4.877e-03  7.434e-03  -0.656 0.511844    
## URBANICITY          -1.015e-01  8.739e-02  -1.162 0.245425    
## Clerical            -4.687e-01  1.342e-01  -3.494 0.000476 ***
## Doctor              -3.295e-01  2.267e-01  -1.453 0.146169    
## `Home Maker`        -4.241e-01  1.595e-01  -2.659 0.007834 ** 
## Lawyer              -2.792e-01  1.529e-01  -1.825 0.067971 .  
## Manager             -1.583e-01  1.355e-01  -1.169 0.242586    
## Professional        -2.863e-01  1.287e-01  -2.225 0.026115 *  
## `z_Blue Collar`     -1.613e-01  1.162e-01  -1.389 0.164984    
## Pickup              -9.353e-02  1.001e-01  -0.935 0.350028    
## `Sports Car`         1.233e-01  1.087e-01   1.134 0.256604    
## Van                 -8.543e-02  1.232e-01  -0.693 0.488075    
## homekids_age        -1.725e-04  9.146e-04  -0.189 0.850407    
## income_homeval_educ  1.060e-07  8.558e-08   1.239 0.215504    
## sex_redcar_suv       4.193e-02  5.575e-02   0.752 0.452006    
## oldclaim_freq       -6.458e-07  4.297e-06  -0.150 0.880546    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5627.8  on 4871  degrees of freedom
## AIC: 5679.8
## 
## Number of Fisher Scoring iterations: 4

oldclaim_freq has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + MSTATUS + 
##     TRAVTIME + CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + 
##     CAR_AGE + URBANICITY + Clerical + Doctor + `Home Maker` + 
##     Lawyer + Manager + Professional + `z_Blue Collar` + Pickup + 
##     `Sports Car` + Van + homekids_age + income_homeval_educ + 
##     sex_redcar_suv, family = binomial(link = "logit"), data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0366  -0.8041  -0.7526   1.4989   1.8615  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -8.637e-01  2.030e-01  -4.254  2.1e-05 ***
## KIDSDRIV             6.417e-02  7.476e-02   0.858 0.390727    
## YOJ                  9.184e-03  9.518e-03   0.965 0.334559    
## MSTATUS              1.208e-02  6.763e-02   0.179 0.858237    
## TRAVTIME             2.464e-03  2.072e-03   1.189 0.234468    
## CAR_USE             -9.295e-02  9.316e-02  -0.998 0.318395    
## BLUEBOOK            -1.656e-06  4.534e-06  -0.365 0.714981    
## TIF                 -3.426e-03  7.936e-03  -0.432 0.665960    
## REVOKED              6.226e-02  9.866e-02   0.631 0.527994    
## MVR_PTS             -1.342e-02  1.556e-02  -0.862 0.388550    
## CAR_AGE             -4.883e-03  7.434e-03  -0.657 0.511308    
## URBANICITY          -1.029e-01  8.690e-02  -1.184 0.236272    
## Clerical            -4.688e-01  1.342e-01  -3.494 0.000475 ***
## Doctor              -3.291e-01  2.267e-01  -1.452 0.146636    
## `Home Maker`        -4.241e-01  1.595e-01  -2.659 0.007831 ** 
## Lawyer              -2.784e-01  1.529e-01  -1.821 0.068566 .  
## Manager             -1.580e-01  1.355e-01  -1.166 0.243455    
## Professional        -2.857e-01  1.286e-01  -2.221 0.026353 *  
## `z_Blue Collar`     -1.611e-01  1.162e-01  -1.386 0.165597    
## Pickup              -9.317e-02  1.001e-01  -0.931 0.351737    
## `Sports Car`         1.228e-01  1.087e-01   1.130 0.258386    
## Van                 -8.564e-02  1.232e-01  -0.695 0.486980    
## homekids_age        -1.696e-04  9.144e-04  -0.185 0.852854    
## income_homeval_educ  1.062e-07  8.556e-08   1.241 0.214462    
## sex_redcar_suv       4.192e-02  5.575e-02   0.752 0.452118    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5627.8  on 4872  degrees of freedom
## AIC: 5677.8
## 
## Number of Fisher Scoring iterations: 4

MSTATUS has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + TRAVTIME + 
##     CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + CAR_AGE + 
##     URBANICITY + Clerical + Doctor + `Home Maker` + Lawyer + 
##     Manager + Professional + `z_Blue Collar` + Pickup + `Sports Car` + 
##     Van + homekids_age + income_homeval_educ + sex_redcar_suv, 
##     family = binomial(link = "logit"), data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0399  -0.8039  -0.7530   1.5001   1.8630  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -8.584e-01  2.008e-01  -4.274 1.92e-05 ***
## KIDSDRIV             6.457e-02  7.473e-02   0.864 0.387540    
## YOJ                  9.459e-03  9.395e-03   1.007 0.314017    
## TRAVTIME             2.466e-03  2.072e-03   1.190 0.234094    
## CAR_USE             -9.334e-02  9.313e-02  -1.002 0.316231    
## BLUEBOOK            -1.678e-06  4.532e-06  -0.370 0.711243    
## TIF                 -3.416e-03  7.936e-03  -0.430 0.666853    
## REVOKED              6.138e-02  9.854e-02   0.623 0.533344    
## MVR_PTS             -1.354e-02  1.554e-02  -0.871 0.383680    
## CAR_AGE             -4.946e-03  7.426e-03  -0.666 0.505391    
## URBANICITY          -1.027e-01  8.689e-02  -1.182 0.237226    
## Clerical            -4.693e-01  1.341e-01  -3.499 0.000467 ***
## Doctor              -3.315e-01  2.263e-01  -1.465 0.142871    
## `Home Maker`        -4.238e-01  1.595e-01  -2.657 0.007875 ** 
## Lawyer              -2.797e-01  1.527e-01  -1.832 0.066925 .  
## Manager             -1.589e-01  1.354e-01  -1.173 0.240641    
## Professional        -2.864e-01  1.286e-01  -2.227 0.025933 *  
## `z_Blue Collar`     -1.615e-01  1.161e-01  -1.391 0.164281    
## Pickup              -9.327e-02  1.001e-01  -0.932 0.351212    
## `Sports Car`         1.230e-01  1.087e-01   1.132 0.257587    
## Van                 -8.546e-02  1.232e-01  -0.694 0.487872    
## homekids_age        -1.683e-04  9.144e-04  -0.184 0.853980    
## income_homeval_educ  1.070e-07  8.545e-08   1.252 0.210408    
## sex_redcar_suv       4.182e-02  5.575e-02   0.750 0.453235    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5627.9  on 4873  degrees of freedom
## AIC: 5675.9
## 
## Number of Fisher Scoring iterations: 4

Homekidsage has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + TRAVTIME + 
##     CAR_USE + BLUEBOOK + TIF + REVOKED + MVR_PTS + CAR_AGE + 
##     URBANICITY + Clerical + Doctor + `Home Maker` + Lawyer + 
##     Manager + Professional + `z_Blue Collar` + Pickup + `Sports Car` + 
##     Van + income_homeval_educ + sex_redcar_suv, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0402  -0.8041  -0.7528   1.4982   1.8643  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -8.605e-01  2.005e-01  -4.292 1.77e-05 ***
## KIDSDRIV             5.716e-02  6.294e-02   0.908 0.363843    
## YOJ                  9.160e-03  9.251e-03   0.990 0.322093    
## TRAVTIME             2.475e-03  2.072e-03   1.195 0.232238    
## CAR_USE             -9.305e-02  9.312e-02  -0.999 0.317677    
## BLUEBOOK            -1.654e-06  4.530e-06  -0.365 0.714985    
## TIF                 -3.426e-03  7.936e-03  -0.432 0.665915    
## REVOKED              6.088e-02  9.850e-02   0.618 0.536536    
## MVR_PTS             -1.359e-02  1.554e-02  -0.874 0.381967    
## CAR_AGE             -4.897e-03  7.421e-03  -0.660 0.509320    
## URBANICITY          -1.026e-01  8.689e-02  -1.181 0.237576    
## Clerical            -4.683e-01  1.340e-01  -3.495 0.000475 ***
## Doctor              -3.290e-01  2.259e-01  -1.457 0.145214    
## `Home Maker`        -4.230e-01  1.594e-01  -2.653 0.007970 ** 
## Lawyer              -2.770e-01  1.520e-01  -1.823 0.068325 .  
## Manager             -1.567e-01  1.349e-01  -1.162 0.245354    
## Professional        -2.841e-01  1.280e-01  -2.220 0.026437 *  
## `z_Blue Collar`     -1.597e-01  1.157e-01  -1.380 0.167499    
## Pickup              -9.349e-02  1.000e-01  -0.935 0.350020    
## `Sports Car`         1.221e-01  1.085e-01   1.125 0.260651    
## Van                 -8.542e-02  1.232e-01  -0.693 0.488030    
## income_homeval_educ  1.079e-07  8.531e-08   1.265 0.205965    
## sex_redcar_suv       4.093e-02  5.554e-02   0.737 0.461190    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5627.9  on 4874  degrees of freedom
## AIC: 5673.9
## 
## Number of Fisher Scoring iterations: 4

BLUEBOOK has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + TRAVTIME + 
##     CAR_USE + TIF + REVOKED + MVR_PTS + CAR_AGE + URBANICITY + 
##     Clerical + Doctor + `Home Maker` + Lawyer + Manager + Professional + 
##     `z_Blue Collar` + Pickup + `Sports Car` + Van + income_homeval_educ + 
##     sex_redcar_suv, family = binomial(link = "logit"), data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0455  -0.8039  -0.7527   1.5009   1.8699  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -8.816e-01  1.920e-01  -4.592 4.39e-06 ***
## KIDSDRIV             5.690e-02  6.294e-02   0.904  0.36594    
## YOJ                  8.939e-03  9.232e-03   0.968  0.33292    
## TRAVTIME             2.477e-03  2.072e-03   1.196  0.23187    
## CAR_USE             -1.006e-01  9.074e-02  -1.109  0.26743    
## TIF                 -3.414e-03  7.936e-03  -0.430  0.66703    
## REVOKED              6.134e-02  9.849e-02   0.623  0.53339    
## MVR_PTS             -1.339e-02  1.553e-02  -0.862  0.38852    
## CAR_AGE             -4.889e-03  7.422e-03  -0.659  0.51003    
## URBANICITY          -1.027e-01  8.689e-02  -1.182  0.23709    
## Clerical            -4.706e-01  1.339e-01  -3.515  0.00044 ***
## Doctor              -3.284e-01  2.258e-01  -1.454  0.14597    
## `Home Maker`        -4.251e-01  1.593e-01  -2.668  0.00762 ** 
## Lawyer              -2.779e-01  1.519e-01  -1.829  0.06737 .  
## Manager             -1.602e-01  1.345e-01  -1.191  0.23359    
## Professional        -2.886e-01  1.274e-01  -2.265  0.02349 *  
## `z_Blue Collar`     -1.596e-01  1.157e-01  -1.379  0.16784    
## Pickup              -8.491e-02  9.719e-02  -0.874  0.38235    
## `Sports Car`         1.281e-01  1.073e-01   1.194  0.23233    
## Van                 -8.668e-02  1.231e-01  -0.704  0.48137    
## income_homeval_educ  1.004e-07  8.279e-08   1.212  0.22544    
## sex_redcar_suv       4.424e-02  5.483e-02   0.807  0.41969    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5628.0  on 4875  degrees of freedom
## AIC: 5672
## 
## Number of Fisher Scoring iterations: 4

TIF has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + TRAVTIME + 
##     CAR_USE + REVOKED + MVR_PTS + CAR_AGE + URBANICITY + Clerical + 
##     Doctor + `Home Maker` + Lawyer + Manager + Professional + 
##     `z_Blue Collar` + Pickup + `Sports Car` + Van + income_homeval_educ + 
##     sex_redcar_suv, family = binomial(link = "logit"), data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0442  -0.8039  -0.7533   1.5004   1.8658  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -9.008e-01  1.868e-01  -4.822 1.42e-06 ***
## KIDSDRIV             5.744e-02  6.293e-02   0.913  0.36137    
## YOJ                  8.923e-03  9.231e-03   0.967  0.33370    
## TRAVTIME             2.477e-03  2.071e-03   1.196  0.23172    
## CAR_USE             -1.006e-01  9.074e-02  -1.109  0.26761    
## REVOKED              6.258e-02  9.845e-02   0.636  0.52501    
## MVR_PTS             -1.309e-02  1.551e-02  -0.844  0.39891    
## CAR_AGE             -4.923e-03  7.421e-03  -0.663  0.50704    
## URBANICITY          -1.031e-01  8.688e-02  -1.187  0.23522    
## Clerical            -4.706e-01  1.339e-01  -3.515  0.00044 ***
## Doctor              -3.262e-01  2.258e-01  -1.445  0.14858    
## `Home Maker`        -4.229e-01  1.592e-01  -2.656  0.00790 ** 
## Lawyer              -2.767e-01  1.519e-01  -1.822  0.06853 .  
## Manager             -1.604e-01  1.345e-01  -1.193  0.23304    
## Professional        -2.878e-01  1.274e-01  -2.259  0.02389 *  
## `z_Blue Collar`     -1.589e-01  1.157e-01  -1.373  0.16960    
## Pickup              -8.465e-02  9.718e-02  -0.871  0.38374    
## `Sports Car`         1.281e-01  1.073e-01   1.194  0.23230    
## Van                 -8.716e-02  1.231e-01  -0.708  0.47892    
## income_homeval_educ  1.009e-07  8.279e-08   1.218  0.22315    
## sex_redcar_suv       4.428e-02  5.482e-02   0.808  0.41923    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5628.2  on 4876  degrees of freedom
## AIC: 5670.2
## 
## Number of Fisher Scoring iterations: 4

REVOKED has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + TRAVTIME + 
##     CAR_USE + MVR_PTS + CAR_AGE + URBANICITY + Clerical + Doctor + 
##     `Home Maker` + Lawyer + Manager + Professional + `z_Blue Collar` + 
##     Pickup + `Sports Car` + Van + income_homeval_educ + sex_redcar_suv, 
##     family = binomial(link = "logit"), data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0302  -0.8039  -0.7536   1.5016   1.8617  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -8.968e-01  1.867e-01  -4.804 1.55e-06 ***
## KIDSDRIV             5.884e-02  6.288e-02   0.936 0.349464    
## YOJ                  8.880e-03  9.229e-03   0.962 0.335981    
## TRAVTIME             2.470e-03  2.071e-03   1.193 0.233001    
## CAR_USE             -1.000e-01  9.071e-02  -1.102 0.270260    
## MVR_PTS             -1.284e-02  1.551e-02  -0.828 0.407894    
## CAR_AGE             -4.888e-03  7.420e-03  -0.659 0.510034    
## URBANICITY          -9.744e-02  8.640e-02  -1.128 0.259423    
## Clerical            -4.719e-01  1.338e-01  -3.526 0.000422 ***
## Doctor              -3.285e-01  2.257e-01  -1.455 0.145553    
## `Home Maker`        -4.256e-01  1.591e-01  -2.674 0.007491 ** 
## Lawyer              -2.775e-01  1.519e-01  -1.827 0.067665 .  
## Manager             -1.634e-01  1.344e-01  -1.216 0.224106    
## Professional        -2.891e-01  1.274e-01  -2.270 0.023216 *  
## `z_Blue Collar`     -1.602e-01  1.157e-01  -1.385 0.166158    
## Pickup              -8.396e-02  9.717e-02  -0.864 0.387535    
## `Sports Car`         1.282e-01  1.073e-01   1.195 0.232108    
## Van                 -8.678e-02  1.231e-01  -0.705 0.480791    
## income_homeval_educ  9.851e-08  8.270e-08   1.191 0.233618    
## sex_redcar_suv       4.572e-02  5.477e-02   0.835 0.403864    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5628.6  on 4877  degrees of freedom
## AIC: 5668.6
## 
## Number of Fisher Scoring iterations: 4

CAR_AGE has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + TRAVTIME + 
##     CAR_USE + MVR_PTS + URBANICITY + Clerical + Doctor + `Home Maker` + 
##     Lawyer + Manager + Professional + `z_Blue Collar` + Pickup + 
##     `Sports Car` + Van + income_homeval_educ + sex_redcar_suv, 
##     family = binomial(link = "logit"), data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0333  -0.8043  -0.7537   1.5048   1.8667  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -9.231e-01  1.824e-01  -5.060  4.2e-07 ***
## KIDSDRIV             5.844e-02  6.288e-02   0.929 0.352690    
## YOJ                  8.844e-03  9.230e-03   0.958 0.338016    
## TRAVTIME             2.461e-03  2.071e-03   1.188 0.234653    
## CAR_USE             -1.020e-01  9.069e-02  -1.125 0.260702    
## MVR_PTS             -1.299e-02  1.551e-02  -0.837 0.402427    
## URBANICITY          -9.952e-02  8.635e-02  -1.153 0.249107    
## Clerical            -4.643e-01  1.334e-01  -3.481 0.000499 ***
## Doctor              -3.315e-01  2.256e-01  -1.469 0.141796    
## `Home Maker`        -4.291e-01  1.591e-01  -2.698 0.006974 ** 
## Lawyer              -2.943e-01  1.497e-01  -1.966 0.049341 *  
## Manager             -1.688e-01  1.341e-01  -1.258 0.208331    
## Professional        -2.919e-01  1.273e-01  -2.293 0.021851 *  
## `z_Blue Collar`     -1.532e-01  1.152e-01  -1.329 0.183733    
## Pickup              -8.215e-02  9.713e-02  -0.846 0.397662    
## `Sports Car`         1.293e-01  1.073e-01   1.206 0.227912    
## Van                 -8.559e-02  1.231e-01  -0.695 0.486781    
## income_homeval_educ  7.887e-08  7.733e-08   1.020 0.307744    
## sex_redcar_suv       4.606e-02  5.477e-02   0.841 0.400395    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5629.1  on 4878  degrees of freedom
## AIC: 5667.1
## 
## Number of Fisher Scoring iterations: 4

Van has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + TRAVTIME + 
##     CAR_USE + MVR_PTS + URBANICITY + Clerical + Doctor + `Home Maker` + 
##     Lawyer + Manager + Professional + `z_Blue Collar` + Pickup + 
##     `Sports Car` + income_homeval_educ + sex_redcar_suv, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0345  -0.8032  -0.7546   1.5055   1.8724  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -9.391e-01  1.811e-01  -5.187 2.14e-07 ***
## KIDSDRIV             5.896e-02  6.287e-02   0.938 0.348302    
## YOJ                  8.796e-03  9.231e-03   0.953 0.340678    
## TRAVTIME             2.459e-03  2.071e-03   1.187 0.235112    
## CAR_USE             -1.114e-01  8.967e-02  -1.242 0.214111    
## MVR_PTS             -1.311e-02  1.551e-02  -0.845 0.397869    
## URBANICITY          -9.971e-02  8.635e-02  -1.155 0.248176    
## Clerical            -4.670e-01  1.333e-01  -3.503 0.000459 ***
## Doctor              -3.346e-01  2.256e-01  -1.483 0.138030    
## `Home Maker`        -4.302e-01  1.590e-01  -2.705 0.006839 ** 
## Lawyer              -2.962e-01  1.497e-01  -1.979 0.047795 *  
## Manager             -1.708e-01  1.341e-01  -1.273 0.202877    
## Professional        -2.948e-01  1.272e-01  -2.318 0.020463 *  
## `z_Blue Collar`     -1.507e-01  1.152e-01  -1.309 0.190566    
## Pickup              -6.522e-02  9.409e-02  -0.693 0.488212    
## `Sports Car`         1.404e-01  1.061e-01   1.323 0.185698    
## income_homeval_educ  7.787e-08  7.735e-08   1.007 0.314084    
## sex_redcar_suv       5.507e-02  5.329e-02   1.033 0.301440    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5629.6  on 4879  degrees of freedom
## AIC: 5665.6
## 
## Number of Fisher Scoring iterations: 4

Pickup has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + TRAVTIME + 
##     CAR_USE + MVR_PTS + URBANICITY + Clerical + Doctor + `Home Maker` + 
##     Lawyer + Manager + Professional + `z_Blue Collar` + `Sports Car` + 
##     income_homeval_educ + sex_redcar_suv, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0354  -0.8034  -0.7548   1.5075   1.8581  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -9.611e-01  1.783e-01  -5.390 7.04e-08 ***
## KIDSDRIV             5.883e-02  6.286e-02   0.936 0.349373    
## YOJ                  8.808e-03  9.230e-03   0.954 0.339953    
## TRAVTIME             2.487e-03  2.070e-03   1.201 0.229593    
## CAR_USE             -1.210e-01  8.853e-02  -1.367 0.171539    
## MVR_PTS             -1.340e-02  1.550e-02  -0.864 0.387392    
## URBANICITY          -9.961e-02  8.633e-02  -1.154 0.248581    
## Clerical            -4.695e-01  1.332e-01  -3.525 0.000424 ***
## Doctor              -3.405e-01  2.255e-01  -1.510 0.131051    
## `Home Maker`        -4.319e-01  1.590e-01  -2.717 0.006590 ** 
## Lawyer              -2.989e-01  1.496e-01  -1.998 0.045739 *  
## Manager             -1.730e-01  1.340e-01  -1.291 0.196795    
## Professional        -2.944e-01  1.272e-01  -2.315 0.020613 *  
## `z_Blue Collar`     -1.471e-01  1.150e-01  -1.279 0.200832    
## `Sports Car`         1.529e-01  1.046e-01   1.461 0.143950    
## income_homeval_educ  8.363e-08  7.686e-08   1.088 0.276552    
## sex_redcar_suv       6.356e-02  5.195e-02   1.224 0.221103    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5630.0  on 4880  degrees of freedom
## AIC: 5664
## 
## Number of Fisher Scoring iterations: 4

MVR_PTS has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ KIDSDRIV + YOJ + TRAVTIME + 
##     CAR_USE + URBANICITY + Clerical + Doctor + `Home Maker` + 
##     Lawyer + Manager + Professional + `z_Blue Collar` + `Sports Car` + 
##     income_homeval_educ + sex_redcar_suv, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0281  -0.8033  -0.7559   1.5076   1.8606  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -9.733e-01  1.777e-01  -5.476 4.34e-08 ***
## KIDSDRIV             5.584e-02  6.275e-02   0.890 0.373517    
## YOJ                  9.091e-03  9.221e-03   0.986 0.324207    
## TRAVTIME             2.445e-03  2.070e-03   1.181 0.237452    
## CAR_USE             -1.256e-01  8.836e-02  -1.422 0.155097    
## URBANICITY          -1.131e-01  8.492e-02  -1.332 0.182828    
## Clerical            -4.693e-01  1.332e-01  -3.524 0.000426 ***
## Doctor              -3.387e-01  2.254e-01  -1.502 0.132972    
## `Home Maker`        -4.288e-01  1.589e-01  -2.698 0.006973 ** 
## Lawyer              -2.984e-01  1.496e-01  -1.995 0.046026 *  
## Manager             -1.669e-01  1.338e-01  -1.247 0.212345    
## Professional        -2.946e-01  1.272e-01  -2.316 0.020546 *  
## `z_Blue Collar`     -1.447e-01  1.150e-01  -1.258 0.208317    
## `Sports Car`         1.479e-01  1.044e-01   1.416 0.156830    
## income_homeval_educ  8.640e-08  7.676e-08   1.126 0.260344    
## sex_redcar_suv       6.203e-02  5.191e-02   1.195 0.232076    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5630.8  on 4881  degrees of freedom
## AIC: 5662.8
## 
## Number of Fisher Scoring iterations: 4

KIDSDRIV has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ YOJ + TRAVTIME + CAR_USE + 
##     URBANICITY + Clerical + Doctor + `Home Maker` + Lawyer + 
##     Manager + Professional + `z_Blue Collar` + `Sports Car` + 
##     income_homeval_educ + sex_redcar_suv, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.0344  -0.8035  -0.7564   1.5106   1.8570  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -9.655e-01  1.775e-01  -5.438 5.38e-08 ***
## YOJ                  9.535e-03  9.211e-03   1.035 0.300619    
## TRAVTIME             2.454e-03  2.070e-03   1.186 0.235778    
## CAR_USE             -1.278e-01  8.831e-02  -1.447 0.147830    
## URBANICITY          -1.141e-01  8.491e-02  -1.344 0.179096    
## Clerical            -4.700e-01  1.332e-01  -3.530 0.000416 ***
## Doctor              -3.480e-01  2.251e-01  -1.546 0.122167    
## `Home Maker`        -4.303e-01  1.589e-01  -2.708 0.006776 ** 
## Lawyer              -3.040e-01  1.494e-01  -2.035 0.041890 *  
## Manager             -1.697e-01  1.338e-01  -1.269 0.204604    
## Professional        -2.992e-01  1.270e-01  -2.355 0.018520 *  
## `z_Blue Collar`     -1.438e-01  1.150e-01  -1.251 0.210998    
## `Sports Car`         1.464e-01  1.044e-01   1.402 0.160977    
## income_homeval_educ  8.546e-08  7.674e-08   1.114 0.265452    
## sex_redcar_suv       6.307e-02  5.189e-02   1.216 0.224163    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5631.6  on 4882  degrees of freedom
## AIC: 5661.6
## 
## Number of Fisher Scoring iterations: 4

YOJ has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ TRAVTIME + CAR_USE + URBANICITY + 
##     Clerical + Doctor + `Home Maker` + Lawyer + Manager + Professional + 
##     `z_Blue Collar` + `Sports Car` + income_homeval_educ + sex_redcar_suv, 
##     family = binomial(link = "logit"), data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.9980  -0.8023  -0.7562   1.5101   1.8467  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -8.943e-01  1.633e-01  -5.476 4.36e-08 ***
## TRAVTIME             2.483e-03  2.070e-03   1.200  0.23028    
## CAR_USE             -1.289e-01  8.828e-02  -1.460  0.14440    
## URBANICITY          -1.109e-01  8.483e-02  -1.308  0.19099    
## Clerical            -4.352e-01  1.288e-01  -3.378  0.00073 ***
## Doctor              -3.364e-01  2.249e-01  -1.496  0.13465    
## `Home Maker`        -4.472e-01  1.580e-01  -2.830  0.00466 ** 
## Lawyer              -2.835e-01  1.481e-01  -1.913  0.05569 .  
## Manager             -1.469e-01  1.320e-01  -1.113  0.26585    
## Professional        -2.718e-01  1.243e-01  -2.187  0.02875 *  
## `z_Blue Collar`     -1.114e-01  1.106e-01  -1.007  0.31410    
## `Sports Car`         1.439e-01  1.044e-01   1.378  0.16817    
## income_homeval_educ  9.911e-08  7.557e-08   1.311  0.18970    
## sex_redcar_suv       6.181e-02  5.186e-02   1.192  0.23328    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5632.6  on 4883  degrees of freedom
## AIC: 5660.6
## 
## Number of Fisher Scoring iterations: 4

Blue Collar has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ TRAVTIME + CAR_USE + URBANICITY + 
##     Clerical + Doctor + `Home Maker` + Lawyer + Manager + Professional + 
##     `Sports Car` + income_homeval_educ + sex_redcar_suv, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.9725  -0.8030  -0.7581   1.5168   1.8513  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -9.599e-01  1.499e-01  -6.402 1.53e-10 ***
## TRAVTIME             2.463e-03  2.070e-03   1.190 0.233999    
## CAR_USE             -1.410e-01  8.749e-02  -1.612 0.107042    
## URBANICITY          -1.098e-01  8.478e-02  -1.295 0.195222    
## Clerical            -3.683e-01  1.108e-01  -3.325 0.000884 ***
## Doctor              -3.038e-01  2.229e-01  -1.363 0.172853    
## `Home Maker`        -3.832e-01  1.450e-01  -2.643 0.008211 ** 
## Lawyer              -2.373e-01  1.412e-01  -1.681 0.092834 .  
## Manager             -9.445e-02  1.217e-01  -0.776 0.437707    
## Professional        -2.144e-01  1.108e-01  -1.935 0.053015 .  
## `Sports Car`         1.432e-01  1.043e-01   1.373 0.169883    
## income_homeval_educ  1.193e-07  7.290e-08   1.637 0.101648    
## sex_redcar_suv       5.969e-02  5.179e-02   1.152 0.249118    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5633.7  on 4884  degrees of freedom
## AIC: 5659.7
## 
## Number of Fisher Scoring iterations: 4

sex_redcar_suv has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ TRAVTIME + CAR_USE + URBANICITY + 
##     Clerical + Doctor + `Home Maker` + Lawyer + Manager + Professional + 
##     `Sports Car` + income_homeval_educ, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.9737  -0.8025  -0.7608   1.5170   1.8408  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -8.766e-01  1.311e-01  -6.688 2.26e-11 ***
## TRAVTIME             2.452e-03  2.070e-03   1.185 0.236126    
## CAR_USE             -1.633e-01  8.524e-02  -1.916 0.055408 .  
## URBANICITY          -1.116e-01  8.476e-02  -1.317 0.187832    
## Clerical            -3.733e-01  1.106e-01  -3.375 0.000737 ***
## Doctor              -3.054e-01  2.227e-01  -1.371 0.170311    
## `Home Maker`        -3.686e-01  1.444e-01  -2.553 0.010689 *  
## Lawyer              -2.434e-01  1.410e-01  -1.726 0.084261 .  
## Manager             -1.001e-01  1.215e-01  -0.824 0.409991    
## Professional        -2.215e-01  1.106e-01  -2.004 0.045102 *  
## `Sports Car`         1.289e-01  1.036e-01   1.244 0.213418    
## income_homeval_educ  1.145e-07  7.274e-08   1.574 0.115375    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5635.0  on 4885  degrees of freedom
## AIC: 5659
## 
## Number of Fisher Scoring iterations: 4

Manager has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ TRAVTIME + CAR_USE + URBANICITY + 
##     Clerical + Doctor + `Home Maker` + Lawyer + Professional + 
##     `Sports Car` + income_homeval_educ, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.9705  -0.8024  -0.7616   1.5177   1.8334  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -9.009e-01  1.278e-01  -7.047 1.83e-12 ***
## TRAVTIME             2.525e-03  2.068e-03   1.221  0.22206    
## CAR_USE             -1.366e-01  7.874e-02  -1.734  0.08286 .  
## URBANICITY          -1.219e-01  8.387e-02  -1.453  0.14609    
## Clerical            -3.469e-01  1.059e-01  -3.276  0.00105 ** 
## Doctor              -2.422e-01  2.092e-01  -1.158  0.24687    
## `Home Maker`        -3.373e-01  1.394e-01  -2.420  0.01552 *  
## Lawyer              -1.931e-01  1.272e-01  -1.518  0.12893    
## Professional        -1.862e-01  1.019e-01  -1.827  0.06776 .  
## `Sports Car`         1.300e-01  1.036e-01   1.255  0.20942    
## income_homeval_educ  9.662e-08  6.962e-08   1.388  0.16519    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5635.7  on 4886  degrees of freedom
## AIC: 5657.7
## 
## Number of Fisher Scoring iterations: 4

Doctor has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ TRAVTIME + CAR_USE + URBANICITY + 
##     Clerical + `Home Maker` + Lawyer + Professional + `Sports Car` + 
##     income_homeval_educ, family = binomial(link = "logit"), data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.9652  -0.8035  -0.7636   1.5279   1.8248  
## 
## Coefficients:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)         -9.079e-01  1.278e-01  -7.105 1.21e-12 ***
## TRAVTIME             2.539e-03  2.068e-03   1.228  0.21945    
## CAR_USE             -1.133e-01  7.622e-02  -1.487  0.13711    
## URBANICITY          -1.226e-01  8.387e-02  -1.462  0.14366    
## Clerical            -3.384e-01  1.057e-01  -3.202  0.00137 ** 
## `Home Maker`        -3.223e-01  1.388e-01  -2.321  0.02027 *  
## Lawyer              -1.509e-01  1.220e-01  -1.237  0.21623    
## Professional        -1.633e-01  1.000e-01  -1.632  0.10270    
## `Sports Car`         1.294e-01  1.036e-01   1.250  0.21145    
## income_homeval_educ  6.385e-08  6.401e-08   0.998  0.31850    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5637.0  on 4887  degrees of freedom
## AIC: 5657
## 
## Number of Fisher Scoring iterations: 4

income_homeval_educ has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ TRAVTIME + CAR_USE + URBANICITY + 
##     Clerical + `Home Maker` + Lawyer + Professional + `Sports Car`, 
##     family = binomial(link = "logit"), data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.9741  -0.8023  -0.7656   1.5279   1.8242  
## 
## Coefficients:
##               Estimate Std. Error z value Pr(>|z|)    
## (Intercept)  -0.874039   0.123063  -7.102 1.23e-12 ***
## TRAVTIME      0.002489   0.002066   1.204 0.228442    
## CAR_USE      -0.115902   0.076171  -1.522 0.128106    
## URBANICITY   -0.110154   0.082884  -1.329 0.183846    
## Clerical     -0.366179   0.101823  -3.596 0.000323 ***
## `Home Maker` -0.342347   0.137301  -2.493 0.012653 *  
## Lawyer       -0.126110   0.119493  -1.055 0.291256    
## Professional -0.162724   0.100042  -1.627 0.103830    
## `Sports Car`  0.123735   0.103394   1.197 0.231410    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5638.0  on 4888  degrees of freedom
## AIC: 5656
## 
## Number of Fisher Scoring iterations: 4

Lawyer has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ TRAVTIME + CAR_USE + URBANICITY + 
##     Clerical + `Home Maker` + Professional + `Sports Car`, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.9632  -0.8047  -0.7658   1.5350   1.8145  
## 
## Coefficients:
##               Estimate Std. Error z value Pr(>|z|)    
## (Intercept)  -0.904444   0.119743  -7.553 4.25e-14 ***
## TRAVTIME      0.002484   0.002066   1.202 0.229291    
## CAR_USE      -0.087183   0.071255  -1.224 0.221131    
## URBANICITY   -0.114165   0.082798  -1.379 0.167944    
## Clerical     -0.338605   0.098482  -3.438 0.000585 ***
## `Home Maker` -0.311776   0.134289  -2.322 0.020250 *  
## Professional -0.136250   0.096903  -1.406 0.159713    
## `Sports Car`  0.126405   0.103357   1.223 0.221330    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5639.1  on 4889  degrees of freedom
## AIC: 5655.1
## 
## Number of Fisher Scoring iterations: 4

TRAVTIME has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ CAR_USE + URBANICITY + Clerical + 
##     `Home Maker` + Professional + `Sports Car`, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.9050  -0.8118  -0.7665   1.5346   1.7838  
## 
## Coefficients:
##              Estimate Std. Error z value Pr(>|z|)    
## (Intercept)  -0.80917    0.08947  -9.044  < 2e-16 ***
## CAR_USE      -0.08358    0.07118  -1.174 0.240259    
## URBANICITY   -0.13172    0.08148  -1.617 0.105957    
## Clerical     -0.33864    0.09847  -3.439 0.000584 ***
## `Home Maker` -0.30812    0.13423  -2.295 0.021711 *  
## Professional -0.13365    0.09687  -1.380 0.167657    
## `Sports Car`  0.12820    0.10333   1.241 0.214694    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5640.6  on 4890  degrees of freedom
## AIC: 5654.6
## 
## Number of Fisher Scoring iterations: 4

CAR_USE has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ URBANICITY + Clerical + `Home Maker` + 
##     Professional + `Sports Car`, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.8927  -0.7980  -0.7595   1.5474   1.7528  
## 
## Coefficients:
##              Estimate Std. Error z value Pr(>|z|)    
## (Intercept)  -0.85766    0.07949 -10.790  < 2e-16 ***
## URBANICITY   -0.12330    0.08113  -1.520  0.12858    
## Clerical     -0.31278    0.09596  -3.259  0.00112 ** 
## `Home Maker` -0.27429    0.13112  -2.092  0.03646 *  
## Professional -0.11468    0.09550  -1.201  0.22982    
## `Sports Car`  0.14329    0.10254   1.397  0.16229    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5642.0  on 4891  degrees of freedom
## AIC: 5654
## 
## Number of Fisher Scoring iterations: 4

Professional has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ URBANICITY + Clerical + `Home Maker` + 
##     `Sports Car`, family = binomial(link = "logit"), data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.8849  -0.7909  -0.7909   1.5564   1.7526  
## 
## Coefficients:
##              Estimate Std. Error z value Pr(>|z|)    
## (Intercept)  -0.87977    0.07740 -11.367  < 2e-16 ***
## URBANICITY   -0.12210    0.08111  -1.505  0.13226    
## Clerical     -0.29154    0.09438  -3.089  0.00201 ** 
## `Home Maker` -0.25318    0.12999  -1.948  0.05145 .  
## `Sports Car`  0.14427    0.10252   1.407  0.15937    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5643.4  on 4892  degrees of freedom
## AIC: 5653.4
## 
## Number of Fisher Scoring iterations: 4

Sports Car has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ URBANICITY + Clerical + `Home Maker`, 
##     family = binomial(link = "logit"), data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.8378  -0.7957  -0.7957   1.5604   1.7443  
## 
## Coefficients:
##              Estimate Std. Error z value Pr(>|z|)    
## (Intercept)  -0.86640    0.07678 -11.285  < 2e-16 ***
## URBANICITY   -0.12129    0.08110  -1.496  0.13476    
## Clerical     -0.28720    0.09431  -3.045  0.00232 ** 
## `Home Maker` -0.23226    0.12906  -1.800  0.07193 .  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5645.4  on 4893  degrees of freedom
## AIC: 5653.4
## 
## Number of Fisher Scoring iterations: 4

URBANICITY has the (highest p value) lowest affect on the target and will be removed next.

## 
## Call:
## glm(formula = train1$TARGET_FLAG ~ Clerical + `Home Maker`, family = binomial(link = "logit"), 
##     data = train3)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -0.8025  -0.8025  -0.8025   1.6061   1.7242  
## 
## Coefficients:
##              Estimate Std. Error z value Pr(>|z|)    
## (Intercept)  -0.96776    0.03660 -26.440  < 2e-16 ***
## Clerical     -0.26228    0.09273  -2.828  0.00468 ** 
## `Home Maker` -0.20824    0.12798  -1.627  0.10372    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 5657.6  on 4896  degrees of freedom
## Residual deviance: 5647.6  on 4894  degrees of freedom
## AIC: 5653.6
## 
## Number of Fisher Scoring iterations: 4

Home Maker has the (highest p value) lowest affect on the target and will be removed next.

## (Intercept)    Clerical 
## -0.19144339 -0.04747191

The variable that is likely to result in a customer not having a claim is: having a clerical job. This seems like a very unlikely model to use to predict the likelihood a customer would get into an accident.

Prediction from Model 3

## [1] 0.23
## [1] 0.4842399

The cutoff associated with a point the farthest distance from the ROC curve is 0.23. I will use 0.23 as the cutoff for making predictions. A value above 0.23 will be assigned a target of one and a value below 0.23 wil be assigned a value of zero. The area under the curve is 0.48.

Confusion Matrix
##     true
## pred    0    1
##    0  337  147
##    1 2070  710

The model predicted 337 0’s that were actually 0. The model predicted 147 0’s that were actually 1.
The model predicted 2,070 1’s that were actaully 0. The model predicted 710 1’s that were actually 1.

##    accuracy     error precision sensitivity specificity        f1
## 1 0.3207721 0.6792279 0.1165687   0.1400083   0.8284714 0.1272178

Building a Multiple Linear Regression Model to Predict the Cost of a Claim

I will start with logit model 2, since it has the highest accuracy and the highest area under the ROC curve. I will use model 2 to build a muliple linear regression model to predict the claim amount. A customer who is predicted to not have a claim (TARGET_FLAG=0) will be predicted to have 0 dollars for the TARGET_AMT.

## 
## Call:
## lm(formula = train2a$TARGET_AMT ~ ., data = train2a)
## 
## Residuals:
##    Min     1Q Median     3Q    Max 
##  -5217  -1713   -726    397  76398 
## 
## Coefficients:
##                 Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   -3.236e+02  3.101e+02  -1.043 0.296869    
## KIDSDRIV       3.129e+02  1.272e+02   2.459 0.013955 *  
## INCOME        -5.603e-03  1.759e-03  -3.186 0.001453 ** 
## PARENT1        9.679e+02  2.167e+02   4.467 8.11e-06 ***
## MSTATUS       -5.577e+02  1.464e+02  -3.810 0.000141 ***
## EDUCATION     -1.418e+02  6.641e+01  -2.135 0.032828 *  
## TRAVTIME       1.363e+01  3.946e+00   3.454 0.000558 ***
## CAR_USE        7.930e+02  1.564e+02   5.070 4.13e-07 ***
## BLUEBOOK       7.132e-03  9.537e-03   0.748 0.454566    
## TIF           -4.742e+01  1.493e+01  -3.175 0.001506 ** 
## CLM_FREQ       8.311e+01  6.040e+01   1.376 0.168879    
## REVOKED        5.237e+02  1.924e+02   2.722 0.006510 ** 
## MVR_PTS        1.307e+02  3.252e+01   4.020 5.90e-05 ***
## URBANICITY     1.854e+03  1.688e+02  10.981  < 2e-16 ***
## Manager       -9.728e+02  2.040e+02  -4.768 1.92e-06 ***
## Professional  -3.946e+02  1.857e+02  -2.125 0.033635 *  
## `Panel Truck`  7.261e+02  3.045e+02   2.385 0.017138 *  
## Pickup         3.023e+02  2.015e+02   1.500 0.133597    
## `Sports Car`   5.841e+02  2.293e+02   2.548 0.010866 *  
## Van            1.570e+02  2.491e+02   0.630 0.528550    
## z_SUV          4.965e+02  1.711e+02   2.901 0.003735 ** 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 4322 on 4876 degrees of freedom
## Multiple R-squared:  0.08062,    Adjusted R-squared:  0.07685 
## F-statistic: 21.38 on 20 and 4876 DF,  p-value: < 2.2e-16

Van has the lowest affect on claim amount and will be removed.

## 
## Call:
## lm(formula = train2a$TARGET_AMT ~ KIDSDRIV + INCOME + PARENT1 + 
##     MSTATUS + EDUCATION + TRAVTIME + CAR_USE + BLUEBOOK + TIF + 
##     CLM_FREQ + REVOKED + MVR_PTS + URBANICITY + Manager + Professional + 
##     `Panel Truck` + Pickup + `Sports Car` + z_SUV, data = train2a)
## 
## Residuals:
##    Min     1Q Median     3Q    Max 
##  -5206  -1714   -718    385  76344 
## 
## Coefficients:
##                 Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   -3.111e+02  3.095e+02  -1.005 0.314846    
## KIDSDRIV       3.132e+02  1.272e+02   2.462 0.013840 *  
## INCOME        -5.582e-03  1.758e-03  -3.175 0.001510 ** 
## PARENT1        9.623e+02  2.165e+02   4.445 8.98e-06 ***
## MSTATUS       -5.590e+02  1.464e+02  -3.820 0.000135 ***
## EDUCATION     -1.410e+02  6.640e+01  -2.124 0.033744 *  
## TRAVTIME       1.363e+01  3.945e+00   3.454 0.000556 ***
## CAR_USE        8.174e+02  1.515e+02   5.394 7.21e-08 ***
## BLUEBOOK       7.952e-03  9.447e-03   0.842 0.399953    
## TIF           -4.721e+01  1.493e+01  -3.162 0.001575 ** 
## CLM_FREQ       8.377e+01  6.039e+01   1.387 0.165428    
## REVOKED        5.264e+02  1.923e+02   2.737 0.006226 ** 
## MVR_PTS        1.313e+02  3.251e+01   4.038 5.48e-05 ***
## URBANICITY     1.854e+03  1.688e+02  10.983  < 2e-16 ***
## Manager       -9.688e+02  2.039e+02  -4.751 2.08e-06 ***
## Professional  -3.897e+02  1.855e+02  -2.101 0.035729 *  
## `Panel Truck`  6.577e+02  2.845e+02   2.312 0.020827 *  
## Pickup         2.591e+02  1.894e+02   1.367 0.171542    
## `Sports Car`   5.522e+02  2.236e+02   2.470 0.013548 *  
## z_SUV          4.642e+02  1.633e+02   2.843 0.004490 ** 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 4321 on 4877 degrees of freedom
## Multiple R-squared:  0.08054,    Adjusted R-squared:  0.07696 
## F-statistic: 22.49 on 19 and 4877 DF,  p-value: < 2.2e-16

BLUEBOOK has the lowest affect on claim amount and will be removed.

## 
## Call:
## lm(formula = train2a$TARGET_AMT ~ KIDSDRIV + INCOME + PARENT1 + 
##     MSTATUS + EDUCATION + TRAVTIME + CAR_USE + TIF + CLM_FREQ + 
##     REVOKED + MVR_PTS + URBANICITY + Manager + Professional + 
##     `Panel Truck` + Pickup + `Sports Car` + z_SUV, data = train2a)
## 
## Residuals:
##    Min     1Q Median     3Q    Max 
##  -5209  -1719   -723    382  76375 
## 
## Coefficients:
##                 Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   -2.084e+02  2.844e+02  -0.733 0.463712    
## KIDSDRIV       3.160e+02  1.272e+02   2.485 0.012982 *  
## INCOME        -5.215e-03  1.703e-03  -3.062 0.002214 ** 
## PARENT1        9.617e+02  2.165e+02   4.442 9.09e-06 ***
## MSTATUS       -5.577e+02  1.463e+02  -3.811 0.000140 ***
## EDUCATION     -1.380e+02  6.630e+01  -2.082 0.037437 *  
## TRAVTIME       1.363e+01  3.945e+00   3.456 0.000553 ***
## CAR_USE        8.257e+02  1.512e+02   5.461 4.97e-08 ***
## TIF           -4.727e+01  1.493e+01  -3.167 0.001552 ** 
## CLM_FREQ       8.231e+01  6.036e+01   1.364 0.172746    
## REVOKED        5.261e+02  1.923e+02   2.735 0.006253 ** 
## MVR_PTS        1.315e+02  3.251e+01   4.044 5.33e-05 ***
## URBANICITY     1.854e+03  1.688e+02  10.987  < 2e-16 ***
## Manager       -9.666e+02  2.039e+02  -4.740 2.19e-06 ***
## Professional  -3.834e+02  1.854e+02  -2.068 0.038688 *  
## `Panel Truck`  7.375e+02  2.683e+02   2.749 0.005997 ** 
## Pickup         2.282e+02  1.859e+02   1.228 0.219566    
## `Sports Car`   5.206e+02  2.204e+02   2.362 0.018217 *  
## z_SUV          4.330e+02  1.590e+02   2.723 0.006495 ** 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 4321 on 4878 degrees of freedom
## Multiple R-squared:  0.08041,    Adjusted R-squared:  0.07702 
## F-statistic:  23.7 on 18 and 4878 DF,  p-value: < 2.2e-16

Pickup has the lowest affect on claim amount and will be removed.

## 
## Call:
## lm(formula = train2a$TARGET_AMT ~ KIDSDRIV + INCOME + PARENT1 + 
##     MSTATUS + EDUCATION + TRAVTIME + CAR_USE + TIF + CLM_FREQ + 
##     REVOKED + MVR_PTS + URBANICITY + Manager + Professional + 
##     `Panel Truck` + `Sports Car` + z_SUV, data = train2a)
## 
## Residuals:
##    Min     1Q Median     3Q    Max 
##  -5208  -1715   -727    372  76258 
## 
## Coefficients:
##                 Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   -1.410e+02  2.791e+02  -0.505 0.613393    
## KIDSDRIV       3.163e+02  1.272e+02   2.487 0.012919 *  
## INCOME        -5.381e-03  1.698e-03  -3.169 0.001537 ** 
## PARENT1        9.571e+02  2.165e+02   4.422 1.00e-05 ***
## MSTATUS       -5.564e+02  1.463e+02  -3.802 0.000145 ***
## EDUCATION     -1.394e+02  6.629e+01  -2.102 0.035569 *  
## TRAVTIME       1.356e+01  3.945e+00   3.438 0.000591 ***
## CAR_USE        8.699e+02  1.469e+02   5.923 3.38e-09 ***
## TIF           -4.693e+01  1.493e+01  -3.144 0.001675 ** 
## CLM_FREQ       8.238e+01  6.036e+01   1.365 0.172395    
## REVOKED        5.334e+02  1.922e+02   2.775 0.005549 ** 
## MVR_PTS        1.323e+02  3.250e+01   4.071 4.76e-05 ***
## URBANICITY     1.854e+03  1.688e+02  10.987  < 2e-16 ***
## Manager       -9.580e+02  2.038e+02  -4.701 2.66e-06 ***
## Professional  -3.787e+02  1.853e+02  -2.043 0.041069 *  
## `Panel Truck`  6.410e+02  2.565e+02   2.499 0.012487 *  
## `Sports Car`   4.523e+02  2.133e+02   2.121 0.033996 *  
## z_SUV          3.645e+02  1.489e+02   2.448 0.014419 *  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 4321 on 4879 degrees of freedom
## Multiple R-squared:  0.08013,    Adjusted R-squared:  0.07692 
## F-statistic:    25 on 17 and 4879 DF,  p-value: < 2.2e-16

CLM_FREQ has the lowest affect on claim amount and will be removed.

## 
## Call:
## lm(formula = train2a$TARGET_AMT ~ KIDSDRIV + INCOME + PARENT1 + 
##     MSTATUS + EDUCATION + TRAVTIME + CAR_USE + TIF + REVOKED + 
##     MVR_PTS + URBANICITY + Manager + Professional + `Panel Truck` + 
##     `Sports Car` + z_SUV, data = train2a)
## 
## Residuals:
##    Min     1Q Median     3Q    Max 
##  -5317  -1714   -737    369  76377 
## 
## Coefficients:
##                 Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   -1.451e+02  2.791e+02  -0.520 0.603117    
## KIDSDRIV       3.246e+02  1.270e+02   2.555 0.010649 *  
## INCOME        -5.514e-03  1.695e-03  -3.253 0.001151 ** 
## PARENT1        9.513e+02  2.164e+02   4.395 1.13e-05 ***
## MSTATUS       -5.688e+02  1.461e+02  -3.894 9.98e-05 ***
## EDUCATION     -1.392e+02  6.630e+01  -2.099 0.035827 *  
## TRAVTIME       1.383e+01  3.941e+00   3.508 0.000455 ***
## CAR_USE        8.809e+02  1.467e+02   6.006 2.04e-09 ***
## TIF           -4.702e+01  1.493e+01  -3.150 0.001641 ** 
## REVOKED        5.349e+02  1.923e+02   2.782 0.005421 ** 
## MVR_PTS        1.483e+02  3.032e+01   4.892 1.03e-06 ***
## URBANICITY     1.906e+03  1.645e+02  11.584  < 2e-16 ***
## Manager       -9.604e+02  2.038e+02  -4.712 2.52e-06 ***
## Professional  -3.815e+02  1.854e+02  -2.058 0.039610 *  
## `Panel Truck`  6.499e+02  2.564e+02   2.534 0.011300 *  
## `Sports Car`   4.706e+02  2.129e+02   2.211 0.027096 *  
## z_SUV          3.708e+02  1.488e+02   2.491 0.012765 *  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 4322 on 4880 degrees of freedom
## Multiple R-squared:  0.07978,    Adjusted R-squared:  0.07676 
## F-statistic: 26.44 on 16 and 4880 DF,  p-value: < 2.2e-16

All of the remaining variables have p values lower that 0.05, which indicates that they are signficant. A p value below 0.05 indicates strong evidence against the null hypothesis that the variable in question does not affect the claim amount.

The adjusted R squared value is .076. 7.6% of the variability in claim amount is accounted for by the model. This is a very unimpressive model.

The F statistic is 26.44, which is high, and indicates that these variables are signficiant.

The following graphs look for patterns in the residuals.

There are some residuals whose values are very high, indicating a large deviation from the model. The residuals follow the line on the QQ plot until a point, and then deviate greatly. This is because some claims are for very large amounts and the model does not predict those very high values.

Prediction from Model 2a

The root mean square error from model 2a is

## [1] 4907.037

The cost of claims predicted by model 2a is off, on average, by $4907. This is an exorbitant amount to be off by on average. The claims for very large amounts are not predicted by my model and are causing the error to be high.

Building a Logarthmic model - Using backward elimination

To try to better account for the very high claim amounts, I will try a logarithmic model. This will also be built off of logit model 2.

## 
## Call:
## lm(formula = log(train2b$TARGET_AMT) ~ ., data = train2b)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -19.836  -6.368  -2.433   6.303  27.055 
## 
## Coefficients:
##                 Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   -1.339e+01  6.178e-01 -21.670  < 2e-16 ***
## KIDSDRIV       1.271e+00  2.534e-01   5.015 5.48e-07 ***
## INCOME        -1.826e-05  3.504e-06  -5.211 1.96e-07 ***
## PARENT1        2.025e+00  4.316e-01   4.690 2.80e-06 ***
## MSTATUS       -1.878e+00  2.916e-01  -6.439 1.32e-10 ***
## EDUCATION     -5.609e-01  1.323e-01  -4.240 2.27e-05 ***
## TRAVTIME       4.856e-02  7.860e-03   6.178 7.00e-10 ***
## CAR_USE        2.779e+00  3.116e-01   8.920  < 2e-16 ***
## BLUEBOOK      -7.314e-05  1.900e-05  -3.850  0.00012 ***
## TIF           -1.778e-01  2.975e-02  -5.976 2.45e-09 ***
## CLM_FREQ       5.613e-01  1.203e-01   4.665 3.17e-06 ***
## REVOKED        3.031e+00  3.832e-01   7.910 3.16e-15 ***
## MVR_PTS        4.441e-01  6.478e-02   6.856 7.97e-12 ***
## URBANICITY     6.518e+00  3.362e-01  19.384  < 2e-16 ***
## Manager       -3.063e+00  4.064e-01  -7.537 5.69e-14 ***
## Professional  -8.244e-01  3.699e-01  -2.228  0.02590 *  
## `Panel Truck`  1.761e+00  6.066e-01   2.904  0.00370 ** 
## Pickup         1.209e+00  4.014e-01   3.011  0.00262 ** 
## `Sports Car`   2.825e+00  4.567e-01   6.187 6.65e-10 ***
## Van            1.579e+00  4.962e-01   3.182  0.00147 ** 
## z_SUV          2.191e+00  3.409e-01   6.426 1.44e-10 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 8.609 on 4876 degrees of freedom
## Multiple R-squared:  0.2246, Adjusted R-squared:  0.2214 
## F-statistic: 70.63 on 20 and 4876 DF,  p-value: < 2.2e-16

All of the remaining variables have p values lower that 0.05, which indicates that they are signficant. A p value below 0.05 indicates strong evidence against the null hypothesis that the variable in question does not affect the claim amount.

The adjusted R squared value is .22. 22% of the variability in claim amount is accounted for by the model. This is a significant improvement over the last model.

The F statistic is 70.63, which is high, and indicates that these variables are signficiant.

The following graphs look for patterns in the residuals.

The residuals show a pattern and the residuals deviate from the line on the QQ plot. These indicate that this is not a good model.

Prediction from Model 2b- Logarithmic Model

The root mean square error from model 2 is

## [1] 5219.295

The cost of claims predicted by model 2b is off, on average, by $5219. This is even higher amount to be off by than the previous model. The claims for very large amounts are not predicted by my model and are causing the error to be high.

Model Selection

In order to predict the TARGET_FLG, which is a binary variable that describes whether or not a customer has a claim, I wil use model 2. This is a logit model that was built using backward elimination starting with all of the variables.

When comparing the three models, the accuracy is greatest for the 2nd model. The precision is also highest for the second model. (The precision is the ratio of correct predictions of zero to total predictions of zero.) The sensitivity is highest for model 2. (The sensitivity is the ratio of the correct predictions of zero to all cases in which the target is zero.) The specificity is the greatest for model 3. (The specificity is the ratio of the correct predictions of 1 to all cases in which the target is one.) The F1 score is highest for the second model. (The F1 score is equal to 2xPrecisionxSensitivity/(Precision+Sensitivity) and gives a balance between the precision and sensitivity.) The area under the roc curve is the farthest from 0.5 for model 2. (The farther the area is from 0.5, the better the model.)

I will use model 2 to make a prediction for the test data because it has a higher accuracy, precision, sensitivity, F1 score and area under the ROC curve than the other models.

The predictions for the evaluation set are below:

##    1    2    3    4    5    6    7    8    9   10   11   12   13   14   15 
##    0    0    0    0    0    0    1    1    0    0    0    1    1    0    0 
##   16   17   18   19   20   21   22   23   24   25   26   27   28   29   30 
##    1    1    0    1    1    0    1    0    1    1    1    1    1    0    0 
##   31   32   33   34   35   36   37   38   39   40   41   42   43   44   45 
##    0    1    0    0    0    0    0    0    0    1    0    1    0    1    0 
##   46   47   48   49   50   51   52   53   54   55   56   57   58   59   60 
##    0    0    1    0    1    0    0    1    0    0    0    0    1    0    1 
##   61   62   63   64   65   66   67   68   69   70   71   72   73   74   75 
##    0    0    1    0    0    1    1    1    0    0    0    0    1    1    1 
##   76   77   78   79   80   81   82   83   84   85   86   87   88   89   90 
##    0    1    0    0    0    1    1    1    0    1    1    0    0    0    1 
##   91   92   93   94   95   96   97   98   99  100  101  102  103  104  105 
##    0    0    0    0    0    0    0    0    1    0    1    1    1    1    0 
##  106  107  108  109  110  111  112  113  114  115  116  117  118  119  120 
##    0    0    0    1    0    1    0    0    0    1    0    0    1    1    0 
##  121  122  123  124  125  126  127  128  129  130  131  132  133  134  135 
##    0    1    1    1    0    1    1    0    0    0    0    0    0    0    0 
##  136  137  138  139  140  141  142  143  144  145  146  147  148  149  150 
##    0    1    1    0    0    0    1    0    0    0    1    0    0    0    0 
##  151  152  153  154  155  156  157  158  159  160  161  162  163  164  165 
##    1    1    1    1    0    1    0    0    1    1    0    0    0    0    1 
##  166  167  168  169  170  171  172  173  174  175  176  177  178  179  180 
##    0    0    0    1    0    0    1    0    1    1    1    1    1    1    1 
##  181  182  183  184  185  186  187  188  189  190  191  192  193  194  195 
##    1    0    0    0    0    0    0    0    0    0    1    1    1    0    0 
##  196  197  198  199  200  201  202  203  204  205  206  207  208  209  210 
##    1    1    0    0    0    0    1    0    0    0    0    1    0    0    0 
##  211  212  213  214  215  216  217  218  219  220  221  222  223  224  225 
##    0    0    1    1    0    0    1    0    0    0    0    0    1    1    0 
##  226  227  228  229  230  231  232  233  234  235  236  237  238  239  240 
##    1    1    1    1    0    0    0    1    0    0    0    0    0    0    1 
##  241  242  243  244  245  246  247  248  249  250  251  252  253  254  255 
##    0    0    1    0    0    1    1    1    0    1    1    1    0    1    1 
##  256  257  258  259  260  261  262  263  264  265  266  267  268  269  270 
##    1    0    0    1    0    0    0    0    0    0    0    0    0    1    1 
##  271  272  273  274  275  276  277  278  279  280  281  282  283  284  285 
##    1    0    0    1    0    0    1    0    0    0    0    0    0    1    1 
##  286  287  288  289  290  291  292  293  294  295  296  297  298  299  300 
##    1    0    1    1    1    1    0    0    1    0    1    0    1    0    0 
##  301  302  303  304  305  306  307  308  309  310  311  312  313  314  315 
##    0    0    0    1    1    1    0    1    0    0    1    0    0    1    0 
##  316  317  318  319  320  321  322  323  324  325  326  327  328  329  330 
##    0    0    0    1    0    0    1    0    0    1    1    1    0    1    0 
##  331  332  333  334  335  336  337  338  339  340  341  342  343  344  345 
##    0    0    1    0    0    0    0    1    0    0    1    1    1    1    0 
##  346  347  348  349  350  351  352  353  354  355  356  357  358  359  360 
##    0    0    0    0    0    0    0    1    1    0    1    1    0    0    0 
##  361  362  363  364  365  366  367  368  369  370  371  372  373  374  375 
##    1    0    0    1    0    1    0    1    0    0    0    0    1    0    0 
##  376  377  378  379  380  381  382  383  384  385  386  387  388  389  390 
##    1    0    0    0    0    0    1    0    0    0    1    1    0    0    1 
##  391  392  393  394  395  396  397  398  399  400  401  402  403  404  405 
##    0    0    0    0    0    1    0    1    0    1    1    0    0    0    0 
##  406  407  408  409  410  411  412  413  414  415  416  417  418  419  420 
##    0    0    1    0    0    0    1    0    0    1    1    0    1    0    0 
##  421  422  423  424  425  426  427  428  429  430  431  432  433  434  435 
##    1    1    1    0    0    1    0    0    1    1    1    0    0    0    1 
##  436  437  438  439  440  441  442  443  444  445  446  447  448  449  450 
##    1    0    0    0    0    0    0    0    0    0    0    0    0    1    1 
##  451  452  453  454  455  456  457  458  459  460  461  462  463  464  465 
##    0    0    1    0    1    1    1    1    0    1    0    0    0    1    0 
##  466  467  468  469  470  471  472  473  474  475  476  477  478  479  480 
##    0    1    1    0    0    0    1    0    0    0    0    1    1    0    1 
##  481  482  483  484  485  486  487  488  489  490  491  492  493  494  495 
##    0    0    0    1    1    1    0    1    0    1    1    0    0    0    0 
##  496  497  498  499  500  501  502  503  504  505  506  507  508  509  510 
##    1    0    0    1    0    0    0    1    0    1    0    1    0    0    0 
##  511  512  513  514  515  516  517  518  519  520  521  522  523  524  525 
##    0    0    0    0    0    0    1    1    0    1    0    0    0    0    0 
##  526  527  528  529  530  531  532  533  534  535  536  537  538  539  540 
##    0    0    0    0    0    0    0    0    1    0    0    0    0    0    0 
##  541  542  543  544  545  546  547  548  549  550  551  552  553  554  555 
##    0    0    1    1    0    0    0    1    1    0    0    0    0    1    0 
##  556  557  558  559  560  561  562  563  564  565  566  567  568  569  570 
##    0    1    0    1    0    0    1    0    0    1    0    1    1    1    1 
##  571  572  573  574  575  576  577  578  579  580  581  582  583  584  585 
##    0    0    0    0    0    0    0    0    1    0    0    1    0    1    0 
##  586  587  588  589  590  591  592  593  594  595  596  597  598  599  600 
##    0    0    0    1    1    1    0    0    1    1    1    1    0    0    1 
##  601  602  603  604  605  606  607  608  609  610  611  612  613  614  615 
##    1    0    0    0    1    1    1    0    0    0    0    1    0    0    1 
##  616  617  618  619  620  621  622  623  624  625  626  627  628  629  630 
##    0    0    1    0    1    0    0    0    0    0    1    1    0    0    1 
##  631  632  633  634  635  636  637  638  639  640  641  642  643  644  645 
##    0    0    0    0    0    0    0    1    0    0    0    1    0    0    0 
##  646  647  648  649  650  651  652  653  654  655  656  657  658  659  660 
##    1    1    0    0    0    0    0    1    0    0    0    1    0    0    0 
##  661  662  663  664  665  666  667  668  669  670  671  672  673  674  675 
##    1    1    1    0    0    1    1    0    0    0    0    1    1    0    1 
##  676  677  678  679  680  681  682  683  684  685  686  687  688  689  690 
##    0    0    1    0    0    1    0    1    0    0    0    0    0    0    0 
##  691  692  693  694  695  696  697  698  699  700  701  702  703  704  705 
##    0    0    0    0    0    1    0    1    0    0    0    0    1    0    0 
##  706  707  708  709  710  711  712  713  714  715  716  717  718  719  720 
##    0    1    1    0    0    0    0    1    0    0    0    0    0    0    0 
##  721  722  723  724  725  726  727  728  729  730  731  732  733  734  735 
##    1    0    0    0    0    0    0    0    0    0    1    1    0    0    1 
##  736  737  738  739  740  741  742  743  744  745  746  747  748  749  750 
##    1    0    0    0    0    1    1    1    0    0    1    1    0    0    0 
##  751  752  753  754  755  756  757  758  759  760  761  762  763  764  765 
##    0    0    1    1    0    1    1    0    0    1    0    1    1    1    1 
##  766  767  768  769  770  771  772  773  774  775  776  777  778  779  780 
##    1    1    0    1    0    1    1    0    1    0    1    0    1    0    0 
##  781  782  783  784  785  786  787  788  789  790  791  792  793  794  795 
##    0    1    0    0    0    1    0    0    0    0    0    0    1    0    0 
##  796  797  798  799  800  801  802  803  804  805  806  807  808  809  810 
##    1    0    1    1    1    0    0    0    1    0    0    0    0    0    1 
##  811  812  813  814  815  816  817  818  819  820  821  822  823  824  825 
##    1    0    0    0    0    0    0    1    1    0    1    0    1    1    1 
##  826  827  828  829  830  831  832  833  834  835  836  837  838  839  840 
##    0    0    0    0    0    0    0    1    0    0    0    0    0    0    0 
##  841  842  843  844  845  846  847  848  849  850  851  852  853  854  855 
##    0    0    0    0    0    0    1    1    1    1    1    0    0    0    1 
##  856  857  858  859  860  861  862  863  864  865  866  867  868  869  870 
##    0    0    0    1    0    0    1    0    0    0    0    1    0    0    1 
##  871  872  873  874  875  876  877  878  879  880  881  882  883  884  885 
##    0    1    0    1    1    0    0    0    0    1    1    0    0    0    1 
##  886  887  888  889  890  891  892  893  894  895  896  897  898  899  900 
##    1    1    0    0    0    0    0    0    0    0    1    0    0    1    1 
##  901  902  903  904  905  906  907  908  909  910  911  912  913  914  915 
##    0    0    1    0    0    0    1    0    1    1    1    0    0    0    0 
##  916  917  918  919  920  921  922  923  924  925  926  927  928  929  930 
##    1    1    1    0    1    0    0    0    0    0    0    0    0    1    0 
##  931  932  933  934  935  936  937  938  939  940  941  942  943  944  945 
##    0    1    0    1    0    1    0    1    1    1    1    0    1    0    1 
##  946  947  948  949  950  951  952  953  954  955  956  957  958  959  960 
##    0    0    0    0    1    0    0    0    1    0    0    0    0    1    0 
##  961  962  963  964  965  966  967  968  969  970  971  972  973  974  975 
##    1    1    1    0    0    1    1    0    0    0    1    0    0    1    0 
##  976  977  978  979  980  981  982  983  984  985  986  987  988  989  990 
##    1    1    0    0    1    0    0    1    1    1    1    1    0    0    1 
##  991  992  993  994  995  996  997  998  999 1000 1001 1002 1003 1004 1005 
##    0    0    1    0    1    0    0    0    0    0    1    1    1    0    1 
## 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 
##    0    0    1    1    0    1    0    0    0    0    1    0    1    1    0 
## 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 
##    0    0    1    1    1    1    1    0    0    0    0    0    0    0    0 
## 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 
##    0    0    0    0    0    0    1    1    0    1    0    1    1    1    1 
## 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 
##    1    1    1    0    1    0    0    0    1    1    0    1    1    0    0 
## 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 
##    0    0    0    0    1    0    0    0    1    0    0    0    0    1    0 
## 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 
##    1    1    0    0    1    1    0    0    0    0    0    0    0    1    0 
## 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 
##    0    0    0    1    1    0    0    1    1    1    1    0    0    0    1 
## 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 
##    1    1    0    0    0    0    1    1    1    0    1    1    0    0    0 
## 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 
##    0    1    0    0    0    0    0    1    0    1    0    0    0    0    0 
## 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 
##    0    0    0    1    1    0    0    1    0    1    1    1    0    1    1 
## 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 
##    0    0    0    0    0    0    0    0    0    0    1    0    1    1    0 
## 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 
##    1    1    1    1    0    0    1    0    0    1    0    1    0    1    1 
## 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 
##    0    0    0    0    0    1    0    0    1    0    0    0    0    1    0 
## 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 
##    0    0    0    0    0    0    1    0    0    1    0    0    1    0    0 
## 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 
##    0    1    1    0    0    0    0    1    0    1    0    0    0    0    1 
## 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 
##    0    0    1    1    0    0    0    1    0    0    1    0    0    0    0 
## 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 
##    1    1    0    0    0    0    1    0    1    1    0    1    0    1    0 
## 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 
##    0    0    0    1    0    0    1    0    1    0    0    0    0    0    0 
## 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 
##    0    1    0    0    0    1    0    0    1    1    0    1    0    0    0 
## 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 
##    1    0    0    1    1    0    1    0    1    1    0    0    0    0    0 
## 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 
##    0    1    1    0    1    1    0    1    0    0    0    0    0    0    1 
## 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 
##    0    1    1    0    0    1    0    1    0    0    0    0    0    0    1 
## 1336 1337 1338 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350 
##    1    0    0    1    0    0    1    0    0    1    0    0    1    0    0 
## 1351 1352 1353 1354 1355 1356 1357 1358 1359 1360 1361 1362 1363 1364 1365 
##    0    1    0    0    0    0    0    1    0    0    0    0    1    0    0 
## 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 
##    0    1    1    1    1    0    0    0    0    0    1    1    0    0    0 
## 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 
##    1    1    1    0    0    0    0    0    0    1    1    1    0    1    0 
## 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 1409 1410 
##    0    0    1    1    0    0    0    1    1    1    0    0    0    0    1 
## 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 
##    0    0    0    0    0    0    0    0    1    0    0    1    1    0    1 
## 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 
##    1    0    0    1    0    0    0    0    0    0    1    1    1    0    0 
## 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 
##    0    1    1    0    0    1    1    0    0    0    0    0    1    0    0 
## 1456 1457 1458 1459 1460 1461 1462 1463 1464 1465 1466 1467 1468 1469 1470 
##    0    1    0    0    0    0    0    0    1    0    1    0    0    0    1 
## 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 
##    0    0    0    1    1    0    1    0    0    0    1    0    1    0    1 
## 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 
##    0    0    0    1    0    0    1    0    0    1    1    0    0    0    0 
## 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 
##    0    1    0    1    0    0    1    0    0    0    0    0    0    0    1 
## 1516 1517 1518 1519 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 
##    1    0    1    0    0    1    0    0    0    0    1    0    1    1    1 
## 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 
##    1    1    0    0    1    0    0    1    1    1    0    0    0    1    1 
## 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 1558 1559 1560 
##    0    0    0    0    0    0    0    0    1    0    0    0    0    1    1 
## 1561 1562 1563 1564 1565 1566 1567 1568 1569 1570 1571 1572 1573 1574 1575 
##    0    1    0    1    1    0    0    0    0    1    1    0    0    1    0 
## 1576 1577 1578 1579 1580 1581 1582 1583 1584 1585 1586 1587 1588 1589 1590 
##    1    1    0    0    0    0    0    0    0    0    0    0    1    0    1 
## 1591 1592 1593 1594 1595 1596 1597 1598 1599 1600 1601 1602 1603 1604 1605 
##    1    1    0    0    0    0    0    0    0    1    1    0    1    1    0 
## 1606 1607 1608 1609 1610 1611 1612 1613 1614 1615 1616 1617 1618 1619 1620 
##    1    0    0    0    1    1    0    0    1    1    1    0    1    0    1 
## 1621 1622 1623 1624 1625 1626 1627 1628 1629 1630 1631 1632 1633 1634 1635 
##    1    0    1    0    0    0    0    0    0    1    0    1    0    1    0 
## 1636 1637 1638 1639 1640 1641 1642 1643 1644 1645 1646 1647 1648 1649 1650 
##    0    1    0    0    0    0    0    0    0    1    0    0    0    0    1 
## 1651 1652 1653 1654 1655 1656 1657 1658 1659 1660 1661 1662 1663 1664 1665 
##    0    0    0    0    1    1    0    0    0    0    1    1    1    1    1 
## 1666 1667 1668 1669 1670 1671 1672 1673 1674 1675 1676 1677 1678 1679 1680 
##    0    1    0    1    0    1    0    1    0    0    0    0    0    0    0 
## 1681 1682 1683 1684 1685 1686 1687 1688 1689 1690 1691 1692 1693 1694 1695 
##    0    1    1    0    1    0    0    1    0    0    0    0    0    1    1 
## 1696 1697 1698 1699 1700 1701 1702 1703 1704 1705 1706 1707 1708 1709 1710 
##    1    0    1    1    0    1    0    0    1    0    0    1    0    1    0 
## 1711 1712 1713 1714 1715 1716 1717 1718 1719 1720 1721 1722 1723 1724 1725 
##    0    0    1    0    1    0    0    0    0    0    0    1    0    1    1 
## 1726 1727 1728 1729 1730 1731 1732 1733 1734 1735 1736 1737 1738 1739 1740 
##    0    0    0    1    1    0    0    0    0    0    0    1    0    0    0 
## 1741 1742 1743 1744 1745 1746 1747 1748 1749 1750 1751 1752 1753 1754 1755 
##    1    0    0    0    1    0    0    0    1    0    0    0    0    1    0 
## 1756 1757 1758 1759 1760 1761 1762 1763 1764 1765 1766 1767 1768 1769 1770 
##    0    0    1    0    0    1    0    0    0    1    1    0    0    0    0 
## 1771 1772 1773 1774 1775 1776 1777 1778 1779 1780 1781 1782 1783 1784 1785 
##    0    0    0    1    1    0    1    0    1    0    0    0    0    1    0 
## 1786 1787 1788 1789 1790 1791 1792 1793 1794 1795 1796 1797 1798 1799 1800 
##    0    0    0    1    1    0    0    0    0    0    1    0    0    0    1 
## 1801 1802 1803 1804 1805 1806 1807 1808 1809 1810 1811 1812 1813 1814 1815 
##    1    0    0    0    0    1    1    1    0    0    0    0    1    0    1 
## 1816 1817 1818 1819 1820 1821 1822 1823 1824 1825 1826 1827 1828 1829 1830 
##    0    0    0    0    0    0    0    0    1    0    1    0    1    0    0 
## 1831 1832 1833 1834 1835 1836 1837 1838 1839 1840 1841 1842 1843 1844 1845 
##    0    0    0    0    1    0    0    1    0    1    0    0    0    0    0 
## 1846 1847 1848 1849 1850 1851 1852 1853 1854 1855 1856 1857 1858 1859 1860 
##    0    0    0    0    0    0    0    1    0    0    0    0    0    0    0 
## 1861 1862 1863 1864 1865 1866 1867 1868 1869 1870 1871 1872 1873 1874 1875 
##    1    0    0    0    0    0    0    0    1    1    1    0    0    0    1 
## 1876 1877 1878 1879 1880 1881 1882 1883 1884 1885 1886 1887 1888 1889 1890 
##    0    0    1    1    1    0    0    0    0    1    0    0    0    1    0 
## 1891 1892 1893 1894 1895 1896 1897 1898 1899 1900 1901 1902 1903 1904 1905 
##    0    0    0    0    1    0    0    0    0    1    0    1    0    1    0 
## 1906 1907 1908 1909 1910 1911 1912 1913 1914 1915 1916 1917 1918 1919 1920 
##    0    0    0    1    1    0    1    0    0    1    0    0    0    0    0 
## 1921 1922 1923 1924 1925 1926 1927 1928 1929 1930 1931 1932 1933 1934 1935 
##    0    1    0    0    0    0    0    0    0    0    0    0    1    1    0 
## 1936 1937 1938 1939 1940 1941 1942 1943 1944 1945 1946 1947 1948 1949 1950 
##    1    0    0    0    0    0    0    0    0    0    0    1    0    1    0 
## 1951 1952 1953 1954 1955 1956 1957 1958 1959 1960 1961 1962 1963 1964 1965 
##    0    0    1    1    0    0    0    0    0    1    1    1    1    0    0 
## 1966 1967 1968 1969 1970 1971 1972 1973 1974 1975 1976 1977 1978 1979 1980 
##    0    0    1    0    0    0    0    1    1    0    0    0    1    1    0 
## 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 
##    0    1    0    0    0    0    0    0    0    1    0    1    1    1    0 
## 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 
##    0    1    1    0    0    1    1    1    0    1    0    1    0    1    1 
## 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024 2025 
##    1    0    1    0    0    1    0    1    1    0    0    0    1    0    0 
## 2026 2027 2028 2029 2030 2031 2032 2033 2034 2035 2036 2037 2038 2039 2040 
##    0    0    0    0    1    0    0    0    0    1    1    0    0    1    0 
## 2041 2042 2043 2044 2045 2046 2047 2048 2049 2050 2051 2052 2053 2054 2055 
##    0    0    0    1    0    0    0    1    0    0    1    0    1    0    1 
## 2056 2057 2058 2059 2060 2061 2062 2063 2064 2065 2066 2067 2068 2069 2070 
##    0    0    0    0    0    0    1    1    0    0    0    1    1    0    0 
## 2071 2072 2073 2074 2075 2076 2077 2078 2079 2080 2081 2082 2083 2084 2085 
##    0    0    1    1    0    0    1    0    0    1    0    0    1    0    0 
## 2086 2087 2088 2089 2090 2091 2092 2093 2094 2095 2096 2097 2098 2099 2100 
##    0    1    1    0    1    0    0    1    0    1    1    0    1    1    1 
## 2101 2102 2103 2104 2105 2106 2107 2108 2109 2110 2111 2112 2113 2114 2115 
##    1    1    1    0    0    0    1    0    0    0    1    0    1    0    0 
## 2116 2117 2118 2119 2120 2121 2122 2123 2124 2125 2126 2127 2128 2129 2130 
##    0    1    0    1    0    0    0    1    1    0    0    1    0    0    0 
## 2131 2132 2133 2134 2135 2136 2137 2138 2139 2140 2141 
##    0    0    0    1    1    0    0    0    0    0    0

I will use model 2a to predict the claim amount because that model had the lower root mean square error.

##        1        2        3        4        5        6        7        8 
##    0.000    0.000    0.000    0.000    0.000    0.000 2369.262 3466.281 
##        9       10       11       12       13       14       15       16 
##    0.000    0.000    0.000 3799.551 4620.172    0.000    0.000 3737.505 
##       17       18       19       20       21       22       23       24 
## 3431.916    0.000 2730.605 2290.083    0.000 1832.988    0.000 2081.918 
##       25       26       27       28       29       30       31       32 
## 2041.693 2304.514 1851.031 3137.272    0.000    0.000    0.000 2082.547 
##       33       34       35       36       37       38       39       40 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000 2964.835 
##       41       42       43       44       45       46       47       48 
##    0.000 2855.948    0.000 2714.003    0.000    0.000    0.000 1905.791 
##       49       50       51       52       53       54       55       56 
##    0.000 3244.206    0.000    0.000 3395.802    0.000    0.000    0.000 
##       57       58       59       60       61       62       63       64 
##    0.000 2499.167    0.000 2547.963    0.000    0.000 1979.590    0.000 
##       65       66       67       68       69       70       71       72 
##    0.000 1467.866 4016.118 3949.387    0.000    0.000    0.000    0.000 
##       73       74       75       76       77       78       79       80 
## 3708.335 1877.746 2995.758    0.000 2211.200    0.000    0.000    0.000 
##       81       82       83       84       85       86       87       88 
## 2549.262 2122.862 2155.607    0.000 2337.344 3097.596    0.000    0.000 
##       89       90       91       92       93       94       95       96 
##    0.000 3477.726    0.000    0.000    0.000    0.000    0.000    0.000 
##       97       98       99      100      101      102      103      104 
##    0.000    0.000 2009.151    0.000 1958.144 2847.853 2695.691 3108.060 
##      105      106      107      108      109      110      111      112 
##    0.000    0.000    0.000    0.000 2626.702    0.000 2971.429    0.000 
##      113      114      115      116      117      118      119      120 
##    0.000    0.000 3267.391    0.000    0.000 3176.143 2584.099    0.000 
##      121      122      123      124      125      126      127      128 
##    0.000 3007.596 3359.524 2439.260    0.000 2328.712 2060.008    0.000 
##      129      130      131      132      133      134      135      136 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##      137      138      139      140      141      142      143      144 
## 3742.517 4184.236    0.000    0.000    0.000 3615.936    0.000    0.000 
##      145      146      147      148      149      150      151      152 
##    0.000 2966.727    0.000    0.000    0.000    0.000 2885.267 2353.918 
##      153      154      155      156      157      158      159      160 
## 4034.475 2214.651    0.000 2579.792    0.000    0.000 2588.501 2626.691 
##      161      162      163      164      165      166      167      168 
##    0.000    0.000    0.000    0.000 3857.852    0.000    0.000    0.000 
##      169      170      171      172      173      174      175      176 
## 2100.014    0.000    0.000 3680.418    0.000 4635.774 2584.969 1807.019 
##      177      178      179      180      181      182      183      184 
## 3454.875 2702.984 3770.963 3269.976 3240.053    0.000    0.000    0.000 
##      185      186      187      188      189      190      191      192 
##    0.000    0.000    0.000    0.000    0.000    0.000 2240.906 3151.826 
##      193      194      195      196      197      198      199      200 
## 3512.474    0.000    0.000 2462.682 2566.310    0.000    0.000    0.000 
##      201      202      203      204      205      206      207      208 
##    0.000 2354.074    0.000    0.000    0.000    0.000 4483.710    0.000 
##      209      210      211      212      213      214      215      216 
##    0.000    0.000    0.000    0.000 2704.046 2882.920    0.000    0.000 
##      217      218      219      220      221      222      223      224 
## 2749.037    0.000    0.000    0.000    0.000    0.000 2517.245 1973.393 
##      225      226      227      228      229      230      231      232 
##    0.000 2025.719 3860.581 1696.430 2327.999    0.000    0.000    0.000 
##      233      234      235      236      237      238      239      240 
## 3922.886    0.000    0.000    0.000    0.000    0.000    0.000 3713.975 
##      241      242      243      244      245      246      247      248 
##    0.000    0.000 3961.794    0.000    0.000 2450.278 1669.442 2000.249 
##      249      250      251      252      253      254      255      256 
##    0.000 2858.901 2137.284 2595.073    0.000 2057.791 1884.467 1767.605 
##      257      258      259      260      261      262      263      264 
##    0.000    0.000 2321.565    0.000    0.000    0.000    0.000    0.000 
##      265      266      267      268      269      270      271      272 
##    0.000    0.000    0.000    0.000 4403.354 2268.714 2891.869    0.000 
##      273      274      275      276      277      278      279      280 
##    0.000 3849.188    0.000    0.000 4172.609    0.000    0.000    0.000 
##      281      282      283      284      285      286      287      288 
##    0.000    0.000    0.000 2184.814 2279.185 1776.982    0.000 1704.613 
##      289      290      291      292      293      294      295      296 
## 1686.425 3470.123 2190.371    0.000    0.000 2269.038    0.000 2213.298 
##      297      298      299      300      301      302      303      304 
##    0.000 2578.011    0.000    0.000    0.000    0.000    0.000 3338.808 
##      305      306      307      308      309      310      311      312 
## 2832.212 2245.462    0.000 2433.617    0.000    0.000 3750.491    0.000 
##      313      314      315      316      317      318      319      320 
##    0.000 4555.508    0.000    0.000    0.000    0.000 2431.446    0.000 
##      321      322      323      324      325      326      327      328 
##    0.000 2178.714    0.000    0.000 3153.108 2399.606 3058.715    0.000 
##      329      330      331      332      333      334      335      336 
## 3001.420    0.000    0.000    0.000 4064.832    0.000    0.000    0.000 
##      337      338      339      340      341      342      343      344 
##    0.000 1978.927    0.000    0.000 2369.255 3251.420 2291.295 3201.887 
##      345      346      347      348      349      350      351      352 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##      353      354      355      356      357      358      359      360 
## 4127.216 3955.022    0.000 2783.045 2219.227    0.000    0.000    0.000 
##      361      362      363      364      365      366      367      368 
## 3653.111    0.000    0.000 1925.648    0.000 2955.017    0.000 1687.425 
##      369      370      371      372      373      374      375      376 
##    0.000    0.000    0.000    0.000 3005.873    0.000    0.000 3020.471 
##      377      378      379      380      381      382      383      384 
##    0.000    0.000    0.000    0.000    0.000 2453.288    0.000    0.000 
##      385      386      387      388      389      390      391      392 
##    0.000 2375.317 1951.675    0.000    0.000 2909.857    0.000    0.000 
##      393      394      395      396      397      398      399      400 
##    0.000    0.000    0.000 2626.056    0.000 1761.357    0.000 2671.869 
##      401      402      403      404      405      406      407      408 
## 2154.406    0.000    0.000    0.000    0.000    0.000    0.000 1948.084 
##      409      410      411      412      413      414      415      416 
##    0.000    0.000    0.000 3481.116    0.000    0.000 2477.336 1344.077 
##      417      418      419      420      421      422      423      424 
##    0.000 2038.607    0.000    0.000 2962.635 3229.071 3801.141    0.000 
##      425      426      427      428      429      430      431      432 
##    0.000 2034.676    0.000    0.000 2694.467 2200.606 1887.413    0.000 
##      433      434      435      436      437      438      439      440 
##    0.000    0.000 1797.262 3828.124    0.000    0.000    0.000    0.000 
##      441      442      443      444      445      446      447      448 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##      449      450      451      452      453      454      455      456 
## 3386.994 4307.971    0.000    0.000 2401.326    0.000 2888.358 3131.317 
##      457      458      459      460      461      462      463      464 
## 3306.987 4173.593    0.000 1593.253    0.000    0.000    0.000 1669.183 
##      465      466      467      468      469      470      471      472 
##    0.000    0.000 3679.639 2492.097    0.000    0.000    0.000 3246.031 
##      473      474      475      476      477      478      479      480 
##    0.000    0.000    0.000    0.000 3746.304 4425.351    0.000 1979.157 
##      481      482      483      484      485      486      487      488 
##    0.000    0.000    0.000 2104.429 3634.381 2578.305    0.000 2678.231 
##      489      490      491      492      493      494      495      496 
##    0.000 3979.610 3342.514    0.000    0.000    0.000    0.000 2859.645 
##      497      498      499      500      501      502      503      504 
##    0.000    0.000 2318.816    0.000    0.000    0.000 4317.003    0.000 
##      505      506      507      508      509      510      511      512 
## 3461.419    0.000 2486.244    0.000    0.000    0.000    0.000    0.000 
##      513      514      515      516      517      518      519      520 
##    0.000    0.000    0.000    0.000 3425.586 3518.467    0.000 2381.040 
##      521      522      523      524      525      526      527      528 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##      529      530      531      532      533      534      535      536 
##    0.000    0.000    0.000    0.000    0.000 1864.355    0.000    0.000 
##      537      538      539      540      541      542      543      544 
##    0.000    0.000    0.000    0.000    0.000    0.000 1911.985 3039.087 
##      545      546      547      548      549      550      551      552 
##    0.000    0.000    0.000 4084.071 2679.186    0.000    0.000    0.000 
##      553      554      555      556      557      558      559      560 
##    0.000 2005.854    0.000    0.000 1926.402    0.000 2352.524    0.000 
##      561      562      563      564      565      566      567      568 
##    0.000 1475.241    0.000    0.000 2341.011    0.000 5553.164 1742.953 
##      569      570      571      572      573      574      575      576 
## 2509.472 3443.720    0.000    0.000    0.000    0.000    0.000    0.000 
##      577      578      579      580      581      582      583      584 
##    0.000    0.000 2332.629    0.000    0.000 3118.198    0.000 3411.348 
##      585      586      587      588      589      590      591      592 
##    0.000    0.000    0.000    0.000 4418.533 2736.284 2304.623    0.000 
##      593      594      595      596      597      598      599      600 
##    0.000 2168.423 2452.577 2470.849 2682.927    0.000    0.000 2563.602 
##      601      602      603      604      605      606      607      608 
## 3187.197    0.000    0.000    0.000 2916.515 2462.188 2929.688    0.000 
##      609      610      611      612      613      614      615      616 
##    0.000    0.000    0.000 2624.340    0.000    0.000 2428.369    0.000 
##      617      618      619      620      621      622      623      624 
##    0.000 1913.871    0.000 3447.200    0.000    0.000    0.000    0.000 
##      625      626      627      628      629      630      631      632 
##    0.000 4331.933 2659.371    0.000    0.000 2950.859    0.000    0.000 
##      633      634      635      636      637      638      639      640 
##    0.000    0.000    0.000    0.000    0.000 3011.399    0.000    0.000 
##      641      642      643      644      645      646      647      648 
##    0.000 2555.876    0.000    0.000    0.000 3151.117 2463.916    0.000 
##      649      650      651      652      653      654      655      656 
##    0.000    0.000    0.000    0.000 3736.110    0.000    0.000    0.000 
##      657      658      659      660      661      662      663      664 
## 2837.473    0.000    0.000    0.000 2719.481 1480.734 1883.652    0.000 
##      665      666      667      668      669      670      671      672 
##    0.000 1706.428 2336.944    0.000    0.000    0.000    0.000 2810.806 
##      673      674      675      676      677      678      679      680 
## 3293.956    0.000 1655.906    0.000    0.000 2382.025    0.000    0.000 
##      681      682      683      684      685      686      687      688 
## 2872.617    0.000 2495.817    0.000    0.000    0.000    0.000    0.000 
##      689      690      691      692      693      694      695      696 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000 2482.088 
##      697      698      699      700      701      702      703      704 
##    0.000 2815.293    0.000    0.000    0.000    0.000 1950.378    0.000 
##      705      706      707      708      709      710      711      712 
##    0.000    0.000 1529.246 3939.748    0.000    0.000    0.000    0.000 
##      713      714      715      716      717      718      719      720 
## 2592.584    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##      721      722      723      724      725      726      727      728 
## 2725.121    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##      729      730      731      732      733      734      735      736 
##    0.000    0.000 2571.616 3372.828    0.000    0.000 2639.739 2722.785 
##      737      738      739      740      741      742      743      744 
##    0.000    0.000    0.000    0.000 3254.143 1727.746 3167.034    0.000 
##      745      746      747      748      749      750      751      752 
##    0.000 2435.198 3437.952    0.000    0.000    0.000    0.000    0.000 
##      753      754      755      756      757      758      759      760 
## 2679.807 2458.147    0.000 2082.780 2225.962    0.000    0.000 1586.105 
##      761      762      763      764      765      766      767      768 
##    0.000 4437.381 2712.816 3549.586 1796.978 2928.451 1517.131    0.000 
##      769      770      771      772      773      774      775      776 
## 2410.686    0.000 2472.165 2662.098    0.000 3451.139    0.000 2151.832 
##      777      778      779      780      781      782      783      784 
##    0.000 2240.260    0.000    0.000    0.000 3863.035    0.000    0.000 
##      785      786      787      788      789      790      791      792 
##    0.000 1907.128    0.000    0.000    0.000    0.000    0.000    0.000 
##      793      794      795      796      797      798      799      800 
## 3237.765    0.000    0.000 1712.911    0.000 2961.946 3414.876 2953.026 
##      801      802      803      804      805      806      807      808 
##    0.000    0.000    0.000 2172.330    0.000    0.000    0.000    0.000 
##      809      810      811      812      813      814      815      816 
##    0.000 2616.479 1936.291    0.000    0.000    0.000    0.000    0.000 
##      817      818      819      820      821      822      823      824 
##    0.000 3332.712 3057.742    0.000 2780.160    0.000 3421.097 2064.041 
##      825      826      827      828      829      830      831      832 
## 2551.820    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##      833      834      835      836      837      838      839      840 
## 3024.359    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##      841      842      843      844      845      846      847      848 
##    0.000    0.000    0.000    0.000    0.000    0.000 2203.552 2789.378 
##      849      850      851      852      853      854      855      856 
## 3413.111 3340.744 3318.875    0.000    0.000    0.000 2337.217    0.000 
##      857      858      859      860      861      862      863      864 
##    0.000    0.000 4089.994    0.000    0.000 3634.272    0.000    0.000 
##      865      866      867      868      869      870      871      872 
##    0.000    0.000 3214.805    0.000    0.000 2654.615    0.000 2861.403 
##      873      874      875      876      877      878      879      880 
##    0.000 4441.638 1732.851    0.000    0.000    0.000    0.000 2219.010 
##      881      882      883      884      885      886      887      888 
## 2422.483    0.000    0.000    0.000 3357.277 2330.594 4098.746    0.000 
##      889      890      891      892      893      894      895      896 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000 2271.413 
##      897      898      899      900      901      902      903      904 
##    0.000    0.000 1817.723 2768.293    0.000    0.000 4364.697    0.000 
##      905      906      907      908      909      910      911      912 
##    0.000    0.000 3346.777    0.000 2187.754 1896.713 3862.605    0.000 
##      913      914      915      916      917      918      919      920 
##    0.000    0.000    0.000 2220.237 5491.319 2680.660    0.000 1885.492 
##      921      922      923      924      925      926      927      928 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##      929      930      931      932      933      934      935      936 
## 3682.021    0.000    0.000 4154.326    0.000 3235.422    0.000 2265.437 
##      937      938      939      940      941      942      943      944 
##    0.000 2083.855 3003.078 2057.612 3958.402    0.000 2106.087    0.000 
##      945      946      947      948      949      950      951      952 
## 3107.897    0.000    0.000    0.000    0.000 2413.498    0.000    0.000 
##      953      954      955      956      957      958      959      960 
##    0.000 2476.987    0.000    0.000    0.000    0.000 2814.717    0.000 
##      961      962      963      964      965      966      967      968 
## 2424.639 2756.560 2194.527    0.000    0.000 2942.710 2257.207    0.000 
##      969      970      971      972      973      974      975      976 
##    0.000    0.000 3043.575    0.000    0.000 2097.125    0.000 1439.480 
##      977      978      979      980      981      982      983      984 
## 1312.109    0.000    0.000 2587.057    0.000    0.000 3089.041 3221.927 
##      985      986      987      988      989      990      991      992 
## 3098.470 2673.620 2227.165    0.000    0.000 2398.908    0.000    0.000 
##      993      994      995      996      997      998      999     1000 
## 2534.730    0.000 2112.878    0.000    0.000    0.000    0.000    0.000 
##     1001     1002     1003     1004     1005     1006     1007     1008 
## 1533.552 3771.001 2933.645    0.000 2521.643    0.000    0.000 1771.994 
##     1009     1010     1011     1012     1013     1014     1015     1016 
## 2294.244    0.000 2912.976    0.000    0.000    0.000    0.000 2220.786 
##     1017     1018     1019     1020     1021     1022     1023     1024 
##    0.000 2304.818 1526.933    0.000    0.000    0.000 3640.500 2943.243 
##     1025     1026     1027     1028     1029     1030     1031     1032 
## 3611.271 3402.559 2102.239    0.000    0.000    0.000    0.000    0.000 
##     1033     1034     1035     1036     1037     1038     1039     1040 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##     1041     1042     1043     1044     1045     1046     1047     1048 
##    0.000 2877.279 2246.896    0.000 3789.237    0.000 2776.471 2743.450 
##     1049     1050     1051     1052     1053     1054     1055     1056 
## 2607.127 2937.717 2606.392 2566.636 3330.434    0.000 2187.627    0.000 
##     1057     1058     1059     1060     1061     1062     1063     1064 
##    0.000    0.000 2434.113 3571.445    0.000 3276.093 2232.106    0.000 
##     1065     1066     1067     1068     1069     1070     1071     1072 
##    0.000    0.000    0.000    0.000    0.000 2293.489    0.000    0.000 
##     1073     1074     1075     1076     1077     1078     1079     1080 
##    0.000 4045.601    0.000    0.000    0.000    0.000 2717.972    0.000 
##     1081     1082     1083     1084     1085     1086     1087     1088 
## 3453.470 2942.332    0.000    0.000 2028.170 4824.958    0.000    0.000 
##     1089     1090     1091     1092     1093     1094     1095     1096 
##    0.000    0.000    0.000    0.000    0.000 2343.001    0.000    0.000 
##     1097     1098     1099     1100     1101     1102     1103     1104 
##    0.000    0.000 2225.471 3833.875    0.000    0.000 1735.579 2641.414 
##     1105     1106     1107     1108     1109     1110     1111     1112 
## 3368.086 1930.074    0.000    0.000    0.000 1605.138 2819.283 2778.051 
##     1113     1114     1115     1116     1117     1118     1119     1120 
##    0.000    0.000    0.000    0.000 2214.715 4301.324 2855.065    0.000 
##     1121     1122     1123     1124     1125     1126     1127     1128 
## 1925.772 2341.441    0.000    0.000    0.000    0.000 4740.345    0.000 
##     1129     1130     1131     1132     1133     1134     1135     1136 
##    0.000    0.000    0.000    0.000 3247.590    0.000 2559.589    0.000 
##     1137     1138     1139     1140     1141     1142     1143     1144 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000 3070.182 
##     1145     1146     1147     1148     1149     1150     1151     1152 
## 2468.054    0.000    0.000 3155.964    0.000 2905.352 2435.813 4610.611 
##     1153     1154     1155     1156     1157     1158     1159     1160 
##    0.000 1880.235 3248.764    0.000    0.000    0.000    0.000    0.000 
##     1161     1162     1163     1164     1165     1166     1167     1168 
##    0.000    0.000    0.000    0.000    0.000 2492.725    0.000 2442.304 
##     1169     1170     1171     1172     1173     1174     1175     1176 
## 1705.605    0.000 2218.092 2178.431 2548.757 2997.880    0.000    0.000 
##     1177     1178     1179     1180     1181     1182     1183     1184 
## 2440.670    0.000    0.000 3737.765    0.000 3895.382    0.000 2975.771 
##     1185     1186     1187     1188     1189     1190     1191     1192 
## 4344.651    0.000    0.000    0.000    0.000    0.000 1843.854    0.000 
##     1193     1194     1195     1196     1197     1198     1199     1200 
##    0.000 2821.233    0.000    0.000    0.000    0.000 2438.538    0.000 
##     1201     1202     1203     1204     1205     1206     1207     1208 
##    0.000    0.000    0.000    0.000    0.000    0.000 2382.325    0.000 
##     1209     1210     1211     1212     1213     1214     1215     1216 
##    0.000 2370.666    0.000    0.000 4456.245    0.000    0.000    0.000 
##     1217     1218     1219     1220     1221     1222     1223     1224 
## 2663.438 1832.310    0.000    0.000    0.000    0.000 3737.840    0.000 
##     1225     1226     1227     1228     1229     1230     1231     1232 
## 2731.563    0.000    0.000    0.000    0.000 4017.232    0.000    0.000 
##     1233     1234     1235     1236     1237     1238     1239     1240 
## 2424.866 2942.444    0.000    0.000    0.000 3178.158    0.000    0.000 
##     1241     1242     1243     1244     1245     1246     1247     1248 
## 1975.015    0.000    0.000    0.000    0.000 4427.320 3020.475    0.000 
##     1249     1250     1251     1252     1253     1254     1255     1256 
##    0.000    0.000    0.000 4208.323    0.000 2754.918 1832.689    0.000 
##     1257     1258     1259     1260     1261     1262     1263     1264 
## 3814.222    0.000 2983.326    0.000    0.000    0.000    0.000 4000.419 
##     1265     1266     1267     1268     1269     1270     1271     1272 
##    0.000    0.000 2509.787    0.000 2623.428    0.000    0.000    0.000 
##     1273     1274     1275     1276     1277     1278     1279     1280 
##    0.000    0.000    0.000    0.000 2796.126    0.000    0.000    0.000 
##     1281     1282     1283     1284     1285     1286     1287     1288 
## 2905.072    0.000    0.000 2156.525 3088.520    0.000 2912.690    0.000 
##     1289     1290     1291     1292     1293     1294     1295     1296 
##    0.000    0.000 2966.482    0.000    0.000 2366.823 2465.136    0.000 
##     1297     1298     1299     1300     1301     1302     1303     1304 
## 1837.770    0.000 2841.444 2112.169    0.000    0.000    0.000    0.000 
##     1305     1306     1307     1308     1309     1310     1311     1312 
##    0.000    0.000 3505.581 2744.253    0.000 3605.963 2164.576    0.000 
##     1313     1314     1315     1316     1317     1318     1319     1320 
## 4534.293    0.000    0.000    0.000    0.000    0.000    0.000 3912.146 
##     1321     1322     1323     1324     1325     1326     1327     1328 
##    0.000 2701.502 3612.022    0.000    0.000 2276.194    0.000 2129.110 
##     1329     1330     1331     1332     1333     1334     1335     1336 
##    0.000    0.000    0.000    0.000    0.000    0.000 1846.566 1826.129 
##     1337     1338     1339     1340     1341     1342     1343     1344 
##    0.000    0.000 2218.836    0.000    0.000 2541.135    0.000    0.000 
##     1345     1346     1347     1348     1349     1350     1351     1352 
## 4220.625    0.000    0.000 2404.862    0.000    0.000    0.000 2462.144 
##     1353     1354     1355     1356     1357     1358     1359     1360 
##    0.000    0.000    0.000    0.000    0.000 2658.243    0.000    0.000 
##     1361     1362     1363     1364     1365     1366     1367     1368 
##    0.000    0.000 4416.293    0.000    0.000    0.000 3019.589 2904.898 
##     1369     1370     1371     1372     1373     1374     1375     1376 
## 2505.151 3410.230    0.000    0.000    0.000    0.000    0.000 2387.049 
##     1377     1378     1379     1380     1381     1382     1383     1384 
## 2073.179    0.000    0.000    0.000 3397.011 3249.016 2520.349    0.000 
##     1385     1386     1387     1388     1389     1390     1391     1392 
##    0.000    0.000    0.000    0.000    0.000 2104.147 3007.173 2997.214 
##     1393     1394     1395     1396     1397     1398     1399     1400 
##    0.000 2308.643    0.000    0.000    0.000 2343.269 3082.486    0.000 
##     1401     1402     1403     1404     1405     1406     1407     1408 
##    0.000    0.000 3430.110 2301.070 1959.296    0.000    0.000    0.000 
##     1409     1410     1411     1412     1413     1414     1415     1416 
##    0.000 3167.317    0.000    0.000    0.000    0.000    0.000    0.000 
##     1417     1418     1419     1420     1421     1422     1423     1424 
##    0.000    0.000 2878.545    0.000    0.000 4503.386 2571.440    0.000 
##     1425     1426     1427     1428     1429     1430     1431     1432 
## 2870.248 4724.533    0.000    0.000 2480.895    0.000    0.000    0.000 
##     1433     1434     1435     1436     1437     1438     1439     1440 
##    0.000    0.000    0.000 2451.605 2180.356 2562.334    0.000    0.000 
##     1441     1442     1443     1444     1445     1446     1447     1448 
##    0.000 3594.218 2433.077    0.000    0.000 3233.921 3514.080    0.000 
##     1449     1450     1451     1452     1453     1454     1455     1456 
##    0.000    0.000    0.000    0.000 2730.898    0.000    0.000    0.000 
##     1457     1458     1459     1460     1461     1462     1463     1464 
## 3691.225    0.000    0.000    0.000    0.000    0.000    0.000 1941.113 
##     1465     1466     1467     1468     1469     1470     1471     1472 
##    0.000 2014.443    0.000    0.000    0.000 2273.432    0.000    0.000 
##     1473     1474     1475     1476     1477     1478     1479     1480 
##    0.000 1176.680 2281.769    0.000 2381.176    0.000    0.000    0.000 
##     1481     1482     1483     1484     1485     1486     1487     1488 
## 1735.304    0.000 1461.190    0.000 2480.721    0.000    0.000    0.000 
##     1489     1490     1491     1492     1493     1494     1495     1496 
## 3358.516    0.000    0.000 3673.179    0.000    0.000 3318.659 3308.538 
##     1497     1498     1499     1500     1501     1502     1503     1504 
##    0.000    0.000    0.000    0.000    0.000 2196.702    0.000 4101.940 
##     1505     1506     1507     1508     1509     1510     1511     1512 
##    0.000    0.000 2789.614    0.000    0.000    0.000    0.000    0.000 
##     1513     1514     1515     1516     1517     1518     1519     1520 
##    0.000    0.000 2566.604 3011.189    0.000 1816.189    0.000    0.000 
##     1521     1522     1523     1524     1525     1526     1527     1528 
## 2067.851    0.000    0.000    0.000    0.000 1660.241    0.000 2377.022 
##     1529     1530     1531     1532     1533     1534     1535     1536 
## 2559.583 3102.723 2716.465 2308.547    0.000    0.000 2096.965    0.000 
##     1537     1538     1539     1540     1541     1542     1543     1544 
##    0.000 2460.842 4159.926 3456.353    0.000    0.000    0.000 2152.571 
##     1545     1546     1547     1548     1549     1550     1551     1552 
## 3458.302    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##     1553     1554     1555     1556     1557     1558     1559     1560 
##    0.000 3076.298    0.000    0.000    0.000    0.000 2323.652 1707.102 
##     1561     1562     1563     1564     1565     1566     1567     1568 
##    0.000 3261.218    0.000 2723.979 4498.600    0.000    0.000    0.000 
##     1569     1570     1571     1572     1573     1574     1575     1576 
##    0.000 2124.674 3113.637    0.000    0.000 2528.814    0.000 2735.019 
##     1577     1578     1579     1580     1581     1582     1583     1584 
## 3062.032    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##     1585     1586     1587     1588     1589     1590     1591     1592 
##    0.000    0.000    0.000 1895.984    0.000 2434.607 1704.596 4131.673 
##     1593     1594     1595     1596     1597     1598     1599     1600 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000 2066.109 
##     1601     1602     1603     1604     1605     1606     1607     1608 
## 2623.772    0.000 1636.651 3093.027    0.000 4722.454    0.000    0.000 
##     1609     1610     1611     1612     1613     1614     1615     1616 
##    0.000 4460.783 1915.737    0.000    0.000 2714.232 2873.937 2348.252 
##     1617     1618     1619     1620     1621     1622     1623     1624 
##    0.000 2879.871    0.000 1603.527 2631.760    0.000 4117.763    0.000 
##     1625     1626     1627     1628     1629     1630     1631     1632 
##    0.000    0.000    0.000    0.000    0.000 4122.972    0.000 3136.958 
##     1633     1634     1635     1636     1637     1638     1639     1640 
##    0.000 1862.196    0.000    0.000 2435.631    0.000    0.000    0.000 
##     1641     1642     1643     1644     1645     1646     1647     1648 
##    0.000    0.000    0.000    0.000 2140.717    0.000    0.000    0.000 
##     1649     1650     1651     1652     1653     1654     1655     1656 
##    0.000 2829.234    0.000    0.000    0.000    0.000 3512.149 3173.811 
##     1657     1658     1659     1660     1661     1662     1663     1664 
##    0.000    0.000    0.000    0.000 1913.146 3267.732 3124.398 3460.127 
##     1665     1666     1667     1668     1669     1670     1671     1672 
## 5301.770    0.000 2250.503    0.000 4412.784    0.000 3024.591    0.000 
##     1673     1674     1675     1676     1677     1678     1679     1680 
## 3320.111    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##     1681     1682     1683     1684     1685     1686     1687     1688 
##    0.000 4064.803 3286.672    0.000 2892.279    0.000    0.000 2430.791 
##     1689     1690     1691     1692     1693     1694     1695     1696 
##    0.000    0.000    0.000    0.000    0.000 2913.360 1452.929 2989.230 
##     1697     1698     1699     1700     1701     1702     1703     1704 
##    0.000 3240.369 4221.373    0.000 2193.558    0.000    0.000 1693.800 
##     1705     1706     1707     1708     1709     1710     1711     1712 
##    0.000    0.000 3453.067    0.000 3408.903    0.000    0.000    0.000 
##     1713     1714     1715     1716     1717     1718     1719     1720 
## 2261.604    0.000 2574.320    0.000    0.000    0.000    0.000    0.000 
##     1721     1722     1723     1724     1725     1726     1727     1728 
##    0.000 1802.257    0.000 1970.695 1832.802    0.000    0.000    0.000 
##     1729     1730     1731     1732     1733     1734     1735     1736 
## 3208.159 2451.373    0.000    0.000    0.000    0.000    0.000    0.000 
##     1737     1738     1739     1740     1741     1742     1743     1744 
## 2021.554    0.000    0.000    0.000 4751.948    0.000    0.000    0.000 
##     1745     1746     1747     1748     1749     1750     1751     1752 
## 1753.775    0.000    0.000    0.000 2642.535    0.000    0.000    0.000 
##     1753     1754     1755     1756     1757     1758     1759     1760 
##    0.000 3111.269    0.000    0.000    0.000 2234.678    0.000    0.000 
##     1761     1762     1763     1764     1765     1766     1767     1768 
## 4068.571    0.000    0.000    0.000 2698.161 4051.058    0.000    0.000 
##     1769     1770     1771     1772     1773     1774     1775     1776 
##    0.000    0.000    0.000    0.000    0.000 4512.878 2208.011    0.000 
##     1777     1778     1779     1780     1781     1782     1783     1784 
## 1946.867    0.000 2841.342    0.000    0.000    0.000    0.000 1365.381 
##     1785     1786     1787     1788     1789     1790     1791     1792 
##    0.000    0.000    0.000    0.000 3903.320 2447.763    0.000    0.000 
##     1793     1794     1795     1796     1797     1798     1799     1800 
##    0.000    0.000    0.000 2546.477    0.000    0.000    0.000 2568.933 
##     1801     1802     1803     1804     1805     1806     1807     1808 
## 2369.726    0.000    0.000    0.000    0.000 2790.137 3115.469 3775.797 
##     1809     1810     1811     1812     1813     1814     1815     1816 
##    0.000    0.000    0.000    0.000 2971.247    0.000 2174.859    0.000 
##     1817     1818     1819     1820     1821     1822     1823     1824 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000 2015.685 
##     1825     1826     1827     1828     1829     1830     1831     1832 
##    0.000 3718.318    0.000 2554.984    0.000    0.000    0.000    0.000 
##     1833     1834     1835     1836     1837     1838     1839     1840 
##    0.000    0.000 2366.190    0.000    0.000 4302.100    0.000 3089.387 
##     1841     1842     1843     1844     1845     1846     1847     1848 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##     1849     1850     1851     1852     1853     1854     1855     1856 
##    0.000    0.000    0.000    0.000 2884.313    0.000    0.000    0.000 
##     1857     1858     1859     1860     1861     1862     1863     1864 
##    0.000    0.000    0.000    0.000 2062.305    0.000    0.000    0.000 
##     1865     1866     1867     1868     1869     1870     1871     1872 
##    0.000    0.000    0.000    0.000 1764.117 2103.202 3259.153    0.000 
##     1873     1874     1875     1876     1877     1878     1879     1880 
##    0.000    0.000 2280.218    0.000    0.000 3919.511 2182.247 2384.420 
##     1881     1882     1883     1884     1885     1886     1887     1888 
##    0.000    0.000    0.000    0.000 1879.282    0.000    0.000    0.000 
##     1889     1890     1891     1892     1893     1894     1895     1896 
## 2751.730    0.000    0.000    0.000    0.000    0.000 2053.523    0.000 
##     1897     1898     1899     1900     1901     1902     1903     1904 
##    0.000    0.000    0.000 2481.630    0.000 3744.170    0.000 1934.822 
##     1905     1906     1907     1908     1909     1910     1911     1912 
##    0.000    0.000    0.000    0.000 2617.277 3718.700    0.000 2322.103 
##     1913     1914     1915     1916     1917     1918     1919     1920 
##    0.000    0.000 2666.930    0.000    0.000    0.000    0.000    0.000 
##     1921     1922     1923     1924     1925     1926     1927     1928 
##    0.000 2253.242    0.000    0.000    0.000    0.000    0.000    0.000 
##     1929     1930     1931     1932     1933     1934     1935     1936 
##    0.000    0.000    0.000    0.000 2879.233 3334.702    0.000 2532.755 
##     1937     1938     1939     1940     1941     1942     1943     1944 
##    0.000    0.000    0.000    0.000    0.000    0.000    0.000    0.000 
##     1945     1946     1947     1948     1949     1950     1951     1952 
##    0.000    0.000 3210.597    0.000 2782.624    0.000    0.000    0.000 
##     1953     1954     1955     1956     1957     1958     1959     1960 
## 2686.458 2230.628    0.000    0.000    0.000    0.000    0.000 2056.608 
##     1961     1962     1963     1964     1965     1966     1967     1968 
## 3550.316 2156.760 2187.033    0.000    0.000    0.000    0.000 3366.827 
##     1969     1970     1971     1972     1973     1974     1975     1976 
##    0.000    0.000    0.000    0.000 1726.972 3631.909    0.000    0.000 
##     1977     1978     1979     1980     1981     1982     1983     1984 
##    0.000 3560.464 1561.706    0.000    0.000 2463.568    0.000    0.000 
##     1985     1986     1987     1988     1989     1990     1991     1992 
##    0.000    0.000    0.000    0.000    0.000 2394.564    0.000 3274.981 
##     1993     1994     1995     1996     1997     1998     1999     2000 
## 3291.735 3006.880    0.000    0.000 2584.964 3871.384    0.000    0.000 
##     2001     2002     2003     2004     2005     2006     2007     2008 
## 2533.846 2760.668 3304.103    0.000 2895.439    0.000 3325.533    0.000 
##     2009     2010     2011     2012     2013     2014     2015     2016 
## 2217.715 2945.842 2104.206    0.000 5297.530    0.000    0.000 2596.098 
##     2017     2018     2019     2020     2021     2022     2023     2024 
##    0.000 3163.383 2940.468    0.000    0.000    0.000 1764.946    0.000 
##     2025     2026     2027     2028     2029     2030     2031     2032 
##    0.000    0.000    0.000    0.000    0.000 3365.061    0.000    0.000 
##     2033     2034     2035     2036     2037     2038     2039     2040 
##    0.000    0.000 2960.000 2393.535    0.000    0.000 2467.865    0.000 
##     2041     2042     2043     2044     2045     2046     2047     2048 
##    0.000    0.000    0.000 2333.033    0.000    0.000    0.000 2550.746 
##     2049     2050     2051     2052     2053     2054     2055     2056 
##    0.000    0.000 1997.586    0.000 3418.936    0.000 3103.695    0.000 
##     2057     2058     2059     2060     2061     2062     2063     2064 
##    0.000    0.000    0.000    0.000    0.000 1865.670 2390.485    0.000 
##     2065     2066     2067     2068     2069     2070     2071     2072 
##    0.000    0.000 3039.080 4384.116    0.000    0.000    0.000    0.000 
##     2073     2074     2075     2076     2077     2078     2079     2080 
## 4103.885 3175.578    0.000    0.000 4013.105    0.000    0.000 5420.134 
##     2081     2082     2083     2084     2085     2086     2087     2088 
##    0.000    0.000 2390.746    0.000    0.000    0.000 1894.218 2787.052 
##     2089     2090     2091     2092     2093     2094     2095     2096 
##    0.000 3019.695    0.000    0.000 2517.282    0.000 1884.067 3281.572 
##     2097     2098     2099     2100     2101     2102     2103     2104 
##    0.000 1591.509 3620.574 2632.414 2774.112 1736.851 3840.326    0.000 
##     2105     2106     2107     2108     2109     2110     2111     2112 
##    0.000    0.000 3282.669    0.000    0.000    0.000 4838.786    0.000 
##     2113     2114     2115     2116     2117     2118     2119     2120 
## 1423.687    0.000    0.000    0.000 3112.698    0.000 4873.935    0.000 
##     2121     2122     2123     2124     2125     2126     2127     2128 
##    0.000    0.000 3876.975 2718.690    0.000    0.000 2439.868    0.000 
##     2129     2130     2131     2132     2133     2134     2135     2136 
##    0.000    0.000    0.000    0.000    0.000 1683.902 3001.403    0.000 
##     2137     2138     2139     2140     2141 
##    0.000    0.000    0.000    0.000    0.000

APPENDIX

library(ggplot2) insurance <- read.csv(“https://raw.githubusercontent.com/swigodsky/Data621/master/insurance_training_data.csv”, stringsAsFactors=FALSE) library(stringr)

head(insurance) nrow(insurance)

library(tidyr) library(dplyr)

insur_df <- insurance insur_df\(INCOME <- as.numeric(gsub("\\D+","",insur_df\)INCOME)) insur_df\(HOME_VAL <- as.numeric(gsub("\\D+","",insur_df\)HOME_VAL)) insur_df\(BLUEBOOK <- as.numeric(gsub("\\D+","",insur_df\)BLUEBOOK)) insur_df\(OLDCLAIM <- as.numeric(gsub("\\D+","",insur_df\)OLDCLAIM))

insur_df[insur_df==“No”]<-0 insur_df[insur_df==“Yes”]<-1 insur_df[insur_df==“no”]<-0 insur_df[insur_df==“yes”]<-1 insur_df\(MSTATUS[insur_df\)MSTATUS==“z_No”]<-0 insur_df\(SEX[insur_df\)SEX==“M”]<-0 insur_df\(SEX[insur_df\)SEX==“z_F”]<-1 insur_df\(CAR_USE[insur_df\)CAR_USE==“Private”]<-0 insur_df\(CAR_USE[insur_df\)CAR_USE==“Commercial”]<-1 insur_df\(URBANICITY[insur_df\)URBANICITY==“z_Highly Rural/ Rural”]<-0 insur_df\(URBANICITY[insur_df\)URBANICITY==“Highly Urban/ Urban”]<-1 insur_df\(PARENT1 <- as.numeric(insur_df\)PARENT1) insur_df\(MSTATUS <- as.numeric(insur_df\)MSTATUS) insur_df\(SEX <- as.numeric(insur_df\)SEX) insur_df\(CAR_USE <- as.numeric(insur_df\)CAR_USE) insur_df\(RED_CAR <- as.numeric(insur_df\)RED_CAR) insur_df\(REVOKED <- as.numeric(insur_df\)REVOKED) insur_df\(URBANICITY <- as.numeric(insur_df\)URBANICITY)

insur_df\(EDUCATION[insur_df\)EDUCATION == “= cutoff] <- 1

spec_val <- spec(test_df, test)
sens_val <- sens(test_df, test)
  
roc_tester <- rbind(roc_tester, list(o_m_specificity=1-spec_val,sensitivity= sens_val))

#calculating Euclidean distance between point and y=x a2 <- c(1-spec_val,sens_val) b2 <- c(0,0) c2 <- c(1,1) d2 <- dist2d(a2,b2,c2) # distance of point a from line (b,c) if (d2>d){ d <- d2 cut_off_val <- cutoff }

#calculating area of trapezoid for each set of data points  
if (cutoff>=0.1){
  num_values = nrow(roc_tester)
  base2 = roc_tester$sensitivity[num_values]
  base1 = roc_tester$sensitivity[num_values-1]
  height2 = roc_tester$o_m_specificity[num_values]
  height1 = roc_tester$o_m_specificity[num_values-1]
  area = .5*(base1+base2)*(height2-height1)
  auc = auc + area
}

}

roc_plot <- ggplot(roc_tester, aes(x = o_m_specificity, y = sensitivity)) + geom_point() + geom_abline(slope=1) + labs(x="False Positive Rate (1-specificity)", y="True Positive Rate (sensitivity)", title="ROC Curve" )
  

return(list(roc_plot=roc_plot, auc_val=auc, cut_off_val=cut_off_val)) }

pred_logit1 <- predict(logit1, newdata=test1, type=“response”)

roc_vals <- roc(pred_logit1,test1) roc_vals\(roc_plot print(roc_vals\)cut_off_val) print(roc_vals$auc_val)

pred_logit1[pred_logit1>=0.32] <- 1 pred_logit1[pred_logit1<0.32] <- 0

table(pred=round(pred_logit1), true=test1$TARGET_FLAG)

accuracy1 <- acc(round(pred_logit1), test1) error1 <- err(round(pred_logit1), test1) precision1 <- prec(round(pred_logit1), test1) sensitivity1 <- sens(round(pred_logit1), test1) specificity1 <- spec(round(pred_logit1), test1) f11 <- f1(round(pred_logit1), test1)

stat1 <- data.frame(list(accuracy=accuracy1, error=error1, precision=precision1, sensitivity=sensitivity1, specificity=specificity1, f1=f11)) print(stat1)

set.seed(15) n <- nrow(insur_df) shuffle_df2 <- insur_df[sample(n),] train_indeces <- 1:round(0.6n) train2 <- shuffle_df2[train_indeces,] test_indeces <- (round(.6n)+1):n test2 <- shuffle_df2[test_indeces,]

logit2 <- glm(train2$TARGET_FLAG ~ ., data=train2, family=binomial (link=“logit”)) logit2 <- update(logit2, .~. -TARGET_AMT, data = train2, family=binomial (link=“logit”)) logit2 <- update(logit2, .~. -INDEX, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -z_Blue Collar, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -Home Maker, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -CAR_AGE, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -SEX, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -AGE, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -Lawyer, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -RED_CAR, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -HOME_VAL, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -Clerical, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -HOMEKIDS, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -YOJ, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -OLDCLAIM, data = train2, family=binomial (link=“logit”)) summary(logit2)

logit2 <- update(logit2, .~. -Doctor, data = train2, family=binomial (link=“logit”)) summary(logit2) logitscalar2 <- mean(dlogis(predict(logit2,type=“link”))) logitscalar2*coef(logit2)

pred_logit2 <- predict(logit2, newdata=test2, type=“response”)

roc_vals2 <- roc(pred_logit2,test2) roc_vals2\(roc_plot print(roc_vals2\)cut_off_val) print(roc_vals2$auc_val)

pred_logit2[pred_logit2>=0.33] <- 1 pred_logit2[pred_logit2<0.33] <- 0

table(pred=round(pred_logit2), true=test2$TARGET_FLAG)

accuracy2 <- acc(round(pred_logit2), test2) error2 <- err(round(pred_logit2), test2) precision2 <- prec(round(pred_logit2), test2) sensitivity2 <- sens(round(pred_logit2), test2) specificity2 <- spec(round(pred_logit2), test2) f12 <- f1(round(pred_logit2), test2)

stat2 <- data.frame(list(accuracy=accuracy2, error=error2, precision=precision2, sensitivity=sensitivity2, specificity=specificity2, f1=f12)) print(stat2)

insur_df3 <-insur_df1

set.seed(1) n <- nrow(insur_df3) shuffle_df3 <- insur_df3[sample(n),] train_indeces <- 1:round(0.6n) train3 <- shuffle_df3[train_indeces,] test_indeces <- (round(.6n)+1):n test3 <- shuffle_df3[test_indeces,]

logit3 <- glm(train1$TARGET_FLAG ~ ., data=train3, family=binomial (link=“logit”)) logit3 <- update(logit3, .~. -TARGET_AMT, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -Panel Truck, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -oldclaim_freq, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -MSTATUS, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -homekids_age, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -BLUEBOOK, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -TIF, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -REVOKED, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -CAR_AGE, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -Van, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -Pickup, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -MVR_PTS, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -KIDSDRIV, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -YOJ, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -z_Blue Collar, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -sex_redcar_suv, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -Manager, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -Doctor, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -income_homeval_educ, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -Lawyer, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -TRAVTIME, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -CAR_USE, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -Professional, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -Sports Car, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -URBANICITY, data = train3, family=binomial (link=“logit”)) summary(logit3)

logit3 <- update(logit3, .~. -Home Maker, data = train3, family=binomial (link=“logit”)) logitscalar3 <- mean(dlogis(predict(logit3,type=“link”))) logitscalar3*coef(logit3)

pred_logit3 <- predict(logit3, newdata=test3, type=“response”)

roc_vals3 <- roc(pred_logit3,test3) roc_vals3\(roc_plot print(roc_vals3\)cut_off_val) print(roc_vals3$auc_val)

pred_logit3[pred_logit3>=0.23] <- 1 pred_logit3[pred_logit3<0.23] <- 0

table(pred=round(pred_logit3), true=test3$TARGET_FLAG)

accuracy3 <- acc(round(pred_logit3), test3) error3 <- err(round(pred_logit3), test3) precision3 <- prec(round(pred_logit3), test3) sensitivity3 <- sens(round(pred_logit3), test3) specificity3 <- spec(round(pred_logit3), test3) f13 <- f1(round(pred_logit3), test3)

stat3 <- data.frame(list(accuracy=accuracy3, error=error3, precision=precision3, sensitivity=sensitivity3, specificity=specificity3, f1=f13)) print(stat3)

train2a <- train2 train2a <- subset(train2a, select=-c(INDEX, TARGET_FLAG, z_Blue Collar, Home Maker, CAR_AGE, SEX, AGE, Lawyer, RED_CAR, HOME_VAL, Clerical, HOMEKIDS, YOJ, Doctor, OLDCLAIM))

claim_lm <- lm(train2a$TARGET_AMT ~., data=train2a) summary(claim_lm)

claim_lm <- update(claim_lm, .~. -Van, data = train2a) summary(claim_lm)

claim_lm <- update(claim_lm, .~. -BLUEBOOK, data = train2a) summary(claim_lm)

claim_lm <- update(claim_lm, .~. -Pickup, data = train2a) summary(claim_lm)

claim_lm <- update(claim_lm, .~. -CLM_FREQ, data = train2a) summary(claim_lm)

plot(fitted(claim_lm),resid(claim_lm)) qqnorm(resid(claim_lm)) qqline(resid(claim_lm))

predictclaim2a <- predict(claim_lm, newdata=test2, type=“response”) predictclaim2a[pred_logit2==0]<-0 predictclaim2a[predictclaim2a<0]<-0

error <- predictclaim2a-test2$TARGET_AMT

rmse <- sqrt(mean(error^2)) rmse

train2b <- train2 train2b <- subset(train2b, select=-c(INDEX, TARGET_FLAG, z_Blue Collar, Home Maker, CAR_AGE, SEX, AGE, Lawyer, RED_CAR, HOME_VAL, Clerical, HOMEKIDS, YOJ, Doctor, OLDCLAIM)) train2b[train2b==0] <- 0.000001 claim_lm2 <- lm(log(train2b$TARGET_AMT) ~., data=train2b) summary(claim_lm2)

plot(fitted(claim_lm2),resid(claim_lm2)) qqnorm(resid(claim_lm2)) qqline(resid(claim_lm2))

predictclaim2b <- predict(claim_lm2, newdata=test2, type=“response”) predictclaim2b <- exp(predictclaim2b) predictclaim2b[pred_logit2==0]<-0 predictclaim2b[predictclaim2b<0]<-0

error2 <- predictclaim2b-test2$TARGET_AMT

rmse2 <- sqrt(mean(error2^2)) rmse2

eval_data <- read.csv(‘https://raw.githubusercontent.com/swigodsky/Data621/master/insurance-evaluation-data.csv’, stringsAsFactors = FALSE)

eval_data\(INCOME <- as.numeric(gsub("\\D+","",eval_data\)INCOME)) eval_data\(HOME_VAL <- as.numeric(gsub("\\D+","",eval_data\)HOME_VAL)) eval_data\(BLUEBOOK <- as.numeric(gsub("\\D+","",eval_data\)BLUEBOOK)) eval_data\(OLDCLAIM <- as.numeric(gsub("\\D+","",eval_data\)OLDCLAIM))

eval_data[eval_data==“No”]<-0 eval_data[eval_data==“Yes”]<-1 eval_data[eval_data==“no”]<-0 eval_data[eval_data==“yes”]<-1 eval_data\(MSTATUS[eval_data\)MSTATUS==“z_No”]<-0 eval_data\(SEX[eval_data\)SEX==“M”]<-0 eval_data\(SEX[eval_data\)SEX==“z_F”]<-1 eval_data\(CAR_USE[eval_data\)CAR_USE==“Private”]<-0 eval_data\(CAR_USE[eval_data\)CAR_USE==“Commercial”]<-1 eval_data\(URBANICITY[eval_data\)URBANICITY==“z_Highly Rural/ Rural”]<-0 eval_data\(URBANICITY[eval_data\)URBANICITY==“Highly Urban/ Urban”]<-1 eval_data\(PARENT1 <- as.numeric(eval_data\)PARENT1) eval_data\(MSTATUS <- as.numeric(eval_data\)MSTATUS) eval_data\(SEX <- as.numeric(eval_data\)SEX) eval_data\(CAR_USE <- as.numeric(eval_data\)CAR_USE) eval_data\(RED_CAR <- as.numeric(eval_data\)RED_CAR) eval_data\(REVOKED <- as.numeric(eval_data\)REVOKED) eval_data\(URBANICITY <- as.numeric(eval_data\)URBANICITY)

eval_data\(EDUCATION[eval_data\)EDUCATION == “=0.33] <- 1 pred_flg[pred_flg<0.33] <- 0 pred_flg

predclm <- predict(claim_lm, newdata=eval_data, type=“response”) predclm[pred_flg==0]<-0 predclm[pred_flg<0]<-0 predclm