Transfer data and .Rmd files to one folder and set working directory there. If needed, install R.matlab and caret libraries and subsequently load them. Load R.matlab package to handle the .mat file, also load 10 rows to check the X_TRAIN dataset.
setwd("D:/Data_Science_Projects/Image_Classification")
#install.packages(c('caret'))
library(R.matlab)
## Warning: package 'R.matlab' was built under R version 3.2.3
## R.matlab v3.3.0 (2015-09-22) successfully loaded. See ?R.matlab for help.
##
## Attaching package: 'R.matlab'
##
## The following objects are masked from 'package:base':
##
## getOption, isOpen
library(e1071)
library(caret)
## Warning: package 'caret' was built under R version 3.2.3
## Loading required package: lattice
## Loading required package: ggplot2
data <- read.csv("Mydata.csv", header=FALSE, nrows=10)
for(i in colnames(data)){
data[,i] <- as.numeric(data[,i])
}
integer <- as.vector(lapply(data, class) == "integer")
numeric <- as.vector(lapply(data, class) == "numeric")
sum(integer)
## [1] 0
sum(numeric)
## [1] 1163
if (isTRUE(sum(integer)+sum(numeric)==1163)){
print("You can go on doing the magic tricks, since all variables in the X_TRAIN dataset are numeric or integer")
rm(integer)
rm(numeric)
}else print("tu es stupido")
## [1] "You can go on doing the magic tricks, since all variables in the X_TRAIN dataset are numeric or integer"
data <- read.csv("MyData.csv", header=TRUE)
for(i in colnames(data)){
data[,i] <- as.numeric(data[,i])
}
if(isTRUE(sum(as.vector(lapply(data,class)=="numeric"))==1163)){
print("Go on and process the data")
}
## [1] "Go on and process the data"
rm(i)
data[,1] <- as.factor(data[,1])
sum(duplicated(data)==TRUE) #2 records found. Impressive.
## [1] 2
data <- unique(data)
Well who would expect it, sum(duplicated(data)==TRUE) duplicated records were found. Subsequently, these were removed using the nice “unique” function.
sum(is.na(data)) #No missing values, hence no further action on this.
## [1] 0
A benchmark modelling will take place to check all data transformations against the baseline dataset which is considered to be the one resulted after Near Zero Variance, duplicated records deletion as well as missing values check.
fitControl <- trainControl(## 10-fold CV
method = "repeatedcv",
number = 5,
#classProbs = TRUE,
## repeated one times
repeats = 1)
gbmGrid <- expand.grid(interaction.depth = 2,
n.trees = 300,
shrinkage = 0.15,
n.minobsinnode = 10)
gbmFit <- train(V1~., data = data,
method = "gbm",
trControl = fitControl,
verbose = TRUE,
tuneGrid = gbmGrid)
## Loading required package: gbm
## Loading required package: survival
##
## Attaching package: 'survival'
##
## The following object is masked from 'package:caret':
##
## cluster
##
## Loading required package: splines
## Loading required package: parallel
## Loaded gbm 2.1.1
## Loading required package: plyr
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2733
## 2 1.2079 nan 0.1500 0.1974
## 3 1.0830 nan 0.1500 0.1576
## 4 0.9811 nan 0.1500 0.1127
## 5 0.9084 nan 0.1500 0.0948
## 6 0.8465 nan 0.1500 0.0776
## 7 0.7950 nan 0.1500 0.0633
## 8 0.7522 nan 0.1500 0.0470
## 9 0.7193 nan 0.1500 0.0452
## 10 0.6873 nan 0.1500 0.0378
## 20 0.5107 nan 0.1500 0.0127
## 40 0.3823 nan 0.1500 0.0001
## 60 0.3217 nan 0.1500 0.0001
## 80 0.2815 nan 0.1500 -0.0006
## 100 0.2498 nan 0.1500 -0.0014
## 120 0.2250 nan 0.1500 -0.0010
## 140 0.2029 nan 0.1500 -0.0013
## 160 0.1855 nan 0.1500 -0.0007
## 180 0.1692 nan 0.1500 -0.0011
## 200 0.1557 nan 0.1500 -0.0005
## 220 0.1436 nan 0.1500 -0.0009
## 240 0.1322 nan 0.1500 -0.0014
## 260 0.1221 nan 0.1500 -0.0013
## 280 0.1128 nan 0.1500 -0.0007
## 300 0.1057 nan 0.1500 -0.0005
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2752
## 2 1.2083 nan 0.1500 0.1985
## 3 1.0817 nan 0.1500 0.1460
## 4 0.9869 nan 0.1500 0.1107
## 5 0.9147 nan 0.1500 0.0911
## 6 0.8557 nan 0.1500 0.0741
## 7 0.8064 nan 0.1500 0.0607
## 8 0.7677 nan 0.1500 0.0555
## 9 0.7305 nan 0.1500 0.0451
## 10 0.6989 nan 0.1500 0.0394
## 20 0.5166 nan 0.1500 0.0127
## 40 0.3863 nan 0.1500 0.0011
## 60 0.3292 nan 0.1500 0.0010
## 80 0.2872 nan 0.1500 -0.0010
## 100 0.2558 nan 0.1500 -0.0006
## 120 0.2306 nan 0.1500 -0.0016
## 140 0.2096 nan 0.1500 -0.0012
## 160 0.1896 nan 0.1500 -0.0015
## 180 0.1749 nan 0.1500 -0.0020
## 200 0.1601 nan 0.1500 -0.0013
## 220 0.1478 nan 0.1500 -0.0002
## 240 0.1366 nan 0.1500 -0.0003
## 260 0.1250 nan 0.1500 -0.0006
## 280 0.1166 nan 0.1500 -0.0013
## 300 0.1083 nan 0.1500 -0.0006
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2732
## 2 1.2055 nan 0.1500 0.1946
## 3 1.0831 nan 0.1500 0.1412
## 4 0.9879 nan 0.1500 0.1198
## 5 0.9121 nan 0.1500 0.0904
## 6 0.8512 nan 0.1500 0.0769
## 7 0.8005 nan 0.1500 0.0634
## 8 0.7585 nan 0.1500 0.0541
## 9 0.7220 nan 0.1500 0.0393
## 10 0.6945 nan 0.1500 0.0409
## 20 0.5148 nan 0.1500 0.0092
## 40 0.3837 nan 0.1500 0.0023
## 60 0.3223 nan 0.1500 -0.0008
## 80 0.2812 nan 0.1500 -0.0002
## 100 0.2492 nan 0.1500 -0.0010
## 120 0.2231 nan 0.1500 -0.0009
## 140 0.2013 nan 0.1500 -0.0011
## 160 0.1833 nan 0.1500 -0.0009
## 180 0.1673 nan 0.1500 -0.0002
## 200 0.1538 nan 0.1500 -0.0011
## 220 0.1423 nan 0.1500 -0.0017
## 240 0.1315 nan 0.1500 -0.0014
## 260 0.1214 nan 0.1500 -0.0007
## 280 0.1124 nan 0.1500 -0.0006
## 300 0.1046 nan 0.1500 -0.0007
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2755
## 2 1.2093 nan 0.1500 0.1957
## 3 1.0884 nan 0.1500 0.1470
## 4 0.9944 nan 0.1500 0.1178
## 5 0.9202 nan 0.1500 0.0939
## 6 0.8598 nan 0.1500 0.0790
## 7 0.8075 nan 0.1500 0.0599
## 8 0.7663 nan 0.1500 0.0519
## 9 0.7321 nan 0.1500 0.0427
## 10 0.7018 nan 0.1500 0.0375
## 20 0.5218 nan 0.1500 0.0136
## 40 0.3887 nan 0.1500 0.0001
## 60 0.3253 nan 0.1500 -0.0000
## 80 0.2831 nan 0.1500 -0.0023
## 100 0.2531 nan 0.1500 -0.0019
## 120 0.2284 nan 0.1500 -0.0038
## 140 0.2074 nan 0.1500 -0.0006
## 160 0.1883 nan 0.1500 -0.0006
## 180 0.1729 nan 0.1500 -0.0013
## 200 0.1588 nan 0.1500 -0.0007
## 220 0.1461 nan 0.1500 -0.0011
## 240 0.1339 nan 0.1500 -0.0009
## 260 0.1237 nan 0.1500 -0.0007
## 280 0.1149 nan 0.1500 -0.0008
## 300 0.1072 nan 0.1500 -0.0007
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2709
## 2 1.2098 nan 0.1500 0.1983
## 3 1.0820 nan 0.1500 0.1509
## 4 0.9846 nan 0.1500 0.1079
## 5 0.9125 nan 0.1500 0.0886
## 6 0.8539 nan 0.1500 0.0716
## 7 0.8036 nan 0.1500 0.0637
## 8 0.7618 nan 0.1500 0.0531
## 9 0.7241 nan 0.1500 0.0436
## 10 0.6946 nan 0.1500 0.0417
## 20 0.5153 nan 0.1500 0.0099
## 40 0.3837 nan 0.1500 0.0020
## 60 0.3248 nan 0.1500 0.0002
## 80 0.2836 nan 0.1500 -0.0001
## 100 0.2524 nan 0.1500 -0.0012
## 120 0.2273 nan 0.1500 -0.0011
## 140 0.2056 nan 0.1500 -0.0012
## 160 0.1877 nan 0.1500 -0.0010
## 180 0.1723 nan 0.1500 -0.0015
## 200 0.1580 nan 0.1500 -0.0012
## 220 0.1462 nan 0.1500 -0.0007
## 240 0.1356 nan 0.1500 -0.0008
## 260 0.1259 nan 0.1500 -0.0011
## 280 0.1167 nan 0.1500 -0.0014
## 300 0.1087 nan 0.1500 -0.0016
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2680
## 2 1.2124 nan 0.1500 0.1943
## 3 1.0865 nan 0.1500 0.1429
## 4 0.9926 nan 0.1500 0.1184
## 5 0.9176 nan 0.1500 0.0867
## 6 0.8589 nan 0.1500 0.0781
## 7 0.8084 nan 0.1500 0.0611
## 8 0.7681 nan 0.1500 0.0537
## 9 0.7325 nan 0.1500 0.0451
## 10 0.7021 nan 0.1500 0.0429
## 20 0.5240 nan 0.1500 0.0125
## 40 0.3950 nan 0.1500 0.0010
## 60 0.3387 nan 0.1500 0.0003
## 80 0.3012 nan 0.1500 -0.0007
## 100 0.2705 nan 0.1500 -0.0019
## 120 0.2463 nan 0.1500 -0.0006
## 140 0.2263 nan 0.1500 -0.0008
## 160 0.2083 nan 0.1500 -0.0013
## 180 0.1933 nan 0.1500 -0.0009
## 200 0.1792 nan 0.1500 -0.0012
## 220 0.1661 nan 0.1500 -0.0008
## 240 0.1548 nan 0.1500 -0.0002
## 260 0.1447 nan 0.1500 -0.0006
## 280 0.1351 nan 0.1500 -0.0004
## 300 0.1269 nan 0.1500 -0.0008
gbmFit
## Stochastic Gradient Boosting
##
## 5998 samples
## 1162 predictors
## 4 classes: '1', '2', '3', '4'
##
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 1 times)
## Summary of sample sizes: 4799, 4797, 4799, 4798, 4799
## Resampling results
##
## Accuracy Kappa Accuracy SD Kappa SD
## 0.8551177 0.7953708 0.01298251 0.01796699
##
## Tuning parameter 'n.trees' was held constant at a value of 300
## 2
## Tuning parameter 'shrinkage' was held constant at a value of
## 0.15
## Tuning parameter 'n.minobsinnode' was held constant at a value of 10
##
Trans <- preProcess(data[,2:1163],
method = c("center", "scale"),
thresh = 0.95,
numUnique = 3,
verbose = TRUE)
## final pre-processing options:
## $center
## [1] "V74" "V78" "V79" "V80" "V81" "V82" "V83"
## [8] "V84" "V85" "V86" "V87" "V88" "V89" "V90"
## [15] "V91" "V92" "V93" "V94" "V95" "V96" "V109"
## [22] "V110" "V111" "V112" "V113" "V114" "V252" "V258"
## [29] "V261" "V262" "V263" "V264" "V268" "V269" "V270"
## [36] "V283" "V284" "V285" "V286" "V287" "V288" "V541"
## [43] "V546" "V649" "V654" "V679" "V757" "V758" "V759"
## [50] "V760" "V761" "V775" "V781" "V782" "V783" "V784"
## [57] "V785" "V786" "V787" "V788" "V789" "V790" "V791"
## [64] "V792" "V859" "V864" "V937" "V938" "V939" "V940"
## [71] "V941" "V942" "V943" "V944" "V945" "V946" "V947"
## [78] "V948" "V949" "V950" "V954" "V967" "V968" "V969"
## [85] "V970" "V971" "V972" "V1075" "V1081" "V1082" "V1083"
## [92] "V1084" "V1085" "V1086" "V1087" "V1111" "V1129" "V1399"
## [99] "V1404" "V1482" "V1579" "V1657" "V1837" "V1838" "V1839"
## [106] "V1841" "V1842" "V2154" "V2180" "V2181" "V2182" "V2183"
## [113] "V2184" "V2186" "V2187" "V2188" "V2189" "V2190" "V2197"
## [120] "V2198" "V2202" "V2203" "V2208" "V2209" "V2214" "V2215"
## [127] "V2219" "V2220" "V2221" "V2227" "V2232" "V2238" "V2449"
## [134] "V2450" "V2451" "V2452" "V2453" "V2454" "V2459" "V2460"
## [141] "V2466" "V2772" "V2786" "V2787" "V2788" "V2789" "V2790"
## [148] "V2792" "V2793" "V2795" "V2796" "V2845" "V3040" "V3041"
## [155] "V3042" "V3048" "V3054" "V3066" "V3072" "V3096" "V3235"
## [162] "V3240" "V3241" "V3373" "V3379" "V3380" "V3383" "V3384"
## [169] "V3504" "V3510" "V3564" "V3620" "V3680" "V3681" "V3682"
## [176] "V3683" "V3684" "V3687" "V3688" "V3689" "V3690" "V3709"
## [183] "V3710" "V3711" "V3712" "V3713" "V3714" "V3744" "V3761"
## [190] "V3762" "V3764" "V3765" "V3766" "V3767" "V3768" "V3772"
## [197] "V3774" "V3883" "V3888" "V4141" "V4681" "V4682" "V4683"
## [204] "V4684" "V4685" "V4686" "V4699" "V4705" "V4706" "V4711"
## [211] "V4712" "V4713" "V4714" "V4715" "V4716" "V4891" "V4985"
## [218] "V4986" "V5237" "V5283" "V5284" "V5285" "V5286" "V5289"
## [225] "V5290" "V5291" "V5292" "V5328" "V5347" "V5348" "V5349"
## [232] "V5350" "V5351" "V5352" "V5358" "V5401" "V5480" "V5482"
## [239] "V5486" "V5689" "V5791" "V5853" "V5854" "V5855" "V5856"
## [246] "V6300" "V6301" "V6306" "V6444" "V6594" "V6619" "V6624"
## [253] "V6696" "V6884" "V6885" "V6886" "V6887" "V6888" "V6890"
## [260] "V6891" "V6893" "V6894" "V7235" "V7345" "V7350" "V7375"
## [267] "V7380" "V7597" "V7602" "V7603" "V7608" "V7614" "V7777"
## [274] "V7783" "V7789" "V7843" "V8311" "V8316" "V8439" "V8440"
## [281] "V8441" "V8442" "V8449" "V8454" "V8455" "V8459" "V8460"
## [288] "V8497" "V8502" "V8515" "V8521" "V8522" "V8523" "V8524"
## [295] "V8525" "V8526" "V8527" "V8528" "V8529" "V8530" "V8531"
## [302] "V8532" "V8558" "V8559" "V8561" "V8928" "V9228" "V9252"
## [309] "V9289" "V9290" "V9291" "V9292" "V9293" "V9294" "V9295"
## [316] "V9300" "V9301" "V9307" "V9313" "V9319" "V9324" "V9360"
## [323] "V9361" "V9362" "V9363" "V9364" "V9365" "V9366" "V9367"
## [330] "V9391" "V9392" "V9643" "V9648" "V9829" "V9834" "V9847"
## [337] "V9853" "V9858" "V9864" "V9931" "V9936" "V10075" "V10117"
## [344] "V10122" "V10141" "V10147" "V10148" "V10149" "V10150" "V10151"
## [351] "V10152" "V10261" "V10262" "V10265" "V10266" "V10283" "V10284"
## [358] "V10286" "V10290" "V10291" "V10296" "V10399" "V10404" "V10497"
## [365] "V10498" "V10499" "V10549" "V10554" "V10713" "V10714" "V10715"
## [372] "V10716" "V10719" "V10720" "V10721" "V10722" "V11058" "V11139"
## [379] "V11140" "V11141" "V11142" "V11220" "V11224" "V11225" "V11226"
## [386] "V11232" "V11259" "V11260" "V11261" "V11262" "V11298" "V11299"
## [393] "V11300" "V11301" "V11302" "V11303" "V11304" "V11406" "V11407"
## [400] "V11408" "V11410" "V11412" "V11443" "V11497" "V11503" "V11509"
## [407] "V11513" "V11514" "V11515" "V11516" "V11517" "V11518" "V11519"
## [414] "V11520" "V11521" "V11598" "V11700" "V11767" "V11768" "V11769"
## [421] "V11771" "V11772" "V11880" "V11953" "V11983" "V12018" "V12030"
## [428] "V12060" "V12193" "V12198" "V12201" "V12202" "V12203" "V12204"
## [435] "V12420" "V12575" "V12661" "V12666" "V12667" "V12668" "V12669"
## [442] "V12670" "V12671" "V12672" "V12781" "V12782" "V12783" "V12784"
## [449] "V12785" "V12786" "V12787" "V12788" "V12792" "V12858" "V12877"
## [456] "V12881" "V12882" "V12883" "V12884" "V12885" "V12886" "V12887"
## [463] "V12888" "V12991" "V13423" "V13428" "V13466" "V13467" "V13468"
## [470] "V13469" "V13470" "V13711" "V13722" "V13776" "V13777" "V13861"
## [477] "V13912" "V13913" "V13917" "V13918" "V13919" "V13920" "V13927"
## [484] "V14005" "V14113" "V14116" "V14117" "V14118" "V14119" "V14121"
## [491] "V14122" "V14123" "V14124" "V14125" "V14128" "V14129" "V14130"
## [498] "V14131" "V14136" "V14137" "V14142" "V14143" "V14144" "V14145"
## [505] "V14146" "V14147" "V14148" "V14370" "V14483" "V14484" "V14921"
## [512] "V14922" "V15012" "V15054" "V15229" "V15230" "V15233" "V15486"
## [519] "V15504" "V15510" "V15805" "V15806" "V15807" "V15808" "V15809"
## [526] "V15810" "V15811" "V15816" "V15822" "V15835" "V15836" "V15837"
## [533] "V15838" "V15839" "V15840" "V16021" "V16022" "V16023" "V16024"
## [540] "V16025" "V16026" "V16027" "V16051" "V16181" "V16182" "V16447"
## [547] "V16451" "V16452" "V16741" "V16770" "V16771" "V16776" "V16987"
## [554] "V16992" "V17059" "V17060" "V17061" "V17062" "V17063" "V17064"
## [561] "V17065" "V17095" "V17100" "V17173" "V17252" "V17257" "V17258"
## [568] "V17294" "V17369" "V17370" "V17455" "V17491" "V17568" "V17652"
## [575] "V17658" "V17766" "V17785" "V17790" "V17815" "V17820" "V17994"
## [582] "V18001" "V18006" "V18036" "V18072" "V18564" "V19607" "V19634"
## [589] "V19637" "V19638" "V19643" "V19644" "V19729" "V19734" "V19759"
## [596] "V19764" "V19867" "V19872" "V20033" "V20113" "V20114" "V20115"
## [603] "V20116" "V20117" "V20118" "V20124" "V20233" "V20274" "V20280"
## [610] "V20286" "V20304" "V20365" "V20366" "V20367" "V20368" "V20369"
## [617] "V20370" "V20374" "V20375" "V20376" "V20418" "V20455" "V20456"
## [624] "V20460" "V20479" "V20484" "V20505" "V20506" "V20507" "V20511"
## [631] "V20512" "V20556" "V20557" "V20562" "V20587" "V20592" "V20665"
## [638] "V20670" "V20793" "V20799" "V20839" "V20843" "V21025" "V21030"
## [645] "V21043" "V21049" "V21054" "V21055" "V21056" "V21059" "V21060"
## [652] "V21091" "V21106" "V21107" "V21108" "V21114" "V21163" "V21601"
## [659] "V21606" "V21925" "V21926" "V21927" "V21928" "V21929" "V21930"
## [666] "V21931" "V21955" "V22069" "V22093" "V22099" "V22105" "V22177"
## [673] "V22178" "V22181" "V22182" "V22207" "V22208" "V22212" "V22321"
## [680] "V22322" "V22323" "V22324" "V22325" "V22326" "V22327" "V22495"
## [687] "V22645" "V22646" "V22647" "V22648" "V22649" "V22650" "V22669"
## [694] "V22675" "V22676" "V22679" "V22680" "V22717" "V22722" "V22728"
## [701] "V22747" "V22752" "V22802" "V22803" "V22804" "V22805" "V22806"
## [708] "V22811" "V22812" "V22825" "V22826" "V22827" "V22828" "V22829"
## [715] "V22830" "V22831" "V22836" "V22855" "V22860" "V22969" "V22970"
## [722] "V22971" "V22972" "V22973" "V22974" "V23131" "V23132" "V23215"
## [729] "V23298" "V23336" "V23337" "V23338" "V23339" "V23359" "V23360"
## [736] "V23361" "V23362" "V23363" "V23364" "V23401" "V23534" "V23535"
## [743] "V23536" "V23538" "V23543" "V23544" "V23559" "V23560" "V23561"
## [750] "V23562" "V23565" "V23566" "V23567" "V23616" "V23683" "V23905"
## [757] "V24036" "V24192" "V24276" "V24279" "V24280" "V24281" "V24282"
## [764] "V24286" "V24287" "V24288" "V24300" "V24395" "V24583" "V24588"
## [771] "V24990" "V25057" "V25058" "V25061" "V25062" "V25063" "V25081"
## [778] "V25087" "V25088" "V25089" "V25092" "V25129" "V25130" "V25131"
## [785] "V25132" "V25133" "V25134" "V25150" "V25151" "V25380" "V25458"
## [792] "V25483" "V25488" "V25597" "V25598" "V25602" "V25699" "V25782"
## [799] "V25807" "V25812" "V25818" "V25848" "V25874" "V25875" "V25876"
## [806] "V25877" "V25878" "V25880" "V25881" "V25882" "V25883" "V25884"
## [813] "V25897" "V25898" "V25899" "V25900" "V25901" "V25903" "V25904"
## [820] "V25905" "V25906" "V25907" "V25908" "V25909" "V25910" "V25911"
## [827] "V25912" "V25913" "V25914" "V25915" "V25916" "V25920" "V26013"
## [834] "V26014" "V26167" "V26172" "V26218" "V26219" "V26224" "V26225"
## [841] "V26226" "V26383" "V26388" "V26604" "V26923" "V26928" "V27019"
## [848] "V27020" "V27026" "V27031" "V27032" "V27033" "V27034" "V27035"
## [855] "V27036" "V27042" "V27067" "V27086" "V27091" "V27092" "V27093"
## [862] "V27094" "V27095" "V27096" "V27098" "V27099" "V27100" "V27101"
## [869] "V27102" "V27186" "V27191" "V27192" "V27211" "V27216" "V27324"
## [876] "V27402" "V27684" "V27685" "V27690" "V27731" "V27734" "V27735"
## [883] "V27736" "V27737" "V27738" "V27744" "V27829" "V27936" "V27973"
## [890] "V27978" "V28009" "V28014" "V28044" "V28261" "V28262" "V28265"
## [897] "V28266" "V28267" "V28285" "V28286" "V28289" "V28290" "V28291"
## [904] "V28292" "V28293" "V28294" "V28295" "V28296" "V28446" "V28609"
## [911] "V28610" "V28612" "V28613" "V28614" "V28615" "V28616" "V28617"
## [918] "V28618" "V28619" "V28620" "V28621" "V28626" "V28651" "V28656"
## [925] "V28698" "V28705" "V28711" "V28717" "V28723" "V28724" "V28725"
## [932] "V28726" "V28727" "V28728" "V28753" "V28754" "V28755" "V28756"
## [939] "V28757" "V28758" "V28801" "V28802" "V28806" "V28824" "V28825"
## [946] "V28826" "V28830" "V28831" "V28836" "V28837" "V28867" "V28872"
## [953] "V28981" "V29005" "V29011" "V29012" "V29013" "V29014" "V29015"
## [960] "V29016" "V29078" "V29079" "V29080" "V29081" "V29082" "V29084"
## [967] "V29085" "V29086" "V29087" "V29088" "V29160" "V29335" "V29942"
## [974] "V29948" "V30012" "V30246" "V30396" "V30399" "V30400" "V30401"
## [981] "V30402" "V30421" "V30422" "V30425" "V30426" "V30451" "V30853"
## [988] "V30894" "V30919" "V30996" "V31176" "V31257" "V31258" "V31259"
## [995] "V31260" "V31266" "V31284" "V31434" "V31440" "V31464" "V31665"
## [1002] "V31666" "V31667" "V31668" "V31848" "V31897" "V31902" "V31908"
## [1009] "V31909" "V31910" "V31914" "V31915" "V31916" "V31917" "V31918"
## [1016] "V31919" "V31920" "V31921" "V31922" "V31923" "V31924" "V31925"
## [1023] "V31926" "V31927" "V31928" "V31929" "V31930" "V31931" "V31932"
## [1030] "V31963" "V31968" "V31999" "V32000" "V32001" "V32002" "V32003"
## [1037] "V32004" "V32005" "V32035" "V32102" "V32103" "V32104" "V32105"
## [1044] "V32293" "V32298" "V32299" "V32303" "V32304" "V32305" "V32317"
## [1051] "V32323" "V32365" "V32370" "V32395" "V32400" "V32622" "V32689"
## [1058] "V32690" "V32691" "V32692" "V32693" "V32694" "V32725" "V32771"
## [1065] "V32772" "V32778" "V32904" "V33234" "V33240" "V33259" "V33301"
## [1072] "V33306" "V33481" "V33487" "V33511" "V33589" "V33594" "V33618"
## [1079] "V33619" "V33623" "V33624" "V33769" "V33770" "V33774" "V33787"
## [1086] "V33793" "V33799" "V33800" "V33804" "V33841" "V33846" "V33866"
## [1093] "V33870" "V33872" "V33873" "V33874" "V33875" "V33876" "V33882"
## [1100] "V34201" "V34494" "V34518" "V34524" "V34950" "V34951" "V34952"
## [1107] "V34956" "V34993" "V35029" "V35030" "V35031" "V35032" "V35033"
## [1114] "V35034" "V35059" "V35064" "V35095" "V35209" "V35215" "V35233"
## [1121] "V35239" "V35240" "V35241" "V35242" "V35243" "V35244" "V35508"
## [1128] "V35510" "V35511" "V35512" "V35513" "V35514" "V35517" "V35518"
## [1135] "V35519" "V35520" "V35527" "V35881" "V35887" "V35892" "V36119"
## [1142] "V36120" "V36123" "V36124" "V36125" "V36126" "V36145" "V36150"
## [1149] "V36205" "V36499" "V36582" "V36607" "V36689" "V36690" "V36727"
## [1156] "V36745" "V36751" "V36752" "V36753" "V36754" "V36755" "V36756"
##
## $scale
## [1] "V74" "V78" "V79" "V80" "V81" "V82" "V83"
## [8] "V84" "V85" "V86" "V87" "V88" "V89" "V90"
## [15] "V91" "V92" "V93" "V94" "V95" "V96" "V109"
## [22] "V110" "V111" "V112" "V113" "V114" "V252" "V258"
## [29] "V261" "V262" "V263" "V264" "V268" "V269" "V270"
## [36] "V283" "V284" "V285" "V286" "V287" "V288" "V541"
## [43] "V546" "V649" "V654" "V679" "V757" "V758" "V759"
## [50] "V760" "V761" "V775" "V781" "V782" "V783" "V784"
## [57] "V785" "V786" "V787" "V788" "V789" "V790" "V791"
## [64] "V792" "V859" "V864" "V937" "V938" "V939" "V940"
## [71] "V941" "V942" "V943" "V944" "V945" "V946" "V947"
## [78] "V948" "V949" "V950" "V954" "V967" "V968" "V969"
## [85] "V970" "V971" "V972" "V1075" "V1081" "V1082" "V1083"
## [92] "V1084" "V1085" "V1086" "V1087" "V1111" "V1129" "V1399"
## [99] "V1404" "V1482" "V1579" "V1657" "V1837" "V1838" "V1839"
## [106] "V1841" "V1842" "V2154" "V2180" "V2181" "V2182" "V2183"
## [113] "V2184" "V2186" "V2187" "V2188" "V2189" "V2190" "V2197"
## [120] "V2198" "V2202" "V2203" "V2208" "V2209" "V2214" "V2215"
## [127] "V2219" "V2220" "V2221" "V2227" "V2232" "V2238" "V2449"
## [134] "V2450" "V2451" "V2452" "V2453" "V2454" "V2459" "V2460"
## [141] "V2466" "V2772" "V2786" "V2787" "V2788" "V2789" "V2790"
## [148] "V2792" "V2793" "V2795" "V2796" "V2845" "V3040" "V3041"
## [155] "V3042" "V3048" "V3054" "V3066" "V3072" "V3096" "V3235"
## [162] "V3240" "V3241" "V3373" "V3379" "V3380" "V3383" "V3384"
## [169] "V3504" "V3510" "V3564" "V3620" "V3680" "V3681" "V3682"
## [176] "V3683" "V3684" "V3687" "V3688" "V3689" "V3690" "V3709"
## [183] "V3710" "V3711" "V3712" "V3713" "V3714" "V3744" "V3761"
## [190] "V3762" "V3764" "V3765" "V3766" "V3767" "V3768" "V3772"
## [197] "V3774" "V3883" "V3888" "V4141" "V4681" "V4682" "V4683"
## [204] "V4684" "V4685" "V4686" "V4699" "V4705" "V4706" "V4711"
## [211] "V4712" "V4713" "V4714" "V4715" "V4716" "V4891" "V4985"
## [218] "V4986" "V5237" "V5283" "V5284" "V5285" "V5286" "V5289"
## [225] "V5290" "V5291" "V5292" "V5328" "V5347" "V5348" "V5349"
## [232] "V5350" "V5351" "V5352" "V5358" "V5401" "V5480" "V5482"
## [239] "V5486" "V5689" "V5791" "V5853" "V5854" "V5855" "V5856"
## [246] "V6300" "V6301" "V6306" "V6444" "V6594" "V6619" "V6624"
## [253] "V6696" "V6884" "V6885" "V6886" "V6887" "V6888" "V6890"
## [260] "V6891" "V6893" "V6894" "V7235" "V7345" "V7350" "V7375"
## [267] "V7380" "V7597" "V7602" "V7603" "V7608" "V7614" "V7777"
## [274] "V7783" "V7789" "V7843" "V8311" "V8316" "V8439" "V8440"
## [281] "V8441" "V8442" "V8449" "V8454" "V8455" "V8459" "V8460"
## [288] "V8497" "V8502" "V8515" "V8521" "V8522" "V8523" "V8524"
## [295] "V8525" "V8526" "V8527" "V8528" "V8529" "V8530" "V8531"
## [302] "V8532" "V8558" "V8559" "V8561" "V8928" "V9228" "V9252"
## [309] "V9289" "V9290" "V9291" "V9292" "V9293" "V9294" "V9295"
## [316] "V9300" "V9301" "V9307" "V9313" "V9319" "V9324" "V9360"
## [323] "V9361" "V9362" "V9363" "V9364" "V9365" "V9366" "V9367"
## [330] "V9391" "V9392" "V9643" "V9648" "V9829" "V9834" "V9847"
## [337] "V9853" "V9858" "V9864" "V9931" "V9936" "V10075" "V10117"
## [344] "V10122" "V10141" "V10147" "V10148" "V10149" "V10150" "V10151"
## [351] "V10152" "V10261" "V10262" "V10265" "V10266" "V10283" "V10284"
## [358] "V10286" "V10290" "V10291" "V10296" "V10399" "V10404" "V10497"
## [365] "V10498" "V10499" "V10549" "V10554" "V10713" "V10714" "V10715"
## [372] "V10716" "V10719" "V10720" "V10721" "V10722" "V11058" "V11139"
## [379] "V11140" "V11141" "V11142" "V11220" "V11224" "V11225" "V11226"
## [386] "V11232" "V11259" "V11260" "V11261" "V11262" "V11298" "V11299"
## [393] "V11300" "V11301" "V11302" "V11303" "V11304" "V11406" "V11407"
## [400] "V11408" "V11410" "V11412" "V11443" "V11497" "V11503" "V11509"
## [407] "V11513" "V11514" "V11515" "V11516" "V11517" "V11518" "V11519"
## [414] "V11520" "V11521" "V11598" "V11700" "V11767" "V11768" "V11769"
## [421] "V11771" "V11772" "V11880" "V11953" "V11983" "V12018" "V12030"
## [428] "V12060" "V12193" "V12198" "V12201" "V12202" "V12203" "V12204"
## [435] "V12420" "V12575" "V12661" "V12666" "V12667" "V12668" "V12669"
## [442] "V12670" "V12671" "V12672" "V12781" "V12782" "V12783" "V12784"
## [449] "V12785" "V12786" "V12787" "V12788" "V12792" "V12858" "V12877"
## [456] "V12881" "V12882" "V12883" "V12884" "V12885" "V12886" "V12887"
## [463] "V12888" "V12991" "V13423" "V13428" "V13466" "V13467" "V13468"
## [470] "V13469" "V13470" "V13711" "V13722" "V13776" "V13777" "V13861"
## [477] "V13912" "V13913" "V13917" "V13918" "V13919" "V13920" "V13927"
## [484] "V14005" "V14113" "V14116" "V14117" "V14118" "V14119" "V14121"
## [491] "V14122" "V14123" "V14124" "V14125" "V14128" "V14129" "V14130"
## [498] "V14131" "V14136" "V14137" "V14142" "V14143" "V14144" "V14145"
## [505] "V14146" "V14147" "V14148" "V14370" "V14483" "V14484" "V14921"
## [512] "V14922" "V15012" "V15054" "V15229" "V15230" "V15233" "V15486"
## [519] "V15504" "V15510" "V15805" "V15806" "V15807" "V15808" "V15809"
## [526] "V15810" "V15811" "V15816" "V15822" "V15835" "V15836" "V15837"
## [533] "V15838" "V15839" "V15840" "V16021" "V16022" "V16023" "V16024"
## [540] "V16025" "V16026" "V16027" "V16051" "V16181" "V16182" "V16447"
## [547] "V16451" "V16452" "V16741" "V16770" "V16771" "V16776" "V16987"
## [554] "V16992" "V17059" "V17060" "V17061" "V17062" "V17063" "V17064"
## [561] "V17065" "V17095" "V17100" "V17173" "V17252" "V17257" "V17258"
## [568] "V17294" "V17369" "V17370" "V17455" "V17491" "V17568" "V17652"
## [575] "V17658" "V17766" "V17785" "V17790" "V17815" "V17820" "V17994"
## [582] "V18001" "V18006" "V18036" "V18072" "V18564" "V19607" "V19634"
## [589] "V19637" "V19638" "V19643" "V19644" "V19729" "V19734" "V19759"
## [596] "V19764" "V19867" "V19872" "V20033" "V20113" "V20114" "V20115"
## [603] "V20116" "V20117" "V20118" "V20124" "V20233" "V20274" "V20280"
## [610] "V20286" "V20304" "V20365" "V20366" "V20367" "V20368" "V20369"
## [617] "V20370" "V20374" "V20375" "V20376" "V20418" "V20455" "V20456"
## [624] "V20460" "V20479" "V20484" "V20505" "V20506" "V20507" "V20511"
## [631] "V20512" "V20556" "V20557" "V20562" "V20587" "V20592" "V20665"
## [638] "V20670" "V20793" "V20799" "V20839" "V20843" "V21025" "V21030"
## [645] "V21043" "V21049" "V21054" "V21055" "V21056" "V21059" "V21060"
## [652] "V21091" "V21106" "V21107" "V21108" "V21114" "V21163" "V21601"
## [659] "V21606" "V21925" "V21926" "V21927" "V21928" "V21929" "V21930"
## [666] "V21931" "V21955" "V22069" "V22093" "V22099" "V22105" "V22177"
## [673] "V22178" "V22181" "V22182" "V22207" "V22208" "V22212" "V22321"
## [680] "V22322" "V22323" "V22324" "V22325" "V22326" "V22327" "V22495"
## [687] "V22645" "V22646" "V22647" "V22648" "V22649" "V22650" "V22669"
## [694] "V22675" "V22676" "V22679" "V22680" "V22717" "V22722" "V22728"
## [701] "V22747" "V22752" "V22802" "V22803" "V22804" "V22805" "V22806"
## [708] "V22811" "V22812" "V22825" "V22826" "V22827" "V22828" "V22829"
## [715] "V22830" "V22831" "V22836" "V22855" "V22860" "V22969" "V22970"
## [722] "V22971" "V22972" "V22973" "V22974" "V23131" "V23132" "V23215"
## [729] "V23298" "V23336" "V23337" "V23338" "V23339" "V23359" "V23360"
## [736] "V23361" "V23362" "V23363" "V23364" "V23401" "V23534" "V23535"
## [743] "V23536" "V23538" "V23543" "V23544" "V23559" "V23560" "V23561"
## [750] "V23562" "V23565" "V23566" "V23567" "V23616" "V23683" "V23905"
## [757] "V24036" "V24192" "V24276" "V24279" "V24280" "V24281" "V24282"
## [764] "V24286" "V24287" "V24288" "V24300" "V24395" "V24583" "V24588"
## [771] "V24990" "V25057" "V25058" "V25061" "V25062" "V25063" "V25081"
## [778] "V25087" "V25088" "V25089" "V25092" "V25129" "V25130" "V25131"
## [785] "V25132" "V25133" "V25134" "V25150" "V25151" "V25380" "V25458"
## [792] "V25483" "V25488" "V25597" "V25598" "V25602" "V25699" "V25782"
## [799] "V25807" "V25812" "V25818" "V25848" "V25874" "V25875" "V25876"
## [806] "V25877" "V25878" "V25880" "V25881" "V25882" "V25883" "V25884"
## [813] "V25897" "V25898" "V25899" "V25900" "V25901" "V25903" "V25904"
## [820] "V25905" "V25906" "V25907" "V25908" "V25909" "V25910" "V25911"
## [827] "V25912" "V25913" "V25914" "V25915" "V25916" "V25920" "V26013"
## [834] "V26014" "V26167" "V26172" "V26218" "V26219" "V26224" "V26225"
## [841] "V26226" "V26383" "V26388" "V26604" "V26923" "V26928" "V27019"
## [848] "V27020" "V27026" "V27031" "V27032" "V27033" "V27034" "V27035"
## [855] "V27036" "V27042" "V27067" "V27086" "V27091" "V27092" "V27093"
## [862] "V27094" "V27095" "V27096" "V27098" "V27099" "V27100" "V27101"
## [869] "V27102" "V27186" "V27191" "V27192" "V27211" "V27216" "V27324"
## [876] "V27402" "V27684" "V27685" "V27690" "V27731" "V27734" "V27735"
## [883] "V27736" "V27737" "V27738" "V27744" "V27829" "V27936" "V27973"
## [890] "V27978" "V28009" "V28014" "V28044" "V28261" "V28262" "V28265"
## [897] "V28266" "V28267" "V28285" "V28286" "V28289" "V28290" "V28291"
## [904] "V28292" "V28293" "V28294" "V28295" "V28296" "V28446" "V28609"
## [911] "V28610" "V28612" "V28613" "V28614" "V28615" "V28616" "V28617"
## [918] "V28618" "V28619" "V28620" "V28621" "V28626" "V28651" "V28656"
## [925] "V28698" "V28705" "V28711" "V28717" "V28723" "V28724" "V28725"
## [932] "V28726" "V28727" "V28728" "V28753" "V28754" "V28755" "V28756"
## [939] "V28757" "V28758" "V28801" "V28802" "V28806" "V28824" "V28825"
## [946] "V28826" "V28830" "V28831" "V28836" "V28837" "V28867" "V28872"
## [953] "V28981" "V29005" "V29011" "V29012" "V29013" "V29014" "V29015"
## [960] "V29016" "V29078" "V29079" "V29080" "V29081" "V29082" "V29084"
## [967] "V29085" "V29086" "V29087" "V29088" "V29160" "V29335" "V29942"
## [974] "V29948" "V30012" "V30246" "V30396" "V30399" "V30400" "V30401"
## [981] "V30402" "V30421" "V30422" "V30425" "V30426" "V30451" "V30853"
## [988] "V30894" "V30919" "V30996" "V31176" "V31257" "V31258" "V31259"
## [995] "V31260" "V31266" "V31284" "V31434" "V31440" "V31464" "V31665"
## [1002] "V31666" "V31667" "V31668" "V31848" "V31897" "V31902" "V31908"
## [1009] "V31909" "V31910" "V31914" "V31915" "V31916" "V31917" "V31918"
## [1016] "V31919" "V31920" "V31921" "V31922" "V31923" "V31924" "V31925"
## [1023] "V31926" "V31927" "V31928" "V31929" "V31930" "V31931" "V31932"
## [1030] "V31963" "V31968" "V31999" "V32000" "V32001" "V32002" "V32003"
## [1037] "V32004" "V32005" "V32035" "V32102" "V32103" "V32104" "V32105"
## [1044] "V32293" "V32298" "V32299" "V32303" "V32304" "V32305" "V32317"
## [1051] "V32323" "V32365" "V32370" "V32395" "V32400" "V32622" "V32689"
## [1058] "V32690" "V32691" "V32692" "V32693" "V32694" "V32725" "V32771"
## [1065] "V32772" "V32778" "V32904" "V33234" "V33240" "V33259" "V33301"
## [1072] "V33306" "V33481" "V33487" "V33511" "V33589" "V33594" "V33618"
## [1079] "V33619" "V33623" "V33624" "V33769" "V33770" "V33774" "V33787"
## [1086] "V33793" "V33799" "V33800" "V33804" "V33841" "V33846" "V33866"
## [1093] "V33870" "V33872" "V33873" "V33874" "V33875" "V33876" "V33882"
## [1100] "V34201" "V34494" "V34518" "V34524" "V34950" "V34951" "V34952"
## [1107] "V34956" "V34993" "V35029" "V35030" "V35031" "V35032" "V35033"
## [1114] "V35034" "V35059" "V35064" "V35095" "V35209" "V35215" "V35233"
## [1121] "V35239" "V35240" "V35241" "V35242" "V35243" "V35244" "V35508"
## [1128] "V35510" "V35511" "V35512" "V35513" "V35514" "V35517" "V35518"
## [1135] "V35519" "V35520" "V35527" "V35881" "V35887" "V35892" "V36119"
## [1142] "V36120" "V36123" "V36124" "V36125" "V36126" "V36145" "V36150"
## [1149] "V36205" "V36499" "V36582" "V36607" "V36689" "V36690" "V36727"
## [1156] "V36745" "V36751" "V36752" "V36753" "V36754" "V36755" "V36756"
##
## $ignore
## character(0)
##
##
## Calculating 1162 means for centering
## Calculating 1162 standard deviations for scaling
data[,2:1163] <- predict(Trans,data[2:1163])
data <- cbind(data[,1],data[,2:1163])
names(data)[1] <- "V1"
gbmFit_SC <- train(V1~., data = data,
method = "gbm",
trControl = fitControl,
verbose = TRUE,
tuneGrid = gbmGrid)
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2769
## 2 1.2056 nan 0.1500 0.1879
## 3 1.0827 nan 0.1500 0.1469
## 4 0.9866 nan 0.1500 0.1125
## 5 0.9133 nan 0.1500 0.0940
## 6 0.8524 nan 0.1500 0.0796
## 7 0.7998 nan 0.1500 0.0623
## 8 0.7571 nan 0.1500 0.0541
## 9 0.7206 nan 0.1500 0.0512
## 10 0.6880 nan 0.1500 0.0377
## 20 0.5084 nan 0.1500 0.0130
## 40 0.3787 nan 0.1500 0.0011
## 60 0.3198 nan 0.1500 -0.0007
## 80 0.2782 nan 0.1500 -0.0021
## 100 0.2477 nan 0.1500 -0.0014
## 120 0.2252 nan 0.1500 -0.0019
## 140 0.2039 nan 0.1500 -0.0009
## 160 0.1858 nan 0.1500 -0.0015
## 180 0.1695 nan 0.1500 -0.0006
## 200 0.1553 nan 0.1500 -0.0009
## 220 0.1431 nan 0.1500 -0.0008
## 240 0.1320 nan 0.1500 -0.0010
## 260 0.1224 nan 0.1500 -0.0012
## 280 0.1135 nan 0.1500 -0.0009
## 300 0.1055 nan 0.1500 -0.0009
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2761
## 2 1.2053 nan 0.1500 0.1858
## 3 1.0878 nan 0.1500 0.1509
## 4 0.9926 nan 0.1500 0.1133
## 5 0.9191 nan 0.1500 0.0983
## 6 0.8564 nan 0.1500 0.0736
## 7 0.8078 nan 0.1500 0.0648
## 8 0.7657 nan 0.1500 0.0567
## 9 0.7283 nan 0.1500 0.0478
## 10 0.6967 nan 0.1500 0.0407
## 20 0.5183 nan 0.1500 0.0123
## 40 0.3853 nan 0.1500 0.0022
## 60 0.3276 nan 0.1500 0.0007
## 80 0.2882 nan 0.1500 -0.0007
## 100 0.2564 nan 0.1500 -0.0007
## 120 0.2315 nan 0.1500 -0.0014
## 140 0.2105 nan 0.1500 -0.0017
## 160 0.1922 nan 0.1500 -0.0011
## 180 0.1766 nan 0.1500 -0.0009
## 200 0.1618 nan 0.1500 -0.0008
## 220 0.1495 nan 0.1500 -0.0007
## 240 0.1371 nan 0.1500 -0.0006
## 260 0.1269 nan 0.1500 -0.0008
## 280 0.1179 nan 0.1500 -0.0007
## 300 0.1092 nan 0.1500 -0.0005
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2708
## 2 1.2089 nan 0.1500 0.1959
## 3 1.0861 nan 0.1500 0.1509
## 4 0.9876 nan 0.1500 0.1153
## 5 0.9115 nan 0.1500 0.0924
## 6 0.8513 nan 0.1500 0.0679
## 7 0.8042 nan 0.1500 0.0624
## 8 0.7638 nan 0.1500 0.0525
## 9 0.7270 nan 0.1500 0.0438
## 10 0.6968 nan 0.1500 0.0393
## 20 0.5121 nan 0.1500 0.0123
## 40 0.3824 nan 0.1500 0.0020
## 60 0.3211 nan 0.1500 -0.0011
## 80 0.2812 nan 0.1500 -0.0007
## 100 0.2492 nan 0.1500 -0.0011
## 120 0.2243 nan 0.1500 -0.0006
## 140 0.2038 nan 0.1500 -0.0015
## 160 0.1837 nan 0.1500 -0.0006
## 180 0.1665 nan 0.1500 -0.0011
## 200 0.1536 nan 0.1500 -0.0013
## 220 0.1412 nan 0.1500 -0.0006
## 240 0.1304 nan 0.1500 -0.0014
## 260 0.1202 nan 0.1500 -0.0006
## 280 0.1115 nan 0.1500 -0.0009
## 300 0.1046 nan 0.1500 -0.0008
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2779
## 2 1.2067 nan 0.1500 0.1868
## 3 1.0845 nan 0.1500 0.1460
## 4 0.9888 nan 0.1500 0.1185
## 5 0.9113 nan 0.1500 0.0950
## 6 0.8503 nan 0.1500 0.0752
## 7 0.8004 nan 0.1500 0.0597
## 8 0.7592 nan 0.1500 0.0528
## 9 0.7218 nan 0.1500 0.0490
## 10 0.6899 nan 0.1500 0.0395
## 20 0.5136 nan 0.1500 0.0122
## 40 0.3850 nan 0.1500 0.0019
## 60 0.3257 nan 0.1500 -0.0000
## 80 0.2844 nan 0.1500 -0.0013
## 100 0.2538 nan 0.1500 -0.0013
## 120 0.2266 nan 0.1500 -0.0005
## 140 0.2055 nan 0.1500 -0.0005
## 160 0.1861 nan 0.1500 -0.0009
## 180 0.1700 nan 0.1500 -0.0009
## 200 0.1557 nan 0.1500 -0.0009
## 220 0.1437 nan 0.1500 -0.0014
## 240 0.1328 nan 0.1500 -0.0006
## 260 0.1231 nan 0.1500 -0.0009
## 280 0.1143 nan 0.1500 -0.0005
## 300 0.1067 nan 0.1500 -0.0008
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2649
## 2 1.2091 nan 0.1500 0.1895
## 3 1.0882 nan 0.1500 0.1479
## 4 0.9941 nan 0.1500 0.1195
## 5 0.9178 nan 0.1500 0.0902
## 6 0.8573 nan 0.1500 0.0765
## 7 0.8069 nan 0.1500 0.0567
## 8 0.7682 nan 0.1500 0.0508
## 9 0.7327 nan 0.1500 0.0435
## 10 0.7009 nan 0.1500 0.0376
## 20 0.5228 nan 0.1500 0.0154
## 40 0.3860 nan 0.1500 0.0009
## 60 0.3259 nan 0.1500 0.0009
## 80 0.2842 nan 0.1500 0.0000
## 100 0.2513 nan 0.1500 -0.0014
## 120 0.2255 nan 0.1500 -0.0014
## 140 0.2043 nan 0.1500 -0.0003
## 160 0.1857 nan 0.1500 -0.0006
## 180 0.1691 nan 0.1500 -0.0015
## 200 0.1547 nan 0.1500 -0.0010
## 220 0.1430 nan 0.1500 -0.0014
## 240 0.1323 nan 0.1500 -0.0005
## 260 0.1226 nan 0.1500 -0.0009
## 280 0.1133 nan 0.1500 -0.0007
## 300 0.1048 nan 0.1500 -0.0009
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2780
## 2 1.2109 nan 0.1500 0.1980
## 3 1.0855 nan 0.1500 0.1437
## 4 0.9908 nan 0.1500 0.1146
## 5 0.9157 nan 0.1500 0.0942
## 6 0.8562 nan 0.1500 0.0766
## 7 0.8068 nan 0.1500 0.0634
## 8 0.7662 nan 0.1500 0.0534
## 9 0.7314 nan 0.1500 0.0449
## 10 0.7023 nan 0.1500 0.0435
## 20 0.5223 nan 0.1500 0.0117
## 40 0.3953 nan 0.1500 0.0003
## 60 0.3355 nan 0.1500 0.0004
## 80 0.2974 nan 0.1500 -0.0009
## 100 0.2690 nan 0.1500 -0.0013
## 120 0.2451 nan 0.1500 -0.0018
## 140 0.2248 nan 0.1500 -0.0006
## 160 0.2072 nan 0.1500 -0.0013
## 180 0.1920 nan 0.1500 -0.0010
## 200 0.1779 nan 0.1500 -0.0008
## 220 0.1656 nan 0.1500 -0.0014
## 240 0.1541 nan 0.1500 -0.0012
## 260 0.1442 nan 0.1500 -0.0011
## 280 0.1348 nan 0.1500 -0.0004
## 300 0.1263 nan 0.1500 -0.0007
gbmFit_SC
## Stochastic Gradient Boosting
##
## 5998 samples
## 1162 predictors
## 4 classes: '1', '2', '3', '4'
##
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 1 times)
## Summary of sample sizes: 4799, 4799, 4797, 4800, 4797
## Resampling results
##
## Accuracy Kappa Accuracy SD Kappa SD
## 0.8529496 0.7922083 0.004334312 0.006126696
##
## Tuning parameter 'n.trees' was held constant at a value of 300
## 2
## Tuning parameter 'shrinkage' was held constant at a value of
## 0.15
## Tuning parameter 'n.minobsinnode' was held constant at a value of 10
##
No significant improvement was noticed. Performance of models remained within standard deviations limits.
skewness <- lapply(data[,2:1163],skewness)
skewness <- as.data.frame(as.matrix(skewness))
if (length(skewness$V1>2)==1162){
print("All variables are skewed, and could do with BoxCox transformation, dont you think?")
}
## [1] "All variables are skewed, and could do with BoxCox transformation, dont you think?"
It was observed that all variables present significant skewdness. BoxCox transformation can be applied. A GBM with same parameter tuning is applied to check for the transformation effect.
system.time(for (i in 2:1163){
skewness(data[,i])
Trans <- BoxCoxTrans(data[,i])
data[,i] <- predict(Trans,data[,i])})
## user system elapsed
## 3.81 0.05 4.07
gbmFit_BC <- train(V1~., data = data,
method = "gbm",
trControl = fitControl,
verbose = TRUE,
tuneGrid = gbmGrid)
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2813
## 2 1.2087 nan 0.1500 0.1949
## 3 1.0814 nan 0.1500 0.1500
## 4 0.9844 nan 0.1500 0.1135
## 5 0.9100 nan 0.1500 0.0974
## 6 0.8478 nan 0.1500 0.0734
## 7 0.7987 nan 0.1500 0.0629
## 8 0.7557 nan 0.1500 0.0548
## 9 0.7175 nan 0.1500 0.0415
## 10 0.6879 nan 0.1500 0.0364
## 20 0.5065 nan 0.1500 0.0121
## 40 0.3776 nan 0.1500 0.0018
## 60 0.3160 nan 0.1500 0.0005
## 80 0.2765 nan 0.1500 -0.0008
## 100 0.2466 nan 0.1500 -0.0007
## 120 0.2214 nan 0.1500 -0.0007
## 140 0.2008 nan 0.1500 -0.0011
## 160 0.1839 nan 0.1500 -0.0019
## 180 0.1691 nan 0.1500 -0.0017
## 200 0.1550 nan 0.1500 -0.0009
## 220 0.1434 nan 0.1500 -0.0006
## 240 0.1322 nan 0.1500 -0.0010
## 260 0.1228 nan 0.1500 -0.0006
## 280 0.1137 nan 0.1500 -0.0006
## 300 0.1054 nan 0.1500 -0.0012
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2797
## 2 1.2049 nan 0.1500 0.1960
## 3 1.0809 nan 0.1500 0.1443
## 4 0.9872 nan 0.1500 0.1148
## 5 0.9120 nan 0.1500 0.0920
## 6 0.8526 nan 0.1500 0.0747
## 7 0.8021 nan 0.1500 0.0560
## 8 0.7627 nan 0.1500 0.0529
## 9 0.7263 nan 0.1500 0.0434
## 10 0.6955 nan 0.1500 0.0420
## 20 0.5140 nan 0.1500 0.0103
## 40 0.3823 nan 0.1500 0.0020
## 60 0.3227 nan 0.1500 -0.0010
## 80 0.2826 nan 0.1500 -0.0014
## 100 0.2505 nan 0.1500 -0.0017
## 120 0.2266 nan 0.1500 -0.0018
## 140 0.2062 nan 0.1500 -0.0010
## 160 0.1874 nan 0.1500 -0.0005
## 180 0.1712 nan 0.1500 -0.0008
## 200 0.1570 nan 0.1500 -0.0006
## 220 0.1434 nan 0.1500 -0.0013
## 240 0.1316 nan 0.1500 -0.0012
## 260 0.1225 nan 0.1500 -0.0006
## 280 0.1142 nan 0.1500 -0.0004
## 300 0.1057 nan 0.1500 -0.0009
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2579
## 2 1.2154 nan 0.1500 0.1985
## 3 1.0887 nan 0.1500 0.1498
## 4 0.9912 nan 0.1500 0.1166
## 5 0.9161 nan 0.1500 0.0964
## 6 0.8539 nan 0.1500 0.0738
## 7 0.8048 nan 0.1500 0.0608
## 8 0.7635 nan 0.1500 0.0497
## 9 0.7293 nan 0.1500 0.0465
## 10 0.6963 nan 0.1500 0.0385
## 20 0.5128 nan 0.1500 0.0112
## 40 0.3825 nan 0.1500 0.0008
## 60 0.3202 nan 0.1500 -0.0017
## 80 0.2822 nan 0.1500 -0.0007
## 100 0.2510 nan 0.1500 -0.0013
## 120 0.2256 nan 0.1500 -0.0011
## 140 0.2035 nan 0.1500 -0.0003
## 160 0.1850 nan 0.1500 -0.0013
## 180 0.1698 nan 0.1500 -0.0009
## 200 0.1566 nan 0.1500 -0.0011
## 220 0.1447 nan 0.1500 -0.0010
## 240 0.1337 nan 0.1500 -0.0005
## 260 0.1232 nan 0.1500 -0.0007
## 280 0.1139 nan 0.1500 -0.0008
## 300 0.1061 nan 0.1500 -0.0008
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2807
## 2 1.2092 nan 0.1500 0.1963
## 3 1.0817 nan 0.1500 0.1467
## 4 0.9856 nan 0.1500 0.1161
## 5 0.9087 nan 0.1500 0.0843
## 6 0.8490 nan 0.1500 0.0681
## 7 0.8012 nan 0.1500 0.0637
## 8 0.7592 nan 0.1500 0.0552
## 9 0.7223 nan 0.1500 0.0474
## 10 0.6899 nan 0.1500 0.0409
## 20 0.5097 nan 0.1500 0.0122
## 40 0.3797 nan 0.1500 -0.0007
## 60 0.3212 nan 0.1500 -0.0002
## 80 0.2777 nan 0.1500 -0.0008
## 100 0.2463 nan 0.1500 -0.0005
## 120 0.2212 nan 0.1500 -0.0008
## 140 0.2001 nan 0.1500 -0.0016
## 160 0.1806 nan 0.1500 -0.0011
## 180 0.1654 nan 0.1500 -0.0010
## 200 0.1508 nan 0.1500 -0.0007
## 220 0.1390 nan 0.1500 -0.0007
## 240 0.1283 nan 0.1500 -0.0007
## 260 0.1195 nan 0.1500 -0.0005
## 280 0.1109 nan 0.1500 -0.0006
## 300 0.1031 nan 0.1500 -0.0006
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2694
## 2 1.2126 nan 0.1500 0.1878
## 3 1.0895 nan 0.1500 0.1498
## 4 0.9943 nan 0.1500 0.1153
## 5 0.9197 nan 0.1500 0.0935
## 6 0.8601 nan 0.1500 0.0758
## 7 0.8107 nan 0.1500 0.0602
## 8 0.7695 nan 0.1500 0.0512
## 9 0.7339 nan 0.1500 0.0449
## 10 0.7026 nan 0.1500 0.0376
## 20 0.5249 nan 0.1500 0.0124
## 40 0.3964 nan 0.1500 0.0003
## 60 0.3358 nan 0.1500 0.0002
## 80 0.2926 nan 0.1500 -0.0011
## 100 0.2620 nan 0.1500 -0.0016
## 120 0.2352 nan 0.1500 -0.0015
## 140 0.2140 nan 0.1500 -0.0007
## 160 0.1944 nan 0.1500 -0.0013
## 180 0.1773 nan 0.1500 -0.0012
## 200 0.1635 nan 0.1500 -0.0016
## 220 0.1503 nan 0.1500 -0.0011
## 240 0.1388 nan 0.1500 -0.0009
## 260 0.1297 nan 0.1500 -0.0009
## 280 0.1199 nan 0.1500 -0.0004
## 300 0.1121 nan 0.1500 -0.0008
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2768
## 2 1.2123 nan 0.1500 0.1996
## 3 1.0870 nan 0.1500 0.1493
## 4 0.9915 nan 0.1500 0.1140
## 5 0.9168 nan 0.1500 0.0900
## 6 0.8569 nan 0.1500 0.0706
## 7 0.8079 nan 0.1500 0.0600
## 8 0.7678 nan 0.1500 0.0540
## 9 0.7328 nan 0.1500 0.0459
## 10 0.7028 nan 0.1500 0.0432
## 20 0.5253 nan 0.1500 0.0112
## 40 0.3968 nan 0.1500 -0.0000
## 60 0.3382 nan 0.1500 0.0002
## 80 0.2981 nan 0.1500 -0.0016
## 100 0.2694 nan 0.1500 -0.0007
## 120 0.2461 nan 0.1500 -0.0007
## 140 0.2267 nan 0.1500 -0.0009
## 160 0.2072 nan 0.1500 -0.0007
## 180 0.1914 nan 0.1500 -0.0008
## 200 0.1774 nan 0.1500 -0.0011
## 220 0.1653 nan 0.1500 -0.0006
## 240 0.1540 nan 0.1500 -0.0008
## 260 0.1435 nan 0.1500 -0.0006
## 280 0.1343 nan 0.1500 -0.0005
## 300 0.1263 nan 0.1500 -0.0009
gbmFit_BC
## Stochastic Gradient Boosting
##
## 5998 samples
## 1162 predictors
## 4 classes: '1', '2', '3', '4'
##
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 1 times)
## Summary of sample sizes: 4797, 4797, 4799, 4800, 4799
## Resampling results
##
## Accuracy Kappa Accuracy SD Kappa SD
## 0.8514479 0.7901417 0.01437934 0.02041485
##
## Tuning parameter 'n.trees' was held constant at a value of 300
## 2
## Tuning parameter 'shrinkage' was held constant at a value of
## 0.15
## Tuning parameter 'n.minobsinnode' was held constant at a value of 10
##
Accuracy has slightly improved to 0.8541204 +/- SD 0.007404085. It should also be noticed that the results became slightly more stable as theory suggests.
##Correlations
descrCor <- cor(data[,2:1163])
summary(descrCor[upper.tri(descrCor)])
## Min. 1st Qu. Median Mean 3rd Qu. Max.
## -0.148000 -0.017220 0.004249 0.015190 0.033370 0.850700
highlyCorDescr <- findCorrelation(descrCor, cutoff = .7)
data <- data[,-highlyCorDescr]
descrCor2 <- cor(data[,2:ncol(data)])
summary(descrCor2[upper.tri(descrCor2)])
## Min. 1st Qu. Median Mean 3rd Qu. Max.
## -0.148000 -0.017240 0.003997 0.014510 0.032580 0.782600
rm(descrCor)
rm(descrCor2)
rm(highCorr)
## Warning in rm(highCorr): object 'highCorr' not found
rm(highlyCorDescr)
rm(i)
rm(Trans)
rm(skewness)
gbmFit_cor <- train(V1~., data = data,
method = "gbm",
trControl = fitControl,
verbose = TRUE,
tuneGrid = gbmGrid)
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2740
## 2 1.2095 nan 0.1500 0.1902
## 3 1.0849 nan 0.1500 0.1458
## 4 0.9901 nan 0.1500 0.1131
## 5 0.9175 nan 0.1500 0.0915
## 6 0.8587 nan 0.1500 0.0753
## 7 0.8079 nan 0.1500 0.0587
## 8 0.7688 nan 0.1500 0.0511
## 9 0.7340 nan 0.1500 0.0456
## 10 0.7021 nan 0.1500 0.0394
## 20 0.5224 nan 0.1500 0.0127
## 40 0.3887 nan 0.1500 0.0005
## 60 0.3292 nan 0.1500 -0.0008
## 80 0.2883 nan 0.1500 0.0004
## 100 0.2566 nan 0.1500 -0.0008
## 120 0.2310 nan 0.1500 -0.0020
## 140 0.2093 nan 0.1500 -0.0005
## 160 0.1915 nan 0.1500 -0.0009
## 180 0.1751 nan 0.1500 -0.0008
## 200 0.1609 nan 0.1500 -0.0006
## 220 0.1481 nan 0.1500 -0.0014
## 240 0.1367 nan 0.1500 -0.0009
## 260 0.1274 nan 0.1500 -0.0010
## 280 0.1187 nan 0.1500 -0.0007
## 300 0.1099 nan 0.1500 -0.0004
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2717
## 2 1.2069 nan 0.1500 0.1868
## 3 1.0848 nan 0.1500 0.1523
## 4 0.9874 nan 0.1500 0.1171
## 5 0.9131 nan 0.1500 0.1009
## 6 0.8490 nan 0.1500 0.0734
## 7 0.7983 nan 0.1500 0.0601
## 8 0.7582 nan 0.1500 0.0537
## 9 0.7237 nan 0.1500 0.0433
## 10 0.6922 nan 0.1500 0.0405
## 20 0.5148 nan 0.1500 0.0082
## 40 0.3838 nan 0.1500 0.0008
## 60 0.3235 nan 0.1500 -0.0009
## 80 0.2835 nan 0.1500 -0.0004
## 100 0.2525 nan 0.1500 -0.0014
## 120 0.2279 nan 0.1500 -0.0009
## 140 0.2057 nan 0.1500 -0.0010
## 160 0.1868 nan 0.1500 -0.0013
## 180 0.1695 nan 0.1500 -0.0004
## 200 0.1559 nan 0.1500 -0.0004
## 220 0.1432 nan 0.1500 -0.0008
## 240 0.1321 nan 0.1500 -0.0008
## 260 0.1218 nan 0.1500 -0.0008
## 280 0.1126 nan 0.1500 -0.0009
## 300 0.1054 nan 0.1500 -0.0004
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2746
## 2 1.2093 nan 0.1500 0.1879
## 3 1.0849 nan 0.1500 0.1483
## 4 0.9863 nan 0.1500 0.1154
## 5 0.9107 nan 0.1500 0.0922
## 6 0.8508 nan 0.1500 0.0718
## 7 0.8033 nan 0.1500 0.0639
## 8 0.7607 nan 0.1500 0.0539
## 9 0.7240 nan 0.1500 0.0466
## 10 0.6930 nan 0.1500 0.0371
## 20 0.5138 nan 0.1500 0.0080
## 40 0.3851 nan 0.1500 0.0031
## 60 0.3235 nan 0.1500 -0.0010
## 80 0.2824 nan 0.1500 -0.0008
## 100 0.2491 nan 0.1500 -0.0020
## 120 0.2241 nan 0.1500 -0.0025
## 140 0.2023 nan 0.1500 -0.0012
## 160 0.1833 nan 0.1500 -0.0012
## 180 0.1676 nan 0.1500 -0.0011
## 200 0.1539 nan 0.1500 -0.0011
## 220 0.1415 nan 0.1500 -0.0009
## 240 0.1307 nan 0.1500 -0.0009
## 260 0.1211 nan 0.1500 -0.0007
## 280 0.1125 nan 0.1500 -0.0005
## 300 0.1048 nan 0.1500 -0.0005
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2760
## 2 1.2070 nan 0.1500 0.1874
## 3 1.0838 nan 0.1500 0.1442
## 4 0.9899 nan 0.1500 0.1196
## 5 0.9123 nan 0.1500 0.0789
## 6 0.8558 nan 0.1500 0.0741
## 7 0.8069 nan 0.1500 0.0637
## 8 0.7662 nan 0.1500 0.0546
## 9 0.7286 nan 0.1500 0.0441
## 10 0.6992 nan 0.1500 0.0387
## 20 0.5215 nan 0.1500 0.0133
## 40 0.3923 nan 0.1500 0.0020
## 60 0.3315 nan 0.1500 -0.0007
## 80 0.2920 nan 0.1500 -0.0009
## 100 0.2623 nan 0.1500 -0.0005
## 120 0.2367 nan 0.1500 -0.0004
## 140 0.2153 nan 0.1500 -0.0010
## 160 0.1948 nan 0.1500 -0.0013
## 180 0.1779 nan 0.1500 -0.0012
## 200 0.1639 nan 0.1500 -0.0010
## 220 0.1515 nan 0.1500 -0.0010
## 240 0.1417 nan 0.1500 -0.0011
## 260 0.1310 nan 0.1500 -0.0009
## 280 0.1215 nan 0.1500 -0.0011
## 300 0.1128 nan 0.1500 -0.0009
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2740
## 2 1.2115 nan 0.1500 0.1951
## 3 1.0864 nan 0.1500 0.1432
## 4 0.9937 nan 0.1500 0.1156
## 5 0.9187 nan 0.1500 0.0960
## 6 0.8579 nan 0.1500 0.0746
## 7 0.8060 nan 0.1500 0.0618
## 8 0.7634 nan 0.1500 0.0467
## 9 0.7303 nan 0.1500 0.0436
## 10 0.6990 nan 0.1500 0.0393
## 20 0.5185 nan 0.1500 0.0120
## 40 0.3860 nan 0.1500 0.0006
## 60 0.3258 nan 0.1500 -0.0005
## 80 0.2848 nan 0.1500 -0.0012
## 100 0.2528 nan 0.1500 -0.0011
## 120 0.2264 nan 0.1500 -0.0009
## 140 0.2043 nan 0.1500 -0.0016
## 160 0.1850 nan 0.1500 -0.0006
## 180 0.1702 nan 0.1500 -0.0019
## 200 0.1570 nan 0.1500 -0.0010
## 220 0.1442 nan 0.1500 -0.0006
## 240 0.1332 nan 0.1500 -0.0007
## 260 0.1240 nan 0.1500 -0.0008
## 280 0.1150 nan 0.1500 -0.0004
## 300 0.1066 nan 0.1500 -0.0007
##
## Iter TrainDeviance ValidDeviance StepSize Improve
## 1 1.3863 nan 0.1500 0.2664
## 2 1.2108 nan 0.1500 0.2020
## 3 1.0852 nan 0.1500 0.1503
## 4 0.9900 nan 0.1500 0.1166
## 5 0.9170 nan 0.1500 0.0908
## 6 0.8576 nan 0.1500 0.0733
## 7 0.8088 nan 0.1500 0.0636
## 8 0.7668 nan 0.1500 0.0486
## 9 0.7335 nan 0.1500 0.0461
## 10 0.7021 nan 0.1500 0.0410
## 20 0.5247 nan 0.1500 0.0103
## 40 0.3980 nan 0.1500 0.0017
## 60 0.3388 nan 0.1500 0.0002
## 80 0.2996 nan 0.1500 -0.0002
## 100 0.2710 nan 0.1500 -0.0005
## 120 0.2473 nan 0.1500 -0.0016
## 140 0.2279 nan 0.1500 -0.0015
## 160 0.2097 nan 0.1500 -0.0006
## 180 0.1937 nan 0.1500 -0.0009
## 200 0.1789 nan 0.1500 -0.0013
## 220 0.1653 nan 0.1500 -0.0007
## 240 0.1543 nan 0.1500 -0.0006
## 260 0.1447 nan 0.1500 -0.0010
## 280 0.1354 nan 0.1500 -0.0009
## 300 0.1266 nan 0.1500 -0.0003
gbmFit_cor
## Stochastic Gradient Boosting
##
## 5998 samples
## 1138 predictors
## 4 classes: '1', '2', '3', '4'
##
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 1 times)
## Summary of sample sizes: 4800, 4798, 4799, 4798, 4797
## Resampling results
##
## Accuracy Kappa Accuracy SD Kappa SD
## 0.8521195 0.791234 0.01149816 0.01645473
##
## Tuning parameter 'n.trees' was held constant at a value of 300
## 2
## Tuning parameter 'shrinkage' was held constant at a value of
## 0.15
## Tuning parameter 'n.minobsinnode' was held constant at a value of 10
##
Slight improvement and stable results. Merci pour mentanant, nous allons parlez demain.