This material is to practise machine learning with mlr
package. If you want to learn more about it, please see this link
In this tutorial, KNN
classification model will be built to predict three levels of classes using a dataset called diabetes
from mclust
package. Also, we will examine various cross-validation techniques to assess the model, and using turning parameters to improve the model.
# Predicting diabetes
library(mclust) # Diabetes data from this package
library(mlr) # Machine learning with r package
head(diabetes) # Dataset
NA
KNN_task<-makeClassifTask(data=diabetes, target="class") # Initialize the task and set response variable
KNN_learner<-makeLearner("classif.knn",par.vals = list("k"=2)) # Choose learner or technique we want to use.
# Train the model
KNN_train<-train(KNN_learner, KNN_task)
# Holdout cross-validation
set.seed(123)
KNN_holdout<-makeResampleDesc(method="Holdout", stratify = T, split=2/3)
KNN_cross_validation <- resample(learner = KNN_learner, task = KNN_task, resampling = KNN_holdout, measures = list(mmce, acc)) # Cross-validation
Resampling: holdout
Measures: mmce acc
[Resample] iter 1: 0.1836735 0.8163265
Aggregated Result: mmce.test.mean=0.1836735,acc.test.mean=0.8163265
# Calculate confusion matrix table
calculateConfusionMatrix(KNN_cross_validation$pred,relative = T)
Relative confusion matrix (normalized by row/column):
predicted
true Chemical Normal Overt -err.-
Chemical 0.67/0.67 0.17/0.08 0.17/0.17 0.33
Normal 0.08/0.17 0.88/0.92 0.04/0.08 0.12
Overt 0.18/0.17 0.00/0.00 0.82/0.75 0.18
-err.- 0.33 0.08 0.25 0.18
Absolute confusion matrix:
predicted
true Chemical Normal Overt -err.-
Chemical 8 2 2 4
Normal 2 23 1 3
Overt 2 0 9 2
-err.- 4 2 3 9
We see that our model has relatively low overal accuracy with 81.6% and mean misclassification error is 0.18. The confusion table shows that 67% of chemically diabetes was correctly classified while 17% of this kind was misclassified as Normal and Overt, respectively.
KNN_task<-makeClassifTask(data=diabetes, target="class") # Initialize the task and set response variable
KNN_learner<-makeLearner("classif.knn",par.vals = list("k"=2)) # Choose learner or technique we want to use.
# Train the model
KNN_train<-train(KNN_learner, KNN_task)
# Holdout cross-validation
set.seed(123)
KNN_10_folds<-makeResampleDesc(method="RepCV", stratify = T, folds=10, reps=5)
KNN_cross_validation <- resample(learner = KNN_learner, task = KNN_task, resampling = KNN_10_folds, measures = list(mmce, acc)) # Cross-validation
Resampling: repeated cross-validation
Measures: mmce acc
[Resample] iter 1: 0.1538462 0.8461538
[Resample] iter 2: 0.3125000 0.6875000
[Resample] iter 3: 0.1250000 0.8750000
[Resample] iter 4: 0.0666667 0.9333333
[Resample] iter 5: 0.0714286 0.9285714
[Resample] iter 6: 0.0000000 1.0000000
[Resample] iter 7: 0.0666667 0.9333333
[Resample] iter 8: 0.0714286 0.9285714
[Resample] iter 9: 0.1428571 0.8571429
[Resample] iter 10: 0.0714286 0.9285714
[Resample] iter 11: 0.0625000 0.9375000
[Resample] iter 12: 0.0000000 1.0000000
[Resample] iter 13: 0.2000000 0.8000000
[Resample] iter 14: 0.0666667 0.9333333
[Resample] iter 15: 0.3076923 0.6923077
[Resample] iter 16: 0.0714286 0.9285714
[Resample] iter 17: 0.0625000 0.9375000
[Resample] iter 18: 0.1333333 0.8666667
[Resample] iter 19: 0.1428571 0.8571429
[Resample] iter 20: 0.1428571 0.8571429
[Resample] iter 21: 0.0666667 0.9333333
[Resample] iter 22: 0.1428571 0.8571429
[Resample] iter 23: 0.0714286 0.9285714
[Resample] iter 24: 0.0714286 0.9285714
[Resample] iter 25: 0.2142857 0.7857143
[Resample] iter 26: 0.1333333 0.8666667
[Resample] iter 27: 0.0666667 0.9333333
[Resample] iter 28: 0.0000000 1.0000000
[Resample] iter 29: 0.1428571 0.8571429
[Resample] iter 30: 0.0000000 1.0000000
[Resample] iter 31: 0.2857143 0.7142857
[Resample] iter 32: 0.0714286 0.9285714
[Resample] iter 33: 0.1538462 0.8461538
[Resample] iter 34: 0.0000000 1.0000000
[Resample] iter 35: 0.2000000 0.8000000
[Resample] iter 36: 0.0666667 0.9333333
[Resample] iter 37: 0.0000000 1.0000000
[Resample] iter 38: 0.2000000 0.8000000
[Resample] iter 39: 0.0666667 0.9333333
[Resample] iter 40: 0.1428571 0.8571429
[Resample] iter 41: 0.0000000 1.0000000
[Resample] iter 42: 0.1428571 0.8571429
[Resample] iter 43: 0.1333333 0.8666667
[Resample] iter 44: 0.1428571 0.8571429
[Resample] iter 45: 0.0714286 0.9285714
[Resample] iter 46: 0.0000000 1.0000000
[Resample] iter 47: 0.1428571 0.8571429
[Resample] iter 48: 0.2857143 0.7142857
[Resample] iter 49: 0.1250000 0.8750000
[Resample] iter 50: 0.1538462 0.8461538
Aggregated Result: mmce.test.mean=0.1113251,acc.test.mean=0.8886749
# Calculate confusion matrix table
calculateConfusionMatrix(KNN_cross_validation$pred,relative = T)
Relative confusion matrix (normalized by row/column):
predicted
true Chemical Normal Overt -err.-
Chemical 0.81/0.76 0.12/0.06 0.07/0.09 0.19
Normal 0.04/0.08 0.96/0.94 0.00/0.00 0.04
Overt 0.18/0.15 0.00/0.00 0.82/0.91 0.18
-err.- 0.24 0.06 0.09 0.11
Absolute confusion matrix:
predicted
true Chemical Normal Overt -err.-
Chemical 145 22 13 35
Normal 16 364 0 16
Overt 29 0 136 29
-err.- 45 22 13 80
best
k for the modelKNN_turning <- makeParamSet(makeDiscreteParam("k", values = 1:10))
cvForTuning <- makeResampleDesc("RepCV", folds = 10, reps = 10)
gridSearch <- makeTuneControlGrid()
tunedK <- tuneParams("classif.knn", task = KNN_init, resampling = cvForTuning, par.set = KNN_turning, control = gridSearch)
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1116190; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1108571; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0916667; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0890952; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0813810; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0813333; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0770476; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0880952; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0894762; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0860952; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0770476
tunedK
Tune result:
Op. pars: k=7
mmce.test.mean=0.0770476
knnTuningData <- generateHyperParsEffectData(tunedK)
plotHyperParsEffect(knnTuningData, x = "k", y = "mmce.test.mean", plot.type = "line") + theme_bw()
We now know that K=7 is the best
k for this dataset, so we will build KNN model with K=7
tunedKnn <- setHyperPars(makeLearner("classif.knn"), par.vals = tunedK$x)
tunedKnnModel <- train(tunedKnn, KNN_init)
inner <- makeResampleDesc("CV")
outer <- makeResampleDesc("RepCV", folds = 10, reps = 5)
knnWrapper <- makeTuneWrapper("classif.knn", resampling = inner, par.set = KNN_turning, control = gridSearch)
cvWithTuning <- resample(knnWrapper, KNN_init, resampling = outer)
Resampling: repeated cross-validation
Measures: mmce
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1137363; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1362637; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1054945; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1131868; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0978022; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0978022; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0901099; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1054945; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1054945; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0978022; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0901099
[Resample] iter 1: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1307692; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0846154; time: 0.0 min
[Tune] Result: k=8 : mmce.test.mean=0.0692308
[Resample] iter 2: 0.1333333
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0923077; time: 0.0 min
[Tune] Result: k=2 : mmce.test.mean=0.0769231
[Resample] iter 3: 0.1333333
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0846154; time: 0.0 min
[Tune] Result: k=6 : mmce.test.mean=0.0769231
[Resample] iter 4: 0.0666667
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1148352; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0686813; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0686813; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0840659; time: 0.0 min
[Tune] Result: k=6 : mmce.test.mean=0.0686813
[Resample] iter 5: 0.1428571
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1153846; time: 0.0 min
[Tune] Result: k=4 : mmce.test.mean=0.0846154
[Resample] iter 6: 0.1333333
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1362637; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1593407; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1054945; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1208791; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0901099; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1054945; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0901099; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1285714; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0901099
[Resample] iter 7: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1230769; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1071429; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0769231; time: 0.0 min
[Tune] Result: k=2 : mmce.test.mean=0.0769231
[Resample] iter 8: 0.1428571
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1153846; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0769231
[Resample] iter 9: 0.0666667
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0615385; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0538462; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0692308; time: 0.0 min
[Tune] Result: k=9 : mmce.test.mean=0.0538462
[Resample] iter 10: 0.0714286
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1071429; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0758242; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0989011; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0994505; time: 0.0 min
[Tune] Result: k=3 : mmce.test.mean=0.0758242
[Resample] iter 11: 0.1428571
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1307692; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1230769; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0846154; time: 0.0 min
[Tune] Result: k=3 : mmce.test.mean=0.0769231
[Resample] iter 12: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0835165; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0829670; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0758242; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0758242; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0983516; time: 0.0 min
[Tune] Result: k=9 : mmce.test.mean=0.0758242
[Resample] iter 13: 0.0714286
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0846154; time: 0.0 min
[Tune] Result: k=6 : mmce.test.mean=0.0769231
[Resample] iter 14: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1000000; time: 0.0 min
[Tune] Result: k=8 : mmce.test.mean=0.1000000
[Resample] iter 15: 0.0666667
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1296703; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0989011; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0989011; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.1137363; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1060440; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0906593; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1214286; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1060440; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0983516; time: 0.0 min
[Tune] Result: k=2 : mmce.test.mean=0.0840659
[Resample] iter 16: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1065934; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1142857; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0835165; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0758242; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0681319; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0681319; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0835165; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0758242; time: 0.0 min
[Tune] Result: k=8 : mmce.test.mean=0.0681319
[Resample] iter 17: 0.1428571
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1065934; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1065934; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0681319; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0609890; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0758242; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0609890; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0686813; time: 0.0 min
[Tune] Result: k=9 : mmce.test.mean=0.0609890
[Resample] iter 18: 0.1428571
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0846154; time: 0.0 min
[Tune] Result: k=5 : mmce.test.mean=0.0692308
[Resample] iter 19: 0.1333333
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1230769; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0846154
[Resample] iter 20: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0983516; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0835165; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0983516; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0829670; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0906593; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0829670; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0983516; time: 0.0 min
[Tune] Result: k=3 : mmce.test.mean=0.0763736
[Resample] iter 21: 0.0714286
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0615385; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0769231; time: 0.0 min
[Tune] Result: k=4 : mmce.test.mean=0.0615385
[Resample] iter 22: 0.1333333
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1461538; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1000000; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0846154
[Resample] iter 23: 0.0666667
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1148352; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1225275; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0989011; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1071429; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0989011; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0758242; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.1060440; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1137363; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1060440; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1060440; time: 0.0 min
[Tune] Result: k=6 : mmce.test.mean=0.0758242
[Resample] iter 24: 0.0714286
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1060440; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0758242; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0835165; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0681319; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0758242; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0681319; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0906593; time: 0.0 min
[Tune] Result: k=9 : mmce.test.mean=0.0681319
[Resample] iter 25: 0.2142857
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1384615; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1384615; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1230769; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0923077; time: 0.0 min
[Tune] Result: k=9 : mmce.test.mean=0.0692308
[Resample] iter 26: 0.0666667
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1142857; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1060440; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0829670; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0829670; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0906593; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0829670; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0752747; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0829670; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0758242; time: 0.0 min
[Tune] Result: k=8 : mmce.test.mean=0.0752747
[Resample] iter 27: 0.0714286
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0923077; time: 0.0 min
[Tune] Result: k=4 : mmce.test.mean=0.0692308
[Resample] iter 28: 0.1333333
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0846154; time: 0.0 min
[Tune] Result: k=8 : mmce.test.mean=0.0692308
[Resample] iter 29: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1142857; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0994505; time: 0.0 min
[Tune] Result: k=3 : mmce.test.mean=0.0763736
[Resample] iter 30: 0.0714286
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1071429; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0686813; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0989011; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0989011; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1142857; time: 0.0 min
[Tune] Result: k=5 : mmce.test.mean=0.0686813
[Resample] iter 31: 0.1428571
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0840659; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0692308
[Resample] iter 32: 0.1428571
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1307692; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0846154; time: 0.0 min
[Tune] Result: k=8 : mmce.test.mean=0.0692308
[Resample] iter 33: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0917582; time: 0.0 min
[Tune] Result: k=1 : mmce.test.mean=0.0840659
[Resample] iter 34: 0.2857143
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1230769; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1461538; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1000000; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0769231
[Resample] iter 35: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0615385; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0686813; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0456044; time: 0.0 min
[Tune] Result: k=10 : mmce.test.mean=0.0456044
[Resample] iter 36: 0.2142857
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1071429; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1142857; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1148352; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1142857; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0989011; time: 0.0 min
[Tune] Result: k=3 : mmce.test.mean=0.0840659
[Resample] iter 37: 0.0714286
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1307692; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0923077; time: 0.0 min
[Tune] Result: k=4 : mmce.test.mean=0.0769231
[Resample] iter 38: 0.0666667
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1307692; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1000000; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0769231
[Resample] iter 39: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0692308; time: 0.0 min
[Tune] Result: k=9 : mmce.test.mean=0.0692308
[Resample] iter 40: 0.2000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1065934; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0989011; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1142857; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.1065934; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0758242; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0989011; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0989011; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0912088; time: 0.0 min
[Tune] Result: k=6 : mmce.test.mean=0.0758242
[Resample] iter 41: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0983516; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0983516; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0906593; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1060440; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0835165; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0835165; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0912088; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0989011; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1065934; time: 0.0 min
[Tune] Result: k=5 : mmce.test.mean=0.0835165
[Resample] iter 42: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1071429; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1071429; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1000000; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0994505
[Resample] iter 43: 0.0714286
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1307692; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1230769; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1076923; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.1153846; time: 0.0 min
[Tune] Result: k=2 : mmce.test.mean=0.1000000
[Resample] iter 44: 0.1333333
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1148352; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1307692; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.1071429; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1071429; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0994505; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0763736; time: 0.0 min
[Tune] Result: k=10 : mmce.test.mean=0.0763736
[Resample] iter 45: 0.0000000
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1384615; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1384615; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1307692; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0923077; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0692308; time: 0.0 min
[Tune] Result: k=10 : mmce.test.mean=0.0692308
[Resample] iter 46: 0.0666667
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0917582; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1219780; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0686813; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0763736; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0686813; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0840659; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0686813; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0840659; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0686813
[Resample] iter 47: 0.1428571
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0923077; time: 0.0 min
[Tune] Result: k=7 : mmce.test.mean=0.0769231
[Resample] iter 48: 0.1333333
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0615385; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0615385; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0692308; time: 0.0 min
[Tune] Result: k=6 : mmce.test.mean=0.0615385
[Resample] iter 49: 0.2666667
[Tune] Started tuning learner classif.knn for parameter set:
With control class: TuneControlGrid
Imputation value: 1
[Tune-x] 1: k=1
[Tune-y] 1: mmce.test.mean=0.1153846; time: 0.0 min
[Tune-x] 2: k=2
[Tune-y] 2: mmce.test.mean=0.1230769; time: 0.0 min
[Tune-x] 3: k=3
[Tune-y] 3: mmce.test.mean=0.1000000; time: 0.0 min
[Tune-x] 4: k=4
[Tune-y] 4: mmce.test.mean=0.0846154; time: 0.0 min
[Tune-x] 5: k=5
[Tune-y] 5: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 6: k=6
[Tune-y] 6: mmce.test.mean=0.0692308; time: 0.0 min
[Tune-x] 7: k=7
[Tune-y] 7: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 8: k=8
[Tune-y] 8: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 9: k=9
[Tune-y] 9: mmce.test.mean=0.0769231; time: 0.0 min
[Tune-x] 10: k=10
[Tune-y] 10: mmce.test.mean=0.0923077; time: 0.0 min
[Tune] Result: k=6 : mmce.test.mean=0.0692308
[Resample] iter 50: 0.0000000
Aggregated Result: mmce.test.mean=0.0885714
cvWithTuning
Resample Result
Task: diabetes
Learner: classif.knn.tuned
Aggr perf: mmce.test.mean=0.0885714
Runtime: 35.7693
calculateConfusionMatrix(cvWithTuning$pred,relative = T)
Relative confusion matrix (normalized by row/column):
predicted
true Chemical Normal Overt -err.-
Chemical 0.84/0.81 0.13/0.06 0.03/0.03 0.16
Normal 0.03/0.06 0.97/0.94 0.00/0.00 0.03
Overt 0.15/0.13 0.00/0.00 0.85/0.97 0.15
-err.- 0.19 0.06 0.03 0.09
Absolute confusion matrix:
predicted
true Chemical Normal Overt -err.-
Chemical 151 24 5 29
Normal 11 369 0 11
Overt 24 0 141 24
-err.- 35 24 5 64