The below code is for Rmarkdown to generate output even if there are errors
knitr::opts_chunk$set(error = TRUE,cache.extra = knitr::rand_seed)
Lets Install Package
library("caret", "skimr")
## Loading required package: lattice
## Loading required package: ggplot2
## Registered S3 methods overwritten by 'ggplot2':
## method from
## [.quosures rlang
## c.quosures rlang
## print.quosures rlang
library("RANN", "randomForest", "fastAdaboost")
## Warning: package 'RANN' was built under R version 3.6.1
## Warning in library("RANN", "randomForest", "fastAdaboost"): 'fastAdaboost'
## not found on search path, using pos = 2
library("gbm", "xgboost")
## Warning: package 'gbm' was built under R version 3.6.1
## Loaded gbm 2.1.5
library("caretEnsemble", "C50")
## Warning: package 'caretEnsemble' was built under R version 3.6.3
##
## Attaching package: 'caretEnsemble'
## The following object is masked from 'package:ggplot2':
##
## autoplot
library("earth")
## Warning: package 'earth' was built under R version 3.6.3
## Loading required package: Formula
## Loading required package: plotmo
## Warning: package 'plotmo' was built under R version 3.6.3
## Loading required package: plotrix
## Loading required package: TeachingDemos
## Warning: package 'TeachingDemos' was built under R version 3.6.3
Import Dataset
orange <- read.csv('https://raw.githubusercontent.com/selva86/datasets/master/orange_juice_withmissing.csv')
Lets see the Structure of the dataframe
str(orange)
## 'data.frame': 1070 obs. of 18 variables:
## $ Purchase : Factor w/ 2 levels "CH","MM": 1 1 1 2 1 1 1 1 1 1 ...
## $ WeekofPurchase: int 237 239 245 227 228 230 232 234 235 238 ...
## $ StoreID : int 1 1 1 1 7 7 7 7 7 7 ...
## $ PriceCH : num 1.75 1.75 1.86 1.69 1.69 1.69 1.69 1.75 1.75 1.75 ...
## $ PriceMM : num 1.99 1.99 2.09 1.69 1.69 1.99 1.99 1.99 1.99 1.99 ...
## $ DiscCH : num 0 0 0.17 0 0 0 0 0 0 0 ...
## $ DiscMM : num 0 0.3 0 0 0 0 0.4 0.4 0.4 0.4 ...
## $ SpecialCH : int 0 0 0 0 0 0 1 1 0 0 ...
## $ SpecialMM : int 0 1 0 0 0 1 1 0 0 0 ...
## $ LoyalCH : num 0.5 0.6 0.68 0.4 0.957 ...
## $ SalePriceMM : num 1.99 1.69 2.09 1.69 1.69 1.99 1.59 1.59 1.59 1.59 ...
## $ SalePriceCH : num 1.75 1.75 1.69 1.69 1.69 1.69 1.69 1.75 1.75 1.75 ...
## $ PriceDiff : num 0.24 -0.06 0.4 0 0 0.3 -0.1 -0.16 -0.16 -0.16 ...
## $ Store7 : Factor w/ 2 levels "No","Yes": 1 1 1 1 2 2 2 2 2 2 ...
## $ PctDiscMM : num 0 0.151 0 0 0 ...
## $ PctDiscCH : num 0 0 0.0914 0 0 ...
## $ ListPriceDiff : num 0.24 0.24 0.23 0 0 0.3 0.3 0.24 0.24 0.24 ...
## $ STORE : int 1 1 1 1 0 0 0 0 0 0 ...
head(orange[, 1:10])
## Purchase WeekofPurchase StoreID PriceCH PriceMM DiscCH DiscMM SpecialCH
## 1 CH 237 1 1.75 1.99 0.00 0.0 0
## 2 CH 239 1 1.75 1.99 0.00 0.3 0
## 3 CH 245 1 1.86 2.09 0.17 0.0 0
## 4 MM 227 1 1.69 1.69 0.00 0.0 0
## 5 CH 228 7 1.69 1.69 0.00 0.0 0
## 6 CH 230 7 1.69 1.99 0.00 0.0 0
## SpecialMM LoyalCH
## 1 0 0.500000
## 2 1 0.600000
## 3 0 0.680000
## 4 0 0.400000
## 5 0 0.956535
## 6 1 0.965228
We will split it into training(80%) and test (20%) using caret’s createdatapartition() method It preserves the proportion of the categories in Y variable. In the argument, we need to give the column of our Y variable.
Setting the seed
set.seed(100)
trainRowNumbers <- createDataPartition(orange$Purchase, p=0.8, list=FALSE)
createDataPartition() takes as input the Y variable in the source dataset and the percentage data that should go into training as the p argument. It returns the rownumbers that should form the training dataset.Plus, you need to set list=F, to prevent returning the result as a list.
trainData <- orange[trainRowNumbers,]
testData <- orange[-trainRowNumbers,]
x = trainData[, 2:18]
y = trainData$Purchase
Lets see a data summary
library(skimr)
## Warning: package 'skimr' was built under R version 3.6.3
skimmed <- skim(trainData)
a <- skimmed[, c(1:4,6,8,9:10,12,14:15)]
a
| Name | trainData |
| Number of rows | 857 |
| Number of columns | 18 |
| _______________________ | |
| Column type frequency: | |
| factor | 2 |
| numeric | 16 |
| ________________________ | |
| Group variables | None |
Variable type: factor
| skim_variable | n_missing | complete_rate | n_unique |
|---|---|---|---|
| Purchase | 0 | 1 | 2 |
| Store7 | 0 | 1 | 2 |
Variable type: numeric
| skim_variable | n_missing | complete_rate | mean | sd | p0 | p50 | p100 | hist |
|---|---|---|---|---|---|---|---|---|
| WeekofPurchase | 0 | 1.00 | 254.16 | 15.64 | 227.00 | 256.00 | 278.00 | ▇▅▅▇▇ |
| StoreID | 1 | 1.00 | 4.01 | 2.33 | 1.00 | 3.00 | 7.00 | ▇▅▃▁▇ |
| PriceCH | 0 | 1.00 | 1.87 | 0.10 | 1.69 | 1.86 | 2.09 | ▅▂▇▆▁ |
| PriceMM | 2 | 1.00 | 2.08 | 0.14 | 1.69 | 2.09 | 2.29 | ▂▁▃▇▆ |
| DiscCH | 1 | 1.00 | 0.05 | 0.12 | 0.00 | 0.00 | 0.50 | ▇▁▁▁▁ |
| DiscMM | 4 | 1.00 | 0.13 | 0.22 | 0.00 | 0.00 | 0.80 | ▇▁▁▁▁ |
| SpecialCH | 2 | 1.00 | 0.15 | 0.36 | 0.00 | 0.00 | 1.00 | ▇▁▁▁▂ |
| SpecialMM | 5 | 0.99 | 0.17 | 0.37 | 0.00 | 0.00 | 1.00 | ▇▁▁▁▂ |
| LoyalCH | 3 | 1.00 | 0.56 | 0.31 | 0.00 | 0.60 | 1.00 | ▅▃▆▆▇ |
| SalePriceMM | 5 | 0.99 | 1.96 | 0.26 | 1.19 | 2.09 | 2.29 | ▁▂▂▂▇ |
| SalePriceCH | 1 | 1.00 | 1.81 | 0.15 | 1.39 | 1.86 | 2.09 | ▂▁▇▇▅ |
| PriceDiff | 0 | 1.00 | 0.14 | 0.27 | -0.67 | 0.23 | 0.64 | ▁▂▃▇▂ |
| PctDiscMM | 4 | 1.00 | 0.06 | 0.10 | 0.00 | 0.00 | 0.40 | ▇▁▁▁▁ |
| PctDiscCH | 2 | 1.00 | 0.03 | 0.06 | 0.00 | 0.00 | 0.25 | ▇▁▁▁▁ |
| ListPriceDiff | 0 | 1.00 | 0.22 | 0.11 | 0.00 | 0.24 | 0.44 | ▂▃▅▇▁ |
| STORE | 2 | 1.00 | 1.59 | 1.43 | 0.00 | 2.00 | 4.00 | ▇▃▅▅▃ |
Lets impute the missing values using knn impute
To predict the missing values with k-Nearest Neighbors using preProcess():
You need to set the method=knnImpute for k-Nearest Neighbors and apply it on the training data. This creates a preprocess model. Then use predict() on the created preprocess model by setting the newdata argument on the same training data.
# Create the knn imputation model on the training data
preProcess_missingdata_model <- preProcess(trainData, method='knnImpute')
preProcess_missingdata_model
## Created from 827 samples and 18 variables
##
## Pre-processing:
## - centered (16)
## - ignored (2)
## - 5 nearest neighbor imputation (16)
## - scaled (16)
The above output shows the various preprocessing steps done in the process of knn imputation.
That is, it has centered (subtract by mean) 16 variables, ignored 2, used k=5 (considered 5 nearest neighbors) to predict the missing values and finally scaled (divide by standard deviation) 16 variables.
Let’s now use this model to predict the missing values in trainData.
# Use the imputation model to predict the values of missing data points
library(RANN) # required for knnInpute
trainData <- predict(preProcess_missingdata_model, newdata = trainData)
anyNA(trainData)
## [1] FALSE
All the missing values are successfully imputed.
Lets create one hot encoding of the dummy Variables. you should ensure the dummyVars model is built on the training data alone and that model is in turn used to create the dummy vars on the test data.
In caret, one-hot-encodings can be created using dummyVars(). Lets do the one hot encoding
dummies_model <- dummyVars(Purchase ~ ., data=trainData)
Create the dummy variables using predict. The Y variable (Purchase) will not be present in trainData_mat.
trainData_mat <- predict(dummies_model, newdata = trainData)
## Warning in model.frame.default(Terms, newdata, na.action = na.action, xlev
## = object$lvls): variable 'Purchase' is not a factor
Converting to dataframed
trainData <- data.frame(trainData_mat)
See the structure of the new dataset
str(trainData)
## 'data.frame': 857 obs. of 18 variables:
## $ WeekofPurchase: num -1.097 -0.969 -0.586 -1.737 -1.673 ...
## $ StoreID : num -1.29 -1.29 -1.29 -1.29 1.29 ...
## $ PriceCH : num -1.1422 -1.1422 -0.0592 -1.7329 -1.7329 ...
## $ PriceMM : num -0.6795 -0.6795 0.0498 -2.8676 -2.8676 ...
## $ DiscCH : num -0.444 -0.444 0.981 -0.444 -0.444 ...
## $ DiscMM : num -0.578 0.793 -0.578 -0.578 -0.578 ...
## $ SpecialCH : num -0.425 -0.425 -0.425 -0.425 -0.425 ...
## $ SpecialMM : num -0.447 2.235 -0.447 -0.447 -0.447 ...
## $ LoyalCH : num -0.211 0.116 0.378 -0.539 1.284 ...
## $ SalePriceMM : num 0.13 -1.037 0.519 -1.037 -1.037 ...
## $ SalePriceCH : num -0.432 -0.432 -0.843 -0.843 -0.843 ...
## $ PriceDiff : num 0.352 -0.744 0.936 -0.525 -0.525 ...
## $ Store7.No : num 1 1 1 1 0 0 0 0 0 0 ...
## $ Store7.Yes : num 0 0 0 0 1 1 1 1 1 1 ...
## $ PctDiscMM : num -0.587 0.861 -0.587 -0.587 -0.587 ...
## $ PctDiscCH : num -0.44 -0.44 1 -0.44 -0.44 ...
## $ ListPriceDiff : num 0.21 0.21 0.118 -2.012 -2.012 ...
## $ STORE : num -0.412 -0.412 -0.412 -0.412 -1.111 ...
In above case, we had one categorical variable, Store7 with 2 categories. It was one-hot-encoded to produce two new columns – Store7.No and Store7.Yes.
The following are the Transformations available in Caret
range: Normalize values so it ranges between 0 and 1 center: Subtract Mean scale: Divide by standard deviation BoxCox: Remove skewness leading to normality. Values must be > 0 YeoJohnson: Like BoxCox, but works for negative values. expoTrans: Exponential transformation, works for negative values. pca: Replace with principal components ica: Replace with independent components spatialSign: Project the data to a unit circle
For our problem, let’s convert all the numeric variables to range between 0 and 1, by setting method=range in preProcess()
preProcess_range_model <- preProcess(trainData, method='range')
trainData <- predict(preProcess_range_model, newdata = trainData)
Lets append the y Variable. Remember we have created two variables y and x above
trainData$Purchase <- y
str(trainData)
## 'data.frame': 857 obs. of 19 variables:
## $ WeekofPurchase: num 0.1961 0.2353 0.3529 0 0.0196 ...
## $ StoreID : num 0 0 0 0 1 1 1 1 1 1 ...
## $ PriceCH : num 0.15 0.15 0.425 0 0 ...
## $ PriceMM : num 0.5 0.5 0.667 0 0 ...
## $ DiscCH : num 0 0 0.34 0 0 0 0 0 0 0.54 ...
## $ DiscMM : num 0 0.375 0 0 0 0 0.5 0.5 0.5 0 ...
## $ SpecialCH : num 0 0 0 0 0 0 1 0 0 0 ...
## $ SpecialMM : num 0 1 0 0 0 1 0 0 0 0 ...
## $ LoyalCH : num 0.5 0.6 0.68 0.4 0.957 ...
## $ SalePriceMM : num 0.727 0.455 0.818 0.455 0.455 ...
## $ SalePriceCH : num 0.514 0.514 0.429 0.429 0.429 ...
## $ PriceDiff : num 0.695 0.466 0.817 0.511 0.511 ...
## $ Store7.No : num 1 1 1 1 0 0 0 0 0 0 ...
## $ Store7.Yes : num 0 0 0 0 1 1 1 1 1 1 ...
## $ PctDiscMM : num 0 0.375 0 0 0 ...
## $ PctDiscCH : num 0 0 0.362 0 0 ...
## $ ListPriceDiff : num 0.545 0.545 0.523 0 0 ...
## $ STORE : num 0.25 0.25 0.25 0.25 0 0 0 0 0 0 ...
## $ Purchase : Factor w/ 2 levels "CH","MM": 1 1 1 2 1 1 1 1 1 1 ...
apply(trainData[, 1:10], 2, FUN=function(x){c('min'=min(x), 'max'=max(x))}) ## WHY 1st 10, why not all
## WeekofPurchase StoreID PriceCH PriceMM DiscCH DiscMM SpecialCH
## min 0 0 0 0 0 0 0
## max 1 1 1 1 1 1 1
## SpecialMM LoyalCH SalePriceMM
## min 0 0 0
## max 1 1 1
A simple common sense approach is, if you group the X variable by the categories of Y, a significant mean shift amongst the X’s groups is a strong indicator (if not the only indicator) that X will have a significant role to help predict Y.
It is possible to watch this shift visually using box plots and density plots.
In fact, caret’s featurePlot() function makes it so convenient.
Simply set the X and Y parameters and set plot=‘box’. You can additionally adjust the label font size (using strip) and the scales to be free as I have done in the below plot.
featurePlot(x = trainData[, 1:18],
y = trainData$Purchase,
plot = "box",
strip=strip.custom(par.strip.text=list(cex=.7)),
scales = list(x = list(relation="free"),
y = list(relation="free")))
Consider for example, LoyalCHs subplot, which measures the loyalty score of the customer to the CH brand. The mean and the placement of the two boxes are glaringly different.
Just by seeing that, I am pretty sure, LoyalCH is going to be a significant predictor of Y.
Let’s do a similar exercise with density plots.
In this case, For a variable to be important, I would expect the density curves to be significantly different for the 2 classes, both in terms of the height (kurtosis) and placement (skewness).
featurePlot(x = trainData[, 1:18],
y = trainData$Purchase,
plot = "density",
strip=strip.custom(par.strip.text=list(cex=.7)),
scales = list(x = list(relation="free"),
y = list(relation="free")))
Take a look at the density curves of the two categories for ‘LoyalCH’, ‘STORE’, ‘StoreID’, ‘WeekofPurchase’. Are they different?
Having visualised the relationships between X and Y, We can only say which variables are likely to be important to predict Y. It may not be wise to conclude which variables are NOT important.
Because sometimes, variables with uninteresting pattern can help explain certain aspects of Y that the visually important variables may not.
So to be safe, let’s not arrive at conclusions about excluding variables prematurely.
A good choice of selecting the important features is the recursive feature elimination (RFE).
So how does recursive feature elimination work?
RFE works in 3 broad steps:
Step 1: Build a ML model on a training dataset and estimate the feature importances on the test dataset.
Step 2: Keeping priority to the most important variables, iterate through by building models of given subset sizes, that is, subgroups of most important predictors determined from step 1. Ranking of the predictors is recalculated in each iteration.
Step 3: The model performances are compared across different subset sizes to arrive at the optimal number and list of final predictors.
It can be implemented using the rfe() function and you have the flexibility to control what algorithm rfe uses and how it cross validates by defining the rfeControl().
set.seed(100)
options(warn=-1)
subsets <- c(1:5, 10, 15, 18)
ctrl <- rfeControl(functions = rfFuncs,
method = "repeatedcv",
repeats = 5,
verbose = FALSE)
lmProfile <- rfe(x=trainData[, 1:18], y=trainData$Purchase,
sizes = subsets,
rfeControl = ctrl)
lmProfile
##
## Recursive feature selection
##
## Outer resampling method: Cross-Validated (10 fold, repeated 5 times)
##
## Resampling performance over subset size:
##
## Variables Accuracy Kappa AccuracySD KappaSD Selected
## 1 0.7815 0.5340 0.03321 0.07347
## 2 0.8210 0.6219 0.04125 0.08631
## 3 0.8243 0.6296 0.04133 0.08696 *
## 4 0.8112 0.6036 0.04315 0.09059
## 5 0.8119 0.6048 0.04300 0.09022
## 10 0.8131 0.6050 0.04039 0.08581
## 15 0.8133 0.6048 0.03993 0.08518
## 18 0.8115 0.6017 0.04319 0.09164
##
## The top 3 variables (out of 3):
## LoyalCH, PriceDiff, StoreID
Apart from the x and y datasets, RFE also takes two important parameters.
sizes rfeControl The sizes determines what all model sizes (the number of most important features) the rfe should consider. In above case, it iterates models of size 1 to 5, 10, 15 and 18.
From the above output, a model size of 3 with LoyalCH, PriceDiff and StoreID seems to achieve the optimal accuracy.
That means, out of 18 other features, a model with just 3 features outperformed many other larger model. Interesting isn’t it! Can you explain why?
However, it is not a mandate that only including these 3 variables will always give high accuracy over larger sized models.
Thats because, the rfe() we just implemented is particular to random forest based rfFuncs
Since ML algorithms have their own way of learning the relationship between the x and y, it is not wise to neglect the other predictors, especially when there is evidence that there is information contained in rest of the variables to explain the relationship between x and y.
Plus also, since the training dataset isn’t large enough, the other predictors may not have had the chance to show its worth.
In the next step, we will build the actual randomForest model on trainData.
Let’s train a Multivariate Adaptive Regression Splines (MARS) model by setting the method=‘earth’.
The MARS algorithm was named as ‘earth’ in R because of a possible trademark conflict with Salford Systems. May be a rumor. Or not.
modelLookup('earth')
## model parameter label forReg forClass probModel
## 1 earth nprune #Terms TRUE TRUE TRUE
## 2 earth degree Product Degree TRUE TRUE TRUE
Set the seed for reproducibility
set.seed(100)
Train the model using earth and predict on the training data itself
model_mars = train(Purchase ~ ., data=trainData, method='earth')
fitted <- predict(model_mars)
But you may ask how is using train() different from using the algorithm’s function directly?
The difference is, besides building the model train() does multiple other things like:
Cross validating the model Tune the hyper parameters for optimal model performance Choose the optimal model based on a given evaluation metric Preprocess the predictors (what we did so far using preProcess()) The train function also accepts the arguments used by the algorithm specified in the method argument.
Now let’s see what the train() has generated.
model_mars
## Multivariate Adaptive Regression Spline
##
## 857 samples
## 18 predictor
## 2 classes: 'CH', 'MM'
##
## No pre-processing
## Resampling: Bootstrapped (25 reps)
## Summary of sample sizes: 857, 857, 857, 857, 857, 857, ...
## Resampling results across tuning parameters:
##
## nprune Accuracy Kappa
## 2 0.8116999 0.5969106
## 9 0.8234148 0.6245781
## 17 0.8105738 0.5975440
##
## Tuning parameter 'degree' was held constant at a value of 1
## Accuracy was used to select the optimal model using the largest value.
## The final values used for the model were nprune = 9 and degree = 1.
You can see what is the Accuracy and Kappa for various combinations of the hyper parameters – interaction.depth and n.trees. And it says ‘Resampling: Bootstrapped (25 reps)’ with a summary of sample sizes.
Looks like train() has already done a basic cross validation and hyper parameter tuning. And that is the default behaviour.
The chosen model and its parameters is reported in the last 2 lines of the output.
When we used model_mars to predict the Y, this final model was automatically used by predict() to compute the predictions.
Plotting the model shows how the various iterations of hyperparameter search performed.
plot(model_mars, main="Model Accuracies with MARS")
Excellent, since MARS supports computing variable importances, let’s extract the variable importances using varImp() to understand which variables came out to be useful.
varimp_mars <- varImp(model_mars)
plot(varimp_mars, main="Variable Importance with MARS")
As suspected, LoyalCH was the most used variable, followed by PriceDiff and StoreID.
A default MARS model has been selected.
Now in order to use the model to predict on new data, the data has to be preprocessed and transformed just the way we did on the training data.
Thanks to caret, all the information required for pre-processing is stored in the respective preProcess model and dummyVar model.
If you recall, we did the pre-processing in the following sequence:
Missing Value imputation –> One-Hot Encoding –> Range Normalization
You need to pass the testData through these models in the same sequence:
preProcess_missingdata_model –> dummies_model –> preProcess_range_model
# Step 1: Impute missing values
testData2 <- predict(preProcess_missingdata_model, testData)
# Step 2: Create one-hot encodings (dummy variables)
testData3 <- predict(dummies_model, testData2)
# Step 3: Transform the features to range between 0 and 1
testData4 <- predict(preProcess_range_model, testData3)
# View
head(testData4[, 1:10])
## WeekofPurchase StoreID PriceCH PriceMM DiscCH DiscMM SpecialCH
## 7 0.09803922 1.0000000 0.000 0.5000000 0 0.5 1
## 11 0.25490196 1.0000000 0.425 0.6666667 0 0.0 0
## 18 0.80392157 0.1666667 0.425 0.8166667 0 0.0 0
## 21 0.58823529 1.0000000 0.425 0.8166667 0 0.0 0
## 33 0.94117647 0.1666667 0.675 0.8166667 0 1.0 0
## 35 0.47058824 0.3333333 0.750 0.9000000 0 0.0 0
## SpecialMM LoyalCH SalePriceMM
## 7 1 0.9722332 0.3636364
## 11 0 0.9886583 0.8181818
## 18 1 0.4000146 0.9000000
## 21 0 0.6000274 0.9000000
## 33 1 0.6800325 0.1727273
## 35 0 0.5440238 0.9454545
# Predict on testData
predicted <- predict(model_mars, testData4)
head(predicted)
## [1] CH CH CH CH MM CH
## Levels: CH MM
The confusion matrix is a tabular representation to compare the predictions (data) vs the actuals (reference). By setting mode=‘everything’ pretty much most classification evaluation metrics are computed.
# Compute the confusion matrix
confusionMatrix(reference = testData$Purchase, data = predicted, mode='everything', positive='MM')
## Confusion Matrix and Statistics
##
## Reference
## Prediction CH MM
## CH 114 26
## MM 16 57
##
## Accuracy : 0.8028
## 95% CI : (0.743, 0.854)
## No Information Rate : 0.6103
## P-Value [Acc > NIR] : 1.281e-09
##
## Kappa : 0.5762
##
## Mcnemar's Test P-Value : 0.1649
##
## Sensitivity : 0.6867
## Specificity : 0.8769
## Pos Pred Value : 0.7808
## Neg Pred Value : 0.8143
## Precision : 0.7808
## Recall : 0.6867
## F1 : 0.7308
## Prevalence : 0.3897
## Detection Rate : 0.2676
## Detection Prevalence : 0.3427
## Balanced Accuracy : 0.7818
##
## 'Positive' Class : MM
##
You have an overall accuracy of 80.28%.
There are two main ways to do hyper parameter tuning using the train():
tuneLength corresponds to the number of unique values for the tuning parameters caret will consider while forming the hyper parameter combinations.
Caret will automatically determine the values each parameter should take.
Alternately, if you want to explicitly control what values should be considered for each parameter, then, you can define the tuneGrid and pass it to train().
Let’s see an example of both these approaches but first let’s setup the trainControl().
The train() function takes a trControl argument that accepts the output of trainControl().
Inside trainControl() you can control how the train() will:
Cross validation method to use. How the results should be summarised using a summary function Cross validation method can be one amongst:
‘boot’: Bootstrap sampling ‘boot632’: Bootstrap sampling with 63.2% bias correction applied ‘optimism_boot’: The optimism bootstrap estimator ‘boot_all’: All boot methods. ‘cv’: k-Fold cross validation ‘repeatedcv’: Repeated k-Fold cross validation ‘oob’: Out of Bag cross validation ‘LOOCV’: Leave one out cross validation ‘LGOCV’: Leave group out cross validation The summaryFunction can be twoClassSummary if Y is binary class or multiClassSummary if the Y has more than 2 categories.
By settiung the classProbs=T the probability scores are generated instead of directly predicting the class based on a predetermined cutoff of 0.5.
# Define the training control
fitControl <- trainControl(
method = 'cv', # k-fold cross validation
number = 5, # number of folds
savePredictions = 'final', # saves predictions for optimal tuning parameter
classProbs = T, # should class probabilities be returned
summaryFunction=twoClassSummary # results summary function
)
Let’s take the train() function we used before, plus, additionally set the tuneLength, trControl and metric.
# Step 1: Tune hyper parameters by setting tuneLength
set.seed(100)
model_mars2 = train(Purchase ~ ., data=trainData, method='earth', tuneLength = 5, metric='ROC', trControl = fitControl)
model_mars2
## Multivariate Adaptive Regression Spline
##
## 857 samples
## 18 predictor
## 2 classes: 'CH', 'MM'
##
## No pre-processing
## Resampling: Cross-Validated (5 fold)
## Summary of sample sizes: 685, 686, 685, 686, 686
## Resampling results across tuning parameters:
##
## nprune ROC Sens Spec
## 2 0.8837092 0.8757143 0.7094075
## 5 0.9025000 0.8795421 0.7513795
## 9 0.8929800 0.8719048 0.7423338
## 13 0.8930665 0.8719048 0.7393035
## 17 0.8930665 0.8719048 0.7393035
##
## Tuning parameter 'degree' was held constant at a value of 1
## ROC was used to select the optimal model using the largest value.
## The final values used for the model were nprune = 5 and degree = 1.
# Step 2: Predict on testData and Compute the confusion matrix
predicted2 <- predict(model_mars2, testData4)
confusionMatrix(reference = testData$Purchase, data = predicted2, mode='everything', positive='MM')
## Confusion Matrix and Statistics
##
## Reference
## Prediction CH MM
## CH 113 27
## MM 17 56
##
## Accuracy : 0.7934
## 95% CI : (0.7328, 0.8457)
## No Information Rate : 0.6103
## P-Value [Acc > NIR] : 8.319e-09
##
## Kappa : 0.556
##
## Mcnemar's Test P-Value : 0.1748
##
## Sensitivity : 0.6747
## Specificity : 0.8692
## Pos Pred Value : 0.7671
## Neg Pred Value : 0.8071
## Precision : 0.7671
## Recall : 0.6747
## F1 : 0.7179
## Prevalence : 0.3897
## Detection Rate : 0.2629
## Detection Prevalence : 0.3427
## Balanced Accuracy : 0.7720
##
## 'Positive' Class : MM
##
# Step 1: Define the tuneGrid
marsGrid <- expand.grid(nprune = c(2, 4, 6, 8, 10),
degree = c(1, 2, 3))
# Step 2: Tune hyper parameters by setting tuneGrid
set.seed(100)
model_mars3 = train(Purchase ~ ., data=trainData, method='earth', metric='ROC', tuneGrid = marsGrid, trControl = fitControl)
model_mars3
## Multivariate Adaptive Regression Spline
##
## 857 samples
## 18 predictor
## 2 classes: 'CH', 'MM'
##
## No pre-processing
## Resampling: Cross-Validated (5 fold)
## Summary of sample sizes: 685, 686, 685, 686, 686
## Resampling results across tuning parameters:
##
## degree nprune ROC Sens Spec
## 1 2 0.8837092 0.8757143 0.7094075
## 1 4 0.9011512 0.8718315 0.7633198
## 1 6 0.9022966 0.8795421 0.7424242
## 1 8 0.8986413 0.8757143 0.7483492
## 1 10 0.8938458 0.8719048 0.7453641
## 2 2 0.8212388 0.8663553 0.6260968
## 2 4 0.9028221 0.8776374 0.7693351
## 2 6 0.9001307 0.8565201 0.7782451
## 2 8 0.8942283 0.8546520 0.7812754
## 2 10 0.8941091 0.8546337 0.7753053
## 3 2 0.8773872 0.8297802 0.7604251
## 3 4 0.9034469 0.8776007 0.7753053
## 3 6 0.9026720 0.8660806 0.7631389
## 3 8 0.8997006 0.8603846 0.7601085
## 3 10 0.8983994 0.8584982 0.7780642
##
## ROC was used to select the optimal model using the largest value.
## The final values used for the model were nprune = 4 and degree = 3.
# Step 3: Predict on testData and Compute the confusion matrix
predicted3 <- predict(model_mars3, testData4)
confusionMatrix(reference = testData$Purchase, data = predicted3, mode='everything', positive='MM')
## Confusion Matrix and Statistics
##
## Reference
## Prediction CH MM
## CH 113 27
## MM 17 56
##
## Accuracy : 0.7934
## 95% CI : (0.7328, 0.8457)
## No Information Rate : 0.6103
## P-Value [Acc > NIR] : 8.319e-09
##
## Kappa : 0.556
##
## Mcnemar's Test P-Value : 0.1748
##
## Sensitivity : 0.6747
## Specificity : 0.8692
## Pos Pred Value : 0.7671
## Neg Pred Value : 0.8071
## Precision : 0.7671
## Recall : 0.6747
## F1 : 0.7179
## Prevalence : 0.3897
## Detection Rate : 0.2629
## Detection Prevalence : 0.3427
## Balanced Accuracy : 0.7720
##
## 'Positive' Class : MM
##
Caret provides the resamples() function where you can provide multiple machine learning models and collectively evaluate them.
Let’s first train some more algorithms.
set.seed(100)
# Train the model using adaboost
model_adaboost = train(Purchase ~ ., data=trainData, method='adaboost', tuneLength=2, trControl = fitControl)
model_adaboost
## AdaBoost Classification Trees
##
## 857 samples
## 18 predictor
## 2 classes: 'CH', 'MM'
##
## No pre-processing
## Resampling: Cross-Validated (5 fold)
## Summary of sample sizes: 685, 686, 685, 686, 686
## Resampling results across tuning parameters:
##
## nIter method ROC Sens Spec
## 50 Adaboost.M1 0.8783495 0.8298535 0.7543193
## 50 Real adaboost 0.6881750 0.8412454 0.7363636
## 100 Adaboost.M1 0.8768766 0.8317399 0.7602895
## 100 Real adaboost 0.6831206 0.8393407 0.7393487
##
## ROC was used to select the optimal model using the largest value.
## The final values used for the model were nIter = 50 and method
## = Adaboost.M1.
set.seed(100)
# Train the model using rf
model_rf = train(Purchase ~ ., data=trainData, method='rf', tuneLength=5, trControl = fitControl)
model_rf
## Random Forest
##
## 857 samples
## 18 predictor
## 2 classes: 'CH', 'MM'
##
## No pre-processing
## Resampling: Cross-Validated (5 fold)
## Summary of sample sizes: 685, 686, 685, 686, 686
## Resampling results across tuning parameters:
##
## mtry ROC Sens Spec
## 2 0.8711563 0.8660989 0.6615106
## 6 0.8871323 0.8565751 0.7333333
## 10 0.8867648 0.8527656 0.7573496
## 14 0.8862704 0.8565751 0.7602895
## 18 0.8850728 0.8508608 0.7723202
##
## ROC was used to select the optimal model using the largest value.
## The final value used for the model was mtry = 6.
set.seed(100)
# Train the model using MARS
model_xgbDART = train(Purchase ~ ., data=trainData, method='xgbDART', tuneLength=5, trControl = fitControl, verbose=F)
model_xgbDART
## eXtreme Gradient Boosting
##
## 857 samples
## 18 predictor
## 2 classes: 'CH', 'MM'
##
## No pre-processing
## Resampling: Cross-Validated (5 fold)
## Summary of sample sizes: 685, 686, 685, 686, 686
## Resampling results across tuning parameters:
##
## max_depth eta rate_drop skip_drop subsample colsample_bytree
## 1 0.3 0.01 0.05 0.500 0.6
## 1 0.3 0.01 0.05 0.500 0.6
## 1 0.3 0.01 0.05 0.500 0.6
## 1 0.3 0.01 0.05 0.500 0.6
## 1 0.3 0.01 0.05 0.500 0.6
## 1 0.3 0.01 0.05 0.500 0.8
## 1 0.3 0.01 0.05 0.500 0.8
## 1 0.3 0.01 0.05 0.500 0.8
## 1 0.3 0.01 0.05 0.500 0.8
## 1 0.3 0.01 0.05 0.500 0.8
## 1 0.3 0.01 0.05 0.625 0.6
## 1 0.3 0.01 0.05 0.625 0.6
## 1 0.3 0.01 0.05 0.625 0.6
## 1 0.3 0.01 0.05 0.625 0.6
## 1 0.3 0.01 0.05 0.625 0.6
## 1 0.3 0.01 0.05 0.625 0.8
## 1 0.3 0.01 0.05 0.625 0.8
## 1 0.3 0.01 0.05 0.625 0.8
## 1 0.3 0.01 0.05 0.625 0.8
## 1 0.3 0.01 0.05 0.625 0.8
## 1 0.3 0.01 0.05 0.750 0.6
## 1 0.3 0.01 0.05 0.750 0.6
## 1 0.3 0.01 0.05 0.750 0.6
## 1 0.3 0.01 0.05 0.750 0.6
## 1 0.3 0.01 0.05 0.750 0.6
## 1 0.3 0.01 0.05 0.750 0.8
## 1 0.3 0.01 0.05 0.750 0.8
## 1 0.3 0.01 0.05 0.750 0.8
## 1 0.3 0.01 0.05 0.750 0.8
## 1 0.3 0.01 0.05 0.750 0.8
## 1 0.3 0.01 0.05 0.875 0.6
## 1 0.3 0.01 0.05 0.875 0.6
## 1 0.3 0.01 0.05 0.875 0.6
## 1 0.3 0.01 0.05 0.875 0.6
## 1 0.3 0.01 0.05 0.875 0.6
## 1 0.3 0.01 0.05 0.875 0.8
## 1 0.3 0.01 0.05 0.875 0.8
## 1 0.3 0.01 0.05 0.875 0.8
## 1 0.3 0.01 0.05 0.875 0.8
## 1 0.3 0.01 0.05 0.875 0.8
## 1 0.3 0.01 0.05 1.000 0.6
## 1 0.3 0.01 0.05 1.000 0.6
## 1 0.3 0.01 0.05 1.000 0.6
## 1 0.3 0.01 0.05 1.000 0.6
## 1 0.3 0.01 0.05 1.000 0.6
## 1 0.3 0.01 0.05 1.000 0.8
## 1 0.3 0.01 0.05 1.000 0.8
## 1 0.3 0.01 0.05 1.000 0.8
## 1 0.3 0.01 0.05 1.000 0.8
## 1 0.3 0.01 0.05 1.000 0.8
## 1 0.3 0.01 0.95 0.500 0.6
## 1 0.3 0.01 0.95 0.500 0.6
## 1 0.3 0.01 0.95 0.500 0.6
## 1 0.3 0.01 0.95 0.500 0.6
## 1 0.3 0.01 0.95 0.500 0.6
## 1 0.3 0.01 0.95 0.500 0.8
## 1 0.3 0.01 0.95 0.500 0.8
## 1 0.3 0.01 0.95 0.500 0.8
## 1 0.3 0.01 0.95 0.500 0.8
## 1 0.3 0.01 0.95 0.500 0.8
## 1 0.3 0.01 0.95 0.625 0.6
## 1 0.3 0.01 0.95 0.625 0.6
## 1 0.3 0.01 0.95 0.625 0.6
## 1 0.3 0.01 0.95 0.625 0.6
## 1 0.3 0.01 0.95 0.625 0.6
## 1 0.3 0.01 0.95 0.625 0.8
## 1 0.3 0.01 0.95 0.625 0.8
## 1 0.3 0.01 0.95 0.625 0.8
## 1 0.3 0.01 0.95 0.625 0.8
## 1 0.3 0.01 0.95 0.625 0.8
## 1 0.3 0.01 0.95 0.750 0.6
## 1 0.3 0.01 0.95 0.750 0.6
## 1 0.3 0.01 0.95 0.750 0.6
## 1 0.3 0.01 0.95 0.750 0.6
## 1 0.3 0.01 0.95 0.750 0.6
## 1 0.3 0.01 0.95 0.750 0.8
## 1 0.3 0.01 0.95 0.750 0.8
## 1 0.3 0.01 0.95 0.750 0.8
## 1 0.3 0.01 0.95 0.750 0.8
## 1 0.3 0.01 0.95 0.750 0.8
## 1 0.3 0.01 0.95 0.875 0.6
## 1 0.3 0.01 0.95 0.875 0.6
## 1 0.3 0.01 0.95 0.875 0.6
## 1 0.3 0.01 0.95 0.875 0.6
## 1 0.3 0.01 0.95 0.875 0.6
## 1 0.3 0.01 0.95 0.875 0.8
## 1 0.3 0.01 0.95 0.875 0.8
## 1 0.3 0.01 0.95 0.875 0.8
## 1 0.3 0.01 0.95 0.875 0.8
## 1 0.3 0.01 0.95 0.875 0.8
## 1 0.3 0.01 0.95 1.000 0.6
## 1 0.3 0.01 0.95 1.000 0.6
## 1 0.3 0.01 0.95 1.000 0.6
## 1 0.3 0.01 0.95 1.000 0.6
## 1 0.3 0.01 0.95 1.000 0.6
## 1 0.3 0.01 0.95 1.000 0.8
## 1 0.3 0.01 0.95 1.000 0.8
## 1 0.3 0.01 0.95 1.000 0.8
## 1 0.3 0.01 0.95 1.000 0.8
## 1 0.3 0.01 0.95 1.000 0.8
## 1 0.3 0.50 0.05 0.500 0.6
## 1 0.3 0.50 0.05 0.500 0.6
## 1 0.3 0.50 0.05 0.500 0.6
## 1 0.3 0.50 0.05 0.500 0.6
## 1 0.3 0.50 0.05 0.500 0.6
## 1 0.3 0.50 0.05 0.500 0.8
## 1 0.3 0.50 0.05 0.500 0.8
## 1 0.3 0.50 0.05 0.500 0.8
## 1 0.3 0.50 0.05 0.500 0.8
## 1 0.3 0.50 0.05 0.500 0.8
## 1 0.3 0.50 0.05 0.625 0.6
## 1 0.3 0.50 0.05 0.625 0.6
## 1 0.3 0.50 0.05 0.625 0.6
## 1 0.3 0.50 0.05 0.625 0.6
## 1 0.3 0.50 0.05 0.625 0.6
## 1 0.3 0.50 0.05 0.625 0.8
## 1 0.3 0.50 0.05 0.625 0.8
## 1 0.3 0.50 0.05 0.625 0.8
## 1 0.3 0.50 0.05 0.625 0.8
## 1 0.3 0.50 0.05 0.625 0.8
## 1 0.3 0.50 0.05 0.750 0.6
## 1 0.3 0.50 0.05 0.750 0.6
## 1 0.3 0.50 0.05 0.750 0.6
## 1 0.3 0.50 0.05 0.750 0.6
## 1 0.3 0.50 0.05 0.750 0.6
## 1 0.3 0.50 0.05 0.750 0.8
## 1 0.3 0.50 0.05 0.750 0.8
## 1 0.3 0.50 0.05 0.750 0.8
## 1 0.3 0.50 0.05 0.750 0.8
## 1 0.3 0.50 0.05 0.750 0.8
## 1 0.3 0.50 0.05 0.875 0.6
## 1 0.3 0.50 0.05 0.875 0.6
## 1 0.3 0.50 0.05 0.875 0.6
## 1 0.3 0.50 0.05 0.875 0.6
## 1 0.3 0.50 0.05 0.875 0.6
## 1 0.3 0.50 0.05 0.875 0.8
## 1 0.3 0.50 0.05 0.875 0.8
## 1 0.3 0.50 0.05 0.875 0.8
## 1 0.3 0.50 0.05 0.875 0.8
## 1 0.3 0.50 0.05 0.875 0.8
## 1 0.3 0.50 0.05 1.000 0.6
## 1 0.3 0.50 0.05 1.000 0.6
## 1 0.3 0.50 0.05 1.000 0.6
## 1 0.3 0.50 0.05 1.000 0.6
## 1 0.3 0.50 0.05 1.000 0.6
## 1 0.3 0.50 0.05 1.000 0.8
## 1 0.3 0.50 0.05 1.000 0.8
## 1 0.3 0.50 0.05 1.000 0.8
## 1 0.3 0.50 0.05 1.000 0.8
## 1 0.3 0.50 0.05 1.000 0.8
## 1 0.3 0.50 0.95 0.500 0.6
## 1 0.3 0.50 0.95 0.500 0.6
## 1 0.3 0.50 0.95 0.500 0.6
## 1 0.3 0.50 0.95 0.500 0.6
## 1 0.3 0.50 0.95 0.500 0.6
## 1 0.3 0.50 0.95 0.500 0.8
## 1 0.3 0.50 0.95 0.500 0.8
## 1 0.3 0.50 0.95 0.500 0.8
## 1 0.3 0.50 0.95 0.500 0.8
## 1 0.3 0.50 0.95 0.500 0.8
## 1 0.3 0.50 0.95 0.625 0.6
## 1 0.3 0.50 0.95 0.625 0.6
## 1 0.3 0.50 0.95 0.625 0.6
## 1 0.3 0.50 0.95 0.625 0.6
## 1 0.3 0.50 0.95 0.625 0.6
## 1 0.3 0.50 0.95 0.625 0.8
## 1 0.3 0.50 0.95 0.625 0.8
## 1 0.3 0.50 0.95 0.625 0.8
## 1 0.3 0.50 0.95 0.625 0.8
## 1 0.3 0.50 0.95 0.625 0.8
## 1 0.3 0.50 0.95 0.750 0.6
## 1 0.3 0.50 0.95 0.750 0.6
## 1 0.3 0.50 0.95 0.750 0.6
## 1 0.3 0.50 0.95 0.750 0.6
## 1 0.3 0.50 0.95 0.750 0.6
## 1 0.3 0.50 0.95 0.750 0.8
## 1 0.3 0.50 0.95 0.750 0.8
## 1 0.3 0.50 0.95 0.750 0.8
## 1 0.3 0.50 0.95 0.750 0.8
## 1 0.3 0.50 0.95 0.750 0.8
## 1 0.3 0.50 0.95 0.875 0.6
## 1 0.3 0.50 0.95 0.875 0.6
## 1 0.3 0.50 0.95 0.875 0.6
## 1 0.3 0.50 0.95 0.875 0.6
## 1 0.3 0.50 0.95 0.875 0.6
## 1 0.3 0.50 0.95 0.875 0.8
## 1 0.3 0.50 0.95 0.875 0.8
## 1 0.3 0.50 0.95 0.875 0.8
## 1 0.3 0.50 0.95 0.875 0.8
## 1 0.3 0.50 0.95 0.875 0.8
## 1 0.3 0.50 0.95 1.000 0.6
## 1 0.3 0.50 0.95 1.000 0.6
## 1 0.3 0.50 0.95 1.000 0.6
## 1 0.3 0.50 0.95 1.000 0.6
## 1 0.3 0.50 0.95 1.000 0.6
## 1 0.3 0.50 0.95 1.000 0.8
## 1 0.3 0.50 0.95 1.000 0.8
## 1 0.3 0.50 0.95 1.000 0.8
## 1 0.3 0.50 0.95 1.000 0.8
## 1 0.3 0.50 0.95 1.000 0.8
## 1 0.4 0.01 0.05 0.500 0.6
## 1 0.4 0.01 0.05 0.500 0.6
## 1 0.4 0.01 0.05 0.500 0.6
## 1 0.4 0.01 0.05 0.500 0.6
## 1 0.4 0.01 0.05 0.500 0.6
## 1 0.4 0.01 0.05 0.500 0.8
## 1 0.4 0.01 0.05 0.500 0.8
## 1 0.4 0.01 0.05 0.500 0.8
## 1 0.4 0.01 0.05 0.500 0.8
## 1 0.4 0.01 0.05 0.500 0.8
## 1 0.4 0.01 0.05 0.625 0.6
## 1 0.4 0.01 0.05 0.625 0.6
## 1 0.4 0.01 0.05 0.625 0.6
## 1 0.4 0.01 0.05 0.625 0.6
## 1 0.4 0.01 0.05 0.625 0.6
## 1 0.4 0.01 0.05 0.625 0.8
## 1 0.4 0.01 0.05 0.625 0.8
## 1 0.4 0.01 0.05 0.625 0.8
## 1 0.4 0.01 0.05 0.625 0.8
## 1 0.4 0.01 0.05 0.625 0.8
## 1 0.4 0.01 0.05 0.750 0.6
## 1 0.4 0.01 0.05 0.750 0.6
## 1 0.4 0.01 0.05 0.750 0.6
## 1 0.4 0.01 0.05 0.750 0.6
## 1 0.4 0.01 0.05 0.750 0.6
## 1 0.4 0.01 0.05 0.750 0.8
## 1 0.4 0.01 0.05 0.750 0.8
## 1 0.4 0.01 0.05 0.750 0.8
## 1 0.4 0.01 0.05 0.750 0.8
## 1 0.4 0.01 0.05 0.750 0.8
## 1 0.4 0.01 0.05 0.875 0.6
## 1 0.4 0.01 0.05 0.875 0.6
## 1 0.4 0.01 0.05 0.875 0.6
## 1 0.4 0.01 0.05 0.875 0.6
## 1 0.4 0.01 0.05 0.875 0.6
## 1 0.4 0.01 0.05 0.875 0.8
## 1 0.4 0.01 0.05 0.875 0.8
## 1 0.4 0.01 0.05 0.875 0.8
## 1 0.4 0.01 0.05 0.875 0.8
## 1 0.4 0.01 0.05 0.875 0.8
## 1 0.4 0.01 0.05 1.000 0.6
## 1 0.4 0.01 0.05 1.000 0.6
## 1 0.4 0.01 0.05 1.000 0.6
## 1 0.4 0.01 0.05 1.000 0.6
## 1 0.4 0.01 0.05 1.000 0.6
## 1 0.4 0.01 0.05 1.000 0.8
## 1 0.4 0.01 0.05 1.000 0.8
## 1 0.4 0.01 0.05 1.000 0.8
## 1 0.4 0.01 0.05 1.000 0.8
## 1 0.4 0.01 0.05 1.000 0.8
## 1 0.4 0.01 0.95 0.500 0.6
## 1 0.4 0.01 0.95 0.500 0.6
## 1 0.4 0.01 0.95 0.500 0.6
## 1 0.4 0.01 0.95 0.500 0.6
## 1 0.4 0.01 0.95 0.500 0.6
## 1 0.4 0.01 0.95 0.500 0.8
## 1 0.4 0.01 0.95 0.500 0.8
## 1 0.4 0.01 0.95 0.500 0.8
## 1 0.4 0.01 0.95 0.500 0.8
## 1 0.4 0.01 0.95 0.500 0.8
## 1 0.4 0.01 0.95 0.625 0.6
## 1 0.4 0.01 0.95 0.625 0.6
## 1 0.4 0.01 0.95 0.625 0.6
## 1 0.4 0.01 0.95 0.625 0.6
## 1 0.4 0.01 0.95 0.625 0.6
## 1 0.4 0.01 0.95 0.625 0.8
## 1 0.4 0.01 0.95 0.625 0.8
## 1 0.4 0.01 0.95 0.625 0.8
## 1 0.4 0.01 0.95 0.625 0.8
## 1 0.4 0.01 0.95 0.625 0.8
## 1 0.4 0.01 0.95 0.750 0.6
## 1 0.4 0.01 0.95 0.750 0.6
## 1 0.4 0.01 0.95 0.750 0.6
## 1 0.4 0.01 0.95 0.750 0.6
## 1 0.4 0.01 0.95 0.750 0.6
## 1 0.4 0.01 0.95 0.750 0.8
## 1 0.4 0.01 0.95 0.750 0.8
## 1 0.4 0.01 0.95 0.750 0.8
## 1 0.4 0.01 0.95 0.750 0.8
## 1 0.4 0.01 0.95 0.750 0.8
## 1 0.4 0.01 0.95 0.875 0.6
## 1 0.4 0.01 0.95 0.875 0.6
## 1 0.4 0.01 0.95 0.875 0.6
## 1 0.4 0.01 0.95 0.875 0.6
## 1 0.4 0.01 0.95 0.875 0.6
## 1 0.4 0.01 0.95 0.875 0.8
## 1 0.4 0.01 0.95 0.875 0.8
## 1 0.4 0.01 0.95 0.875 0.8
## 1 0.4 0.01 0.95 0.875 0.8
## 1 0.4 0.01 0.95 0.875 0.8
## 1 0.4 0.01 0.95 1.000 0.6
## 1 0.4 0.01 0.95 1.000 0.6
## 1 0.4 0.01 0.95 1.000 0.6
## 1 0.4 0.01 0.95 1.000 0.6
## 1 0.4 0.01 0.95 1.000 0.6
## 1 0.4 0.01 0.95 1.000 0.8
## 1 0.4 0.01 0.95 1.000 0.8
## 1 0.4 0.01 0.95 1.000 0.8
## 1 0.4 0.01 0.95 1.000 0.8
## 1 0.4 0.01 0.95 1.000 0.8
## 1 0.4 0.50 0.05 0.500 0.6
## 1 0.4 0.50 0.05 0.500 0.6
## 1 0.4 0.50 0.05 0.500 0.6
## 1 0.4 0.50 0.05 0.500 0.6
## 1 0.4 0.50 0.05 0.500 0.6
## 1 0.4 0.50 0.05 0.500 0.8
## 1 0.4 0.50 0.05 0.500 0.8
## 1 0.4 0.50 0.05 0.500 0.8
## 1 0.4 0.50 0.05 0.500 0.8
## 1 0.4 0.50 0.05 0.500 0.8
## 1 0.4 0.50 0.05 0.625 0.6
## 1 0.4 0.50 0.05 0.625 0.6
## 1 0.4 0.50 0.05 0.625 0.6
## 1 0.4 0.50 0.05 0.625 0.6
## 1 0.4 0.50 0.05 0.625 0.6
## 1 0.4 0.50 0.05 0.625 0.8
## 1 0.4 0.50 0.05 0.625 0.8
## 1 0.4 0.50 0.05 0.625 0.8
## 1 0.4 0.50 0.05 0.625 0.8
## 1 0.4 0.50 0.05 0.625 0.8
## 1 0.4 0.50 0.05 0.750 0.6
## 1 0.4 0.50 0.05 0.750 0.6
## 1 0.4 0.50 0.05 0.750 0.6
## 1 0.4 0.50 0.05 0.750 0.6
## 1 0.4 0.50 0.05 0.750 0.6
## 1 0.4 0.50 0.05 0.750 0.8
## 1 0.4 0.50 0.05 0.750 0.8
## 1 0.4 0.50 0.05 0.750 0.8
## 1 0.4 0.50 0.05 0.750 0.8
## 1 0.4 0.50 0.05 0.750 0.8
## 1 0.4 0.50 0.05 0.875 0.6
## 1 0.4 0.50 0.05 0.875 0.6
## 1 0.4 0.50 0.05 0.875 0.6
## 1 0.4 0.50 0.05 0.875 0.6
## 1 0.4 0.50 0.05 0.875 0.6
## 1 0.4 0.50 0.05 0.875 0.8
## 1 0.4 0.50 0.05 0.875 0.8
## 1 0.4 0.50 0.05 0.875 0.8
## 1 0.4 0.50 0.05 0.875 0.8
## 1 0.4 0.50 0.05 0.875 0.8
## 1 0.4 0.50 0.05 1.000 0.6
## 1 0.4 0.50 0.05 1.000 0.6
## 1 0.4 0.50 0.05 1.000 0.6
## 1 0.4 0.50 0.05 1.000 0.6
## 1 0.4 0.50 0.05 1.000 0.6
## 1 0.4 0.50 0.05 1.000 0.8
## 1 0.4 0.50 0.05 1.000 0.8
## 1 0.4 0.50 0.05 1.000 0.8
## 1 0.4 0.50 0.05 1.000 0.8
## 1 0.4 0.50 0.05 1.000 0.8
## 1 0.4 0.50 0.95 0.500 0.6
## 1 0.4 0.50 0.95 0.500 0.6
## 1 0.4 0.50 0.95 0.500 0.6
## 1 0.4 0.50 0.95 0.500 0.6
## 1 0.4 0.50 0.95 0.500 0.6
## 1 0.4 0.50 0.95 0.500 0.8
## 1 0.4 0.50 0.95 0.500 0.8
## 1 0.4 0.50 0.95 0.500 0.8
## 1 0.4 0.50 0.95 0.500 0.8
## 1 0.4 0.50 0.95 0.500 0.8
## 1 0.4 0.50 0.95 0.625 0.6
## 1 0.4 0.50 0.95 0.625 0.6
## 1 0.4 0.50 0.95 0.625 0.6
## 1 0.4 0.50 0.95 0.625 0.6
## 1 0.4 0.50 0.95 0.625 0.6
## 1 0.4 0.50 0.95 0.625 0.8
## 1 0.4 0.50 0.95 0.625 0.8
## 1 0.4 0.50 0.95 0.625 0.8
## 1 0.4 0.50 0.95 0.625 0.8
## 1 0.4 0.50 0.95 0.625 0.8
## 1 0.4 0.50 0.95 0.750 0.6
## 1 0.4 0.50 0.95 0.750 0.6
## 1 0.4 0.50 0.95 0.750 0.6
## 1 0.4 0.50 0.95 0.750 0.6
## 1 0.4 0.50 0.95 0.750 0.6
## 1 0.4 0.50 0.95 0.750 0.8
## 1 0.4 0.50 0.95 0.750 0.8
## 1 0.4 0.50 0.95 0.750 0.8
## 1 0.4 0.50 0.95 0.750 0.8
## 1 0.4 0.50 0.95 0.750 0.8
## 1 0.4 0.50 0.95 0.875 0.6
## 1 0.4 0.50 0.95 0.875 0.6
## 1 0.4 0.50 0.95 0.875 0.6
## 1 0.4 0.50 0.95 0.875 0.6
## 1 0.4 0.50 0.95 0.875 0.6
## 1 0.4 0.50 0.95 0.875 0.8
## 1 0.4 0.50 0.95 0.875 0.8
## 1 0.4 0.50 0.95 0.875 0.8
## 1 0.4 0.50 0.95 0.875 0.8
## 1 0.4 0.50 0.95 0.875 0.8
## 1 0.4 0.50 0.95 1.000 0.6
## 1 0.4 0.50 0.95 1.000 0.6
## 1 0.4 0.50 0.95 1.000 0.6
## 1 0.4 0.50 0.95 1.000 0.6
## 1 0.4 0.50 0.95 1.000 0.6
## 1 0.4 0.50 0.95 1.000 0.8
## 1 0.4 0.50 0.95 1.000 0.8
## 1 0.4 0.50 0.95 1.000 0.8
## 1 0.4 0.50 0.95 1.000 0.8
## 1 0.4 0.50 0.95 1.000 0.8
## 2 0.3 0.01 0.05 0.500 0.6
## 2 0.3 0.01 0.05 0.500 0.6
## 2 0.3 0.01 0.05 0.500 0.6
## 2 0.3 0.01 0.05 0.500 0.6
## 2 0.3 0.01 0.05 0.500 0.6
## 2 0.3 0.01 0.05 0.500 0.8
## 2 0.3 0.01 0.05 0.500 0.8
## 2 0.3 0.01 0.05 0.500 0.8
## 2 0.3 0.01 0.05 0.500 0.8
## 2 0.3 0.01 0.05 0.500 0.8
## 2 0.3 0.01 0.05 0.625 0.6
## 2 0.3 0.01 0.05 0.625 0.6
## 2 0.3 0.01 0.05 0.625 0.6
## 2 0.3 0.01 0.05 0.625 0.6
## 2 0.3 0.01 0.05 0.625 0.6
## 2 0.3 0.01 0.05 0.625 0.8
## 2 0.3 0.01 0.05 0.625 0.8
## 2 0.3 0.01 0.05 0.625 0.8
## 2 0.3 0.01 0.05 0.625 0.8
## 2 0.3 0.01 0.05 0.625 0.8
## 2 0.3 0.01 0.05 0.750 0.6
## 2 0.3 0.01 0.05 0.750 0.6
## 2 0.3 0.01 0.05 0.750 0.6
## 2 0.3 0.01 0.05 0.750 0.6
## 2 0.3 0.01 0.05 0.750 0.6
## 2 0.3 0.01 0.05 0.750 0.8
## 2 0.3 0.01 0.05 0.750 0.8
## 2 0.3 0.01 0.05 0.750 0.8
## 2 0.3 0.01 0.05 0.750 0.8
## 2 0.3 0.01 0.05 0.750 0.8
## 2 0.3 0.01 0.05 0.875 0.6
## 2 0.3 0.01 0.05 0.875 0.6
## 2 0.3 0.01 0.05 0.875 0.6
## 2 0.3 0.01 0.05 0.875 0.6
## 2 0.3 0.01 0.05 0.875 0.6
## 2 0.3 0.01 0.05 0.875 0.8
## 2 0.3 0.01 0.05 0.875 0.8
## 2 0.3 0.01 0.05 0.875 0.8
## 2 0.3 0.01 0.05 0.875 0.8
## 2 0.3 0.01 0.05 0.875 0.8
## 2 0.3 0.01 0.05 1.000 0.6
## 2 0.3 0.01 0.05 1.000 0.6
## 2 0.3 0.01 0.05 1.000 0.6
## 2 0.3 0.01 0.05 1.000 0.6
## 2 0.3 0.01 0.05 1.000 0.6
## 2 0.3 0.01 0.05 1.000 0.8
## 2 0.3 0.01 0.05 1.000 0.8
## 2 0.3 0.01 0.05 1.000 0.8
## 2 0.3 0.01 0.05 1.000 0.8
## 2 0.3 0.01 0.05 1.000 0.8
## 2 0.3 0.01 0.95 0.500 0.6
## 2 0.3 0.01 0.95 0.500 0.6
## 2 0.3 0.01 0.95 0.500 0.6
## 2 0.3 0.01 0.95 0.500 0.6
## 2 0.3 0.01 0.95 0.500 0.6
## 2 0.3 0.01 0.95 0.500 0.8
## 2 0.3 0.01 0.95 0.500 0.8
## 2 0.3 0.01 0.95 0.500 0.8
## 2 0.3 0.01 0.95 0.500 0.8
## 2 0.3 0.01 0.95 0.500 0.8
## 2 0.3 0.01 0.95 0.625 0.6
## 2 0.3 0.01 0.95 0.625 0.6
## 2 0.3 0.01 0.95 0.625 0.6
## 2 0.3 0.01 0.95 0.625 0.6
## 2 0.3 0.01 0.95 0.625 0.6
## 2 0.3 0.01 0.95 0.625 0.8
## 2 0.3 0.01 0.95 0.625 0.8
## 2 0.3 0.01 0.95 0.625 0.8
## 2 0.3 0.01 0.95 0.625 0.8
## 2 0.3 0.01 0.95 0.625 0.8
## 2 0.3 0.01 0.95 0.750 0.6
## 2 0.3 0.01 0.95 0.750 0.6
## 2 0.3 0.01 0.95 0.750 0.6
## 2 0.3 0.01 0.95 0.750 0.6
## 2 0.3 0.01 0.95 0.750 0.6
## 2 0.3 0.01 0.95 0.750 0.8
## 2 0.3 0.01 0.95 0.750 0.8
## 2 0.3 0.01 0.95 0.750 0.8
## 2 0.3 0.01 0.95 0.750 0.8
## 2 0.3 0.01 0.95 0.750 0.8
## 2 0.3 0.01 0.95 0.875 0.6
## 2 0.3 0.01 0.95 0.875 0.6
## 2 0.3 0.01 0.95 0.875 0.6
## 2 0.3 0.01 0.95 0.875 0.6
## 2 0.3 0.01 0.95 0.875 0.6
## 2 0.3 0.01 0.95 0.875 0.8
## 2 0.3 0.01 0.95 0.875 0.8
## 2 0.3 0.01 0.95 0.875 0.8
## 2 0.3 0.01 0.95 0.875 0.8
## 2 0.3 0.01 0.95 0.875 0.8
## 2 0.3 0.01 0.95 1.000 0.6
## 2 0.3 0.01 0.95 1.000 0.6
## 2 0.3 0.01 0.95 1.000 0.6
## 2 0.3 0.01 0.95 1.000 0.6
## 2 0.3 0.01 0.95 1.000 0.6
## 2 0.3 0.01 0.95 1.000 0.8
## 2 0.3 0.01 0.95 1.000 0.8
## 2 0.3 0.01 0.95 1.000 0.8
## 2 0.3 0.01 0.95 1.000 0.8
## 2 0.3 0.01 0.95 1.000 0.8
## 2 0.3 0.50 0.05 0.500 0.6
## 2 0.3 0.50 0.05 0.500 0.6
## 2 0.3 0.50 0.05 0.500 0.6
## 2 0.3 0.50 0.05 0.500 0.6
## 2 0.3 0.50 0.05 0.500 0.6
## 2 0.3 0.50 0.05 0.500 0.8
## 2 0.3 0.50 0.05 0.500 0.8
## 2 0.3 0.50 0.05 0.500 0.8
## 2 0.3 0.50 0.05 0.500 0.8
## 2 0.3 0.50 0.05 0.500 0.8
## 2 0.3 0.50 0.05 0.625 0.6
## 2 0.3 0.50 0.05 0.625 0.6
## 2 0.3 0.50 0.05 0.625 0.6
## 2 0.3 0.50 0.05 0.625 0.6
## 2 0.3 0.50 0.05 0.625 0.6
## 2 0.3 0.50 0.05 0.625 0.8
## 2 0.3 0.50 0.05 0.625 0.8
## 2 0.3 0.50 0.05 0.625 0.8
## 2 0.3 0.50 0.05 0.625 0.8
## 2 0.3 0.50 0.05 0.625 0.8
## 2 0.3 0.50 0.05 0.750 0.6
## 2 0.3 0.50 0.05 0.750 0.6
## 2 0.3 0.50 0.05 0.750 0.6
## 2 0.3 0.50 0.05 0.750 0.6
## 2 0.3 0.50 0.05 0.750 0.6
## 2 0.3 0.50 0.05 0.750 0.8
## 2 0.3 0.50 0.05 0.750 0.8
## 2 0.3 0.50 0.05 0.750 0.8
## 2 0.3 0.50 0.05 0.750 0.8
## 2 0.3 0.50 0.05 0.750 0.8
## 2 0.3 0.50 0.05 0.875 0.6
## 2 0.3 0.50 0.05 0.875 0.6
## 2 0.3 0.50 0.05 0.875 0.6
## 2 0.3 0.50 0.05 0.875 0.6
## 2 0.3 0.50 0.05 0.875 0.6
## 2 0.3 0.50 0.05 0.875 0.8
## 2 0.3 0.50 0.05 0.875 0.8
## 2 0.3 0.50 0.05 0.875 0.8
## 2 0.3 0.50 0.05 0.875 0.8
## 2 0.3 0.50 0.05 0.875 0.8
## 2 0.3 0.50 0.05 1.000 0.6
## 2 0.3 0.50 0.05 1.000 0.6
## 2 0.3 0.50 0.05 1.000 0.6
## 2 0.3 0.50 0.05 1.000 0.6
## 2 0.3 0.50 0.05 1.000 0.6
## 2 0.3 0.50 0.05 1.000 0.8
## 2 0.3 0.50 0.05 1.000 0.8
## 2 0.3 0.50 0.05 1.000 0.8
## 2 0.3 0.50 0.05 1.000 0.8
## 2 0.3 0.50 0.05 1.000 0.8
## 2 0.3 0.50 0.95 0.500 0.6
## 2 0.3 0.50 0.95 0.500 0.6
## 2 0.3 0.50 0.95 0.500 0.6
## 2 0.3 0.50 0.95 0.500 0.6
## 2 0.3 0.50 0.95 0.500 0.6
## 2 0.3 0.50 0.95 0.500 0.8
## 2 0.3 0.50 0.95 0.500 0.8
## 2 0.3 0.50 0.95 0.500 0.8
## 2 0.3 0.50 0.95 0.500 0.8
## 2 0.3 0.50 0.95 0.500 0.8
## 2 0.3 0.50 0.95 0.625 0.6
## 2 0.3 0.50 0.95 0.625 0.6
## 2 0.3 0.50 0.95 0.625 0.6
## 2 0.3 0.50 0.95 0.625 0.6
## 2 0.3 0.50 0.95 0.625 0.6
## 2 0.3 0.50 0.95 0.625 0.8
## 2 0.3 0.50 0.95 0.625 0.8
## 2 0.3 0.50 0.95 0.625 0.8
## 2 0.3 0.50 0.95 0.625 0.8
## 2 0.3 0.50 0.95 0.625 0.8
## 2 0.3 0.50 0.95 0.750 0.6
## 2 0.3 0.50 0.95 0.750 0.6
## 2 0.3 0.50 0.95 0.750 0.6
## 2 0.3 0.50 0.95 0.750 0.6
## 2 0.3 0.50 0.95 0.750 0.6
## 2 0.3 0.50 0.95 0.750 0.8
## 2 0.3 0.50 0.95 0.750 0.8
## 2 0.3 0.50 0.95 0.750 0.8
## 2 0.3 0.50 0.95 0.750 0.8
## 2 0.3 0.50 0.95 0.750 0.8
## 2 0.3 0.50 0.95 0.875 0.6
## 2 0.3 0.50 0.95 0.875 0.6
## 2 0.3 0.50 0.95 0.875 0.6
## 2 0.3 0.50 0.95 0.875 0.6
## 2 0.3 0.50 0.95 0.875 0.6
## 2 0.3 0.50 0.95 0.875 0.8
## 2 0.3 0.50 0.95 0.875 0.8
## 2 0.3 0.50 0.95 0.875 0.8
## 2 0.3 0.50 0.95 0.875 0.8
## 2 0.3 0.50 0.95 0.875 0.8
## 2 0.3 0.50 0.95 1.000 0.6
## 2 0.3 0.50 0.95 1.000 0.6
## 2 0.3 0.50 0.95 1.000 0.6
## 2 0.3 0.50 0.95 1.000 0.6
## 2 0.3 0.50 0.95 1.000 0.6
## 2 0.3 0.50 0.95 1.000 0.8
## 2 0.3 0.50 0.95 1.000 0.8
## 2 0.3 0.50 0.95 1.000 0.8
## 2 0.3 0.50 0.95 1.000 0.8
## 2 0.3 0.50 0.95 1.000 0.8
## 2 0.4 0.01 0.05 0.500 0.6
## 2 0.4 0.01 0.05 0.500 0.6
## 2 0.4 0.01 0.05 0.500 0.6
## 2 0.4 0.01 0.05 0.500 0.6
## 2 0.4 0.01 0.05 0.500 0.6
## 2 0.4 0.01 0.05 0.500 0.8
## 2 0.4 0.01 0.05 0.500 0.8
## 2 0.4 0.01 0.05 0.500 0.8
## 2 0.4 0.01 0.05 0.500 0.8
## 2 0.4 0.01 0.05 0.500 0.8
## 2 0.4 0.01 0.05 0.625 0.6
## 2 0.4 0.01 0.05 0.625 0.6
## 2 0.4 0.01 0.05 0.625 0.6
## 2 0.4 0.01 0.05 0.625 0.6
## 2 0.4 0.01 0.05 0.625 0.6
## 2 0.4 0.01 0.05 0.625 0.8
## 2 0.4 0.01 0.05 0.625 0.8
## 2 0.4 0.01 0.05 0.625 0.8
## 2 0.4 0.01 0.05 0.625 0.8
## 2 0.4 0.01 0.05 0.625 0.8
## 2 0.4 0.01 0.05 0.750 0.6
## 2 0.4 0.01 0.05 0.750 0.6
## 2 0.4 0.01 0.05 0.750 0.6
## 2 0.4 0.01 0.05 0.750 0.6
## 2 0.4 0.01 0.05 0.750 0.6
## 2 0.4 0.01 0.05 0.750 0.8
## 2 0.4 0.01 0.05 0.750 0.8
## 2 0.4 0.01 0.05 0.750 0.8
## 2 0.4 0.01 0.05 0.750 0.8
## 2 0.4 0.01 0.05 0.750 0.8
## 2 0.4 0.01 0.05 0.875 0.6
## 2 0.4 0.01 0.05 0.875 0.6
## 2 0.4 0.01 0.05 0.875 0.6
## 2 0.4 0.01 0.05 0.875 0.6
## 2 0.4 0.01 0.05 0.875 0.6
## 2 0.4 0.01 0.05 0.875 0.8
## 2 0.4 0.01 0.05 0.875 0.8
## 2 0.4 0.01 0.05 0.875 0.8
## 2 0.4 0.01 0.05 0.875 0.8
## 2 0.4 0.01 0.05 0.875 0.8
## 2 0.4 0.01 0.05 1.000 0.6
## 2 0.4 0.01 0.05 1.000 0.6
## 2 0.4 0.01 0.05 1.000 0.6
## 2 0.4 0.01 0.05 1.000 0.6
## 2 0.4 0.01 0.05 1.000 0.6
## 2 0.4 0.01 0.05 1.000 0.8
## 2 0.4 0.01 0.05 1.000 0.8
## 2 0.4 0.01 0.05 1.000 0.8
## 2 0.4 0.01 0.05 1.000 0.8
## 2 0.4 0.01 0.05 1.000 0.8
## 2 0.4 0.01 0.95 0.500 0.6
## 2 0.4 0.01 0.95 0.500 0.6
## 2 0.4 0.01 0.95 0.500 0.6
## 2 0.4 0.01 0.95 0.500 0.6
## 2 0.4 0.01 0.95 0.500 0.6
## 2 0.4 0.01 0.95 0.500 0.8
## 2 0.4 0.01 0.95 0.500 0.8
## 2 0.4 0.01 0.95 0.500 0.8
## 2 0.4 0.01 0.95 0.500 0.8
## 2 0.4 0.01 0.95 0.500 0.8
## 2 0.4 0.01 0.95 0.625 0.6
## 2 0.4 0.01 0.95 0.625 0.6
## 2 0.4 0.01 0.95 0.625 0.6
## 2 0.4 0.01 0.95 0.625 0.6
## 2 0.4 0.01 0.95 0.625 0.6
## 2 0.4 0.01 0.95 0.625 0.8
## 2 0.4 0.01 0.95 0.625 0.8
## 2 0.4 0.01 0.95 0.625 0.8
## 2 0.4 0.01 0.95 0.625 0.8
## 2 0.4 0.01 0.95 0.625 0.8
## 2 0.4 0.01 0.95 0.750 0.6
## 2 0.4 0.01 0.95 0.750 0.6
## 2 0.4 0.01 0.95 0.750 0.6
## 2 0.4 0.01 0.95 0.750 0.6
## 2 0.4 0.01 0.95 0.750 0.6
## 2 0.4 0.01 0.95 0.750 0.8
## 2 0.4 0.01 0.95 0.750 0.8
## 2 0.4 0.01 0.95 0.750 0.8
## 2 0.4 0.01 0.95 0.750 0.8
## 2 0.4 0.01 0.95 0.750 0.8
## 2 0.4 0.01 0.95 0.875 0.6
## 2 0.4 0.01 0.95 0.875 0.6
## 2 0.4 0.01 0.95 0.875 0.6
## 2 0.4 0.01 0.95 0.875 0.6
## 2 0.4 0.01 0.95 0.875 0.6
## 2 0.4 0.01 0.95 0.875 0.8
## 2 0.4 0.01 0.95 0.875 0.8
## 2 0.4 0.01 0.95 0.875 0.8
## 2 0.4 0.01 0.95 0.875 0.8
## 2 0.4 0.01 0.95 0.875 0.8
## 2 0.4 0.01 0.95 1.000 0.6
## 2 0.4 0.01 0.95 1.000 0.6
## 2 0.4 0.01 0.95 1.000 0.6
## 2 0.4 0.01 0.95 1.000 0.6
## 2 0.4 0.01 0.95 1.000 0.6
## 2 0.4 0.01 0.95 1.000 0.8
## 2 0.4 0.01 0.95 1.000 0.8
## 2 0.4 0.01 0.95 1.000 0.8
## 2 0.4 0.01 0.95 1.000 0.8
## 2 0.4 0.01 0.95 1.000 0.8
## 2 0.4 0.50 0.05 0.500 0.6
## 2 0.4 0.50 0.05 0.500 0.6
## 2 0.4 0.50 0.05 0.500 0.6
## 2 0.4 0.50 0.05 0.500 0.6
## 2 0.4 0.50 0.05 0.500 0.6
## 2 0.4 0.50 0.05 0.500 0.8
## 2 0.4 0.50 0.05 0.500 0.8
## 2 0.4 0.50 0.05 0.500 0.8
## 2 0.4 0.50 0.05 0.500 0.8
## 2 0.4 0.50 0.05 0.500 0.8
## 2 0.4 0.50 0.05 0.625 0.6
## 2 0.4 0.50 0.05 0.625 0.6
## 2 0.4 0.50 0.05 0.625 0.6
## 2 0.4 0.50 0.05 0.625 0.6
## 2 0.4 0.50 0.05 0.625 0.6
## 2 0.4 0.50 0.05 0.625 0.8
## 2 0.4 0.50 0.05 0.625 0.8
## 2 0.4 0.50 0.05 0.625 0.8
## 2 0.4 0.50 0.05 0.625 0.8
## 2 0.4 0.50 0.05 0.625 0.8
## 2 0.4 0.50 0.05 0.750 0.6
## 2 0.4 0.50 0.05 0.750 0.6
## 2 0.4 0.50 0.05 0.750 0.6
## 2 0.4 0.50 0.05 0.750 0.6
## 2 0.4 0.50 0.05 0.750 0.6
## 2 0.4 0.50 0.05 0.750 0.8
## 2 0.4 0.50 0.05 0.750 0.8
## 2 0.4 0.50 0.05 0.750 0.8
## 2 0.4 0.50 0.05 0.750 0.8
## 2 0.4 0.50 0.05 0.750 0.8
## 2 0.4 0.50 0.05 0.875 0.6
## 2 0.4 0.50 0.05 0.875 0.6
## 2 0.4 0.50 0.05 0.875 0.6
## 2 0.4 0.50 0.05 0.875 0.6
## 2 0.4 0.50 0.05 0.875 0.6
## 2 0.4 0.50 0.05 0.875 0.8
## 2 0.4 0.50 0.05 0.875 0.8
## 2 0.4 0.50 0.05 0.875 0.8
## 2 0.4 0.50 0.05 0.875 0.8
## 2 0.4 0.50 0.05 0.875 0.8
## 2 0.4 0.50 0.05 1.000 0.6
## 2 0.4 0.50 0.05 1.000 0.6
## 2 0.4 0.50 0.05 1.000 0.6
## 2 0.4 0.50 0.05 1.000 0.6
## 2 0.4 0.50 0.05 1.000 0.6
## 2 0.4 0.50 0.05 1.000 0.8
## 2 0.4 0.50 0.05 1.000 0.8
## 2 0.4 0.50 0.05 1.000 0.8
## 2 0.4 0.50 0.05 1.000 0.8
## 2 0.4 0.50 0.05 1.000 0.8
## 2 0.4 0.50 0.95 0.500 0.6
## 2 0.4 0.50 0.95 0.500 0.6
## 2 0.4 0.50 0.95 0.500 0.6
## 2 0.4 0.50 0.95 0.500 0.6
## 2 0.4 0.50 0.95 0.500 0.6
## 2 0.4 0.50 0.95 0.500 0.8
## 2 0.4 0.50 0.95 0.500 0.8
## 2 0.4 0.50 0.95 0.500 0.8
## 2 0.4 0.50 0.95 0.500 0.8
## 2 0.4 0.50 0.95 0.500 0.8
## 2 0.4 0.50 0.95 0.625 0.6
## 2 0.4 0.50 0.95 0.625 0.6
## 2 0.4 0.50 0.95 0.625 0.6
## 2 0.4 0.50 0.95 0.625 0.6
## 2 0.4 0.50 0.95 0.625 0.6
## 2 0.4 0.50 0.95 0.625 0.8
## 2 0.4 0.50 0.95 0.625 0.8
## 2 0.4 0.50 0.95 0.625 0.8
## 2 0.4 0.50 0.95 0.625 0.8
## 2 0.4 0.50 0.95 0.625 0.8
## 2 0.4 0.50 0.95 0.750 0.6
## 2 0.4 0.50 0.95 0.750 0.6
## 2 0.4 0.50 0.95 0.750 0.6
## 2 0.4 0.50 0.95 0.750 0.6
## 2 0.4 0.50 0.95 0.750 0.6
## 2 0.4 0.50 0.95 0.750 0.8
## 2 0.4 0.50 0.95 0.750 0.8
## 2 0.4 0.50 0.95 0.750 0.8
## 2 0.4 0.50 0.95 0.750 0.8
## 2 0.4 0.50 0.95 0.750 0.8
## 2 0.4 0.50 0.95 0.875 0.6
## 2 0.4 0.50 0.95 0.875 0.6
## 2 0.4 0.50 0.95 0.875 0.6
## 2 0.4 0.50 0.95 0.875 0.6
## 2 0.4 0.50 0.95 0.875 0.6
## 2 0.4 0.50 0.95 0.875 0.8
## 2 0.4 0.50 0.95 0.875 0.8
## 2 0.4 0.50 0.95 0.875 0.8
## 2 0.4 0.50 0.95 0.875 0.8
## 2 0.4 0.50 0.95 0.875 0.8
## 2 0.4 0.50 0.95 1.000 0.6
## 2 0.4 0.50 0.95 1.000 0.6
## 2 0.4 0.50 0.95 1.000 0.6
## 2 0.4 0.50 0.95 1.000 0.6
## 2 0.4 0.50 0.95 1.000 0.6
## 2 0.4 0.50 0.95 1.000 0.8
## 2 0.4 0.50 0.95 1.000 0.8
## 2 0.4 0.50 0.95 1.000 0.8
## 2 0.4 0.50 0.95 1.000 0.8
## 2 0.4 0.50 0.95 1.000 0.8
## 3 0.3 0.01 0.05 0.500 0.6
## 3 0.3 0.01 0.05 0.500 0.6
## 3 0.3 0.01 0.05 0.500 0.6
## 3 0.3 0.01 0.05 0.500 0.6
## 3 0.3 0.01 0.05 0.500 0.6
## 3 0.3 0.01 0.05 0.500 0.8
## 3 0.3 0.01 0.05 0.500 0.8
## 3 0.3 0.01 0.05 0.500 0.8
## 3 0.3 0.01 0.05 0.500 0.8
## 3 0.3 0.01 0.05 0.500 0.8
## 3 0.3 0.01 0.05 0.625 0.6
## 3 0.3 0.01 0.05 0.625 0.6
## 3 0.3 0.01 0.05 0.625 0.6
## 3 0.3 0.01 0.05 0.625 0.6
## 3 0.3 0.01 0.05 0.625 0.6
## 3 0.3 0.01 0.05 0.625 0.8
## 3 0.3 0.01 0.05 0.625 0.8
## 3 0.3 0.01 0.05 0.625 0.8
## 3 0.3 0.01 0.05 0.625 0.8
## 3 0.3 0.01 0.05 0.625 0.8
## 3 0.3 0.01 0.05 0.750 0.6
## 3 0.3 0.01 0.05 0.750 0.6
## 3 0.3 0.01 0.05 0.750 0.6
## 3 0.3 0.01 0.05 0.750 0.6
## 3 0.3 0.01 0.05 0.750 0.6
## 3 0.3 0.01 0.05 0.750 0.8
## 3 0.3 0.01 0.05 0.750 0.8
## 3 0.3 0.01 0.05 0.750 0.8
## 3 0.3 0.01 0.05 0.750 0.8
## 3 0.3 0.01 0.05 0.750 0.8
## 3 0.3 0.01 0.05 0.875 0.6
## 3 0.3 0.01 0.05 0.875 0.6
## 3 0.3 0.01 0.05 0.875 0.6
## 3 0.3 0.01 0.05 0.875 0.6
## 3 0.3 0.01 0.05 0.875 0.6
## 3 0.3 0.01 0.05 0.875 0.8
## 3 0.3 0.01 0.05 0.875 0.8
## 3 0.3 0.01 0.05 0.875 0.8
## 3 0.3 0.01 0.05 0.875 0.8
## 3 0.3 0.01 0.05 0.875 0.8
## 3 0.3 0.01 0.05 1.000 0.6
## 3 0.3 0.01 0.05 1.000 0.6
## 3 0.3 0.01 0.05 1.000 0.6
## 3 0.3 0.01 0.05 1.000 0.6
## 3 0.3 0.01 0.05 1.000 0.6
## 3 0.3 0.01 0.05 1.000 0.8
## 3 0.3 0.01 0.05 1.000 0.8
## 3 0.3 0.01 0.05 1.000 0.8
## 3 0.3 0.01 0.05 1.000 0.8
## 3 0.3 0.01 0.05 1.000 0.8
## 3 0.3 0.01 0.95 0.500 0.6
## 3 0.3 0.01 0.95 0.500 0.6
## 3 0.3 0.01 0.95 0.500 0.6
## 3 0.3 0.01 0.95 0.500 0.6
## 3 0.3 0.01 0.95 0.500 0.6
## 3 0.3 0.01 0.95 0.500 0.8
## 3 0.3 0.01 0.95 0.500 0.8
## 3 0.3 0.01 0.95 0.500 0.8
## 3 0.3 0.01 0.95 0.500 0.8
## 3 0.3 0.01 0.95 0.500 0.8
## 3 0.3 0.01 0.95 0.625 0.6
## 3 0.3 0.01 0.95 0.625 0.6
## 3 0.3 0.01 0.95 0.625 0.6
## 3 0.3 0.01 0.95 0.625 0.6
## 3 0.3 0.01 0.95 0.625 0.6
## 3 0.3 0.01 0.95 0.625 0.8
## 3 0.3 0.01 0.95 0.625 0.8
## 3 0.3 0.01 0.95 0.625 0.8
## 3 0.3 0.01 0.95 0.625 0.8
## 3 0.3 0.01 0.95 0.625 0.8
## 3 0.3 0.01 0.95 0.750 0.6
## 3 0.3 0.01 0.95 0.750 0.6
## 3 0.3 0.01 0.95 0.750 0.6
## 3 0.3 0.01 0.95 0.750 0.6
## 3 0.3 0.01 0.95 0.750 0.6
## 3 0.3 0.01 0.95 0.750 0.8
## 3 0.3 0.01 0.95 0.750 0.8
## 3 0.3 0.01 0.95 0.750 0.8
## 3 0.3 0.01 0.95 0.750 0.8
## 3 0.3 0.01 0.95 0.750 0.8
## 3 0.3 0.01 0.95 0.875 0.6
## 3 0.3 0.01 0.95 0.875 0.6
## 3 0.3 0.01 0.95 0.875 0.6
## 3 0.3 0.01 0.95 0.875 0.6
## 3 0.3 0.01 0.95 0.875 0.6
## 3 0.3 0.01 0.95 0.875 0.8
## 3 0.3 0.01 0.95 0.875 0.8
## 3 0.3 0.01 0.95 0.875 0.8
## 3 0.3 0.01 0.95 0.875 0.8
## 3 0.3 0.01 0.95 0.875 0.8
## 3 0.3 0.01 0.95 1.000 0.6
## 3 0.3 0.01 0.95 1.000 0.6
## 3 0.3 0.01 0.95 1.000 0.6
## 3 0.3 0.01 0.95 1.000 0.6
## 3 0.3 0.01 0.95 1.000 0.6
## 3 0.3 0.01 0.95 1.000 0.8
## 3 0.3 0.01 0.95 1.000 0.8
## 3 0.3 0.01 0.95 1.000 0.8
## 3 0.3 0.01 0.95 1.000 0.8
## 3 0.3 0.01 0.95 1.000 0.8
## 3 0.3 0.50 0.05 0.500 0.6
## 3 0.3 0.50 0.05 0.500 0.6
## 3 0.3 0.50 0.05 0.500 0.6
## 3 0.3 0.50 0.05 0.500 0.6
## 3 0.3 0.50 0.05 0.500 0.6
## 3 0.3 0.50 0.05 0.500 0.8
## 3 0.3 0.50 0.05 0.500 0.8
## 3 0.3 0.50 0.05 0.500 0.8
## 3 0.3 0.50 0.05 0.500 0.8
## 3 0.3 0.50 0.05 0.500 0.8
## 3 0.3 0.50 0.05 0.625 0.6
## 3 0.3 0.50 0.05 0.625 0.6
## 3 0.3 0.50 0.05 0.625 0.6
## 3 0.3 0.50 0.05 0.625 0.6
## 3 0.3 0.50 0.05 0.625 0.6
## 3 0.3 0.50 0.05 0.625 0.8
## 3 0.3 0.50 0.05 0.625 0.8
## 3 0.3 0.50 0.05 0.625 0.8
## 3 0.3 0.50 0.05 0.625 0.8
## 3 0.3 0.50 0.05 0.625 0.8
## 3 0.3 0.50 0.05 0.750 0.6
## 3 0.3 0.50 0.05 0.750 0.6
## 3 0.3 0.50 0.05 0.750 0.6
## 3 0.3 0.50 0.05 0.750 0.6
## 3 0.3 0.50 0.05 0.750 0.6
## 3 0.3 0.50 0.05 0.750 0.8
## 3 0.3 0.50 0.05 0.750 0.8
## 3 0.3 0.50 0.05 0.750 0.8
## 3 0.3 0.50 0.05 0.750 0.8
## 3 0.3 0.50 0.05 0.750 0.8
## 3 0.3 0.50 0.05 0.875 0.6
## 3 0.3 0.50 0.05 0.875 0.6
## 3 0.3 0.50 0.05 0.875 0.6
## 3 0.3 0.50 0.05 0.875 0.6
## 3 0.3 0.50 0.05 0.875 0.6
## 3 0.3 0.50 0.05 0.875 0.8
## 3 0.3 0.50 0.05 0.875 0.8
## 3 0.3 0.50 0.05 0.875 0.8
## 3 0.3 0.50 0.05 0.875 0.8
## 3 0.3 0.50 0.05 0.875 0.8
## 3 0.3 0.50 0.05 1.000 0.6
## 3 0.3 0.50 0.05 1.000 0.6
## 3 0.3 0.50 0.05 1.000 0.6
## 3 0.3 0.50 0.05 1.000 0.6
## 3 0.3 0.50 0.05 1.000 0.6
## 3 0.3 0.50 0.05 1.000 0.8
## 3 0.3 0.50 0.05 1.000 0.8
## 3 0.3 0.50 0.05 1.000 0.8
## 3 0.3 0.50 0.05 1.000 0.8
## 3 0.3 0.50 0.05 1.000 0.8
## 3 0.3 0.50 0.95 0.500 0.6
## 3 0.3 0.50 0.95 0.500 0.6
## 3 0.3 0.50 0.95 0.500 0.6
## 3 0.3 0.50 0.95 0.500 0.6
## 3 0.3 0.50 0.95 0.500 0.6
## 3 0.3 0.50 0.95 0.500 0.8
## 3 0.3 0.50 0.95 0.500 0.8
## 3 0.3 0.50 0.95 0.500 0.8
## 3 0.3 0.50 0.95 0.500 0.8
## 3 0.3 0.50 0.95 0.500 0.8
## 3 0.3 0.50 0.95 0.625 0.6
## 3 0.3 0.50 0.95 0.625 0.6
## 3 0.3 0.50 0.95 0.625 0.6
## 3 0.3 0.50 0.95 0.625 0.6
## 3 0.3 0.50 0.95 0.625 0.6
## 3 0.3 0.50 0.95 0.625 0.8
## 3 0.3 0.50 0.95 0.625 0.8
## 3 0.3 0.50 0.95 0.625 0.8
## 3 0.3 0.50 0.95 0.625 0.8
## 3 0.3 0.50 0.95 0.625 0.8
## 3 0.3 0.50 0.95 0.750 0.6
## 3 0.3 0.50 0.95 0.750 0.6
## 3 0.3 0.50 0.95 0.750 0.6
## 3 0.3 0.50 0.95 0.750 0.6
## 3 0.3 0.50 0.95 0.750 0.6
## 3 0.3 0.50 0.95 0.750 0.8
## 3 0.3 0.50 0.95 0.750 0.8
## 3 0.3 0.50 0.95 0.750 0.8
## 3 0.3 0.50 0.95 0.750 0.8
## 3 0.3 0.50 0.95 0.750 0.8
## 3 0.3 0.50 0.95 0.875 0.6
## 3 0.3 0.50 0.95 0.875 0.6
## 3 0.3 0.50 0.95 0.875 0.6
## 3 0.3 0.50 0.95 0.875 0.6
## 3 0.3 0.50 0.95 0.875 0.6
## 3 0.3 0.50 0.95 0.875 0.8
## 3 0.3 0.50 0.95 0.875 0.8
## 3 0.3 0.50 0.95 0.875 0.8
## 3 0.3 0.50 0.95 0.875 0.8
## 3 0.3 0.50 0.95 0.875 0.8
## 3 0.3 0.50 0.95 1.000 0.6
## 3 0.3 0.50 0.95 1.000 0.6
## 3 0.3 0.50 0.95 1.000 0.6
## 3 0.3 0.50 0.95 1.000 0.6
## 3 0.3 0.50 0.95 1.000 0.6
## 3 0.3 0.50 0.95 1.000 0.8
## 3 0.3 0.50 0.95 1.000 0.8
## 3 0.3 0.50 0.95 1.000 0.8
## 3 0.3 0.50 0.95 1.000 0.8
## 3 0.3 0.50 0.95 1.000 0.8
## 3 0.4 0.01 0.05 0.500 0.6
## 3 0.4 0.01 0.05 0.500 0.6
## 3 0.4 0.01 0.05 0.500 0.6
## 3 0.4 0.01 0.05 0.500 0.6
## 3 0.4 0.01 0.05 0.500 0.6
## 3 0.4 0.01 0.05 0.500 0.8
## 3 0.4 0.01 0.05 0.500 0.8
## 3 0.4 0.01 0.05 0.500 0.8
## 3 0.4 0.01 0.05 0.500 0.8
## 3 0.4 0.01 0.05 0.500 0.8
## 3 0.4 0.01 0.05 0.625 0.6
## 3 0.4 0.01 0.05 0.625 0.6
## 3 0.4 0.01 0.05 0.625 0.6
## 3 0.4 0.01 0.05 0.625 0.6
## 3 0.4 0.01 0.05 0.625 0.6
## 3 0.4 0.01 0.05 0.625 0.8
## 3 0.4 0.01 0.05 0.625 0.8
## 3 0.4 0.01 0.05 0.625 0.8
## 3 0.4 0.01 0.05 0.625 0.8
## 3 0.4 0.01 0.05 0.625 0.8
## 3 0.4 0.01 0.05 0.750 0.6
## 3 0.4 0.01 0.05 0.750 0.6
## 3 0.4 0.01 0.05 0.750 0.6
## 3 0.4 0.01 0.05 0.750 0.6
## 3 0.4 0.01 0.05 0.750 0.6
## 3 0.4 0.01 0.05 0.750 0.8
## 3 0.4 0.01 0.05 0.750 0.8
## 3 0.4 0.01 0.05 0.750 0.8
## 3 0.4 0.01 0.05 0.750 0.8
## 3 0.4 0.01 0.05 0.750 0.8
## 3 0.4 0.01 0.05 0.875 0.6
## 3 0.4 0.01 0.05 0.875 0.6
## 3 0.4 0.01 0.05 0.875 0.6
## 3 0.4 0.01 0.05 0.875 0.6
## 3 0.4 0.01 0.05 0.875 0.6
## 3 0.4 0.01 0.05 0.875 0.8
## 3 0.4 0.01 0.05 0.875 0.8
## 3 0.4 0.01 0.05 0.875 0.8
## 3 0.4 0.01 0.05 0.875 0.8
## 3 0.4 0.01 0.05 0.875 0.8
## 3 0.4 0.01 0.05 1.000 0.6
## 3 0.4 0.01 0.05 1.000 0.6
## 3 0.4 0.01 0.05 1.000 0.6
## 3 0.4 0.01 0.05 1.000 0.6
## 3 0.4 0.01 0.05 1.000 0.6
## 3 0.4 0.01 0.05 1.000 0.8
## 3 0.4 0.01 0.05 1.000 0.8
## 3 0.4 0.01 0.05 1.000 0.8
## 3 0.4 0.01 0.05 1.000 0.8
## 3 0.4 0.01 0.05 1.000 0.8
## 3 0.4 0.01 0.95 0.500 0.6
## 3 0.4 0.01 0.95 0.500 0.6
## 3 0.4 0.01 0.95 0.500 0.6
## 3 0.4 0.01 0.95 0.500 0.6
## 3 0.4 0.01 0.95 0.500 0.6
## 3 0.4 0.01 0.95 0.500 0.8
## 3 0.4 0.01 0.95 0.500 0.8
## 3 0.4 0.01 0.95 0.500 0.8
## 3 0.4 0.01 0.95 0.500 0.8
## 3 0.4 0.01 0.95 0.500 0.8
## 3 0.4 0.01 0.95 0.625 0.6
## 3 0.4 0.01 0.95 0.625 0.6
## 3 0.4 0.01 0.95 0.625 0.6
## 3 0.4 0.01 0.95 0.625 0.6
## 3 0.4 0.01 0.95 0.625 0.6
## 3 0.4 0.01 0.95 0.625 0.8
## 3 0.4 0.01 0.95 0.625 0.8
## 3 0.4 0.01 0.95 0.625 0.8
## 3 0.4 0.01 0.95 0.625 0.8
## 3 0.4 0.01 0.95 0.625 0.8
## 3 0.4 0.01 0.95 0.750 0.6
## 3 0.4 0.01 0.95 0.750 0.6
## 3 0.4 0.01 0.95 0.750 0.6
## 3 0.4 0.01 0.95 0.750 0.6
## 3 0.4 0.01 0.95 0.750 0.6
## 3 0.4 0.01 0.95 0.750 0.8
## 3 0.4 0.01 0.95 0.750 0.8
## 3 0.4 0.01 0.95 0.750 0.8
## 3 0.4 0.01 0.95 0.750 0.8
## 3 0.4 0.01 0.95 0.750 0.8
## 3 0.4 0.01 0.95 0.875 0.6
## 3 0.4 0.01 0.95 0.875 0.6
## 3 0.4 0.01 0.95 0.875 0.6
## 3 0.4 0.01 0.95 0.875 0.6
## 3 0.4 0.01 0.95 0.875 0.6
## 3 0.4 0.01 0.95 0.875 0.8
## 3 0.4 0.01 0.95 0.875 0.8
## 3 0.4 0.01 0.95 0.875 0.8
## 3 0.4 0.01 0.95 0.875 0.8
## 3 0.4 0.01 0.95 0.875 0.8
## 3 0.4 0.01 0.95 1.000 0.6
## 3 0.4 0.01 0.95 1.000 0.6
## 3 0.4 0.01 0.95 1.000 0.6
## 3 0.4 0.01 0.95 1.000 0.6
## 3 0.4 0.01 0.95 1.000 0.6
## 3 0.4 0.01 0.95 1.000 0.8
## 3 0.4 0.01 0.95 1.000 0.8
## 3 0.4 0.01 0.95 1.000 0.8
## 3 0.4 0.01 0.95 1.000 0.8
## 3 0.4 0.01 0.95 1.000 0.8
## 3 0.4 0.50 0.05 0.500 0.6
## 3 0.4 0.50 0.05 0.500 0.6
## 3 0.4 0.50 0.05 0.500 0.6
## 3 0.4 0.50 0.05 0.500 0.6
## 3 0.4 0.50 0.05 0.500 0.6
## 3 0.4 0.50 0.05 0.500 0.8
## 3 0.4 0.50 0.05 0.500 0.8
## 3 0.4 0.50 0.05 0.500 0.8
## 3 0.4 0.50 0.05 0.500 0.8
## 3 0.4 0.50 0.05 0.500 0.8
## 3 0.4 0.50 0.05 0.625 0.6
## 3 0.4 0.50 0.05 0.625 0.6
## 3 0.4 0.50 0.05 0.625 0.6
## 3 0.4 0.50 0.05 0.625 0.6
## 3 0.4 0.50 0.05 0.625 0.6
## 3 0.4 0.50 0.05 0.625 0.8
## 3 0.4 0.50 0.05 0.625 0.8
## 3 0.4 0.50 0.05 0.625 0.8
## 3 0.4 0.50 0.05 0.625 0.8
## 3 0.4 0.50 0.05 0.625 0.8
## 3 0.4 0.50 0.05 0.750 0.6
## 3 0.4 0.50 0.05 0.750 0.6
## 3 0.4 0.50 0.05 0.750 0.6
## 3 0.4 0.50 0.05 0.750 0.6
## 3 0.4 0.50 0.05 0.750 0.6
## 3 0.4 0.50 0.05 0.750 0.8
## 3 0.4 0.50 0.05 0.750 0.8
## 3 0.4 0.50 0.05 0.750 0.8
## 3 0.4 0.50 0.05 0.750 0.8
## 3 0.4 0.50 0.05 0.750 0.8
## 3 0.4 0.50 0.05 0.875 0.6
## 3 0.4 0.50 0.05 0.875 0.6
## 3 0.4 0.50 0.05 0.875 0.6
## 3 0.4 0.50 0.05 0.875 0.6
## 3 0.4 0.50 0.05 0.875 0.6
## 3 0.4 0.50 0.05 0.875 0.8
## 3 0.4 0.50 0.05 0.875 0.8
## 3 0.4 0.50 0.05 0.875 0.8
## 3 0.4 0.50 0.05 0.875 0.8
## 3 0.4 0.50 0.05 0.875 0.8
## 3 0.4 0.50 0.05 1.000 0.6
## 3 0.4 0.50 0.05 1.000 0.6
## 3 0.4 0.50 0.05 1.000 0.6
## 3 0.4 0.50 0.05 1.000 0.6
## 3 0.4 0.50 0.05 1.000 0.6
## 3 0.4 0.50 0.05 1.000 0.8
## 3 0.4 0.50 0.05 1.000 0.8
## 3 0.4 0.50 0.05 1.000 0.8
## 3 0.4 0.50 0.05 1.000 0.8
## 3 0.4 0.50 0.05 1.000 0.8
## 3 0.4 0.50 0.95 0.500 0.6
## 3 0.4 0.50 0.95 0.500 0.6
## 3 0.4 0.50 0.95 0.500 0.6
## 3 0.4 0.50 0.95 0.500 0.6
## 3 0.4 0.50 0.95 0.500 0.6
## 3 0.4 0.50 0.95 0.500 0.8
## 3 0.4 0.50 0.95 0.500 0.8
## 3 0.4 0.50 0.95 0.500 0.8
## 3 0.4 0.50 0.95 0.500 0.8
## 3 0.4 0.50 0.95 0.500 0.8
## 3 0.4 0.50 0.95 0.625 0.6
## 3 0.4 0.50 0.95 0.625 0.6
## 3 0.4 0.50 0.95 0.625 0.6
## 3 0.4 0.50 0.95 0.625 0.6
## 3 0.4 0.50 0.95 0.625 0.6
## 3 0.4 0.50 0.95 0.625 0.8
## 3 0.4 0.50 0.95 0.625 0.8
## 3 0.4 0.50 0.95 0.625 0.8
## 3 0.4 0.50 0.95 0.625 0.8
## 3 0.4 0.50 0.95 0.625 0.8
## 3 0.4 0.50 0.95 0.750 0.6
## 3 0.4 0.50 0.95 0.750 0.6
## 3 0.4 0.50 0.95 0.750 0.6
## 3 0.4 0.50 0.95 0.750 0.6
## 3 0.4 0.50 0.95 0.750 0.6
## 3 0.4 0.50 0.95 0.750 0.8
## 3 0.4 0.50 0.95 0.750 0.8
## 3 0.4 0.50 0.95 0.750 0.8
## 3 0.4 0.50 0.95 0.750 0.8
## 3 0.4 0.50 0.95 0.750 0.8
## 3 0.4 0.50 0.95 0.875 0.6
## 3 0.4 0.50 0.95 0.875 0.6
## 3 0.4 0.50 0.95 0.875 0.6
## 3 0.4 0.50 0.95 0.875 0.6
## 3 0.4 0.50 0.95 0.875 0.6
## 3 0.4 0.50 0.95 0.875 0.8
## 3 0.4 0.50 0.95 0.875 0.8
## 3 0.4 0.50 0.95 0.875 0.8
## 3 0.4 0.50 0.95 0.875 0.8
## 3 0.4 0.50 0.95 0.875 0.8
## 3 0.4 0.50 0.95 1.000 0.6
## 3 0.4 0.50 0.95 1.000 0.6
## 3 0.4 0.50 0.95 1.000 0.6
## 3 0.4 0.50 0.95 1.000 0.6
## 3 0.4 0.50 0.95 1.000 0.6
## 3 0.4 0.50 0.95 1.000 0.8
## 3 0.4 0.50 0.95 1.000 0.8
## 3 0.4 0.50 0.95 1.000 0.8
## 3 0.4 0.50 0.95 1.000 0.8
## 3 0.4 0.50 0.95 1.000 0.8
## 4 0.3 0.01 0.05 0.500 0.6
## 4 0.3 0.01 0.05 0.500 0.6
## 4 0.3 0.01 0.05 0.500 0.6
## 4 0.3 0.01 0.05 0.500 0.6
## 4 0.3 0.01 0.05 0.500 0.6
## 4 0.3 0.01 0.05 0.500 0.8
## 4 0.3 0.01 0.05 0.500 0.8
## 4 0.3 0.01 0.05 0.500 0.8
## 4 0.3 0.01 0.05 0.500 0.8
## 4 0.3 0.01 0.05 0.500 0.8
## 4 0.3 0.01 0.05 0.625 0.6
## 4 0.3 0.01 0.05 0.625 0.6
## 4 0.3 0.01 0.05 0.625 0.6
## 4 0.3 0.01 0.05 0.625 0.6
## 4 0.3 0.01 0.05 0.625 0.6
## 4 0.3 0.01 0.05 0.625 0.8
## 4 0.3 0.01 0.05 0.625 0.8
## 4 0.3 0.01 0.05 0.625 0.8
## 4 0.3 0.01 0.05 0.625 0.8
## 4 0.3 0.01 0.05 0.625 0.8
## 4 0.3 0.01 0.05 0.750 0.6
## 4 0.3 0.01 0.05 0.750 0.6
## 4 0.3 0.01 0.05 0.750 0.6
## 4 0.3 0.01 0.05 0.750 0.6
## 4 0.3 0.01 0.05 0.750 0.6
## 4 0.3 0.01 0.05 0.750 0.8
## 4 0.3 0.01 0.05 0.750 0.8
## 4 0.3 0.01 0.05 0.750 0.8
## 4 0.3 0.01 0.05 0.750 0.8
## 4 0.3 0.01 0.05 0.750 0.8
## 4 0.3 0.01 0.05 0.875 0.6
## 4 0.3 0.01 0.05 0.875 0.6
## 4 0.3 0.01 0.05 0.875 0.6
## 4 0.3 0.01 0.05 0.875 0.6
## 4 0.3 0.01 0.05 0.875 0.6
## 4 0.3 0.01 0.05 0.875 0.8
## 4 0.3 0.01 0.05 0.875 0.8
## 4 0.3 0.01 0.05 0.875 0.8
## 4 0.3 0.01 0.05 0.875 0.8
## 4 0.3 0.01 0.05 0.875 0.8
## 4 0.3 0.01 0.05 1.000 0.6
## 4 0.3 0.01 0.05 1.000 0.6
## 4 0.3 0.01 0.05 1.000 0.6
## 4 0.3 0.01 0.05 1.000 0.6
## 4 0.3 0.01 0.05 1.000 0.6
## 4 0.3 0.01 0.05 1.000 0.8
## 4 0.3 0.01 0.05 1.000 0.8
## 4 0.3 0.01 0.05 1.000 0.8
## 4 0.3 0.01 0.05 1.000 0.8
## 4 0.3 0.01 0.05 1.000 0.8
## 4 0.3 0.01 0.95 0.500 0.6
## 4 0.3 0.01 0.95 0.500 0.6
## 4 0.3 0.01 0.95 0.500 0.6
## 4 0.3 0.01 0.95 0.500 0.6
## 4 0.3 0.01 0.95 0.500 0.6
## 4 0.3 0.01 0.95 0.500 0.8
## 4 0.3 0.01 0.95 0.500 0.8
## 4 0.3 0.01 0.95 0.500 0.8
## 4 0.3 0.01 0.95 0.500 0.8
## 4 0.3 0.01 0.95 0.500 0.8
## 4 0.3 0.01 0.95 0.625 0.6
## 4 0.3 0.01 0.95 0.625 0.6
## 4 0.3 0.01 0.95 0.625 0.6
## 4 0.3 0.01 0.95 0.625 0.6
## 4 0.3 0.01 0.95 0.625 0.6
## 4 0.3 0.01 0.95 0.625 0.8
## 4 0.3 0.01 0.95 0.625 0.8
## 4 0.3 0.01 0.95 0.625 0.8
## 4 0.3 0.01 0.95 0.625 0.8
## 4 0.3 0.01 0.95 0.625 0.8
## 4 0.3 0.01 0.95 0.750 0.6
## 4 0.3 0.01 0.95 0.750 0.6
## 4 0.3 0.01 0.95 0.750 0.6
## 4 0.3 0.01 0.95 0.750 0.6
## 4 0.3 0.01 0.95 0.750 0.6
## 4 0.3 0.01 0.95 0.750 0.8
## 4 0.3 0.01 0.95 0.750 0.8
## 4 0.3 0.01 0.95 0.750 0.8
## 4 0.3 0.01 0.95 0.750 0.8
## 4 0.3 0.01 0.95 0.750 0.8
## 4 0.3 0.01 0.95 0.875 0.6
## 4 0.3 0.01 0.95 0.875 0.6
## 4 0.3 0.01 0.95 0.875 0.6
## 4 0.3 0.01 0.95 0.875 0.6
## 4 0.3 0.01 0.95 0.875 0.6
## 4 0.3 0.01 0.95 0.875 0.8
## 4 0.3 0.01 0.95 0.875 0.8
## 4 0.3 0.01 0.95 0.875 0.8
## 4 0.3 0.01 0.95 0.875 0.8
## 4 0.3 0.01 0.95 0.875 0.8
## 4 0.3 0.01 0.95 1.000 0.6
## 4 0.3 0.01 0.95 1.000 0.6
## 4 0.3 0.01 0.95 1.000 0.6
## 4 0.3 0.01 0.95 1.000 0.6
## 4 0.3 0.01 0.95 1.000 0.6
## 4 0.3 0.01 0.95 1.000 0.8
## 4 0.3 0.01 0.95 1.000 0.8
## 4 0.3 0.01 0.95 1.000 0.8
## 4 0.3 0.01 0.95 1.000 0.8
## 4 0.3 0.01 0.95 1.000 0.8
## 4 0.3 0.50 0.05 0.500 0.6
## 4 0.3 0.50 0.05 0.500 0.6
## 4 0.3 0.50 0.05 0.500 0.6
## 4 0.3 0.50 0.05 0.500 0.6
## 4 0.3 0.50 0.05 0.500 0.6
## 4 0.3 0.50 0.05 0.500 0.8
## 4 0.3 0.50 0.05 0.500 0.8
## 4 0.3 0.50 0.05 0.500 0.8
## 4 0.3 0.50 0.05 0.500 0.8
## 4 0.3 0.50 0.05 0.500 0.8
## 4 0.3 0.50 0.05 0.625 0.6
## 4 0.3 0.50 0.05 0.625 0.6
## 4 0.3 0.50 0.05 0.625 0.6
## 4 0.3 0.50 0.05 0.625 0.6
## 4 0.3 0.50 0.05 0.625 0.6
## 4 0.3 0.50 0.05 0.625 0.8
## 4 0.3 0.50 0.05 0.625 0.8
## 4 0.3 0.50 0.05 0.625 0.8
## 4 0.3 0.50 0.05 0.625 0.8
## 4 0.3 0.50 0.05 0.625 0.8
## 4 0.3 0.50 0.05 0.750 0.6
## 4 0.3 0.50 0.05 0.750 0.6
## 4 0.3 0.50 0.05 0.750 0.6
## 4 0.3 0.50 0.05 0.750 0.6
## 4 0.3 0.50 0.05 0.750 0.6
## 4 0.3 0.50 0.05 0.750 0.8
## 4 0.3 0.50 0.05 0.750 0.8
## 4 0.3 0.50 0.05 0.750 0.8
## 4 0.3 0.50 0.05 0.750 0.8
## 4 0.3 0.50 0.05 0.750 0.8
## 4 0.3 0.50 0.05 0.875 0.6
## 4 0.3 0.50 0.05 0.875 0.6
## 4 0.3 0.50 0.05 0.875 0.6
## 4 0.3 0.50 0.05 0.875 0.6
## 4 0.3 0.50 0.05 0.875 0.6
## 4 0.3 0.50 0.05 0.875 0.8
## 4 0.3 0.50 0.05 0.875 0.8
## 4 0.3 0.50 0.05 0.875 0.8
## 4 0.3 0.50 0.05 0.875 0.8
## 4 0.3 0.50 0.05 0.875 0.8
## 4 0.3 0.50 0.05 1.000 0.6
## 4 0.3 0.50 0.05 1.000 0.6
## 4 0.3 0.50 0.05 1.000 0.6
## 4 0.3 0.50 0.05 1.000 0.6
## 4 0.3 0.50 0.05 1.000 0.6
## 4 0.3 0.50 0.05 1.000 0.8
## 4 0.3 0.50 0.05 1.000 0.8
## 4 0.3 0.50 0.05 1.000 0.8
## 4 0.3 0.50 0.05 1.000 0.8
## 4 0.3 0.50 0.05 1.000 0.8
## 4 0.3 0.50 0.95 0.500 0.6
## 4 0.3 0.50 0.95 0.500 0.6
## 4 0.3 0.50 0.95 0.500 0.6
## 4 0.3 0.50 0.95 0.500 0.6
## 4 0.3 0.50 0.95 0.500 0.6
## 4 0.3 0.50 0.95 0.500 0.8
## 4 0.3 0.50 0.95 0.500 0.8
## 4 0.3 0.50 0.95 0.500 0.8
## 4 0.3 0.50 0.95 0.500 0.8
## 4 0.3 0.50 0.95 0.500 0.8
## 4 0.3 0.50 0.95 0.625 0.6
## 4 0.3 0.50 0.95 0.625 0.6
## 4 0.3 0.50 0.95 0.625 0.6
## 4 0.3 0.50 0.95 0.625 0.6
## 4 0.3 0.50 0.95 0.625 0.6
## 4 0.3 0.50 0.95 0.625 0.8
## 4 0.3 0.50 0.95 0.625 0.8
## 4 0.3 0.50 0.95 0.625 0.8
## 4 0.3 0.50 0.95 0.625 0.8
## 4 0.3 0.50 0.95 0.625 0.8
## 4 0.3 0.50 0.95 0.750 0.6
## 4 0.3 0.50 0.95 0.750 0.6
## 4 0.3 0.50 0.95 0.750 0.6
## 4 0.3 0.50 0.95 0.750 0.6
## 4 0.3 0.50 0.95 0.750 0.6
## 4 0.3 0.50 0.95 0.750 0.8
## 4 0.3 0.50 0.95 0.750 0.8
## 4 0.3 0.50 0.95 0.750 0.8
## 4 0.3 0.50 0.95 0.750 0.8
## 4 0.3 0.50 0.95 0.750 0.8
## 4 0.3 0.50 0.95 0.875 0.6
## 4 0.3 0.50 0.95 0.875 0.6
## 4 0.3 0.50 0.95 0.875 0.6
## 4 0.3 0.50 0.95 0.875 0.6
## 4 0.3 0.50 0.95 0.875 0.6
## 4 0.3 0.50 0.95 0.875 0.8
## 4 0.3 0.50 0.95 0.875 0.8
## 4 0.3 0.50 0.95 0.875 0.8
## 4 0.3 0.50 0.95 0.875 0.8
## 4 0.3 0.50 0.95 0.875 0.8
## 4 0.3 0.50 0.95 1.000 0.6
## 4 0.3 0.50 0.95 1.000 0.6
## 4 0.3 0.50 0.95 1.000 0.6
## 4 0.3 0.50 0.95 1.000 0.6
## 4 0.3 0.50 0.95 1.000 0.6
## 4 0.3 0.50 0.95 1.000 0.8
## 4 0.3 0.50 0.95 1.000 0.8
## 4 0.3 0.50 0.95 1.000 0.8
## 4 0.3 0.50 0.95 1.000 0.8
## 4 0.3 0.50 0.95 1.000 0.8
## 4 0.4 0.01 0.05 0.500 0.6
## 4 0.4 0.01 0.05 0.500 0.6
## 4 0.4 0.01 0.05 0.500 0.6
## 4 0.4 0.01 0.05 0.500 0.6
## 4 0.4 0.01 0.05 0.500 0.6
## 4 0.4 0.01 0.05 0.500 0.8
## 4 0.4 0.01 0.05 0.500 0.8
## 4 0.4 0.01 0.05 0.500 0.8
## 4 0.4 0.01 0.05 0.500 0.8
## 4 0.4 0.01 0.05 0.500 0.8
## 4 0.4 0.01 0.05 0.625 0.6
## 4 0.4 0.01 0.05 0.625 0.6
## 4 0.4 0.01 0.05 0.625 0.6
## 4 0.4 0.01 0.05 0.625 0.6
## 4 0.4 0.01 0.05 0.625 0.6
## 4 0.4 0.01 0.05 0.625 0.8
## 4 0.4 0.01 0.05 0.625 0.8
## 4 0.4 0.01 0.05 0.625 0.8
## 4 0.4 0.01 0.05 0.625 0.8
## 4 0.4 0.01 0.05 0.625 0.8
## 4 0.4 0.01 0.05 0.750 0.6
## 4 0.4 0.01 0.05 0.750 0.6
## 4 0.4 0.01 0.05 0.750 0.6
## 4 0.4 0.01 0.05 0.750 0.6
## 4 0.4 0.01 0.05 0.750 0.6
## 4 0.4 0.01 0.05 0.750 0.8
## 4 0.4 0.01 0.05 0.750 0.8
## 4 0.4 0.01 0.05 0.750 0.8
## 4 0.4 0.01 0.05 0.750 0.8
## 4 0.4 0.01 0.05 0.750 0.8
## 4 0.4 0.01 0.05 0.875 0.6
## 4 0.4 0.01 0.05 0.875 0.6
## 4 0.4 0.01 0.05 0.875 0.6
## 4 0.4 0.01 0.05 0.875 0.6
## 4 0.4 0.01 0.05 0.875 0.6
## 4 0.4 0.01 0.05 0.875 0.8
## 4 0.4 0.01 0.05 0.875 0.8
## 4 0.4 0.01 0.05 0.875 0.8
## 4 0.4 0.01 0.05 0.875 0.8
## 4 0.4 0.01 0.05 0.875 0.8
## 4 0.4 0.01 0.05 1.000 0.6
## 4 0.4 0.01 0.05 1.000 0.6
## 4 0.4 0.01 0.05 1.000 0.6
## 4 0.4 0.01 0.05 1.000 0.6
## 4 0.4 0.01 0.05 1.000 0.6
## 4 0.4 0.01 0.05 1.000 0.8
## 4 0.4 0.01 0.05 1.000 0.8
## 4 0.4 0.01 0.05 1.000 0.8
## 4 0.4 0.01 0.05 1.000 0.8
## 4 0.4 0.01 0.05 1.000 0.8
## 4 0.4 0.01 0.95 0.500 0.6
## 4 0.4 0.01 0.95 0.500 0.6
## 4 0.4 0.01 0.95 0.500 0.6
## 4 0.4 0.01 0.95 0.500 0.6
## 4 0.4 0.01 0.95 0.500 0.6
## 4 0.4 0.01 0.95 0.500 0.8
## 4 0.4 0.01 0.95 0.500 0.8
## 4 0.4 0.01 0.95 0.500 0.8
## 4 0.4 0.01 0.95 0.500 0.8
## 4 0.4 0.01 0.95 0.500 0.8
## 4 0.4 0.01 0.95 0.625 0.6
## 4 0.4 0.01 0.95 0.625 0.6
## 4 0.4 0.01 0.95 0.625 0.6
## 4 0.4 0.01 0.95 0.625 0.6
## 4 0.4 0.01 0.95 0.625 0.6
## 4 0.4 0.01 0.95 0.625 0.8
## 4 0.4 0.01 0.95 0.625 0.8
## 4 0.4 0.01 0.95 0.625 0.8
## 4 0.4 0.01 0.95 0.625 0.8
## 4 0.4 0.01 0.95 0.625 0.8
## 4 0.4 0.01 0.95 0.750 0.6
## 4 0.4 0.01 0.95 0.750 0.6
## 4 0.4 0.01 0.95 0.750 0.6
## 4 0.4 0.01 0.95 0.750 0.6
## 4 0.4 0.01 0.95 0.750 0.6
## 4 0.4 0.01 0.95 0.750 0.8
## 4 0.4 0.01 0.95 0.750 0.8
## 4 0.4 0.01 0.95 0.750 0.8
## 4 0.4 0.01 0.95 0.750 0.8
## 4 0.4 0.01 0.95 0.750 0.8
## 4 0.4 0.01 0.95 0.875 0.6
## 4 0.4 0.01 0.95 0.875 0.6
## 4 0.4 0.01 0.95 0.875 0.6
## 4 0.4 0.01 0.95 0.875 0.6
## 4 0.4 0.01 0.95 0.875 0.6
## 4 0.4 0.01 0.95 0.875 0.8
## 4 0.4 0.01 0.95 0.875 0.8
## 4 0.4 0.01 0.95 0.875 0.8
## 4 0.4 0.01 0.95 0.875 0.8
## 4 0.4 0.01 0.95 0.875 0.8
## 4 0.4 0.01 0.95 1.000 0.6
## 4 0.4 0.01 0.95 1.000 0.6
## 4 0.4 0.01 0.95 1.000 0.6
## 4 0.4 0.01 0.95 1.000 0.6
## 4 0.4 0.01 0.95 1.000 0.6
## 4 0.4 0.01 0.95 1.000 0.8
## 4 0.4 0.01 0.95 1.000 0.8
## 4 0.4 0.01 0.95 1.000 0.8
## 4 0.4 0.01 0.95 1.000 0.8
## 4 0.4 0.01 0.95 1.000 0.8
## 4 0.4 0.50 0.05 0.500 0.6
## 4 0.4 0.50 0.05 0.500 0.6
## 4 0.4 0.50 0.05 0.500 0.6
## 4 0.4 0.50 0.05 0.500 0.6
## 4 0.4 0.50 0.05 0.500 0.6
## 4 0.4 0.50 0.05 0.500 0.8
## 4 0.4 0.50 0.05 0.500 0.8
## 4 0.4 0.50 0.05 0.500 0.8
## 4 0.4 0.50 0.05 0.500 0.8
## 4 0.4 0.50 0.05 0.500 0.8
## 4 0.4 0.50 0.05 0.625 0.6
## 4 0.4 0.50 0.05 0.625 0.6
## 4 0.4 0.50 0.05 0.625 0.6
## 4 0.4 0.50 0.05 0.625 0.6
## 4 0.4 0.50 0.05 0.625 0.6
## 4 0.4 0.50 0.05 0.625 0.8
## 4 0.4 0.50 0.05 0.625 0.8
## 4 0.4 0.50 0.05 0.625 0.8
## 4 0.4 0.50 0.05 0.625 0.8
## 4 0.4 0.50 0.05 0.625 0.8
## 4 0.4 0.50 0.05 0.750 0.6
## 4 0.4 0.50 0.05 0.750 0.6
## 4 0.4 0.50 0.05 0.750 0.6
## 4 0.4 0.50 0.05 0.750 0.6
## 4 0.4 0.50 0.05 0.750 0.6
## 4 0.4 0.50 0.05 0.750 0.8
## 4 0.4 0.50 0.05 0.750 0.8
## 4 0.4 0.50 0.05 0.750 0.8
## 4 0.4 0.50 0.05 0.750 0.8
## 4 0.4 0.50 0.05 0.750 0.8
## 4 0.4 0.50 0.05 0.875 0.6
## 4 0.4 0.50 0.05 0.875 0.6
## 4 0.4 0.50 0.05 0.875 0.6
## 4 0.4 0.50 0.05 0.875 0.6
## 4 0.4 0.50 0.05 0.875 0.6
## 4 0.4 0.50 0.05 0.875 0.8
## 4 0.4 0.50 0.05 0.875 0.8
## 4 0.4 0.50 0.05 0.875 0.8
## 4 0.4 0.50 0.05 0.875 0.8
## 4 0.4 0.50 0.05 0.875 0.8
## 4 0.4 0.50 0.05 1.000 0.6
## 4 0.4 0.50 0.05 1.000 0.6
## 4 0.4 0.50 0.05 1.000 0.6
## 4 0.4 0.50 0.05 1.000 0.6
## 4 0.4 0.50 0.05 1.000 0.6
## 4 0.4 0.50 0.05 1.000 0.8
## 4 0.4 0.50 0.05 1.000 0.8
## 4 0.4 0.50 0.05 1.000 0.8
## 4 0.4 0.50 0.05 1.000 0.8
## 4 0.4 0.50 0.05 1.000 0.8
## 4 0.4 0.50 0.95 0.500 0.6
## 4 0.4 0.50 0.95 0.500 0.6
## 4 0.4 0.50 0.95 0.500 0.6
## 4 0.4 0.50 0.95 0.500 0.6
## 4 0.4 0.50 0.95 0.500 0.6
## 4 0.4 0.50 0.95 0.500 0.8
## 4 0.4 0.50 0.95 0.500 0.8
## 4 0.4 0.50 0.95 0.500 0.8
## 4 0.4 0.50 0.95 0.500 0.8
## 4 0.4 0.50 0.95 0.500 0.8
## 4 0.4 0.50 0.95 0.625 0.6
## 4 0.4 0.50 0.95 0.625 0.6
## 4 0.4 0.50 0.95 0.625 0.6
## 4 0.4 0.50 0.95 0.625 0.6
## 4 0.4 0.50 0.95 0.625 0.6
## 4 0.4 0.50 0.95 0.625 0.8
## 4 0.4 0.50 0.95 0.625 0.8
## 4 0.4 0.50 0.95 0.625 0.8
## 4 0.4 0.50 0.95 0.625 0.8
## 4 0.4 0.50 0.95 0.625 0.8
## 4 0.4 0.50 0.95 0.750 0.6
## 4 0.4 0.50 0.95 0.750 0.6
## 4 0.4 0.50 0.95 0.750 0.6
## 4 0.4 0.50 0.95 0.750 0.6
## 4 0.4 0.50 0.95 0.750 0.6
## 4 0.4 0.50 0.95 0.750 0.8
## 4 0.4 0.50 0.95 0.750 0.8
## 4 0.4 0.50 0.95 0.750 0.8
## 4 0.4 0.50 0.95 0.750 0.8
## 4 0.4 0.50 0.95 0.750 0.8
## 4 0.4 0.50 0.95 0.875 0.6
## 4 0.4 0.50 0.95 0.875 0.6
## 4 0.4 0.50 0.95 0.875 0.6
## 4 0.4 0.50 0.95 0.875 0.6
## 4 0.4 0.50 0.95 0.875 0.6
## 4 0.4 0.50 0.95 0.875 0.8
## 4 0.4 0.50 0.95 0.875 0.8
## 4 0.4 0.50 0.95 0.875 0.8
## 4 0.4 0.50 0.95 0.875 0.8
## 4 0.4 0.50 0.95 0.875 0.8
## 4 0.4 0.50 0.95 1.000 0.6
## 4 0.4 0.50 0.95 1.000 0.6
## 4 0.4 0.50 0.95 1.000 0.6
## 4 0.4 0.50 0.95 1.000 0.6
## 4 0.4 0.50 0.95 1.000 0.6
## 4 0.4 0.50 0.95 1.000 0.8
## 4 0.4 0.50 0.95 1.000 0.8
## 4 0.4 0.50 0.95 1.000 0.8
## 4 0.4 0.50 0.95 1.000 0.8
## 4 0.4 0.50 0.95 1.000 0.8
## 5 0.3 0.01 0.05 0.500 0.6
## 5 0.3 0.01 0.05 0.500 0.6
## 5 0.3 0.01 0.05 0.500 0.6
## 5 0.3 0.01 0.05 0.500 0.6
## 5 0.3 0.01 0.05 0.500 0.6
## 5 0.3 0.01 0.05 0.500 0.8
## 5 0.3 0.01 0.05 0.500 0.8
## 5 0.3 0.01 0.05 0.500 0.8
## 5 0.3 0.01 0.05 0.500 0.8
## 5 0.3 0.01 0.05 0.500 0.8
## 5 0.3 0.01 0.05 0.625 0.6
## 5 0.3 0.01 0.05 0.625 0.6
## 5 0.3 0.01 0.05 0.625 0.6
## 5 0.3 0.01 0.05 0.625 0.6
## 5 0.3 0.01 0.05 0.625 0.6
## 5 0.3 0.01 0.05 0.625 0.8
## 5 0.3 0.01 0.05 0.625 0.8
## 5 0.3 0.01 0.05 0.625 0.8
## 5 0.3 0.01 0.05 0.625 0.8
## 5 0.3 0.01 0.05 0.625 0.8
## 5 0.3 0.01 0.05 0.750 0.6
## 5 0.3 0.01 0.05 0.750 0.6
## 5 0.3 0.01 0.05 0.750 0.6
## 5 0.3 0.01 0.05 0.750 0.6
## 5 0.3 0.01 0.05 0.750 0.6
## 5 0.3 0.01 0.05 0.750 0.8
## 5 0.3 0.01 0.05 0.750 0.8
## 5 0.3 0.01 0.05 0.750 0.8
## 5 0.3 0.01 0.05 0.750 0.8
## 5 0.3 0.01 0.05 0.750 0.8
## 5 0.3 0.01 0.05 0.875 0.6
## 5 0.3 0.01 0.05 0.875 0.6
## 5 0.3 0.01 0.05 0.875 0.6
## 5 0.3 0.01 0.05 0.875 0.6
## 5 0.3 0.01 0.05 0.875 0.6
## 5 0.3 0.01 0.05 0.875 0.8
## 5 0.3 0.01 0.05 0.875 0.8
## 5 0.3 0.01 0.05 0.875 0.8
## 5 0.3 0.01 0.05 0.875 0.8
## 5 0.3 0.01 0.05 0.875 0.8
## 5 0.3 0.01 0.05 1.000 0.6
## 5 0.3 0.01 0.05 1.000 0.6
## 5 0.3 0.01 0.05 1.000 0.6
## 5 0.3 0.01 0.05 1.000 0.6
## 5 0.3 0.01 0.05 1.000 0.6
## 5 0.3 0.01 0.05 1.000 0.8
## 5 0.3 0.01 0.05 1.000 0.8
## 5 0.3 0.01 0.05 1.000 0.8
## 5 0.3 0.01 0.05 1.000 0.8
## 5 0.3 0.01 0.05 1.000 0.8
## 5 0.3 0.01 0.95 0.500 0.6
## 5 0.3 0.01 0.95 0.500 0.6
## 5 0.3 0.01 0.95 0.500 0.6
## 5 0.3 0.01 0.95 0.500 0.6
## 5 0.3 0.01 0.95 0.500 0.6
## 5 0.3 0.01 0.95 0.500 0.8
## 5 0.3 0.01 0.95 0.500 0.8
## 5 0.3 0.01 0.95 0.500 0.8
## 5 0.3 0.01 0.95 0.500 0.8
## 5 0.3 0.01 0.95 0.500 0.8
## 5 0.3 0.01 0.95 0.625 0.6
## 5 0.3 0.01 0.95 0.625 0.6
## 5 0.3 0.01 0.95 0.625 0.6
## 5 0.3 0.01 0.95 0.625 0.6
## 5 0.3 0.01 0.95 0.625 0.6
## 5 0.3 0.01 0.95 0.625 0.8
## 5 0.3 0.01 0.95 0.625 0.8
## 5 0.3 0.01 0.95 0.625 0.8
## 5 0.3 0.01 0.95 0.625 0.8
## 5 0.3 0.01 0.95 0.625 0.8
## 5 0.3 0.01 0.95 0.750 0.6
## 5 0.3 0.01 0.95 0.750 0.6
## 5 0.3 0.01 0.95 0.750 0.6
## 5 0.3 0.01 0.95 0.750 0.6
## 5 0.3 0.01 0.95 0.750 0.6
## 5 0.3 0.01 0.95 0.750 0.8
## 5 0.3 0.01 0.95 0.750 0.8
## 5 0.3 0.01 0.95 0.750 0.8
## 5 0.3 0.01 0.95 0.750 0.8
## 5 0.3 0.01 0.95 0.750 0.8
## 5 0.3 0.01 0.95 0.875 0.6
## 5 0.3 0.01 0.95 0.875 0.6
## 5 0.3 0.01 0.95 0.875 0.6
## 5 0.3 0.01 0.95 0.875 0.6
## 5 0.3 0.01 0.95 0.875 0.6
## 5 0.3 0.01 0.95 0.875 0.8
## 5 0.3 0.01 0.95 0.875 0.8
## 5 0.3 0.01 0.95 0.875 0.8
## 5 0.3 0.01 0.95 0.875 0.8
## 5 0.3 0.01 0.95 0.875 0.8
## 5 0.3 0.01 0.95 1.000 0.6
## 5 0.3 0.01 0.95 1.000 0.6
## 5 0.3 0.01 0.95 1.000 0.6
## 5 0.3 0.01 0.95 1.000 0.6
## 5 0.3 0.01 0.95 1.000 0.6
## 5 0.3 0.01 0.95 1.000 0.8
## 5 0.3 0.01 0.95 1.000 0.8
## 5 0.3 0.01 0.95 1.000 0.8
## 5 0.3 0.01 0.95 1.000 0.8
## 5 0.3 0.01 0.95 1.000 0.8
## 5 0.3 0.50 0.05 0.500 0.6
## 5 0.3 0.50 0.05 0.500 0.6
## 5 0.3 0.50 0.05 0.500 0.6
## 5 0.3 0.50 0.05 0.500 0.6
## 5 0.3 0.50 0.05 0.500 0.6
## 5 0.3 0.50 0.05 0.500 0.8
## 5 0.3 0.50 0.05 0.500 0.8
## 5 0.3 0.50 0.05 0.500 0.8
## 5 0.3 0.50 0.05 0.500 0.8
## 5 0.3 0.50 0.05 0.500 0.8
## 5 0.3 0.50 0.05 0.625 0.6
## 5 0.3 0.50 0.05 0.625 0.6
## 5 0.3 0.50 0.05 0.625 0.6
## 5 0.3 0.50 0.05 0.625 0.6
## 5 0.3 0.50 0.05 0.625 0.6
## 5 0.3 0.50 0.05 0.625 0.8
## 5 0.3 0.50 0.05 0.625 0.8
## 5 0.3 0.50 0.05 0.625 0.8
## 5 0.3 0.50 0.05 0.625 0.8
## 5 0.3 0.50 0.05 0.625 0.8
## 5 0.3 0.50 0.05 0.750 0.6
## 5 0.3 0.50 0.05 0.750 0.6
## 5 0.3 0.50 0.05 0.750 0.6
## 5 0.3 0.50 0.05 0.750 0.6
## 5 0.3 0.50 0.05 0.750 0.6
## 5 0.3 0.50 0.05 0.750 0.8
## 5 0.3 0.50 0.05 0.750 0.8
## 5 0.3 0.50 0.05 0.750 0.8
## 5 0.3 0.50 0.05 0.750 0.8
## 5 0.3 0.50 0.05 0.750 0.8
## 5 0.3 0.50 0.05 0.875 0.6
## 5 0.3 0.50 0.05 0.875 0.6
## 5 0.3 0.50 0.05 0.875 0.6
## 5 0.3 0.50 0.05 0.875 0.6
## 5 0.3 0.50 0.05 0.875 0.6
## 5 0.3 0.50 0.05 0.875 0.8
## 5 0.3 0.50 0.05 0.875 0.8
## 5 0.3 0.50 0.05 0.875 0.8
## 5 0.3 0.50 0.05 0.875 0.8
## 5 0.3 0.50 0.05 0.875 0.8
## 5 0.3 0.50 0.05 1.000 0.6
## 5 0.3 0.50 0.05 1.000 0.6
## 5 0.3 0.50 0.05 1.000 0.6
## 5 0.3 0.50 0.05 1.000 0.6
## 5 0.3 0.50 0.05 1.000 0.6
## 5 0.3 0.50 0.05 1.000 0.8
## 5 0.3 0.50 0.05 1.000 0.8
## 5 0.3 0.50 0.05 1.000 0.8
## 5 0.3 0.50 0.05 1.000 0.8
## 5 0.3 0.50 0.05 1.000 0.8
## 5 0.3 0.50 0.95 0.500 0.6
## 5 0.3 0.50 0.95 0.500 0.6
## 5 0.3 0.50 0.95 0.500 0.6
## 5 0.3 0.50 0.95 0.500 0.6
## 5 0.3 0.50 0.95 0.500 0.6
## 5 0.3 0.50 0.95 0.500 0.8
## 5 0.3 0.50 0.95 0.500 0.8
## 5 0.3 0.50 0.95 0.500 0.8
## 5 0.3 0.50 0.95 0.500 0.8
## 5 0.3 0.50 0.95 0.500 0.8
## 5 0.3 0.50 0.95 0.625 0.6
## 5 0.3 0.50 0.95 0.625 0.6
## 5 0.3 0.50 0.95 0.625 0.6
## 5 0.3 0.50 0.95 0.625 0.6
## 5 0.3 0.50 0.95 0.625 0.6
## 5 0.3 0.50 0.95 0.625 0.8
## 5 0.3 0.50 0.95 0.625 0.8
## 5 0.3 0.50 0.95 0.625 0.8
## 5 0.3 0.50 0.95 0.625 0.8
## 5 0.3 0.50 0.95 0.625 0.8
## 5 0.3 0.50 0.95 0.750 0.6
## 5 0.3 0.50 0.95 0.750 0.6
## 5 0.3 0.50 0.95 0.750 0.6
## 5 0.3 0.50 0.95 0.750 0.6
## 5 0.3 0.50 0.95 0.750 0.6
## 5 0.3 0.50 0.95 0.750 0.8
## 5 0.3 0.50 0.95 0.750 0.8
## 5 0.3 0.50 0.95 0.750 0.8
## 5 0.3 0.50 0.95 0.750 0.8
## 5 0.3 0.50 0.95 0.750 0.8
## 5 0.3 0.50 0.95 0.875 0.6
## 5 0.3 0.50 0.95 0.875 0.6
## 5 0.3 0.50 0.95 0.875 0.6
## 5 0.3 0.50 0.95 0.875 0.6
## 5 0.3 0.50 0.95 0.875 0.6
## 5 0.3 0.50 0.95 0.875 0.8
## 5 0.3 0.50 0.95 0.875 0.8
## 5 0.3 0.50 0.95 0.875 0.8
## 5 0.3 0.50 0.95 0.875 0.8
## 5 0.3 0.50 0.95 0.875 0.8
## 5 0.3 0.50 0.95 1.000 0.6
## 5 0.3 0.50 0.95 1.000 0.6
## 5 0.3 0.50 0.95 1.000 0.6
## 5 0.3 0.50 0.95 1.000 0.6
## 5 0.3 0.50 0.95 1.000 0.6
## 5 0.3 0.50 0.95 1.000 0.8
## 5 0.3 0.50 0.95 1.000 0.8
## 5 0.3 0.50 0.95 1.000 0.8
## 5 0.3 0.50 0.95 1.000 0.8
## 5 0.3 0.50 0.95 1.000 0.8
## 5 0.4 0.01 0.05 0.500 0.6
## 5 0.4 0.01 0.05 0.500 0.6
## 5 0.4 0.01 0.05 0.500 0.6
## 5 0.4 0.01 0.05 0.500 0.6
## 5 0.4 0.01 0.05 0.500 0.6
## 5 0.4 0.01 0.05 0.500 0.8
## 5 0.4 0.01 0.05 0.500 0.8
## 5 0.4 0.01 0.05 0.500 0.8
## 5 0.4 0.01 0.05 0.500 0.8
## 5 0.4 0.01 0.05 0.500 0.8
## 5 0.4 0.01 0.05 0.625 0.6
## 5 0.4 0.01 0.05 0.625 0.6
## 5 0.4 0.01 0.05 0.625 0.6
## 5 0.4 0.01 0.05 0.625 0.6
## 5 0.4 0.01 0.05 0.625 0.6
## 5 0.4 0.01 0.05 0.625 0.8
## 5 0.4 0.01 0.05 0.625 0.8
## 5 0.4 0.01 0.05 0.625 0.8
## 5 0.4 0.01 0.05 0.625 0.8
## 5 0.4 0.01 0.05 0.625 0.8
## 5 0.4 0.01 0.05 0.750 0.6
## 5 0.4 0.01 0.05 0.750 0.6
## 5 0.4 0.01 0.05 0.750 0.6
## 5 0.4 0.01 0.05 0.750 0.6
## 5 0.4 0.01 0.05 0.750 0.6
## 5 0.4 0.01 0.05 0.750 0.8
## 5 0.4 0.01 0.05 0.750 0.8
## 5 0.4 0.01 0.05 0.750 0.8
## 5 0.4 0.01 0.05 0.750 0.8
## 5 0.4 0.01 0.05 0.750 0.8
## 5 0.4 0.01 0.05 0.875 0.6
## 5 0.4 0.01 0.05 0.875 0.6
## 5 0.4 0.01 0.05 0.875 0.6
## 5 0.4 0.01 0.05 0.875 0.6
## 5 0.4 0.01 0.05 0.875 0.6
## 5 0.4 0.01 0.05 0.875 0.8
## 5 0.4 0.01 0.05 0.875 0.8
## 5 0.4 0.01 0.05 0.875 0.8
## 5 0.4 0.01 0.05 0.875 0.8
## 5 0.4 0.01 0.05 0.875 0.8
## 5 0.4 0.01 0.05 1.000 0.6
## 5 0.4 0.01 0.05 1.000 0.6
## 5 0.4 0.01 0.05 1.000 0.6
## 5 0.4 0.01 0.05 1.000 0.6
## 5 0.4 0.01 0.05 1.000 0.6
## 5 0.4 0.01 0.05 1.000 0.8
## 5 0.4 0.01 0.05 1.000 0.8
## 5 0.4 0.01 0.05 1.000 0.8
## 5 0.4 0.01 0.05 1.000 0.8
## 5 0.4 0.01 0.05 1.000 0.8
## 5 0.4 0.01 0.95 0.500 0.6
## 5 0.4 0.01 0.95 0.500 0.6
## 5 0.4 0.01 0.95 0.500 0.6
## 5 0.4 0.01 0.95 0.500 0.6
## 5 0.4 0.01 0.95 0.500 0.6
## 5 0.4 0.01 0.95 0.500 0.8
## 5 0.4 0.01 0.95 0.500 0.8
## 5 0.4 0.01 0.95 0.500 0.8
## 5 0.4 0.01 0.95 0.500 0.8
## 5 0.4 0.01 0.95 0.500 0.8
## 5 0.4 0.01 0.95 0.625 0.6
## 5 0.4 0.01 0.95 0.625 0.6
## 5 0.4 0.01 0.95 0.625 0.6
## 5 0.4 0.01 0.95 0.625 0.6
## 5 0.4 0.01 0.95 0.625 0.6
## 5 0.4 0.01 0.95 0.625 0.8
## 5 0.4 0.01 0.95 0.625 0.8
## 5 0.4 0.01 0.95 0.625 0.8
## 5 0.4 0.01 0.95 0.625 0.8
## 5 0.4 0.01 0.95 0.625 0.8
## 5 0.4 0.01 0.95 0.750 0.6
## 5 0.4 0.01 0.95 0.750 0.6
## 5 0.4 0.01 0.95 0.750 0.6
## 5 0.4 0.01 0.95 0.750 0.6
## 5 0.4 0.01 0.95 0.750 0.6
## 5 0.4 0.01 0.95 0.750 0.8
## 5 0.4 0.01 0.95 0.750 0.8
## 5 0.4 0.01 0.95 0.750 0.8
## 5 0.4 0.01 0.95 0.750 0.8
## 5 0.4 0.01 0.95 0.750 0.8
## 5 0.4 0.01 0.95 0.875 0.6
## 5 0.4 0.01 0.95 0.875 0.6
## 5 0.4 0.01 0.95 0.875 0.6
## 5 0.4 0.01 0.95 0.875 0.6
## 5 0.4 0.01 0.95 0.875 0.6
## 5 0.4 0.01 0.95 0.875 0.8
## 5 0.4 0.01 0.95 0.875 0.8
## 5 0.4 0.01 0.95 0.875 0.8
## 5 0.4 0.01 0.95 0.875 0.8
## 5 0.4 0.01 0.95 0.875 0.8
## 5 0.4 0.01 0.95 1.000 0.6
## 5 0.4 0.01 0.95 1.000 0.6
## 5 0.4 0.01 0.95 1.000 0.6
## 5 0.4 0.01 0.95 1.000 0.6
## 5 0.4 0.01 0.95 1.000 0.6
## 5 0.4 0.01 0.95 1.000 0.8
## 5 0.4 0.01 0.95 1.000 0.8
## 5 0.4 0.01 0.95 1.000 0.8
## 5 0.4 0.01 0.95 1.000 0.8
## 5 0.4 0.01 0.95 1.000 0.8
## 5 0.4 0.50 0.05 0.500 0.6
## 5 0.4 0.50 0.05 0.500 0.6
## 5 0.4 0.50 0.05 0.500 0.6
## 5 0.4 0.50 0.05 0.500 0.6
## 5 0.4 0.50 0.05 0.500 0.6
## 5 0.4 0.50 0.05 0.500 0.8
## 5 0.4 0.50 0.05 0.500 0.8
## 5 0.4 0.50 0.05 0.500 0.8
## 5 0.4 0.50 0.05 0.500 0.8
## 5 0.4 0.50 0.05 0.500 0.8
## 5 0.4 0.50 0.05 0.625 0.6
## 5 0.4 0.50 0.05 0.625 0.6
## 5 0.4 0.50 0.05 0.625 0.6
## 5 0.4 0.50 0.05 0.625 0.6
## 5 0.4 0.50 0.05 0.625 0.6
## 5 0.4 0.50 0.05 0.625 0.8
## 5 0.4 0.50 0.05 0.625 0.8
## 5 0.4 0.50 0.05 0.625 0.8
## 5 0.4 0.50 0.05 0.625 0.8
## 5 0.4 0.50 0.05 0.625 0.8
## 5 0.4 0.50 0.05 0.750 0.6
## 5 0.4 0.50 0.05 0.750 0.6
## 5 0.4 0.50 0.05 0.750 0.6
## 5 0.4 0.50 0.05 0.750 0.6
## 5 0.4 0.50 0.05 0.750 0.6
## 5 0.4 0.50 0.05 0.750 0.8
## 5 0.4 0.50 0.05 0.750 0.8
## 5 0.4 0.50 0.05 0.750 0.8
## 5 0.4 0.50 0.05 0.750 0.8
## 5 0.4 0.50 0.05 0.750 0.8
## 5 0.4 0.50 0.05 0.875 0.6
## 5 0.4 0.50 0.05 0.875 0.6
## 5 0.4 0.50 0.05 0.875 0.6
## 5 0.4 0.50 0.05 0.875 0.6
## 5 0.4 0.50 0.05 0.875 0.6
## 5 0.4 0.50 0.05 0.875 0.8
## 5 0.4 0.50 0.05 0.875 0.8
## 5 0.4 0.50 0.05 0.875 0.8
## 5 0.4 0.50 0.05 0.875 0.8
## 5 0.4 0.50 0.05 0.875 0.8
## 5 0.4 0.50 0.05 1.000 0.6
## 5 0.4 0.50 0.05 1.000 0.6
## 5 0.4 0.50 0.05 1.000 0.6
## 5 0.4 0.50 0.05 1.000 0.6
## 5 0.4 0.50 0.05 1.000 0.6
## 5 0.4 0.50 0.05 1.000 0.8
## 5 0.4 0.50 0.05 1.000 0.8
## 5 0.4 0.50 0.05 1.000 0.8
## 5 0.4 0.50 0.05 1.000 0.8
## 5 0.4 0.50 0.05 1.000 0.8
## 5 0.4 0.50 0.95 0.500 0.6
## 5 0.4 0.50 0.95 0.500 0.6
## 5 0.4 0.50 0.95 0.500 0.6
## 5 0.4 0.50 0.95 0.500 0.6
## 5 0.4 0.50 0.95 0.500 0.6
## 5 0.4 0.50 0.95 0.500 0.8
## 5 0.4 0.50 0.95 0.500 0.8
## 5 0.4 0.50 0.95 0.500 0.8
## 5 0.4 0.50 0.95 0.500 0.8
## 5 0.4 0.50 0.95 0.500 0.8
## 5 0.4 0.50 0.95 0.625 0.6
## 5 0.4 0.50 0.95 0.625 0.6
## 5 0.4 0.50 0.95 0.625 0.6
## 5 0.4 0.50 0.95 0.625 0.6
## 5 0.4 0.50 0.95 0.625 0.6
## 5 0.4 0.50 0.95 0.625 0.8
## 5 0.4 0.50 0.95 0.625 0.8
## 5 0.4 0.50 0.95 0.625 0.8
## 5 0.4 0.50 0.95 0.625 0.8
## 5 0.4 0.50 0.95 0.625 0.8
## 5 0.4 0.50 0.95 0.750 0.6
## 5 0.4 0.50 0.95 0.750 0.6
## 5 0.4 0.50 0.95 0.750 0.6
## 5 0.4 0.50 0.95 0.750 0.6
## 5 0.4 0.50 0.95 0.750 0.6
## 5 0.4 0.50 0.95 0.750 0.8
## 5 0.4 0.50 0.95 0.750 0.8
## 5 0.4 0.50 0.95 0.750 0.8
## 5 0.4 0.50 0.95 0.750 0.8
## 5 0.4 0.50 0.95 0.750 0.8
## 5 0.4 0.50 0.95 0.875 0.6
## 5 0.4 0.50 0.95 0.875 0.6
## 5 0.4 0.50 0.95 0.875 0.6
## 5 0.4 0.50 0.95 0.875 0.6
## 5 0.4 0.50 0.95 0.875 0.6
## 5 0.4 0.50 0.95 0.875 0.8
## 5 0.4 0.50 0.95 0.875 0.8
## 5 0.4 0.50 0.95 0.875 0.8
## 5 0.4 0.50 0.95 0.875 0.8
## 5 0.4 0.50 0.95 0.875 0.8
## 5 0.4 0.50 0.95 1.000 0.6
## 5 0.4 0.50 0.95 1.000 0.6
## 5 0.4 0.50 0.95 1.000 0.6
## 5 0.4 0.50 0.95 1.000 0.6
## 5 0.4 0.50 0.95 1.000 0.6
## 5 0.4 0.50 0.95 1.000 0.8
## 5 0.4 0.50 0.95 1.000 0.8
## 5 0.4 0.50 0.95 1.000 0.8
## 5 0.4 0.50 0.95 1.000 0.8
## 5 0.4 0.50 0.95 1.000 0.8
## nrounds ROC Sens Spec
## 50 0.9022676 0.8623443 0.7753053
## 100 0.9005646 0.8680769 0.7604251
## 150 0.8966960 0.8623077 0.7602895
## 200 0.8997248 0.8604212 0.7692899
## 250 0.9010432 0.8699634 0.7663501
## 50 0.9041983 0.8700183 0.7602895
## 100 0.9023155 0.8680403 0.7872004
## 150 0.9015020 0.8603846 0.7871551
## 200 0.8990032 0.8584982 0.7782451
## 250 0.9012306 0.8661722 0.7782451
## 50 0.9052504 0.8661355 0.7843510
## 100 0.9023994 0.8680586 0.7843510
## 150 0.9024170 0.8699634 0.7663953
## 200 0.9007762 0.8737729 0.7723654
## 250 0.9032855 0.8641941 0.7843057
## 50 0.9048888 0.8680586 0.7931705
## 100 0.9027491 0.8680952 0.7692447
## 150 0.9010945 0.8680586 0.7722750
## 200 0.8999018 0.8699817 0.7752601
## 250 0.8993197 0.8661722 0.7603347
## 50 0.9078411 0.8738462 0.7782451
## 100 0.9066537 0.8776374 0.7812754
## 150 0.9021321 0.8699451 0.7842605
## 200 0.9024339 0.8661355 0.7753053
## 250 0.9020776 0.8680769 0.7663501
## 50 0.9068755 0.8738828 0.7783356
## 100 0.9032355 0.8814835 0.7663048
## 150 0.9018840 0.8718498 0.7692899
## 200 0.9030443 0.8757143 0.7603347
## 250 0.9021518 0.8642308 0.7693351
## 50 0.9070927 0.8814835 0.7722298
## 100 0.9065635 0.8795604 0.7692447
## 150 0.9058188 0.8833516 0.7693351
## 200 0.9043542 0.8852564 0.7633198
## 250 0.9046956 0.8738095 0.7543645
## 50 0.9098692 0.8757143 0.7843510
## 100 0.9073198 0.8757143 0.7783808
## 150 0.9063468 0.8757143 0.7693804
## 200 0.9036276 0.8757143 0.7723202
## 250 0.9039602 0.8757143 0.7693351
## 50 0.9087301 0.8795238 0.7843510
## 100 0.9073448 0.8833333 0.7783356
## 150 0.9068763 0.8795238 0.7753957
## 200 0.9066650 0.8795238 0.7782904
## 250 0.9053662 0.8795238 0.7663501
## 50 0.9085491 0.8776190 0.7872908
## 100 0.9089542 0.8852381 0.7723202
## 150 0.9079738 0.8891026 0.7753053
## 200 0.9069267 0.8814469 0.7663501
## 250 0.9067969 0.8814469 0.7693351
## 50 0.9054168 0.8680037 0.7932157
## 100 0.9038869 0.8680586 0.7902307
## 150 0.9008164 0.8604212 0.7782904
## 200 0.8993926 0.8661538 0.7692899
## 250 0.8955649 0.8604212 0.7633650
## 50 0.9063073 0.8718498 0.7752601
## 100 0.8994780 0.8584799 0.7633650
## 150 0.8979538 0.8623443 0.7693351
## 200 0.8974015 0.8547070 0.7752601
## 250 0.8982230 0.8566117 0.7753053
## 50 0.9033659 0.8718681 0.7513342
## 100 0.9013492 0.8718498 0.7632745
## 150 0.9024644 0.8661172 0.7632745
## 200 0.8986170 0.8642308 0.7603347
## 250 0.8967286 0.8546703 0.7753053
## 50 0.9058008 0.8738278 0.7872456
## 100 0.9041101 0.8719048 0.7782451
## 150 0.9019721 0.8661722 0.7663048
## 200 0.8985708 0.8661905 0.7692899
## 250 0.8968676 0.8565934 0.7513342
## 50 0.9048621 0.8833883 0.7842605
## 100 0.9039819 0.8775824 0.7542741
## 150 0.9019719 0.8700000 0.7663048
## 200 0.9007339 0.8719048 0.7573496
## 250 0.8997800 0.8681136 0.7543645
## 50 0.9084151 0.8852381 0.7842605
## 100 0.9033719 0.8757143 0.7692447
## 150 0.9000253 0.8795604 0.7573496
## 200 0.9004665 0.8680952 0.7692899
## 250 0.8983765 0.8757692 0.7543645
## 50 0.9065543 0.8700000 0.7872004
## 100 0.9039118 0.8756960 0.7663048
## 150 0.9038778 0.8757143 0.7723202
## 200 0.9017226 0.8699817 0.7603347
## 250 0.9001209 0.8623260 0.7663048
## 50 0.9088488 0.8852564 0.7781999
## 100 0.9054435 0.8795238 0.7692899
## 150 0.9034113 0.8738095 0.7693351
## 200 0.9025696 0.8718681 0.7573948
## 250 0.9000941 0.8642308 0.7543645
## 50 0.9080730 0.8776374 0.7901854
## 100 0.9078667 0.8795421 0.7753053
## 150 0.9057658 0.8795238 0.7693351
## 200 0.9037351 0.8757326 0.7573948
## 250 0.9016852 0.8757326 0.7543645
## 50 0.9093954 0.8852564 0.7872908
## 100 0.9083317 0.8814469 0.7812754
## 150 0.9062356 0.8795238 0.7693351
## 200 0.9042338 0.8795421 0.7573496
## 250 0.9024334 0.8776374 0.7543645
## 50 0.8879504 0.8527289 0.7542741
## 100 0.8942538 0.8566117 0.7632293
## 150 0.8983109 0.8680220 0.7512890
## 200 0.9017720 0.8718132 0.7632293
## 250 0.9015823 0.8737912 0.7722750
## 50 0.8907219 0.8585531 0.7483492
## 100 0.8913431 0.8508425 0.7513342
## 150 0.8942319 0.8546703 0.7543645
## 200 0.8962255 0.8566300 0.7603347
## 250 0.8977244 0.8547253 0.7722298
## 50 0.8905757 0.8546703 0.7632745
## 100 0.8955690 0.8623260 0.7543645
## 150 0.8986385 0.8699451 0.7513795
## 200 0.9010710 0.8738095 0.7662144
## 250 0.9040072 0.8795421 0.7872456
## 50 0.8902237 0.8604212 0.7393487
## 100 0.8937125 0.8546886 0.7573044
## 150 0.8964202 0.8642308 0.7603347
## 200 0.8977319 0.8565934 0.7752601
## 250 0.8983456 0.8642308 0.7633198
## 50 0.8885477 0.8718681 0.7512438
## 100 0.8947193 0.8642125 0.7601990
## 150 0.9018620 0.8719048 0.7661692
## 200 0.9034513 0.8681136 0.7632293
## 250 0.9062509 0.8719048 0.7781999
## 50 0.8901906 0.8681136 0.7453641
## 100 0.8976159 0.8566667 0.7513342
## 150 0.8983248 0.8604579 0.7602895
## 200 0.9001597 0.8604579 0.7662596
## 250 0.9041905 0.8680952 0.7601990
## 50 0.8850880 0.8604579 0.7483492
## 100 0.8961961 0.8585348 0.7483492
## 150 0.8987060 0.8546886 0.7722298
## 200 0.8979397 0.8661538 0.7513342
## 250 0.9059467 0.8661538 0.7692899
## 50 0.8923440 0.8566117 0.7543193
## 100 0.8978202 0.8566117 0.7543193
## 150 0.8990410 0.8546886 0.7602895
## 200 0.9004537 0.8546886 0.7632745
## 250 0.9007437 0.8546337 0.7603347
## 50 0.8899870 0.8584982 0.7453189
## 100 0.8995396 0.8547436 0.7602895
## 150 0.9003307 0.8489560 0.7543193
## 200 0.9033594 0.8661905 0.7781999
## 250 0.9046985 0.8642857 0.7811850
## 50 0.8930593 0.8508974 0.7602895
## 100 0.8928750 0.8585165 0.7572592
## 150 0.8972564 0.8585165 0.7602895
## 200 0.9007453 0.8585165 0.7602895
## 250 0.9020615 0.8547070 0.7662596
## 50 0.9053313 0.8680769 0.7692447
## 100 0.9039519 0.8719048 0.7842153
## 150 0.9008117 0.8661905 0.7722750
## 200 0.8996893 0.8680769 0.7663048
## 250 0.8967822 0.8623260 0.7602895
## 50 0.9057881 0.8681136 0.7781999
## 100 0.9037329 0.8680586 0.7841248
## 150 0.8998823 0.8584615 0.7782904
## 200 0.8979340 0.8565934 0.7662596
## 250 0.8978426 0.8566300 0.7843057
## 50 0.9048513 0.8604212 0.7781999
## 100 0.9035530 0.8738095 0.7752148
## 150 0.9026441 0.8642857 0.7573044
## 200 0.8983414 0.8719048 0.7603347
## 250 0.9002137 0.8604396 0.7633198
## 50 0.9051540 0.8737912 0.7753957
## 100 0.9033610 0.8661722 0.7784261
## 150 0.9000343 0.8661905 0.7723654
## 200 0.8982681 0.8604579 0.7663048
## 250 0.8985869 0.8546703 0.7603347
## 50 0.9065177 0.8738278 0.7842605
## 100 0.9047790 0.8700000 0.7782904
## 150 0.9025776 0.8718498 0.7663048
## 200 0.9007881 0.8604212 0.7633198
## 250 0.9000698 0.8643223 0.7573496
## 50 0.9066271 0.8795604 0.7812754
## 100 0.9031075 0.8661538 0.7693351
## 150 0.9016151 0.8604029 0.7812302
## 200 0.8987956 0.8604396 0.7663048
## 250 0.8991241 0.8546703 0.7543193
## 50 0.9083097 0.8681136 0.7812302
## 100 0.9050127 0.8814469 0.7753053
## 150 0.9042135 0.8661538 0.7633198
## 200 0.9017229 0.8642491 0.7543645
## 250 0.9006857 0.8623443 0.7543645
## 50 0.9084298 0.8852930 0.7901854
## 100 0.9039157 0.8852564 0.7633198
## 150 0.9038962 0.8756960 0.7692899
## 200 0.9016164 0.8699634 0.7573496
## 250 0.8997535 0.8699817 0.7573496
## 50 0.9098845 0.8852930 0.7842153
## 100 0.9089133 0.8833700 0.7693351
## 150 0.9069257 0.8776374 0.7633650
## 200 0.9045070 0.8738278 0.7663501
## 250 0.9029447 0.8795604 0.7603347
## 50 0.9089806 0.8814103 0.7902759
## 100 0.9080955 0.8833516 0.7843057
## 150 0.9063624 0.8795238 0.7693351
## 200 0.9042191 0.8814469 0.7663501
## 250 0.9030430 0.8795421 0.7513795
## 50 0.9010686 0.8776557 0.7603347
## 100 0.8983736 0.8643040 0.7663048
## 150 0.9003956 0.8738095 0.7663048
## 200 0.9007426 0.8719048 0.7873360
## 250 0.8990577 0.8604396 0.7722750
## 50 0.9042606 0.8660989 0.7902759
## 100 0.9054047 0.8565568 0.7842605
## 150 0.9029739 0.8661722 0.7663048
## 200 0.9005089 0.8718498 0.7692899
## 250 0.8999050 0.8623077 0.7663048
## 50 0.9032973 0.8699634 0.7782904
## 100 0.9011054 0.8737912 0.7634102
## 150 0.9016595 0.8718864 0.7813207
## 200 0.9021820 0.8603846 0.7753053
## 250 0.9031113 0.8699817 0.7633198
## 50 0.9034396 0.8737729 0.7842605
## 100 0.9018871 0.8661722 0.7781999
## 150 0.8998271 0.8585348 0.7781999
## 200 0.9007546 0.8623443 0.7752148
## 250 0.9018390 0.8737912 0.7722750
## 50 0.9053361 0.8794872 0.7722750
## 100 0.9025271 0.8775641 0.7692899
## 150 0.8997762 0.8718498 0.7692899
## 200 0.8982032 0.8603846 0.7692899
## 250 0.9006025 0.8642308 0.7752601
## 50 0.9022052 0.8661538 0.7783356
## 100 0.9040286 0.8680586 0.7692899
## 150 0.9015592 0.8699634 0.7543645
## 200 0.9027290 0.8680952 0.7603347
## 250 0.9001754 0.8604029 0.7633198
## 50 0.9052421 0.8871795 0.7692899
## 100 0.9061441 0.8872161 0.7603799
## 150 0.9045213 0.8814469 0.7663048
## 200 0.9034776 0.8757509 0.7753053
## 250 0.9031956 0.8738095 0.7723202
## 50 0.9065999 0.8737546 0.7872456
## 100 0.9047499 0.8737363 0.7902307
## 150 0.9020541 0.8718681 0.7782904
## 200 0.9006113 0.8680403 0.7722750
## 250 0.9015136 0.8718864 0.7692899
## 50 0.9091564 0.8814469 0.7902759
## 100 0.9090086 0.8756777 0.7991859
## 150 0.9070532 0.8776190 0.7842605
## 200 0.9053203 0.8795238 0.7663501
## 250 0.9051349 0.8814469 0.7693351
## 50 0.9084964 0.8776007 0.7872908
## 100 0.9089261 0.8814469 0.7812754
## 150 0.9081562 0.8833333 0.7753053
## 200 0.9055797 0.8814286 0.7723202
## 250 0.9050593 0.8814469 0.7693351
## 50 0.9020361 0.8680769 0.7782451
## 100 0.8976646 0.8680220 0.7692899
## 150 0.8963780 0.8604212 0.7455450
## 200 0.8996597 0.8623260 0.7663501
## 250 0.8983261 0.8661355 0.7513342
## 50 0.9040714 0.8680769 0.7841701
## 100 0.9016495 0.8757326 0.7752601
## 150 0.8984407 0.8565568 0.7692447
## 200 0.8958574 0.8604029 0.7632745
## 250 0.8965393 0.8584982 0.7691995
## 50 0.9007628 0.8680586 0.7692447
## 100 0.8986793 0.8622711 0.7633198
## 150 0.8972550 0.8622711 0.7663048
## 200 0.8948672 0.8584982 0.7633198
## 250 0.8932691 0.8470330 0.7483492
## 50 0.9034036 0.8661538 0.7782904
## 100 0.8992968 0.8585165 0.7603347
## 150 0.8964062 0.8508974 0.7663048
## 200 0.8946353 0.8528388 0.7752601
## 250 0.8948680 0.8528022 0.7602895
## 50 0.9029285 0.8642674 0.7781999
## 100 0.9032366 0.8661722 0.7633650
## 150 0.9009903 0.8680952 0.7603799
## 200 0.8989271 0.8584982 0.7603347
## 250 0.8949377 0.8565934 0.7573044
## 50 0.9044684 0.8719231 0.7752601
## 100 0.9009212 0.8661722 0.7513795
## 150 0.8983076 0.8584615 0.7663501
## 200 0.8960173 0.8642308 0.7573496
## 250 0.8940870 0.8661905 0.7453641
## 50 0.9076558 0.8853114 0.7782451
## 100 0.9017858 0.8700000 0.7513795
## 150 0.8998788 0.8642125 0.7663048
## 200 0.8970054 0.8623443 0.7603347
## 250 0.8966085 0.8642308 0.7573496
## 50 0.9047604 0.8852564 0.7752148
## 100 0.9044659 0.8814286 0.7633198
## 150 0.9012340 0.8718864 0.7722750
## 200 0.8987540 0.8680586 0.7752601
## 250 0.8963874 0.8642125 0.7633198
## 50 0.9082818 0.8852747 0.7872456
## 100 0.9061535 0.8795238 0.7723202
## 150 0.9040242 0.8776374 0.7543645
## 200 0.9024282 0.8757143 0.7483944
## 250 0.9012711 0.8719048 0.7573496
## 50 0.9083539 0.8852747 0.7811850
## 100 0.9068210 0.8814469 0.7812754
## 150 0.9041919 0.8814469 0.7603799
## 200 0.9025561 0.8795421 0.7483944
## 250 0.9014427 0.8738095 0.7543645
## 50 0.8912763 0.8604212 0.7422886
## 100 0.8965421 0.8603846 0.7452736
## 150 0.9002968 0.8623260 0.7514247
## 200 0.8996676 0.8623810 0.7543645
## 250 0.9004867 0.8681319 0.7513795
## 50 0.8922793 0.8527656 0.7632745
## 100 0.8973533 0.8527473 0.7602442
## 150 0.9005156 0.8642308 0.7542741
## 200 0.9022925 0.8642491 0.7721845
## 250 0.9011011 0.8642125 0.7753053
## 50 0.8869523 0.8546886 0.7662596
## 100 0.8980648 0.8604945 0.7633650
## 150 0.9024075 0.8642674 0.7781999
## 200 0.9028713 0.8680769 0.7871551
## 250 0.9036017 0.8623443 0.7841701
## 50 0.8875819 0.8450733 0.7752148
## 100 0.8942195 0.8469963 0.7632745
## 150 0.8969938 0.8488828 0.7783356
## 200 0.9007576 0.8470696 0.7753505
## 250 0.9031619 0.8661355 0.7782451
## 50 0.8904642 0.8565568 0.7571687
## 100 0.8949366 0.8623443 0.7572139
## 150 0.8982968 0.8661538 0.7542289
## 200 0.8986697 0.8623443 0.7662596
## 250 0.9038191 0.8718864 0.7662596
## 50 0.8939029 0.8546886 0.7632745
## 100 0.8983416 0.8470513 0.7752148
## 150 0.9035202 0.8565751 0.7632745
## 200 0.9063372 0.8680403 0.7781095
## 250 0.9040589 0.8622894 0.7872004
## 50 0.8909871 0.8546886 0.7542741
## 100 0.9008819 0.8604396 0.7453641
## 150 0.9004125 0.8604029 0.7512438
## 200 0.9046567 0.8680403 0.7662596
## 250 0.9067170 0.8699634 0.7573044
## 50 0.8921164 0.8604212 0.7453189
## 100 0.8893067 0.8508608 0.7602895
## 150 0.8983298 0.8565385 0.7452736
## 200 0.9019512 0.8623077 0.7512438
## 250 0.9051631 0.8699634 0.7512438
## 50 0.8928775 0.8604212 0.7512438
## 100 0.8939421 0.8623260 0.7422886
## 150 0.8995515 0.8661355 0.7422886
## 200 0.9029082 0.8642491 0.7542289
## 250 0.9017424 0.8623810 0.7542289
## 50 0.8952134 0.8508974 0.7543193
## 100 0.8990305 0.8470513 0.7543193
## 150 0.9017287 0.8584982 0.7512890
## 200 0.9039039 0.8603846 0.7513342
## 250 0.9059321 0.8642308 0.7602895
## 50 0.9025658 0.8680037 0.7603347
## 100 0.9002367 0.8718864 0.7691995
## 150 0.8926962 0.8508059 0.7811850
## 200 0.8889385 0.8508974 0.7602442
## 250 0.8931577 0.8489744 0.7752148
## 50 0.9071037 0.8776557 0.7813207
## 100 0.9005123 0.8641758 0.7632745
## 150 0.8975269 0.8508608 0.7602895
## 200 0.8971918 0.8547070 0.7424242
## 250 0.8920236 0.8527656 0.7693351
## 50 0.9063702 0.8852747 0.7632293
## 100 0.9025287 0.8719231 0.7633650
## 150 0.8994624 0.8661722 0.7603347
## 200 0.9001853 0.8681136 0.7603347
## 250 0.8984118 0.8642491 0.7573496
## 50 0.9026840 0.8699817 0.7723654
## 100 0.9011493 0.8718864 0.7692899
## 150 0.8998512 0.8757143 0.7573044
## 200 0.8945072 0.8527473 0.7722750
## 250 0.8955089 0.8604212 0.7693351
## 50 0.9076372 0.8737912 0.7961104
## 100 0.9035794 0.8718681 0.7782904
## 150 0.9006147 0.8700000 0.7753053
## 200 0.8980620 0.8680769 0.7633198
## 250 0.8966717 0.8680769 0.7843057
## 50 0.9050664 0.8776557 0.7811850
## 100 0.9028321 0.8699817 0.7782904
## 150 0.8992577 0.8661722 0.7573496
## 200 0.8996474 0.8603663 0.7722750
## 250 0.8996301 0.8661722 0.7692447
## 50 0.9075162 0.8852564 0.7633198
## 100 0.9045496 0.8718864 0.7692899
## 150 0.9026312 0.8699817 0.7603347
## 200 0.9002211 0.8584982 0.7543193
## 250 0.8981094 0.8584799 0.7573496
## 50 0.9080130 0.8833516 0.7603799
## 100 0.9030983 0.8699817 0.7723202
## 150 0.9012063 0.8700000 0.7573496
## 200 0.8982501 0.8604212 0.7543645
## 250 0.8965661 0.8623260 0.7663048
## 50 0.9081150 0.8833700 0.7962460
## 100 0.9059549 0.8795055 0.7782904
## 150 0.9039903 0.8795421 0.7722750
## 200 0.9022148 0.8680769 0.7722750
## 250 0.9010296 0.8680769 0.7513795
## 50 0.9090857 0.8871795 0.7872004
## 100 0.9065775 0.8852564 0.7753053
## 150 0.9042735 0.8795421 0.7663501
## 200 0.9021286 0.8795421 0.7573496
## 250 0.9010002 0.8738095 0.7483944
## 50 0.8996523 0.8566300 0.7631841
## 100 0.9010853 0.8624359 0.7752601
## 150 0.8962712 0.8642125 0.7722298
## 200 0.8950299 0.8623443 0.7723202
## 250 0.8935103 0.8546886 0.7781999
## 50 0.9001969 0.8642857 0.7693351
## 100 0.8973503 0.8604212 0.7693804
## 150 0.8924551 0.8584799 0.7512890
## 200 0.8885554 0.8584799 0.7633198
## 250 0.8914155 0.8527656 0.7751696
## 50 0.8985851 0.8565934 0.7543193
## 100 0.8951723 0.8584982 0.7601990
## 150 0.8920640 0.8584982 0.7602442
## 200 0.8904996 0.8565934 0.7752148
## 250 0.8890228 0.8585165 0.7661692
## 50 0.8960850 0.8450733 0.7513342
## 100 0.8924389 0.8508242 0.7723202
## 150 0.8904439 0.8565568 0.7693804
## 200 0.8898548 0.8565568 0.7722298
## 250 0.8881334 0.8546154 0.7662144
## 50 0.9027050 0.8622894 0.7812302
## 100 0.8954513 0.8565568 0.7632745
## 150 0.8939122 0.8527289 0.7724107
## 200 0.8932279 0.8584799 0.7662596
## 250 0.8914750 0.8584615 0.7603347
## 50 0.9046807 0.8680952 0.7572592
## 100 0.9005279 0.8642674 0.7721845
## 150 0.8965284 0.8566117 0.7601990
## 200 0.8909262 0.8546520 0.7661239
## 250 0.8908060 0.8584799 0.7631841
## 50 0.9032686 0.8585165 0.7812302
## 100 0.8981250 0.8489194 0.7871551
## 150 0.8965853 0.8508608 0.7782451
## 200 0.8944058 0.8566117 0.7691542
## 250 0.8928263 0.8527473 0.7691542
## 50 0.9035565 0.8642308 0.7722298
## 100 0.8971731 0.8603663 0.7662596
## 150 0.8951893 0.8565385 0.7574401
## 200 0.8926010 0.8508608 0.7693351
## 250 0.8913755 0.8622894 0.7602442
## 50 0.9011534 0.8604579 0.7633198
## 100 0.8966576 0.8547253 0.7633650
## 150 0.8943969 0.8546703 0.7543645
## 200 0.8921734 0.8508608 0.7512438
## 250 0.8888073 0.8470147 0.7483039
## 50 0.9053781 0.8756960 0.7842153
## 100 0.9008878 0.8622894 0.7662596
## 150 0.8957956 0.8603846 0.7662596
## 200 0.8928457 0.8565568 0.7632293
## 250 0.8923464 0.8546703 0.7632745
## 50 0.9020322 0.8680403 0.7482135
## 100 0.8927013 0.8680586 0.7422433
## 150 0.8892981 0.8603846 0.7422886
## 200 0.8852182 0.8527839 0.7661692
## 250 0.8784423 0.8527473 0.7332429
## 50 0.8991020 0.8700000 0.7753505
## 100 0.8955951 0.8527106 0.7662596
## 150 0.8845969 0.8488645 0.7602442
## 200 0.8816468 0.8489377 0.7332881
## 250 0.8779028 0.8489194 0.7393035
## 50 0.8985352 0.8642308 0.7603347
## 100 0.8932766 0.8565934 0.7633650
## 150 0.8895575 0.8565751 0.7601538
## 200 0.8857798 0.8585348 0.7722750
## 250 0.8834784 0.8527473 0.7752601
## 50 0.9035270 0.8719048 0.7812754
## 100 0.8970561 0.8565934 0.7961556
## 150 0.8923977 0.8604029 0.7692447
## 200 0.8870459 0.8508059 0.7753505
## 250 0.8845430 0.8604396 0.7691995
## 50 0.9015228 0.8661722 0.7663048
## 100 0.8920295 0.8546886 0.7662144
## 150 0.8877767 0.8489377 0.7782451
## 200 0.8838582 0.8489377 0.7632293
## 250 0.8826473 0.8546520 0.7602442
## 50 0.8982302 0.8604396 0.7662596
## 100 0.8905759 0.8565934 0.7722750
## 150 0.8880344 0.8546337 0.7634555
## 200 0.8797759 0.8469963 0.7544098
## 250 0.8799788 0.8469597 0.7723202
## 50 0.9016387 0.8623626 0.7633198
## 100 0.8969768 0.8680220 0.7602895
## 150 0.8919042 0.8584615 0.7631389
## 200 0.8869075 0.8623077 0.7722750
## 250 0.8830613 0.8527473 0.7723202
## 50 0.9008609 0.8718864 0.7603347
## 100 0.8921477 0.8547070 0.7661692
## 150 0.8902467 0.8565568 0.7541836
## 200 0.8870292 0.8507875 0.7691542
## 250 0.8837628 0.8507875 0.7662144
## 50 0.9022080 0.8680952 0.7812302
## 100 0.8965354 0.8585531 0.7573496
## 150 0.8900197 0.8527656 0.7512890
## 200 0.8858470 0.8451099 0.7572592
## 250 0.8827235 0.8412821 0.7572139
## 50 0.9048067 0.8719048 0.7812754
## 100 0.8991053 0.8661722 0.7573044
## 150 0.8936166 0.8661355 0.7543193
## 200 0.8896667 0.8546337 0.7513342
## 250 0.8884508 0.8469963 0.7512890
## 50 0.9009942 0.8738278 0.7600633
## 100 0.9058072 0.8814652 0.7603799
## 150 0.9049592 0.8795604 0.7633198
## 200 0.9038808 0.8738278 0.7663048
## 250 0.9053169 0.8757143 0.7663501
## 50 0.9011966 0.8776374 0.7691995
## 100 0.9037573 0.8738462 0.7722298
## 150 0.9049324 0.8719048 0.7872004
## 200 0.9045263 0.8738095 0.7901854
## 250 0.9032117 0.8738095 0.7901854
## 50 0.9022607 0.8776190 0.7333333
## 100 0.9066515 0.8833700 0.7483039
## 150 0.9047431 0.8815201 0.7602442
## 200 0.9058454 0.8814835 0.7663048
## 250 0.9067864 0.8795788 0.7722298
## 50 0.9026588 0.8546886 0.7871099
## 100 0.9024755 0.8642308 0.7841701
## 150 0.9031764 0.8699634 0.7782904
## 200 0.9048949 0.8814652 0.7692899
## 250 0.9057026 0.8757326 0.7812302
## 50 0.8983991 0.8890842 0.7333786
## 100 0.9010027 0.8833883 0.7573044
## 150 0.9010767 0.8680952 0.7752148
## 200 0.9029305 0.8700000 0.7722298
## 250 0.9043636 0.8623810 0.7692447
## 50 0.9040393 0.8642857 0.7663048
## 100 0.9078095 0.8795604 0.7841701
## 150 0.9069061 0.8776190 0.7871551
## 200 0.9069596 0.8776007 0.7781547
## 250 0.9065437 0.8756410 0.7781999
## 50 0.9059985 0.8756960 0.7632293
## 100 0.9041884 0.8814835 0.7572592
## 150 0.9060650 0.8661355 0.7750791
## 200 0.9064908 0.8738095 0.7781095
## 250 0.9054331 0.8833883 0.7810945
## 50 0.9035145 0.8547436 0.7812302
## 100 0.9040186 0.8584799 0.7930801
## 150 0.9020966 0.8700183 0.7781999
## 200 0.9048260 0.8680952 0.7811850
## 250 0.9042521 0.8718864 0.7811850
## 50 0.8986996 0.8814652 0.7273632
## 100 0.9064294 0.8738278 0.7691995
## 150 0.9077320 0.8700000 0.7781547
## 200 0.9083372 0.8719048 0.7781999
## 250 0.9079190 0.8699634 0.7752148
## 50 0.9076847 0.8547070 0.7931705
## 100 0.9063275 0.8585897 0.7931705
## 150 0.9061952 0.8700000 0.7842153
## 200 0.9065482 0.8795421 0.7842153
## 250 0.9073973 0.8776557 0.7781999
## 50 0.9030097 0.8700000 0.7513795
## 100 0.8971860 0.8622894 0.7721845
## 150 0.8926586 0.8585165 0.7572592
## 200 0.8877514 0.8489011 0.7662144
## 250 0.8871529 0.8450916 0.7692899
## 50 0.9032824 0.8738278 0.7662144
## 100 0.8896950 0.8469780 0.7542289
## 150 0.8875065 0.8508242 0.7392583
## 200 0.8836898 0.8489377 0.7481682
## 250 0.8785969 0.8451099 0.7572139
## 50 0.9037358 0.8680769 0.7752148
## 100 0.8981060 0.8623443 0.7662596
## 150 0.8907622 0.8585531 0.7631841
## 200 0.8861673 0.8451282 0.7752601
## 250 0.8850198 0.8546886 0.7661692
## 50 0.9035952 0.8681136 0.7752601
## 100 0.8934247 0.8623260 0.7543645
## 150 0.8862634 0.8527106 0.7513342
## 200 0.8810515 0.8412637 0.7573948
## 250 0.8793130 0.8565751 0.7604251
## 50 0.9007926 0.8642491 0.7752601
## 100 0.8969704 0.8527839 0.7602442
## 150 0.8885894 0.8584615 0.7692899
## 200 0.8866311 0.8508425 0.7632745
## 250 0.8807332 0.8526923 0.7632745
## 50 0.9070847 0.8680769 0.7902307
## 100 0.8969295 0.8527289 0.7843057
## 150 0.8934638 0.8470147 0.7752601
## 200 0.8875646 0.8547070 0.7602442
## 250 0.8861075 0.8565751 0.7573496
## 50 0.9029131 0.8699817 0.7781999
## 100 0.8955220 0.8622711 0.7722298
## 150 0.8934289 0.8565751 0.7782904
## 200 0.8892017 0.8604029 0.7603347
## 250 0.8873061 0.8642125 0.7722750
## 50 0.9015310 0.8622894 0.7662144
## 100 0.8945263 0.8642308 0.7662596
## 150 0.8887995 0.8489011 0.7812302
## 200 0.8849294 0.8527473 0.7541836
## 250 0.8817175 0.8622894 0.7601990
## 50 0.9011464 0.8546886 0.7752601
## 100 0.8981553 0.8585165 0.7692899
## 150 0.8923908 0.8527289 0.7542741
## 200 0.8887276 0.8431685 0.7542741
## 250 0.8864583 0.8393407 0.7572139
## 50 0.9039217 0.8718864 0.7782451
## 100 0.8966711 0.8623443 0.7602895
## 150 0.8910055 0.8508059 0.7513342
## 200 0.8875998 0.8527106 0.7483039
## 250 0.8845248 0.8488828 0.7542289
## 50 0.8987723 0.8565751 0.7722750
## 100 0.8947292 0.8604212 0.7662144
## 150 0.8875730 0.8546886 0.7453641
## 200 0.8907365 0.8547436 0.7543193
## 250 0.8879146 0.8661355 0.7483039
## 50 0.8954191 0.8623626 0.7692899
## 100 0.8847690 0.8604396 0.7722298
## 150 0.8853020 0.8508242 0.7722750
## 200 0.8814946 0.8527656 0.7542289
## 250 0.8784898 0.8451099 0.7572592
## 50 0.9001250 0.8603297 0.7483039
## 100 0.8902727 0.8565751 0.7363636
## 150 0.8886088 0.8680037 0.7483039
## 200 0.8877478 0.8660989 0.7602442
## 250 0.8837225 0.8412454 0.7632293
## 50 0.8956801 0.8584982 0.7663048
## 100 0.8921371 0.8508059 0.7783356
## 150 0.8908350 0.8622894 0.7722750
## 200 0.8861551 0.8603846 0.7632293
## 250 0.8879956 0.8604029 0.7753505
## 50 0.8998085 0.8604396 0.7723202
## 100 0.8933796 0.8432784 0.7663048
## 150 0.8927378 0.8469963 0.7782451
## 200 0.8925973 0.8622894 0.7812302
## 250 0.8875859 0.8546703 0.7751696
## 50 0.9030733 0.8565751 0.8081411
## 100 0.8924247 0.8489377 0.7752148
## 150 0.8904497 0.8469780 0.7511986
## 200 0.8899488 0.8546337 0.7631841
## 250 0.8872759 0.8450733 0.7601538
## 50 0.8994844 0.8585714 0.7602442
## 100 0.8954504 0.8547070 0.7512890
## 150 0.8928139 0.8585165 0.7692899
## 200 0.8901413 0.8451465 0.7782451
## 250 0.8891110 0.8547070 0.7752148
## 50 0.9037491 0.8623810 0.7963365
## 100 0.8941735 0.8508425 0.7902307
## 150 0.8937563 0.8604029 0.7782904
## 200 0.8911112 0.8603846 0.7902759
## 250 0.8891834 0.8584799 0.7783356
## 50 0.8970032 0.8680769 0.7484848
## 100 0.8919703 0.8565934 0.7544098
## 150 0.8894024 0.8488828 0.7393487
## 200 0.8882589 0.8565385 0.7632745
## 250 0.8878253 0.8546337 0.7601990
## 50 0.8993907 0.8661355 0.7543645
## 100 0.8956058 0.8565568 0.7692899
## 150 0.8917271 0.8546520 0.7692447
## 200 0.8899483 0.8489011 0.7572592
## 250 0.8892347 0.8450733 0.7632293
## 50 0.9012311 0.8719231 0.7631841
## 100 0.8957128 0.8624176 0.7721393
## 150 0.8854754 0.8566300 0.7692899
## 200 0.8857922 0.8623260 0.7661692
## 250 0.8754879 0.8508791 0.7573044
## 50 0.8891436 0.8546703 0.7454093
## 100 0.8823714 0.8527289 0.7572592
## 150 0.8807710 0.8565385 0.7632293
## 200 0.8793074 0.8546886 0.7511986
## 250 0.8788026 0.8546337 0.7422433
## 50 0.8976688 0.8662088 0.7693351
## 100 0.8935291 0.8624176 0.7722750
## 150 0.8874823 0.8585531 0.7632293
## 200 0.8866949 0.8546703 0.7661692
## 250 0.8791487 0.8546520 0.7511986
## 50 0.8977067 0.8622894 0.7752148
## 100 0.8876899 0.8565568 0.7482587
## 150 0.8825931 0.8431136 0.7633198
## 200 0.8795617 0.8412637 0.7542741
## 250 0.8778153 0.8393956 0.7453189
## 50 0.9001361 0.8585165 0.7602895
## 100 0.8926184 0.8623077 0.7692447
## 150 0.8847519 0.8508425 0.7692447
## 200 0.8798155 0.8604029 0.7453189
## 250 0.8775972 0.8584799 0.7513795
## 50 0.8994560 0.8718498 0.7663048
## 100 0.8911481 0.8661722 0.7692899
## 150 0.8845845 0.8699634 0.7572592
## 200 0.8839895 0.8584615 0.7631841
## 250 0.8777563 0.8527656 0.7572139
## 50 0.8983123 0.8622894 0.7632745
## 100 0.8895428 0.8546886 0.7691995
## 150 0.8875118 0.8508608 0.7813207
## 200 0.8836481 0.8508791 0.7723654
## 250 0.8814924 0.8508425 0.7663048
## 50 0.8980753 0.8642308 0.7572592
## 100 0.8901145 0.8546703 0.7542289
## 150 0.8841774 0.8699817 0.7512438
## 200 0.8799413 0.8546520 0.7542741
## 250 0.8756733 0.8604029 0.7572592
## 50 0.9011454 0.8584799 0.7752601
## 100 0.8932129 0.8412454 0.7753053
## 150 0.8876969 0.8374176 0.7512438
## 200 0.8848637 0.8431502 0.7542289
## 250 0.8812595 0.8374176 0.7512438
## 50 0.9013477 0.8661172 0.7723202
## 100 0.8921936 0.8527839 0.7512890
## 150 0.8888899 0.8489194 0.7512890
## 200 0.8848338 0.8489194 0.7482587
## 250 0.8802561 0.8450916 0.7512438
## 50 0.9016274 0.8852747 0.7363184
## 100 0.9036703 0.8871429 0.7634102
## 150 0.9046586 0.8814652 0.7603347
## 200 0.9019597 0.8737546 0.7843057
## 250 0.9034095 0.8756777 0.7663953
## 50 0.9038896 0.8604396 0.7871551
## 100 0.9002104 0.8757326 0.7782451
## 150 0.9014874 0.8719048 0.7752601
## 200 0.8990855 0.8719231 0.7781999
## 250 0.9007381 0.8756960 0.7692447
## 50 0.8981202 0.8814103 0.7573044
## 100 0.9010429 0.8757326 0.7513795
## 150 0.9020122 0.8756960 0.7423338
## 200 0.9001581 0.8719231 0.7573044
## 250 0.9015759 0.8662454 0.7633650
## 50 0.9019524 0.8661538 0.7602442
## 100 0.9026889 0.8699084 0.7811850
## 150 0.9024015 0.8813370 0.7722298
## 200 0.9036910 0.8852381 0.7722298
## 250 0.9034620 0.8757143 0.7662596
## 50 0.9065613 0.8776374 0.7752148
## 100 0.9042385 0.8700000 0.7931253
## 150 0.9057812 0.8700366 0.7932157
## 200 0.9045264 0.8699451 0.7842153
## 250 0.9031298 0.8699267 0.7632293
## 50 0.9019811 0.8623260 0.7872456
## 100 0.9058235 0.8661722 0.7812302
## 150 0.9067543 0.8737912 0.7752601
## 200 0.9078134 0.8738095 0.7782904
## 250 0.9068956 0.8700000 0.7872456
## 50 0.9052593 0.8852381 0.7483039
## 100 0.9041528 0.8757326 0.7602442
## 150 0.9043131 0.8700000 0.7692447
## 200 0.9043977 0.8604396 0.7722298
## 250 0.9046922 0.8681319 0.7722298
## 50 0.9076331 0.8680769 0.7841701
## 100 0.9057736 0.8737912 0.7902307
## 150 0.9060457 0.8776374 0.7782451
## 200 0.9066803 0.8757143 0.7872004
## 250 0.9064750 0.8814469 0.7723202
## 50 0.9008101 0.8719414 0.7363184
## 100 0.9058576 0.8834249 0.7542741
## 150 0.9094714 0.8814652 0.7752148
## 200 0.9078385 0.8795421 0.7752601
## 250 0.9067102 0.8757143 0.7842153
## 50 0.9052444 0.8662271 0.7751696
## 100 0.9067612 0.8757143 0.7722750
## 150 0.9070021 0.8757143 0.7811850
## 200 0.9070889 0.8852747 0.7783356
## 250 0.9067735 0.8814835 0.7783808
## 50 0.9039815 0.8565385 0.7691542
## 100 0.8913395 0.8527473 0.7632745
## 150 0.8873544 0.8450733 0.7632745
## 200 0.8879844 0.8604029 0.7423338
## 250 0.8832386 0.8565385 0.7303483
## 50 0.8932172 0.8623443 0.7602442
## 100 0.8845888 0.8527656 0.7603347
## 150 0.8780528 0.8489377 0.7722750
## 200 0.8790390 0.8336630 0.7693351
## 250 0.8694526 0.8355495 0.7573496
## 50 0.8974845 0.8528022 0.7842605
## 100 0.8917888 0.8470147 0.7573044
## 150 0.8826997 0.8469963 0.7662144
## 200 0.8777017 0.8469780 0.7424242
## 250 0.8781822 0.8489377 0.7692447
## 50 0.8929350 0.8622711 0.7724107
## 100 0.8786087 0.8469963 0.7663048
## 150 0.8762758 0.8527289 0.7513795
## 200 0.8710725 0.8526923 0.7424242
## 250 0.8724859 0.8470147 0.7453641
## 50 0.8981101 0.8546886 0.7753053
## 100 0.8935861 0.8432234 0.7781999
## 150 0.8860451 0.8336081 0.7782451
## 200 0.8841966 0.8432051 0.7781547
## 250 0.8807164 0.8489011 0.7512438
## 50 0.9010507 0.8623443 0.7751244
## 100 0.8921987 0.8603846 0.7601538
## 150 0.8856579 0.8565934 0.7511986
## 200 0.8831130 0.8661538 0.7392130
## 250 0.8801557 0.8604029 0.7451832
## 50 0.9025911 0.8623077 0.7842605
## 100 0.8913135 0.8489560 0.7842153
## 150 0.8873488 0.8527289 0.7663501
## 200 0.8825922 0.8508242 0.7543645
## 250 0.8813028 0.8470147 0.7663048
## 50 0.8994301 0.8661355 0.7720941
## 100 0.8892917 0.8565751 0.7631389
## 150 0.8841694 0.8584799 0.7631841
## 200 0.8820053 0.8584799 0.7691995
## 250 0.8766149 0.8469963 0.7633198
## 50 0.8976497 0.8680586 0.7781999
## 100 0.8928143 0.8603480 0.7632293
## 150 0.8887382 0.8584799 0.7512438
## 200 0.8850743 0.8584982 0.7542289
## 250 0.8832767 0.8565934 0.7482587
## 50 0.9017022 0.8623077 0.7841701
## 100 0.8927448 0.8604212 0.7542289
## 150 0.8897813 0.8546703 0.7601990
## 200 0.8858438 0.8489377 0.7511533
## 250 0.8836891 0.8470147 0.7511986
## 50 0.8951064 0.8527656 0.7782904
## 100 0.8885361 0.8393956 0.7931705
## 150 0.8871490 0.8584615 0.7602895
## 200 0.8789801 0.8642125 0.7603799
## 250 0.8792568 0.8603846 0.7573948
## 50 0.8912889 0.8450916 0.7782451
## 100 0.8862182 0.8565201 0.7632293
## 150 0.8842854 0.8526923 0.7663048
## 200 0.8835099 0.8584432 0.7692447
## 250 0.8821631 0.8603663 0.7691995
## 50 0.8933790 0.8585165 0.7933062
## 100 0.8855968 0.8547253 0.7453189
## 150 0.8877231 0.8412821 0.7753053
## 200 0.8830139 0.8489377 0.7603347
## 250 0.8796349 0.8565934 0.7692899
## 50 0.8916226 0.8489560 0.7602442
## 100 0.8785866 0.8413187 0.7601538
## 150 0.8760319 0.8393590 0.7722750
## 200 0.8753720 0.8527106 0.7572592
## 250 0.8768319 0.8527473 0.7453641
## 50 0.8945898 0.8642491 0.7812302
## 100 0.8884387 0.8604212 0.7841701
## 150 0.8841504 0.8508425 0.7842605
## 200 0.8845497 0.8508059 0.7782451
## 250 0.8825976 0.8527106 0.7752601
## 50 0.8943278 0.8584799 0.7661692
## 100 0.8866563 0.8546703 0.7512438
## 150 0.8833146 0.8450733 0.7632745
## 200 0.8812530 0.8489011 0.7543193
## 250 0.8817983 0.8546337 0.7602895
## 50 0.8977597 0.8527839 0.7722750
## 100 0.8928935 0.8584249 0.7782451
## 150 0.8901484 0.8545971 0.7782904
## 200 0.8864524 0.8584066 0.7812754
## 250 0.8845046 0.8527106 0.7812754
## 50 0.8952368 0.8642308 0.7693804
## 100 0.8888513 0.8527289 0.7513342
## 150 0.8861552 0.8546703 0.7572592
## 200 0.8845189 0.8546337 0.7602442
## 250 0.8834301 0.8527473 0.7632745
## 50 0.8963655 0.8604212 0.7452736
## 100 0.8905856 0.8622711 0.7393487
## 150 0.8882560 0.8584432 0.7542741
## 200 0.8841910 0.8526923 0.7542741
## 250 0.8813249 0.8488462 0.7512438
## 50 0.8957051 0.8699817 0.7602895
## 100 0.8887429 0.8584615 0.7572592
## 150 0.8851786 0.8546337 0.7542741
## 200 0.8834418 0.8527106 0.7572592
## 250 0.8811000 0.8508059 0.7482587
## 50 0.8923999 0.8566117 0.7572592
## 100 0.8846507 0.8642491 0.7363184
## 150 0.8782088 0.8527473 0.7482587
## 200 0.8727905 0.8470513 0.7452736
## 250 0.8755993 0.8565751 0.7483944
## 50 0.8914390 0.8700000 0.7602442
## 100 0.8809877 0.8565934 0.7601990
## 150 0.8714182 0.8469780 0.7722750
## 200 0.8728648 0.8412821 0.7752601
## 250 0.8726736 0.8546886 0.7693351
## 50 0.8922463 0.8508425 0.7811850
## 100 0.8808406 0.8546337 0.7631841
## 150 0.8777764 0.8431136 0.7452284
## 200 0.8754632 0.8469780 0.7632293
## 250 0.8709370 0.8412454 0.7602895
## 50 0.8961866 0.8527656 0.7632293
## 100 0.8850736 0.8527473 0.7692447
## 150 0.8815276 0.8584249 0.7783356
## 200 0.8770541 0.8527473 0.7633650
## 250 0.8759557 0.8527473 0.7603799
## 50 0.8984948 0.8470513 0.7601538
## 100 0.8929711 0.8527473 0.7752148
## 150 0.8853408 0.8508425 0.7631389
## 200 0.8800651 0.8546703 0.7692447
## 250 0.8793314 0.8546703 0.7633198
## 50 0.8963168 0.8603846 0.7752601
## 100 0.8852772 0.8546337 0.7753053
## 150 0.8802687 0.8508059 0.7572592
## 200 0.8766413 0.8431868 0.7543193
## 250 0.8755225 0.8451465 0.7483492
## 50 0.8946247 0.8546337 0.7692899
## 100 0.8859513 0.8585531 0.7752601
## 150 0.8788196 0.8489194 0.7722750
## 200 0.8761971 0.8470147 0.7633198
## 250 0.8738559 0.8431502 0.7573496
## 50 0.8922631 0.8489011 0.7692447
## 100 0.8873546 0.8508242 0.7691542
## 150 0.8806679 0.8527473 0.7631389
## 200 0.8788328 0.8470330 0.7662596
## 250 0.8757568 0.8489194 0.7752148
## 50 0.8963413 0.8642308 0.7752148
## 100 0.8871961 0.8546520 0.7601990
## 150 0.8823717 0.8527473 0.7392583
## 200 0.8766180 0.8431502 0.7512438
## 250 0.8736270 0.8431502 0.7453189
## 50 0.8958121 0.8603846 0.7572592
## 100 0.8889433 0.8527106 0.7542289
## 150 0.8829527 0.8469963 0.7452284
## 200 0.8793505 0.8546337 0.7543193
## 250 0.8777454 0.8507692 0.7573044
## 50 0.9014029 0.8719048 0.7393487
## 100 0.9057907 0.8661905 0.7662596
## 150 0.9067420 0.8776374 0.7633198
## 200 0.9063387 0.8681319 0.7632745
## 250 0.9056374 0.8700183 0.7782904
## 50 0.9043467 0.8757326 0.7602895
## 100 0.9057327 0.8756960 0.7723202
## 150 0.9057941 0.8699267 0.7842153
## 200 0.9057562 0.8737912 0.7722298
## 250 0.9040369 0.8623626 0.7872004
## 50 0.9020455 0.8834066 0.7481682
## 100 0.9064803 0.8814469 0.7661692
## 150 0.9079998 0.8833700 0.7752148
## 200 0.9047554 0.8737912 0.7781999
## 250 0.9042908 0.8680769 0.7692447
## 50 0.9063203 0.8833516 0.7872456
## 100 0.9105354 0.8680952 0.7962913
## 150 0.9055989 0.8681136 0.7812302
## 200 0.9068125 0.8680952 0.7812302
## 250 0.9044994 0.8719231 0.7782451
## 50 0.9028033 0.8814469 0.7483944
## 100 0.9046564 0.8776190 0.7514247
## 150 0.9064157 0.8719048 0.7693351
## 200 0.9055778 0.8757326 0.7722750
## 250 0.9028380 0.8604029 0.7663048
## 50 0.9049296 0.8871978 0.7753053
## 100 0.9060946 0.8757326 0.7752148
## 150 0.9054661 0.8738095 0.7812302
## 200 0.9037400 0.8718864 0.7782451
## 250 0.9020370 0.8737729 0.7781999
## 50 0.9046361 0.8738095 0.7483944
## 100 0.9055536 0.8757143 0.7513342
## 150 0.9062790 0.8738278 0.7663501
## 200 0.9052312 0.8738095 0.7663953
## 250 0.9025589 0.8719231 0.7692899
## 50 0.9019149 0.8814652 0.7663048
## 100 0.9033784 0.8776374 0.7722750
## 150 0.9042343 0.8776923 0.7723654
## 200 0.9051507 0.8738462 0.7722750
## 250 0.9048360 0.8757509 0.7723202
## 50 0.8987154 0.8776740 0.7393035
## 100 0.9051770 0.8776740 0.7812754
## 150 0.9061157 0.8757692 0.7753505
## 200 0.9048055 0.8738645 0.7813207
## 250 0.9051275 0.8680769 0.7902759
## 50 0.9037164 0.8834066 0.7782904
## 100 0.9035712 0.8795604 0.7932610
## 150 0.9036987 0.8757326 0.7873360
## 200 0.9042342 0.8757509 0.7843057
## 250 0.9023537 0.8719231 0.7872456
## 50 0.8886373 0.8527289 0.7512438
## 100 0.8843333 0.8489011 0.7511986
## 150 0.8788514 0.8393407 0.7753053
## 200 0.8726672 0.8393590 0.7662596
## 250 0.8688701 0.8278755 0.7513342
## 50 0.8912983 0.8585165 0.7662596
## 100 0.8770703 0.8508608 0.7633650
## 150 0.8763964 0.8546703 0.7543193
## 200 0.8719815 0.8527656 0.7663048
## 250 0.8733837 0.8412637 0.7483492
## 50 0.8970443 0.8413187 0.7662144
## 100 0.8856779 0.8623260 0.7632745
## 150 0.8811856 0.8546703 0.7482135
## 200 0.8800375 0.8451099 0.7542741
## 250 0.8751373 0.8374542 0.7572592
## 50 0.8866683 0.8508425 0.7632745
## 100 0.8793307 0.8565751 0.7572592
## 150 0.8772693 0.8508425 0.7572592
## 200 0.8729576 0.8412821 0.7573496
## 250 0.8674468 0.8469963 0.7453189
## 50 0.8957153 0.8642125 0.7632293
## 100 0.8870295 0.8565751 0.7632293
## 150 0.8799249 0.8527473 0.7634102
## 200 0.8769179 0.8546703 0.7633650
## 250 0.8753701 0.8412637 0.7543193
## 50 0.8939400 0.8680403 0.7454093
## 100 0.8855220 0.8489194 0.7752148
## 150 0.8802581 0.8527106 0.7692899
## 200 0.8721584 0.8411905 0.7662596
## 250 0.8698814 0.8431136 0.7454093
## 50 0.8956893 0.8641941 0.7423338
## 100 0.8880378 0.8527473 0.7573044
## 150 0.8850304 0.8489194 0.7602895
## 200 0.8809376 0.8412637 0.7632745
## 250 0.8754761 0.8336081 0.7663048
## 50 0.8949885 0.8584982 0.7513342
## 100 0.8865440 0.8565751 0.7512890
## 150 0.8809755 0.8527656 0.7482135
## 200 0.8767875 0.8527289 0.7513342
## 250 0.8755378 0.8431868 0.7453189
## 50 0.8970250 0.8642125 0.7603799
## 100 0.8897231 0.8623077 0.7542289
## 150 0.8848989 0.8526923 0.7542741
## 200 0.8817817 0.8527289 0.7632745
## 250 0.8770224 0.8431685 0.7632293
## 50 0.8952683 0.8699634 0.7632745
## 100 0.8868673 0.8584615 0.7542741
## 150 0.8823517 0.8642125 0.7483039
## 200 0.8794621 0.8469780 0.7513342
## 250 0.8773153 0.8450733 0.7543193
## 50 0.8900919 0.8604396 0.7811850
## 100 0.8812344 0.8489560 0.7542289
## 150 0.8808785 0.8489194 0.7752601
## 200 0.8742125 0.8354579 0.7722298
## 250 0.8734197 0.8393223 0.7781547
## 50 0.8784033 0.8393407 0.7481230
## 100 0.8765038 0.8374359 0.7512438
## 150 0.8705244 0.8412271 0.7513342
## 200 0.8746766 0.8527839 0.7513342
## 250 0.8741763 0.8565751 0.7423790
## 50 0.8887991 0.8603480 0.7332881
## 100 0.8772524 0.8565201 0.7334690
## 150 0.8799630 0.8641941 0.7454093
## 200 0.8805642 0.8527473 0.7544098
## 250 0.8787634 0.8547070 0.7573948
## 50 0.8913117 0.8585165 0.7663048
## 100 0.8815842 0.8546703 0.7602442
## 150 0.8805052 0.8489194 0.7542289
## 200 0.8762391 0.8451099 0.7721845
## 250 0.8763450 0.8489194 0.7542741
## 50 0.8863470 0.8584249 0.7423790
## 100 0.8814835 0.8585165 0.7453189
## 150 0.8805925 0.8565568 0.7543193
## 200 0.8772595 0.8566300 0.7572592
## 250 0.8769479 0.8470696 0.7632293
## 50 0.8897641 0.8489744 0.7813659
## 100 0.8808908 0.8584799 0.7693351
## 150 0.8769263 0.8603480 0.7542741
## 200 0.8775046 0.8584799 0.7632293
## 250 0.8804301 0.8642125 0.7543193
## 50 0.8927520 0.8604396 0.7542289
## 100 0.8866877 0.8546154 0.7751244
## 150 0.8826159 0.8565568 0.7573044
## 200 0.8782982 0.8431502 0.7633198
## 250 0.8781474 0.8508425 0.7632745
## 50 0.8895559 0.8642308 0.7452284
## 100 0.8818590 0.8527473 0.7511986
## 150 0.8801838 0.8546703 0.7573044
## 200 0.8778894 0.8508425 0.7602895
## 250 0.8757715 0.8450916 0.7573044
## 50 0.8877414 0.8470330 0.7632745
## 100 0.8846326 0.8470147 0.7572592
## 150 0.8809462 0.8546337 0.7512890
## 200 0.8792321 0.8469780 0.7542289
## 250 0.8775981 0.8489377 0.7542289
## 50 0.8928489 0.8527473 0.7602895
## 100 0.8856138 0.8584982 0.7483039
## 150 0.8824219 0.8412821 0.7512438
## 200 0.8818684 0.8527473 0.7392583
## 250 0.8791472 0.8527656 0.7422886
## 50 0.8890225 0.8470147 0.7572592
## 100 0.8831233 0.8603480 0.7513795
## 150 0.8768519 0.8469780 0.7334690
## 200 0.8726718 0.8469963 0.7364541
## 250 0.8704253 0.8431502 0.7424242
## 50 0.8860038 0.8604762 0.7602442
## 100 0.8756685 0.8374176 0.7483944
## 150 0.8692305 0.8317033 0.7633198
## 200 0.8656254 0.8412454 0.7453641
## 250 0.8658771 0.8316850 0.7483944
## 50 0.8859798 0.8584615 0.7603347
## 100 0.8781492 0.8392857 0.7513342
## 150 0.8759609 0.8393040 0.7573496
## 200 0.8712627 0.8451099 0.7573496
## 250 0.8681694 0.8470330 0.7543193
## 50 0.8863604 0.8604212 0.7662596
## 100 0.8778141 0.8623077 0.7424242
## 150 0.8714625 0.8489011 0.7424695
## 200 0.8667616 0.8546154 0.7454093
## 250 0.8690850 0.8527473 0.7424242
## 50 0.8885438 0.8776740 0.7542741
## 100 0.8836776 0.8585165 0.7602442
## 150 0.8771611 0.8526923 0.7543193
## 200 0.8737143 0.8469780 0.7633198
## 250 0.8713235 0.8393223 0.7632745
## 50 0.8808934 0.8565751 0.7573496
## 100 0.8765657 0.8565751 0.7393939
## 150 0.8741980 0.8508608 0.7483039
## 200 0.8704876 0.8431685 0.7483944
## 250 0.8685941 0.8374725 0.7513342
## 50 0.8889830 0.8565751 0.7722750
## 100 0.8816603 0.8622527 0.7722298
## 150 0.8754828 0.8431685 0.7692899
## 200 0.8723021 0.8469780 0.7633198
## 250 0.8716049 0.8412454 0.7573496
## 50 0.8871544 0.8584432 0.7572592
## 100 0.8779673 0.8489011 0.7601990
## 150 0.8757220 0.8545971 0.7543193
## 200 0.8736745 0.8565751 0.7483492
## 250 0.8700821 0.8527473 0.7453641
## 50 0.8925941 0.8565385 0.7482135
## 100 0.8826698 0.8432051 0.7572592
## 150 0.8799777 0.8489560 0.7632745
## 200 0.8759904 0.8432234 0.7543193
## 250 0.8735034 0.8412821 0.7572592
## 50 0.8916015 0.8584799 0.7572592
## 100 0.8837756 0.8642125 0.7603347
## 150 0.8791715 0.8508242 0.7602442
## 200 0.8766093 0.8584615 0.7662596
## 250 0.8720530 0.8488828 0.7572139
## 50 0.8989426 0.8718864 0.7393487
## 100 0.9000381 0.8776374 0.7781999
## 150 0.8979999 0.8661722 0.7811398
## 200 0.8984003 0.8623443 0.7722750
## 250 0.8985796 0.8661905 0.7633198
## 50 0.9055431 0.8642491 0.7813207
## 100 0.9054220 0.8718681 0.7722750
## 150 0.9031344 0.8776190 0.7692899
## 200 0.9030008 0.8661722 0.7781547
## 250 0.9030296 0.8642308 0.7722298
## 50 0.9062409 0.8852564 0.7573948
## 100 0.9064552 0.8776374 0.7512890
## 150 0.9063123 0.8737912 0.7514247
## 200 0.9038202 0.8757143 0.7603799
## 250 0.9054796 0.8661722 0.7812302
## 50 0.9037062 0.8814652 0.7813659
## 100 0.9016744 0.8757326 0.7783808
## 150 0.9041861 0.8681136 0.7843510
## 200 0.9049163 0.8699817 0.7692447
## 250 0.9055121 0.8585531 0.7691995
## 50 0.9005515 0.8680586 0.7393939
## 100 0.9011278 0.8738095 0.7513342
## 150 0.9009682 0.8719414 0.7544098
## 200 0.8995403 0.8661722 0.7543645
## 250 0.9011842 0.8661355 0.7573044
## 50 0.9046325 0.8718864 0.7663048
## 100 0.9013882 0.8756777 0.7573496
## 150 0.9014438 0.8776557 0.7692899
## 200 0.9027194 0.8814286 0.7663048
## 250 0.9017595 0.8756593 0.7692899
## 50 0.9043265 0.8795055 0.7573496
## 100 0.9020824 0.8718498 0.7513795
## 150 0.9026756 0.8718681 0.7751696
## 200 0.9032253 0.8756593 0.7692447
## 250 0.9030990 0.8660989 0.7722750
## 50 0.9044820 0.8795421 0.7603799
## 100 0.9023841 0.8776557 0.7632745
## 150 0.9025300 0.8757875 0.7662596
## 200 0.9018402 0.8738828 0.7692899
## 250 0.9013715 0.8719231 0.7722750
## 50 0.9051852 0.8871978 0.7663501
## 100 0.9039248 0.8776374 0.7723202
## 150 0.9064993 0.8699817 0.7842605
## 200 0.9055792 0.8699634 0.7842605
## 250 0.9040388 0.8642125 0.7842605
## 50 0.9037330 0.8814103 0.7663048
## 100 0.9048765 0.8795055 0.7722750
## 150 0.9024496 0.8718681 0.7782451
## 200 0.9025541 0.8757143 0.7782451
## 250 0.9015541 0.8738278 0.7782451
## 50 0.8919482 0.8489560 0.7721845
## 100 0.8778024 0.8469414 0.7513795
## 150 0.8728197 0.8393773 0.7483039
## 200 0.8711981 0.8355311 0.7393939
## 250 0.8689293 0.8374725 0.7423790
## 50 0.8835833 0.8355128 0.7632745
## 100 0.8765139 0.8527656 0.7603799
## 150 0.8752542 0.8489011 0.7453641
## 200 0.8775431 0.8469780 0.7544098
## 250 0.8792401 0.8508425 0.7603347
## 50 0.8872311 0.8566484 0.7572592
## 100 0.8792590 0.8470330 0.7692447
## 150 0.8743964 0.8451465 0.7662596
## 200 0.8693502 0.8469963 0.7723202
## 250 0.8699554 0.8431502 0.7573948
## 50 0.8903448 0.8584615 0.7721393
## 100 0.8848256 0.8565751 0.7662596
## 150 0.8819107 0.8488828 0.7781999
## 200 0.8707319 0.8316667 0.7662144
## 250 0.8718247 0.8412821 0.7453189
## 50 0.8900381 0.8565934 0.7541836
## 100 0.8809140 0.8604029 0.7543193
## 150 0.8741797 0.8451099 0.7603347
## 200 0.8682944 0.8489194 0.7573496
## 250 0.8662013 0.8508059 0.7573496
## 50 0.8915678 0.8565201 0.7722750
## 100 0.8817129 0.8546337 0.7483492
## 150 0.8778763 0.8508242 0.7632745
## 200 0.8751543 0.8393407 0.7572592
## 250 0.8719228 0.8374176 0.7512890
## 50 0.8866928 0.8546703 0.7601990
## 100 0.8795376 0.8316850 0.7691995
## 150 0.8746141 0.8412271 0.7782451
## 200 0.8728071 0.8393590 0.7662596
## 250 0.8732301 0.8393590 0.7662596
## 50 0.8935613 0.8527656 0.7632293
## 100 0.8803338 0.8508242 0.7692447
## 150 0.8771587 0.8546520 0.7573044
## 200 0.8725200 0.8470147 0.7602895
## 250 0.8706603 0.8489011 0.7453189
## 50 0.8906950 0.8565751 0.7572592
## 100 0.8845555 0.8623443 0.7632293
## 150 0.8800773 0.8546337 0.7601538
## 200 0.8747896 0.8450183 0.7602442
## 250 0.8707223 0.8431136 0.7542289
## 50 0.8919754 0.8641758 0.7721845
## 100 0.8838793 0.8489011 0.7662144
## 150 0.8769868 0.8508242 0.7512438
## 200 0.8735857 0.8451099 0.7662596
## 250 0.8707503 0.8489194 0.7692899
## 50 0.8888194 0.8412637 0.7602442
## 100 0.8874671 0.8565934 0.7663048
## 150 0.8806617 0.8489377 0.7692447
## 200 0.8820639 0.8546886 0.7602895
## 250 0.8780724 0.8565568 0.7512438
## 50 0.8797186 0.8604396 0.7333786
## 100 0.8768057 0.8527289 0.7243781
## 150 0.8759925 0.8450549 0.7513342
## 200 0.8751595 0.8527473 0.7574401
## 250 0.8741172 0.8507875 0.7454093
## 50 0.8907242 0.8585165 0.7423338
## 100 0.8832072 0.8527473 0.7572592
## 150 0.8807545 0.8508425 0.7602895
## 200 0.8797619 0.8489377 0.7633198
## 250 0.8758855 0.8412637 0.7603347
## 50 0.8901194 0.8508425 0.7783356
## 100 0.8846228 0.8584615 0.7633650
## 150 0.8781091 0.8642125 0.7573948
## 200 0.8780487 0.8604212 0.7573496
## 250 0.8742555 0.8623443 0.7424242
## 50 0.8871096 0.8546154 0.7572592
## 100 0.8826730 0.8470330 0.7603347
## 150 0.8759945 0.8489011 0.7633198
## 200 0.8725763 0.8451282 0.7603347
## 250 0.8741795 0.8413004 0.7663048
## 50 0.8878480 0.8566300 0.7661692
## 100 0.8791040 0.8451099 0.7692899
## 150 0.8771921 0.8374176 0.7692899
## 200 0.8749290 0.8413004 0.7543645
## 250 0.8739550 0.8355495 0.7573496
## 50 0.8903441 0.8527289 0.7722750
## 100 0.8825686 0.8508425 0.7633650
## 150 0.8789874 0.8565934 0.7663048
## 200 0.8764304 0.8393773 0.7633650
## 250 0.8767079 0.8432051 0.7722750
## 50 0.8881585 0.8623077 0.7572592
## 100 0.8796982 0.8508425 0.7453641
## 150 0.8767766 0.8470330 0.7483492
## 200 0.8754835 0.8527289 0.7483039
## 250 0.8748248 0.8489194 0.7483944
## 50 0.8922152 0.8527289 0.7632745
## 100 0.8855003 0.8412637 0.7602895
## 150 0.8811104 0.8431502 0.7422886
## 200 0.8787900 0.8469963 0.7483039
## 250 0.8769939 0.8508425 0.7423338
## 50 0.8953328 0.8642308 0.7662144
## 100 0.8867430 0.8546520 0.7602442
## 150 0.8830603 0.8469780 0.7572139
## 200 0.8786514 0.8469963 0.7543645
## 250 0.8774226 0.8412271 0.7453189
## 50 0.8897409 0.8680220 0.7602895
## 100 0.8768703 0.8508242 0.7631841
## 150 0.8770454 0.8489011 0.7544098
## 200 0.8739825 0.8469780 0.7573948
## 250 0.8708040 0.8336630 0.7782904
## 50 0.8866858 0.8642491 0.7541836
## 100 0.8783295 0.8527473 0.7452736
## 150 0.8743873 0.8470330 0.7363184
## 200 0.8737080 0.8431319 0.7303935
## 250 0.8709281 0.8393223 0.7393939
## 50 0.8859216 0.8488828 0.7603347
## 100 0.8793205 0.8451099 0.7423338
## 150 0.8717163 0.8450549 0.7662596
## 200 0.8734390 0.8488462 0.7573044
## 250 0.8710569 0.8469597 0.7393487
## 50 0.8846685 0.8641941 0.7573496
## 100 0.8715969 0.8584799 0.7483492
## 150 0.8694763 0.8527473 0.7394392
## 200 0.8675972 0.8508242 0.7573044
## 250 0.8671734 0.8470330 0.7483039
## 50 0.8944017 0.8604212 0.7541836
## 100 0.8798579 0.8547070 0.7453189
## 150 0.8797917 0.8470147 0.7602442
## 200 0.8755086 0.8374359 0.7572592
## 250 0.8727125 0.8374542 0.7512438
## 50 0.8904027 0.8603846 0.7662144
## 100 0.8796541 0.8565385 0.7393939
## 150 0.8741908 0.8431319 0.7512890
## 200 0.8707887 0.8374542 0.7483492
## 250 0.8688357 0.8431502 0.7543193
## 50 0.8891390 0.8603663 0.7542289
## 100 0.8800915 0.8641941 0.7661692
## 150 0.8729973 0.8546337 0.7542289
## 200 0.8700464 0.8412454 0.7482587
## 250 0.8658977 0.8450733 0.7452736
## 50 0.8875487 0.8489377 0.7601990
## 100 0.8768144 0.8566117 0.7573044
## 150 0.8728485 0.8584799 0.7543193
## 200 0.8726744 0.8508425 0.7543645
## 250 0.8688960 0.8432234 0.7513342
## 50 0.8894893 0.8565934 0.7542289
## 100 0.8813171 0.8470147 0.7512890
## 150 0.8775589 0.8431685 0.7542289
## 200 0.8730319 0.8450733 0.7423790
## 250 0.8701966 0.8450733 0.7513342
## 50 0.8924847 0.8604029 0.7631841
## 100 0.8832363 0.8489377 0.7511533
## 150 0.8792281 0.8489194 0.7362280
## 200 0.8765903 0.8469963 0.7422886
## 250 0.8734428 0.8489194 0.7362732
## 50 0.8970404 0.8680037 0.7393035
## 100 0.9004187 0.8699817 0.7423338
## 150 0.8984220 0.8680220 0.7483039
## 200 0.8966519 0.8565751 0.7662144
## 250 0.8963005 0.8565751 0.7542289
## 50 0.9058860 0.8680220 0.7662596
## 100 0.9022618 0.8622894 0.7631841
## 150 0.9002770 0.8661172 0.7631389
## 200 0.9007463 0.8603846 0.7511533
## 250 0.8989982 0.8622894 0.7542741
## 50 0.8924444 0.8680769 0.7363184
## 100 0.8959743 0.8680769 0.7603347
## 150 0.8957327 0.8699634 0.7633198
## 200 0.8958560 0.8642308 0.7663048
## 250 0.8995072 0.8660989 0.7633198
## 50 0.9022862 0.8738278 0.7752148
## 100 0.9028077 0.8604579 0.7722298
## 150 0.9039749 0.8699817 0.7691995
## 200 0.9017711 0.8584982 0.7721845
## 250 0.9013106 0.8623260 0.7722750
## 50 0.9000986 0.8699634 0.7453189
## 100 0.8975419 0.8661905 0.7602895
## 150 0.8968892 0.8604029 0.7602895
## 200 0.8988841 0.8642674 0.7602442
## 250 0.8991782 0.8661905 0.7632745
## 50 0.9004727 0.8738462 0.7632293
## 100 0.9022661 0.8623443 0.7722298
## 150 0.9017958 0.8642125 0.7781999
## 200 0.8997614 0.8661355 0.7751696
## 250 0.9010833 0.8718498 0.7632293
## 50 0.8979430 0.8738645 0.7332429
## 100 0.9014116 0.8738278 0.7483039
## 150 0.8995188 0.8662271 0.7722750
## 200 0.8986310 0.8661722 0.7573044
## 250 0.8983972 0.8680952 0.7663048
## 50 0.9036091 0.8795788 0.7572592
## 100 0.9025644 0.8757326 0.7662144
## 150 0.9010122 0.8719048 0.7782451
## 200 0.9014008 0.8718864 0.7692899
## 250 0.9006846 0.8718864 0.7722750
## 50 0.8952945 0.8775641 0.7334690
## 100 0.8967264 0.8776190 0.7424242
## 150 0.9010812 0.8680403 0.7573948
## 200 0.9000211 0.8699634 0.7513795
## 250 0.9001154 0.8680586 0.7513795
## 50 0.9002607 0.8680769 0.7692447
## 100 0.9004769 0.8642491 0.7722750
## 150 0.8993697 0.8604029 0.7692447
## 200 0.8998830 0.8584615 0.7753053
## 250 0.8985949 0.8565385 0.7812754
## 50 0.8916821 0.8470330 0.7631841
## 100 0.8792006 0.8431685 0.7542289
## 150 0.8765462 0.8412821 0.7543193
## 200 0.8722590 0.8431685 0.7692447
## 250 0.8716735 0.8451099 0.7662596
## 50 0.8873113 0.8566117 0.7783356
## 100 0.8788868 0.8317216 0.7782451
## 150 0.8704513 0.8508242 0.7483039
## 200 0.8721783 0.8412454 0.7453641
## 250 0.8684489 0.8393407 0.7423790
## 50 0.8866495 0.8527656 0.7692447
## 100 0.8819730 0.8451099 0.7632745
## 150 0.8768193 0.8451465 0.7663048
## 200 0.8714877 0.8431868 0.7572592
## 250 0.8718520 0.8393773 0.7632293
## 50 0.8888100 0.8508608 0.7722298
## 100 0.8811921 0.8508608 0.7572592
## 150 0.8768332 0.8470330 0.7483944
## 200 0.8711288 0.8450916 0.7423338
## 250 0.8668104 0.8431502 0.7422886
## 50 0.8877303 0.8470147 0.7661692
## 100 0.8783352 0.8451465 0.7572139
## 150 0.8736950 0.8431502 0.7692447
## 200 0.8746923 0.8470147 0.7663048
## 250 0.8717939 0.8393590 0.7512890
## 50 0.8815308 0.8546337 0.7663048
## 100 0.8781544 0.8450916 0.7662596
## 150 0.8751599 0.8450733 0.7603347
## 200 0.8707283 0.8450916 0.7573044
## 250 0.8705534 0.8431868 0.7453189
## 50 0.8901825 0.8565018 0.7722750
## 100 0.8837577 0.8507875 0.7483492
## 150 0.8790613 0.8488828 0.7542741
## 200 0.8740440 0.8526923 0.7542741
## 250 0.8728066 0.8412637 0.7542741
## 50 0.8867965 0.8565568 0.7601990
## 100 0.8785130 0.8450916 0.7632745
## 150 0.8737038 0.8413004 0.7513342
## 200 0.8702877 0.8432234 0.7602895
## 250 0.8684581 0.8431868 0.7572139
## 50 0.8892420 0.8566117 0.7512890
## 100 0.8849944 0.8546703 0.7392583
## 150 0.8793574 0.8450916 0.7452736
## 200 0.8752137 0.8508425 0.7452284
## 250 0.8717114 0.8450916 0.7452284
## 50 0.8930991 0.8680403 0.7632293
## 100 0.8816946 0.8565568 0.7512438
## 150 0.8769190 0.8527106 0.7572592
## 200 0.8736829 0.8431685 0.7482587
## 250 0.8715129 0.8393590 0.7512438
## 50 0.8784170 0.8470147 0.7813207
## 100 0.8742641 0.8373993 0.7573496
## 150 0.8688575 0.8335714 0.7363636
## 200 0.8705020 0.8431685 0.7482587
## 250 0.8655376 0.8374359 0.7393487
## 50 0.8742191 0.8566484 0.7573948
## 100 0.8812693 0.8546703 0.7872456
## 150 0.8757881 0.8603663 0.7633198
## 200 0.8772089 0.8469780 0.7512438
## 250 0.8788544 0.8546154 0.7453189
## 50 0.8775910 0.8488645 0.7393035
## 100 0.8755816 0.8393590 0.7661692
## 150 0.8720395 0.8508242 0.7572139
## 200 0.8694513 0.8469963 0.7572139
## 250 0.8673287 0.8450549 0.7573044
## 50 0.8778399 0.8565751 0.7363636
## 100 0.8744992 0.8508242 0.7571687
## 150 0.8754400 0.8450916 0.7453641
## 200 0.8759045 0.8545971 0.7483039
## 250 0.8748757 0.8527106 0.7602895
## 50 0.8826453 0.8508791 0.7662144
## 100 0.8794444 0.8431136 0.7692899
## 150 0.8767193 0.8412088 0.7662596
## 200 0.8758361 0.8316850 0.7573044
## 250 0.8744058 0.8355311 0.7602442
## 50 0.8898352 0.8546337 0.7933062
## 100 0.8858517 0.8565568 0.7752148
## 150 0.8823154 0.8546520 0.7662596
## 200 0.8799322 0.8527473 0.7602442
## 250 0.8767385 0.8489194 0.7572592
## 50 0.8856128 0.8584432 0.7573496
## 100 0.8814342 0.8680586 0.7663501
## 150 0.8754125 0.8489744 0.7633198
## 200 0.8731248 0.8546703 0.7573044
## 250 0.8726722 0.8527656 0.7573496
## 50 0.8805634 0.8392674 0.7483492
## 100 0.8732768 0.8393223 0.7423790
## 150 0.8713660 0.8393590 0.7484396
## 200 0.8707312 0.8317033 0.7424695
## 250 0.8712700 0.8317399 0.7513795
## 50 0.8830609 0.8565751 0.7512890
## 100 0.8803246 0.8527839 0.7333786
## 150 0.8769465 0.8604029 0.7363184
## 200 0.8766005 0.8489377 0.7363184
## 250 0.8746223 0.8508425 0.7422886
## 50 0.8910715 0.8527289 0.7511081
## 100 0.8823130 0.8546520 0.7573044
## 150 0.8797159 0.8546520 0.7422433
## 200 0.8773686 0.8546520 0.7483039
## 250 0.8779464 0.8489377 0.7632745
## 50 0.8837347 0.8489011 0.7662596
## 100 0.8775324 0.8393223 0.7273632
## 150 0.8674268 0.8373993 0.7392583
## 200 0.8654540 0.8412088 0.7245138
## 250 0.8637086 0.8316667 0.7393939
## 50 0.8823099 0.8432234 0.7632293
## 100 0.8741839 0.8546520 0.7663501
## 150 0.8724687 0.8527473 0.7692899
## 200 0.8682513 0.8469780 0.7573044
## 250 0.8678432 0.8336264 0.7603347
## 50 0.8770281 0.8622894 0.7572139
## 100 0.8705788 0.8450916 0.7451832
## 150 0.8693476 0.8508425 0.7482135
## 200 0.8660866 0.8642125 0.7423338
## 250 0.8656446 0.8527473 0.7662596
## 50 0.8763387 0.8565751 0.7542289
## 100 0.8699971 0.8470330 0.7483492
## 150 0.8734221 0.8508608 0.7483944
## 200 0.8704181 0.8450733 0.7573496
## 250 0.8684326 0.8412637 0.7574401
## 50 0.8856893 0.8604212 0.7514247
## 100 0.8768567 0.8527106 0.7453641
## 150 0.8714320 0.8450916 0.7334238
## 200 0.8673499 0.8374359 0.7424695
## 250 0.8659422 0.8298168 0.7453641
## 50 0.8814734 0.8431868 0.7541836
## 100 0.8788953 0.8507875 0.7452736
## 150 0.8732835 0.8374359 0.7632293
## 200 0.8707823 0.8393040 0.7572592
## 250 0.8693743 0.8374359 0.7423338
## 50 0.8801832 0.8489011 0.7752148
## 100 0.8738248 0.8565568 0.7423338
## 150 0.8693021 0.8412637 0.7393487
## 200 0.8660255 0.8355495 0.7422886
## 250 0.8662843 0.8431685 0.7513342
## 50 0.8795277 0.8470147 0.7632745
## 100 0.8746376 0.8508242 0.7541836
## 150 0.8728819 0.8565568 0.7752601
## 200 0.8669284 0.8431868 0.7633198
## 250 0.8665962 0.8412637 0.7632745
## 50 0.8868094 0.8489560 0.7602895
## 100 0.8787609 0.8584799 0.7452736
## 150 0.8764190 0.8565751 0.7632745
## 200 0.8728396 0.8508425 0.7512890
## 250 0.8712629 0.8451099 0.7543193
## 50 0.8872138 0.8584799 0.7573496
## 100 0.8813231 0.8489377 0.7572139
## 150 0.8755630 0.8450733 0.7483039
## 200 0.8724714 0.8393407 0.7423338
## 250 0.8714968 0.8432051 0.7423338
## 50 0.9023953 0.8757326 0.7573044
## 100 0.9013012 0.8737912 0.7601990
## 150 0.8999264 0.8757326 0.7691995
## 200 0.8982393 0.8738462 0.7781999
## 250 0.8947661 0.8661355 0.7662144
## 50 0.8998515 0.8757143 0.7661692
## 100 0.9018768 0.8795055 0.7662596
## 150 0.8972140 0.8737729 0.7632293
## 200 0.8954167 0.8623626 0.7572592
## 250 0.8921756 0.8642857 0.7692447
## 50 0.8994746 0.8680769 0.7573948
## 100 0.8987952 0.8699817 0.7573496
## 150 0.8978289 0.8623077 0.7603799
## 200 0.8969905 0.8738095 0.7573948
## 250 0.8969031 0.8699634 0.7542289
## 50 0.9008577 0.8661538 0.7753053
## 100 0.9000727 0.8623260 0.7752601
## 150 0.9007949 0.8680586 0.7662596
## 200 0.8991649 0.8661538 0.7722750
## 250 0.8977139 0.8680220 0.7782451
## 50 0.8995552 0.8661722 0.7602442
## 100 0.8924869 0.8584982 0.7512438
## 150 0.8976685 0.8565751 0.7512890
## 200 0.8973497 0.8680769 0.7481682
## 250 0.8970828 0.8623443 0.7572139
## 50 0.9001187 0.8795421 0.7573948
## 100 0.8978692 0.8700000 0.7812302
## 150 0.8958599 0.8623077 0.7722750
## 200 0.8964532 0.8565568 0.7693351
## 250 0.8953355 0.8603846 0.7631841
## 50 0.8969620 0.8642491 0.7512890
## 100 0.8990333 0.8623626 0.7602442
## 150 0.8982202 0.8661355 0.7513342
## 200 0.8992106 0.8623260 0.7752601
## 250 0.8975997 0.8623260 0.7602442
## 50 0.9042504 0.8776190 0.7721845
## 100 0.9011750 0.8661172 0.7632745
## 150 0.8993297 0.8660806 0.7631841
## 200 0.8981778 0.8622711 0.7512890
## 250 0.8990055 0.8681136 0.7662144
## 50 0.9013858 0.8718864 0.7483944
## 100 0.9010850 0.8623443 0.7603347
## 150 0.8989393 0.8700000 0.7753053
## 200 0.8984954 0.8642491 0.7692899
## 250 0.8992685 0.8585165 0.7663048
## 50 0.9026289 0.8699451 0.7722298
## 100 0.9002446 0.8546703 0.7872908
## 150 0.8994478 0.8642308 0.7752601
## 200 0.8980593 0.8604029 0.7752601
## 250 0.8990062 0.8565751 0.7781999
## 50 0.8830761 0.8431319 0.7723202
## 100 0.8774863 0.8508242 0.7603347
## 150 0.8643435 0.8374542 0.7514247
## 200 0.8652907 0.8431502 0.7633198
## 250 0.8647798 0.8431868 0.7483944
## 50 0.8873341 0.8508425 0.7721393
## 100 0.8681256 0.8451282 0.7542289
## 150 0.8723909 0.8412821 0.7573496
## 200 0.8687951 0.8393407 0.7632745
## 250 0.8649716 0.8355495 0.7573496
## 50 0.8762144 0.8584982 0.7693351
## 100 0.8730275 0.8316850 0.7602895
## 150 0.8709469 0.8374908 0.7483039
## 200 0.8739240 0.8316667 0.7632745
## 250 0.8701430 0.8393223 0.7542741
## 50 0.8811414 0.8527839 0.7602895
## 100 0.8696061 0.8279487 0.7543193
## 150 0.8675381 0.8317399 0.7573044
## 200 0.8657860 0.8278571 0.7662596
## 250 0.8627527 0.8259890 0.7632745
## 50 0.8778467 0.8604396 0.7273180
## 100 0.8710884 0.8489560 0.7543193
## 150 0.8668517 0.8393407 0.7393487
## 200 0.8649527 0.8412271 0.7303483
## 250 0.8638634 0.8335714 0.7274084
## 50 0.8833469 0.8566117 0.7573044
## 100 0.8780638 0.8470513 0.7512890
## 150 0.8715818 0.8336264 0.7542289
## 200 0.8688697 0.8317399 0.7393487
## 250 0.8709504 0.8374908 0.7513342
## 50 0.8837397 0.8565934 0.7751696
## 100 0.8764221 0.8527289 0.7752148
## 150 0.8728472 0.8432234 0.7692447
## 200 0.8699599 0.8450916 0.7602895
## 250 0.8643902 0.8374725 0.7483492
## 50 0.8868684 0.8584615 0.7541384
## 100 0.8755817 0.8565385 0.7363184
## 150 0.8691344 0.8527289 0.7363184
## 200 0.8678363 0.8489011 0.7184532
## 250 0.8646672 0.8470147 0.7244686
## 50 0.8884235 0.8527839 0.7631841
## 100 0.8780903 0.8451099 0.7632293
## 150 0.8740795 0.8565751 0.7512890
## 200 0.8690316 0.8489194 0.7393487
## 250 0.8679314 0.8508059 0.7483492
## 50 0.8848396 0.8622894 0.7512438
## 100 0.8769898 0.8489011 0.7421981
## 150 0.8744224 0.8546337 0.7453189
## 200 0.8695859 0.8469963 0.7423338
## 250 0.8669502 0.8450733 0.7423338
## 50 0.8877291 0.8623443 0.7663048
## 100 0.8793624 0.8585348 0.7693804
## 150 0.8798835 0.8622894 0.7513342
## 200 0.8771280 0.8469963 0.7513795
## 250 0.8792331 0.8489377 0.7603347
## 50 0.8894063 0.8642125 0.7631389
## 100 0.8818900 0.8546703 0.7603347
## 150 0.8779751 0.8527473 0.7721845
## 200 0.8742383 0.8412454 0.7633650
## 250 0.8721525 0.8374176 0.7542289
## 50 0.8814247 0.8546154 0.7692899
## 100 0.8732408 0.8431868 0.7692447
## 150 0.8676809 0.8508242 0.7393939
## 200 0.8681098 0.8470147 0.7423790
## 250 0.8688881 0.8508242 0.7542741
## 50 0.8830222 0.8622527 0.7602442
## 100 0.8805080 0.8641941 0.7782451
## 150 0.8755291 0.8565201 0.7573948
## 200 0.8755339 0.8545788 0.7513795
## 250 0.8750112 0.8527656 0.7483944
## 50 0.8822973 0.8508425 0.7602895
## 100 0.8773947 0.8565751 0.7662144
## 150 0.8735756 0.8489011 0.7572592
## 200 0.8735622 0.8412454 0.7632745
## 250 0.8723373 0.8412637 0.7663048
## 50 0.8887761 0.8489744 0.7662144
## 100 0.8790819 0.8393590 0.7601538
## 150 0.8778854 0.8374542 0.7572592
## 200 0.8753661 0.8450916 0.7632293
## 250 0.8741763 0.8470147 0.7662144
## 50 0.8823064 0.8527656 0.7542289
## 100 0.8783611 0.8546520 0.7691542
## 150 0.8757730 0.8507875 0.7512438
## 200 0.8745122 0.8489194 0.7512890
## 250 0.8730258 0.8508242 0.7512890
## 50 0.8789902 0.8450549 0.7632293
## 100 0.8739803 0.8527289 0.7543193
## 150 0.8737517 0.8565018 0.7513342
## 200 0.8703914 0.8507875 0.7542741
## 250 0.8698497 0.8488828 0.7513342
## 50 0.8876901 0.8470330 0.7542289
## 100 0.8806994 0.8470147 0.7602442
## 150 0.8753201 0.8469780 0.7483039
## 200 0.8732224 0.8393407 0.7512438
## 250 0.8729004 0.8469963 0.7512890
## 50 0.8836256 0.8507875 0.7602442
## 100 0.8786395 0.8507692 0.7512890
## 150 0.8770907 0.8450733 0.7632293
## 200 0.8758617 0.8450549 0.7512438
## 250 0.8741077 0.8450733 0.7542289
## 50 0.8834266 0.8584982 0.7573044
## 100 0.8741377 0.8374359 0.7602442
## 150 0.8701130 0.8489011 0.7513342
## 200 0.8720321 0.8489194 0.7602895
## 250 0.8709490 0.8412821 0.7483492
## 50 0.8780693 0.8527473 0.7393487
## 100 0.8741685 0.8470330 0.7454093
## 150 0.8715532 0.8470513 0.7483944
## 200 0.8686566 0.8412454 0.7334690
## 250 0.8655891 0.8431685 0.7453641
## 50 0.8890524 0.8565568 0.7663048
## 100 0.8776407 0.8546886 0.7513342
## 150 0.8752446 0.8393407 0.7633650
## 200 0.8709428 0.8412454 0.7513795
## 250 0.8666706 0.8374359 0.7573496
## 50 0.8774660 0.8450916 0.7632745
## 100 0.8715580 0.8470147 0.7513342
## 150 0.8674038 0.8317033 0.7364089
## 200 0.8672951 0.8431685 0.7364541
## 250 0.8639214 0.8393773 0.7423790
## 50 0.8824590 0.8508059 0.7573496
## 100 0.8736536 0.8508425 0.7513795
## 150 0.8709980 0.8508425 0.7424242
## 200 0.8693420 0.8451099 0.7454545
## 250 0.8672711 0.8451099 0.7364541
## 50 0.8794360 0.8527473 0.7573948
## 100 0.8737919 0.8412454 0.7453641
## 150 0.8676873 0.8431685 0.7513795
## 200 0.8659551 0.8355128 0.7543645
## 250 0.8631998 0.8354945 0.7423338
## 50 0.8811973 0.8508425 0.7393487
## 100 0.8760425 0.8469780 0.7453189
## 150 0.8705412 0.8297802 0.7453189
## 200 0.8672588 0.8317033 0.7423338
## 250 0.8632150 0.8374359 0.7363184
## 50 0.8803065 0.8488828 0.7512438
## 100 0.8742372 0.8508059 0.7542741
## 150 0.8705745 0.8470330 0.7632745
## 200 0.8674182 0.8451282 0.7602442
## 250 0.8657279 0.8470147 0.7573044
## 50 0.8831349 0.8604396 0.7481682
## 100 0.8744576 0.8489377 0.7453189
## 150 0.8702501 0.8393407 0.7452736
## 200 0.8683797 0.8431868 0.7333333
## 250 0.8670860 0.8431685 0.7393487
## 50 0.8865179 0.8508242 0.7752601
## 100 0.8809430 0.8488828 0.7632293
## 150 0.8745882 0.8450733 0.7543193
## 200 0.8712932 0.8412821 0.7543193
## 250 0.8678527 0.8431868 0.7453641
## 50 0.8994716 0.8852747 0.7453189
## 100 0.8985598 0.8738095 0.7604251
## 150 0.8999706 0.8642125 0.7723654
## 200 0.8979904 0.8718498 0.7692899
## 250 0.8958586 0.8584615 0.7632745
## 50 0.8989618 0.8719048 0.7602442
## 100 0.8986877 0.8661722 0.7572139
## 150 0.8984534 0.8661538 0.7542289
## 200 0.8948263 0.8603846 0.7542741
## 250 0.8927468 0.8565568 0.7631389
## 50 0.8922353 0.8585348 0.7542289
## 100 0.8920457 0.8604029 0.7362732
## 150 0.8929866 0.8566484 0.7661692
## 200 0.8940779 0.8604579 0.7722298
## 250 0.8931889 0.8680586 0.7722298
## 50 0.9005604 0.8699817 0.7571687
## 100 0.9004037 0.8738095 0.7572139
## 150 0.9007259 0.8757326 0.7512438
## 200 0.9001792 0.8718864 0.7601538
## 250 0.8994765 0.8699817 0.7572139
## 50 0.8979247 0.8642308 0.7753957
## 100 0.8950762 0.8661722 0.7663048
## 150 0.8936211 0.8604029 0.7692899
## 200 0.8919832 0.8604212 0.7722298
## 250 0.8918582 0.8584982 0.7631389
## 50 0.8999869 0.8623626 0.7692447
## 100 0.8985092 0.8662088 0.7632293
## 150 0.8964042 0.8642857 0.7692899
## 200 0.8977175 0.8642857 0.7752601
## 250 0.8936792 0.8642491 0.7543645
## 50 0.8899471 0.8661905 0.7213930
## 100 0.8940986 0.8661355 0.7602895
## 150 0.8929632 0.8719048 0.7573044
## 200 0.8933486 0.8776374 0.7483492
## 250 0.8941339 0.8719048 0.7573496
## 50 0.8971047 0.8566300 0.7691995
## 100 0.8940891 0.8528205 0.7632293
## 150 0.8967497 0.8508242 0.7751696
## 200 0.8951255 0.8546520 0.7662596
## 250 0.8934843 0.8507692 0.7721845
## 50 0.8961759 0.8623443 0.7573496
## 100 0.8968755 0.8585165 0.7453641
## 150 0.8967732 0.8547253 0.7483039
## 200 0.8954391 0.8565934 0.7632293
## 250 0.8961738 0.8546886 0.7572592
## 50 0.9004084 0.8700366 0.7842153
## 100 0.8976826 0.8604212 0.7901854
## 150 0.9005504 0.8565934 0.7932157
## 200 0.8982233 0.8603846 0.7872456
## 250 0.8982713 0.8565385 0.7872456
## 50 0.8797413 0.8375092 0.7542741
## 100 0.8695472 0.8469780 0.7633198
## 150 0.8734476 0.8393773 0.7542289
## 200 0.8699895 0.8336630 0.7543193
## 250 0.8683112 0.8393773 0.7633198
## 50 0.8814818 0.8508791 0.7543645
## 100 0.8755894 0.8508791 0.7603347
## 150 0.8715152 0.8374359 0.7483039
## 200 0.8693438 0.8374176 0.7484396
## 250 0.8691852 0.8335897 0.7514247
## 50 0.8873517 0.8565934 0.7752601
## 100 0.8783835 0.8603114 0.7602895
## 150 0.8757367 0.8507875 0.7543193
## 200 0.8725937 0.8526923 0.7453641
## 250 0.8695096 0.8431319 0.7393939
## 50 0.8704465 0.8374725 0.7331524
## 100 0.8678792 0.8374725 0.7452736
## 150 0.8650683 0.8450916 0.7393035
## 200 0.8645129 0.8374725 0.7304387
## 250 0.8630957 0.8393773 0.7304387
## 50 0.8850824 0.8546337 0.7632745
## 100 0.8777037 0.8565385 0.7482135
## 150 0.8735050 0.8527656 0.7542289
## 200 0.8695435 0.8469780 0.7542289
## 250 0.8681773 0.8489011 0.7542289
## 50 0.8796711 0.8604212 0.7513795
## 100 0.8749532 0.8527656 0.7722750
## 150 0.8701555 0.8508425 0.7513342
## 200 0.8665724 0.8432051 0.7603347
## 250 0.8659603 0.8546703 0.7573496
## 50 0.8833842 0.8565385 0.7512890
## 100 0.8754564 0.8450733 0.7453189
## 150 0.8727099 0.8470513 0.7422886
## 200 0.8693073 0.8489194 0.7482135
## 250 0.8673301 0.8450916 0.7363184
## 50 0.8818488 0.8412637 0.7602442
## 100 0.8765453 0.8412454 0.7662596
## 150 0.8727144 0.8355861 0.7632293
## 200 0.8687754 0.8355678 0.7572592
## 250 0.8663143 0.8317033 0.7542289
## 50 0.8854178 0.8508791 0.7571687
## 100 0.8785232 0.8566117 0.7512438
## 150 0.8728445 0.8489377 0.7542741
## 200 0.8708430 0.8451282 0.7542741
## 250 0.8672598 0.8374542 0.7453189
## 50 0.8865798 0.8527289 0.7722298
## 100 0.8781106 0.8546520 0.7722298
## 150 0.8726990 0.8489011 0.7632745
## 200 0.8715535 0.8489011 0.7543193
## 250 0.8688860 0.8489011 0.7483492
## 50 0.8770681 0.8471062 0.7603799
## 100 0.8753030 0.8584799 0.7544098
## 150 0.8682391 0.8547253 0.7394392
## 200 0.8637657 0.8412821 0.7424695
## 250 0.8603620 0.8469963 0.7215287
## 50 0.8710405 0.8412637 0.7483944
## 100 0.8672137 0.8317399 0.7424242
## 150 0.8702027 0.8336630 0.7663501
## 200 0.8696113 0.8393590 0.7573948
## 250 0.8680919 0.8317033 0.7484396
## 50 0.8695246 0.8450733 0.7364089
## 100 0.8640185 0.8469963 0.7273632
## 150 0.8679572 0.8508608 0.7274989
## 200 0.8642530 0.8508059 0.7244686
## 250 0.8655142 0.8393590 0.7364089
## 50 0.8751667 0.8394505 0.7601990
## 100 0.8713914 0.8490110 0.7424242
## 150 0.8691415 0.8374725 0.7393939
## 200 0.8673165 0.8470147 0.7273632
## 250 0.8686905 0.8431502 0.7244233
## 50 0.8738782 0.8451282 0.7571687
## 100 0.8701387 0.8451099 0.7603347
## 150 0.8712606 0.8527656 0.7572592
## 200 0.8680673 0.8412821 0.7453189
## 250 0.8669093 0.8432418 0.7483039
## 50 0.8770996 0.8393590 0.7573948
## 100 0.8724528 0.8489560 0.7393487
## 150 0.8719448 0.8450916 0.7453641
## 200 0.8686164 0.8508425 0.7364089
## 250 0.8705508 0.8432051 0.7483492
## 50 0.8794449 0.8623260 0.7483039
## 100 0.8728118 0.8623260 0.7482587
## 150 0.8699129 0.8508242 0.7512890
## 200 0.8702331 0.8546520 0.7513342
## 250 0.8689824 0.8642125 0.7542741
## 50 0.8770685 0.8527839 0.7632745
## 100 0.8716272 0.8432051 0.7632745
## 150 0.8690363 0.8489377 0.7483039
## 200 0.8692877 0.8412821 0.7393487
## 250 0.8683396 0.8374725 0.7513342
## 50 0.8802888 0.8432234 0.7601085
## 100 0.8750908 0.8432418 0.7541836
## 150 0.8710799 0.8432234 0.7452736
## 200 0.8694411 0.8432418 0.7423338
## 250 0.8683844 0.8374908 0.7453189
## 50 0.8804912 0.8393590 0.7661692
## 100 0.8737376 0.8470147 0.7482587
## 150 0.8692439 0.8432234 0.7542289
## 200 0.8702088 0.8432051 0.7602442
## 250 0.8698425 0.8432051 0.7572592
## 50 0.8775321 0.8451099 0.7482587
## 100 0.8731859 0.8430952 0.7543193
## 150 0.8676031 0.8355128 0.7513342
## 200 0.8588527 0.8296703 0.7483039
## 250 0.8612000 0.8354579 0.7364089
## 50 0.8737839 0.8431685 0.7424695
## 100 0.8645341 0.8451282 0.7304839
## 150 0.8648606 0.8412454 0.7394392
## 200 0.8614928 0.8393407 0.7364541
## 250 0.8604951 0.8374542 0.7214835
## 50 0.8768091 0.8412637 0.7572592
## 100 0.8664046 0.8354945 0.7453641
## 150 0.8646503 0.8336081 0.7512890
## 200 0.8644002 0.8202198 0.7513342
## 250 0.8644406 0.8259524 0.7454093
## 50 0.8877706 0.8680769 0.7663953
## 100 0.8760375 0.8565201 0.7573496
## 150 0.8697155 0.8526923 0.7513795
## 200 0.8680377 0.8450549 0.7574401
## 250 0.8661135 0.8489194 0.7484396
## 50 0.8773289 0.8374176 0.7512890
## 100 0.8754226 0.8373993 0.7692899
## 150 0.8705515 0.8335897 0.7603799
## 200 0.8688579 0.8393590 0.7543645
## 250 0.8646681 0.8317216 0.7364541
## 50 0.8737916 0.8412271 0.7632293
## 100 0.8678924 0.8412271 0.7602442
## 150 0.8664223 0.8336081 0.7482587
## 200 0.8625669 0.8297985 0.7512890
## 250 0.8629390 0.8355128 0.7422886
## 50 0.8782811 0.8546703 0.7573044
## 100 0.8727999 0.8469963 0.7573948
## 150 0.8680397 0.8393956 0.7454093
## 200 0.8650327 0.8393773 0.7513795
## 250 0.8620202 0.8413187 0.7423790
## 50 0.8823816 0.8489194 0.7572592
## 100 0.8763608 0.8451099 0.7513342
## 150 0.8705871 0.8393590 0.7453641
## 200 0.8691218 0.8413004 0.7423790
## 250 0.8667398 0.8413004 0.7423790
## 50 0.8764932 0.8546337 0.7571687
## 100 0.8700723 0.8412637 0.7662144
## 150 0.8680677 0.8508425 0.7662144
## 200 0.8644899 0.8546703 0.7602895
## 250 0.8628553 0.8508608 0.7632745
## 50 0.8817830 0.8450916 0.7661692
## 100 0.8721521 0.8450916 0.7511986
## 150 0.8669104 0.8412637 0.7543193
## 200 0.8651709 0.8450733 0.7452736
## 250 0.8625632 0.8431502 0.7482587
## 50 0.8952037 0.8719048 0.7601990
## 100 0.8932406 0.8661538 0.7572592
## 150 0.8955434 0.8661722 0.7662144
## 200 0.8931886 0.8547253 0.7573044
## 250 0.8914122 0.8471062 0.7633198
## 50 0.8961244 0.8719048 0.7572139
## 100 0.8976430 0.8604212 0.7662596
## 150 0.8964706 0.8585165 0.7781547
## 200 0.8950333 0.8680586 0.7691995
## 250 0.8947592 0.8547070 0.7663048
## 50 0.8972721 0.8757326 0.7422433
## 100 0.8962095 0.8699817 0.7452736
## 150 0.8973086 0.8603663 0.7632293
## 200 0.8932872 0.8642125 0.7662596
## 250 0.8915983 0.8546337 0.7631841
## 50 0.8998791 0.8680769 0.7483492
## 100 0.8993949 0.8489927 0.7781999
## 150 0.9006832 0.8547070 0.7933062
## 200 0.8997544 0.8566117 0.7842605
## 250 0.8958447 0.8623260 0.7752148
## 50 0.8928804 0.8565751 0.7544098
## 100 0.8947232 0.8699634 0.7603799
## 150 0.8944689 0.8623260 0.7544098
## 200 0.8949724 0.8623260 0.7572139
## 250 0.8920646 0.8642491 0.7573044
## 50 0.8998714 0.8700183 0.7691995
## 100 0.9010975 0.8700000 0.7692447
## 150 0.8977229 0.8623260 0.7662596
## 200 0.8996038 0.8623077 0.7602442
## 250 0.8940410 0.8508242 0.7661692
## 50 0.8943206 0.8546886 0.7332881
## 100 0.8958092 0.8546886 0.7513342
## 150 0.8928899 0.8508608 0.7453641
## 200 0.8923191 0.8527839 0.7603347
## 250 0.8927129 0.8470330 0.7692447
## 50 0.8991240 0.8585165 0.7661692
## 100 0.8978409 0.8470696 0.7661239
## 150 0.8977739 0.8642491 0.7751696
## 200 0.8986578 0.8584799 0.7661692
## 250 0.8966045 0.8642125 0.7781999
## 50 0.8891436 0.8680220 0.7362732
## 100 0.8916415 0.8565568 0.7661692
## 150 0.8966919 0.8603663 0.7572592
## 200 0.8969733 0.8623077 0.7661692
## 250 0.8954180 0.8604029 0.7661692
## 50 0.8967958 0.8623626 0.7812302
## 100 0.8961671 0.8604212 0.7751696
## 150 0.8951324 0.8585165 0.7662596
## 200 0.8935044 0.8604212 0.7632745
## 250 0.8939127 0.8623260 0.7752601
## 50 0.8782792 0.8565385 0.7482135
## 100 0.8669508 0.8489560 0.7303483
## 150 0.8623355 0.8432418 0.7274536
## 200 0.8601272 0.8527473 0.7154681
## 250 0.8628910 0.8412637 0.7213930
## 50 0.8764102 0.8451648 0.7511533
## 100 0.8702740 0.8508974 0.7273632
## 150 0.8609770 0.8374908 0.7363184
## 200 0.8642983 0.8355311 0.7393939
## 250 0.8606192 0.8279121 0.7274989
## 50 0.8785666 0.8336447 0.7512890
## 100 0.8711072 0.8394139 0.7663048
## 150 0.8652961 0.8355311 0.7423338
## 200 0.8644856 0.8336081 0.7512438
## 250 0.8630115 0.8393407 0.7393939
## 50 0.8849852 0.8432601 0.7663048
## 100 0.8725629 0.8394322 0.7482135
## 150 0.8682983 0.8355861 0.7452736
## 200 0.8625135 0.8336813 0.7422886
## 250 0.8603550 0.8317216 0.7393035
## 50 0.8798803 0.8451282 0.7451832
## 100 0.8721061 0.8489377 0.7572592
## 150 0.8675435 0.8393773 0.7572592
## 200 0.8645163 0.8469963 0.7482587
## 250 0.8630265 0.8431685 0.7512890
## 50 0.8759145 0.8565934 0.7393939
## 100 0.8703855 0.8374725 0.7393487
## 150 0.8655337 0.8335897 0.7363636
## 200 0.8616231 0.8259524 0.7303935
## 250 0.8607297 0.8412454 0.7333786
## 50 0.8804850 0.8469963 0.7452736
## 100 0.8713843 0.8469963 0.7483492
## 150 0.8655535 0.8241026 0.7423790
## 200 0.8640429 0.8374725 0.7453641
## 250 0.8623116 0.8412821 0.7453641
## 50 0.8843128 0.8547070 0.7722298
## 100 0.8751368 0.8450733 0.7542741
## 150 0.8726502 0.8469780 0.7512438
## 200 0.8718774 0.8470147 0.7452736
## 250 0.8713912 0.8527473 0.7393035
## 50 0.8773917 0.8451099 0.7573044
## 100 0.8694966 0.8451099 0.7542741
## 150 0.8660852 0.8489011 0.7423338
## 200 0.8637313 0.8412821 0.7483039
## 250 0.8623694 0.8432051 0.7453189
## 50 0.8854024 0.8546703 0.7752601
## 100 0.8819418 0.8528022 0.7722298
## 150 0.8760252 0.8393773 0.7752601
## 200 0.8734408 0.8450916 0.7692899
## 250 0.8703675 0.8489194 0.7692899
##
## Tuning parameter 'gamma' was held constant at a value of 0
##
## Tuning parameter 'min_child_weight' was held constant at a value of 1
## ROC was used to select the optimal model using the largest value.
## The final values used for the model were nrounds = 100, max_depth = 3,
## eta = 0.3, gamma = 0, subsample = 0.625, colsample_bytree =
## 0.8, rate_drop = 0.5, skip_drop = 0.05 and min_child_weight = 1.
set.seed(100)
# Train the model using MARS
model_svmRadial = train(Purchase ~ ., data=trainData, method='svmRadial', tuneLength=15, trControl = fitControl)
model_svmRadial
## Support Vector Machines with Radial Basis Function Kernel
##
## 857 samples
## 18 predictor
## 2 classes: 'CH', 'MM'
##
## No pre-processing
## Resampling: Cross-Validated (5 fold)
## Summary of sample sizes: 685, 686, 685, 686, 686
## Resampling results across tuning parameters:
##
## C ROC Sens Spec
## 0.25 0.8968213 0.8795055 0.7274084
## 0.50 0.8980530 0.8776007 0.7214835
## 1.00 0.8977832 0.8776190 0.7334238
## 2.00 0.8934719 0.8718681 0.7303483
## 4.00 0.8915500 0.8794689 0.7154229
## 8.00 0.8868855 0.8890293 0.6825418
## 16.00 0.8823947 0.8870696 0.6854817
## 32.00 0.8767745 0.8889744 0.6583899
## 64.00 0.8600145 0.8889744 0.6524197
## 128.00 0.8486717 0.8813370 0.6494346
## 256.00 0.8413847 0.8832784 0.6284487
## 512.00 0.8313846 0.8871062 0.6196744
## 1024.00 0.8198163 0.8909524 0.6136137
## 2048.00 0.8143498 0.8986081 0.5598372
## 4096.00 0.8113379 0.9024725 0.5388964
##
## Tuning parameter 'sigma' was held constant at a value of 0.06525857
## ROC was used to select the optimal model using the largest value.
## The final values used for the model were sigma = 0.06525857 and C = 0.5.
# Compare model performances using resample()
models_compare <- resamples(list(ADABOOST=model_adaboost, RF=model_rf, XGBDART=model_xgbDART, MARS=model_mars3, SVM=model_svmRadial))
# Summary of the models performances
summary(models_compare)
##
## Call:
## summary.resamples(object = models_compare)
##
## Models: ADABOOST, RF, XGBDART, MARS, SVM
## Number of resamples: 5
##
## ROC
## Min. 1st Qu. Median Mean 3rd Qu. Max. NA's
## ADABOOST 0.8525253 0.8772245 0.8828932 0.8783495 0.8852878 0.8938166 0
## RF 0.8691198 0.8697618 0.8932262 0.8871323 0.8997868 0.9037669 0
## XGBDART 0.8878788 0.9026263 0.9111585 0.9105354 0.9199716 0.9310419 0
## MARS 0.8808081 0.8943743 0.9044776 0.9034469 0.9146411 0.9229334 0
## SVM 0.8712843 0.8823192 0.9022033 0.8980530 0.9166188 0.9178394 0
##
## Sens
## Min. 1st Qu. Median Mean 3rd Qu. Max. NA's
## ADABOOST 0.7714286 0.8076923 0.8380952 0.8298535 0.8653846 0.8666667 0
## RF 0.8076923 0.8380952 0.8666667 0.8565751 0.8761905 0.8942308 0
## XGBDART 0.8190476 0.8461538 0.8761905 0.8680952 0.8952381 0.9038462 0
## MARS 0.8365385 0.8476190 0.8952381 0.8776007 0.9038462 0.9047619 0
## SVM 0.8173077 0.8666667 0.8857143 0.8776007 0.8952381 0.9230769 0
##
## Spec
## Min. 1st Qu. Median Mean 3rd Qu. Max. NA's
## ADABOOST 0.6969697 0.7014925 0.7462687 0.7543193 0.8059701 0.8208955 0
## RF 0.6666667 0.6716418 0.7462687 0.7333333 0.7761194 0.8059701 0
## XGBDART 0.7575758 0.7910448 0.8059701 0.7962913 0.8059701 0.8208955 0
## MARS 0.7272727 0.7761194 0.7761194 0.7753053 0.7910448 0.8059701 0
## SVM 0.6716418 0.6969697 0.7164179 0.7214835 0.7313433 0.7910448 0
Lets Plot to resample Output to compare the models
# Draw box plots to compare models
scales <- list(x=list(relation="free"), y=list(relation="free"))
bwplot(models_compare, scales=scales)
The xgbDART model appears to be the be best performing model overall because of the high ROC. But if you need a model that predicts the positives better, you might want to consider MARS, given its high sensitivity
So we have predictions from multiple individual models. To do this we had to run the train() function once for each model, store the models and pass it to the res
The caretEnsemble package lets you do just that.
All you have to do is put the names of all the algorithms you want to run in a vector and pass it to caretEnsemble::caretList() instead of caret::train().
library(caretEnsemble)
# Stacking Algorithms - Run multiple algos in one call.
trainControl <- trainControl(method="repeatedcv",
number=10,
repeats=3,
savePredictions=TRUE,
classProbs=TRUE)
algorithmList <- c('rf', 'adaboost', 'earth', 'xgbDART', 'svmRadial')
set.seed(100)
models <- caretList(Purchase ~ ., data=trainData, trControl=trainControl, methodList=algorithmList)
results <- resamples(models)
summary(results)
##
## Call:
## summary.resamples(object = results)
##
## Models: rf, adaboost, earth, xgbDART, svmRadial
## Number of resamples: 30
##
## Accuracy
## Min. 1st Qu. Median Mean 3rd Qu. Max. NA's
## rf 0.7011494 0.7813611 0.8245554 0.8148761 0.8488372 0.8823529 0
## adaboost 0.7126437 0.7764706 0.8117647 0.8079169 0.8367305 0.9058824 0
## earth 0.7529412 0.8117647 0.8304378 0.8311071 0.8501403 0.9069767 0
## xgbDART 0.7790698 0.8235294 0.8372093 0.8447774 0.8720930 0.9302326 0
## svmRadial 0.7647059 0.7930233 0.8245554 0.8261035 0.8567613 0.8953488 0
##
## Kappa
## Min. 1st Qu. Median Mean 3rd Qu. Max. NA's
## rf 0.3656758 0.5412023 0.6239376 0.6106210 0.6816051 0.7523310 0
## adaboost 0.4059000 0.5332939 0.6054203 0.5973623 0.6539334 0.7973778 0
## earth 0.4883921 0.5879658 0.6400500 0.6410969 0.6839406 0.8073908 0
## xgbDART 0.5437483 0.6133688 0.6583877 0.6715447 0.7278216 0.8555431 0
## svmRadial 0.4959444 0.5598055 0.6190147 0.6280087 0.6897242 0.7774583 0
Plot the resamples output to compare the models.
# Box plots to compare models
scales <- list(x=list(relation="free"), y=list(relation="free"))
bwplot(results, scales=scales)
Turns out this can be done too, using the caretStack(). You just need to make sure you don’t use the same trainControl you used to build the models.
# Create the trainControl
set.seed(101)
stackControl <- trainControl(method="repeatedcv",
number=10,
repeats=3,
savePredictions=TRUE,
classProbs=TRUE)
# Ensemble the predictions of `models` to form a new combined prediction based on glm
stack.glm <- caretStack(models, method="glm", metric="Accuracy", trControl=stackControl)
print(stack.glm)
## A glm ensemble of 5 base models: rf, adaboost, earth, xgbDART, svmRadial
##
## Ensemble results:
## Generalized Linear Model
##
## 2571 samples
## 5 predictor
## 2 classes: 'CH', 'MM'
##
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 3 times)
## Summary of sample sizes: 2314, 2314, 2313, 2314, 2315, 2313, ...
## Resampling results:
##
## Accuracy Kappa
## 0.8445367 0.6708299
A point to consider: The ensembles tend to perform better if the predictions are less correlated with each other.
So you may want to try passing different types of models, both high and low performing rather than just stick to passing high accuracy models to the caretStack.
print(stack.glm)
## A glm ensemble of 5 base models: rf, adaboost, earth, xgbDART, svmRadial
##
## Ensemble results:
## Generalized Linear Model
##
## 2571 samples
## 5 predictor
## 2 classes: 'CH', 'MM'
##
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 3 times)
## Summary of sample sizes: 2314, 2314, 2313, 2314, 2315, 2313, ...
## Resampling results:
##
## Accuracy Kappa
## 0.8445367 0.6708299
# Predict on testData
stack_predicteds <- predict(stack.glm, newdata=testData4)
head(stack_predicteds)
## [1] CH CH CH CH MM CH
## Levels: CH MM
That’s it