Machine learning (ML) develops algorithms to identify patterns in data (unsupervised ML) or make predictions and inferences (supervised ML).

Supervised ML trains the machine to learn from prior examples to predict either a categorical outcome (classification) or a numeric outcome (regression), or to infer the relationships between the outcome and its explanatory variables.

Two early forms of supervised ML are linear regression (OLS) and generalized linear models (GLM) (Poisson regression and logistic regression). These methods have been improved with advanced linear methods, including stepwise selection, regularization (ridge, lasso, elastic net), principal components regression, and partial least squares. With greater computing capacity, non-linear models are now in use, including polynomial regression, step functions, splines, and generalized additive models (GAM). Decision trees (bagging, random forests, and, boosting) are additional options for regression and classification, and support vector machines is an additional option for classification.

These notes cover regularization. Regularization is a set of methods to manage the bias-variance trade-off problem in linear regression.

Background

The multiple linear regression model expresses observed variable \(Y\) as a linear function of \(X\), \(Y = X \beta + \epsilon\), where \(\epsilon \sim N(0, \sigma^2)\). The OLS approach estimates the coefficients by minimizing the loss fuction,

\[L_{OLS} = ||y - X \hat{\beta}||^2.\]

The resulting estimate for the coefficients is

\[\hat{\beta}_{OLS} = (X'X)^{-1}(X'Y).\]

There are two important characteristics of any estimator: its bias and its variance. For OLS, these are

\[Bias(\hat{\beta}_{OLS}) = E(\hat{\beta}_{OLS}) - \beta\] \[Var(\hat{\beta}_{OLS}) = \sigma^2(X'X)^{-1}\]

where the unknown variance \(\sigma^2\) is estimated from the residuals

\[\hat\sigma^2 = \frac{\epsilon' \epsilon}{n - k}.\]

The OLS estimator is unbiased, but it can have a large variance when the predictor variables are highly correlated with each other, or when there are many predictors (notice how \(\hat{\sigma}^2\) increases as \(k \rightarrow n\)). Regularization is an approach that reduces variance at the cost of introducing some bias. Stepwise selection balances this tradeoff by eliminating variables, but this throws away information. Regularization keeps all the predictors, but reduces coefficient values to reduce bias.

Ridge

The Ridge approach estimates the linear model coefficients by minimizing an augmented loss fuction which includes a term, \(\lambda\), that penalizes the magnitude of the coefficient estimates,

\[L_{ridge} = ||y - X \hat{\beta}||^2 + \lambda||\hat\beta||^2.\]

The resulting estimate for the coefficients is

\[\hat{\beta} = (X'X + \lambda I)^{-1}(X'Y).\]

As \(\lambda \rightarrow 0\), ridge regression approaches OLS. The bias and variance for the ridge estimator is

\[Bias(\hat{\beta}_{ridge}) = -\lambda (X'X + \lambda I)^{-1} \beta\] \[Var(\hat{\beta}_{OLS}) = \sigma^2(X'X + \lambda I)^{-1}X'X(X'X + \lambda I)^{-1}\]

The estimator bias increases with \(\lambda\) and the estimator variance decreases with \(\lambda\). The optimal level for \(\lambda\) that balances the tradeoff is the one that minimizes some criterion with cross-validation, usually the root mean squared error (RMSE), but you can also try the Akaike or Bayesian Information Criterion (AIC or BIC), or Rsquared.

Example

Using the cars data set from the base R package, I will predict mpg from the other variables using the glmnet method inside the caret package. glmnet fits generalized linear models via penalized maximum likelihood and can handle ridge, lasso, and elastic net.

To compare this ridge regression model with the lasso and elastic net models, I’ll create a training and validation set.

library(tidyverse)  # for data cleaning
library(caret)  # for workflow

data("mtcars")

set.seed(123)
partition <- createDataPartition(mtcars$mpg, p = 0.6, list = FALSE)
train <- mtcars[partition, ]
valid <- mtcars[-partition, ]
rm(partition)

I’ll also creata a trainControl object to reuse in the three models I will eventual fit (ridge, lasso, and elastic net). Creating a trainControl object is useful not just because it is a resusable code chunk, but because it guarantees the models use the same observations in the cross-validation folds.

model.trControl <- trainControl(method = "cv",
                           number = 5,  # number of folds, usually 10, but 5 for smaller data sets
                           index = createFolds(train$mpg, k = 5),  # specify the fold index
                           verboseIter = FALSE,  # don't print the results to the log
                           savePredictions = "final")  # saves predictions for the optimal tuning parameters

The first step is to perform cross-validation (I’ll use folds instead of 10 because data set is small) to select the optimal value for \(\lambda\). I’ll specify alpha = 0 in a tuning grid for ridge regression (the following sections will reveal how alpha distinguishes ridge, lasso, and elastic net).

set.seed(1234)
ridge.model <- train(
  mpg ~ .,
  data = train,
  method = "glmnet",
  metric = "RMSE",  # Choose from RMSE, RSquared, AIC, BIC, ...others?
  preProcess = c("zv",  # remove and variables with zero variance (a single value)
                 "medianImpute",   # impute NAs with median (then try knnImpute!)
                 "center", "scale",  # standardize data for linear models "it just works better."
                 "pca"  # reduce dimensions of nearly-zero variance columns
                 ),  
  tuneGrid = expand.grid(
    .alpha = 0,  # optimize a ridge regression
    .lambda = seq(0, 10, length = 100)),  # create range by experiment to find metric's local min
  trControl = model.trControl
  )
ridge.model
## glmnet 
## 
## 21 samples
## 10 predictors
## 
## Pre-processing: median imputation (10), centered (10), scaled
##  (10), principal component signal extraction (10) 
## Resampling: Cross-Validated (5 fold) 
## Summary of sample sizes: 6, 3, 4, 3, 5 
## Resampling results across tuning parameters:
## 
##   lambda      RMSE      Rsquared   MAE     
##    0.0000000  2.842346  0.8335761  2.226618
##    0.1010101  2.842346  0.8335761  2.226618
##    0.2020202  2.842346  0.8335761  2.226618
##    0.3030303  2.859627  0.8335761  2.241724
##    0.4040404  2.888292  0.8335761  2.263432
##    0.5050505  2.916358  0.8335761  2.284131
##    0.6060606  2.947597  0.8335761  2.304810
##    0.7070707  2.974429  0.8335761  2.321993
##    0.8080808  3.000703  0.8335761  2.342673
##    0.9090909  3.023804  0.8335761  2.357964
##    1.0101010  3.047025  0.8335761  2.372200
##    1.1111111  3.070365  0.8335761  2.385677
##    1.2121212  3.093814  0.8335761  2.398812
##    1.3131313  3.117282  0.8335761  2.415169
##    1.4141414  3.140762  0.8335761  2.432679
##    1.5151515  3.164282  0.8335761  2.449407
##    1.6161616  3.187796  0.8335761  2.465393
##    1.7171717  3.211213  0.8335761  2.480595
##    1.8181818  3.234584  0.8335761  2.495162
##    1.9191919  3.257885  0.8335761  2.509154
##    2.0202020  3.280962  0.8335761  2.522402
##    2.1212121  3.304038  0.8335761  2.535290
##    2.2222222  3.326849  0.8335761  2.547471
##    2.3232323  3.349620  0.8335761  2.559369
##    2.4242424  3.372110  0.8335761  2.571945
##    2.5252525  3.394469  0.8335761  2.584626
##    2.6262626  3.416658  0.8335761  2.597561
##    2.7272727  3.438548  0.8335761  2.610580
##    2.8282828  3.460257  0.8335761  2.623367
##    2.9292929  3.481771  0.8335761  2.635771
##    3.0303030  3.503036  0.8335761  2.649253
##    3.1313131  3.524098  0.8335761  2.663431
##    3.2323232  3.544896  0.8335761  2.678349
##    3.3333333  3.565465  0.8335761  2.694312
##    3.4343434  3.585811  0.8335761  2.709857
##    3.5353535  3.605919  0.8335761  2.724935
##    3.6363636  3.625729  0.8335761  2.739618
##    3.7373737  3.645325  0.8335761  2.754575
##    3.8383838  3.664807  0.8335761  2.770479
##    3.9393939  3.683844  0.8335761  2.785749
##    4.0404040  3.702706  0.8335761  2.800800
##    4.1414141  3.721349  0.8335761  2.816135
##    4.2424242  3.739817  0.8335761  2.831552
##    4.3434343  3.757919  0.8335761  2.847069
##    4.4444444  3.775808  0.8335761  2.862282
##    4.5454545  3.793502  0.8335761  2.877180
##    4.6464646  3.811066  0.8335761  2.891821
##    4.7474747  3.828196  0.8335761  2.905968
##    4.8484848  3.845124  0.8335761  2.919862
##    4.9494949  3.861883  0.8335761  2.933499
##    5.0505051  3.878616  0.8335761  2.947003
##    5.1515152  3.894864  0.8335761  2.959977
##    5.2525253  3.910915  0.8335761  2.973793
##    5.3535354  3.926789  0.8335761  2.988131
##    5.4545455  3.942472  0.8335761  3.002211
##    5.5555556  3.958144  0.8335761  3.016200
##    5.6565657  3.973326  0.8335761  3.029641
##    5.7575758  3.988339  0.8335761  3.042855
##    5.8585859  4.003151  0.8335761  3.055851
##    5.9595960  4.017812  0.8335761  3.068645
##    6.0606061  4.032512  0.8335761  3.081404
##    6.1616162  4.046831  0.8335761  3.093760
##    6.2626263  4.060818  0.8335761  3.105752
##    6.3636364  4.074638  0.8335761  3.117569
##    6.4646465  4.088363  0.8335761  3.129260
##    6.5656566  4.101952  0.8335761  3.140910
##    6.6666667  4.115563  0.8335761  3.152984
##    6.7676768  4.128796  0.8335761  3.164673
##    6.8686869  4.141715  0.8335761  3.176052
##    6.9696970  4.154493  0.8335761  3.187251
##    7.0707071  4.167186  0.8335761  3.198327
##    7.1717172  4.179718  0.8335761  3.209221
##    7.2727273  4.192288  0.8335761  3.220108
##    7.3737374  4.204710  0.8335761  3.230836
##    7.4747475  4.216656  0.8335761  3.241109
##    7.5757576  4.228540  0.8335761  3.251491
##    7.6767677  4.240180  0.8335761  3.261692
##    7.7777778  4.251806  0.8335761  3.271846
##    7.8787879  4.263301  0.8335761  3.281854
##    7.9797980  4.274831  0.8335761  3.291863
##    8.0808081  4.286233  0.8335761  3.301739
##    8.1818182  4.297225  0.8335761  3.311467
##    8.2828283  4.308072  0.8335761  3.321274
##    8.3838384  4.318775  0.8335761  3.330906
##    8.4848485  4.329392  0.8335761  3.340424
##    8.5858586  4.339926  0.8335761  3.349846
##    8.6868687  4.350426  0.8335761  3.359215
##    8.7878788  4.360953  0.8335761  3.368742
##    8.8888889  4.371292  0.8335761  3.378239
##    8.9898990  4.381218  0.8335761  3.387328
##    9.0909091  4.391073  0.8335761  3.396338
##    9.1919192  4.400798  0.8335761  3.405208
##    9.2929293  4.410429  0.8335761  3.414136
##    9.3939394  4.420017  0.8335761  3.423010
##    9.4949495  4.429538  0.8335761  3.431808
##    9.5959596  4.439079  0.8335761  3.440606
##    9.6969697  4.448553  0.8335761  3.449334
##    9.7979798  4.457786  0.8335761  3.457832
##    9.8989899  4.466709  0.8335761  3.466027
##   10.0000000  4.475619  0.8335761  3.474195
## 
## Tuning parameter 'alpha' was held constant at a value of 0
## RMSE was used to select the optimal model using the smallest value.
## The final values used for the model were alpha = 0 and lambda = 0.2020202.

The printout of the model shows the RMSE, RSquared, and mean absolute error (MAE) values at each lambda specified in the tuning grid. The final three lines summarize what happened. It did not tune alpha because I held it at 0 for ridge regression; it optimized using RMSE; and the optimal tuning values (at the miminum RMSE) were alpha = 0 and lambda = 1.9. You can see how RMSE minimum with the plot.

plot(ridge.model, main = "Ridge Regression Parameter Tuning", xlab = "lambda")

Make predictions on the validation data set and save the performance metrics for comparison to other models.

(ridge.perf <- postResample(pred = predict(ridge.model, newdata = valid), 
                            obs = valid$mpg))
##      RMSE  Rsquared       MAE 
## 2.9941023 0.8251066 2.4183197

On average, this model will miss the true value of mpg by 2.35 mpg (RMSE) or 1.96 mpg (MAE). The model explains about 85% of the variation in mpg.

Lasso

Like ridge, lasso adds a penalty for coefficients, but instead of penalizing the sum of squared coefficients (L2 penalty), lasso penalizes the sum of absolute values (L1 penalty). As a result, for high values of \(\lambda\), coefficients can be zeroed under lasso.

The loss fuction for lasso is

\[L_{ridge} = ||y - X \hat{\beta}||^2 + \lambda||\hat\beta||.\]

Example

Continuing with prediction of mpg from the other variables in the mtcars data set, follow the same steps as before, but with ridge regression. This time specify parameter alpha = 1 for ridge regression (it was 0 for ridge, and for elastic net it will be something in between and require optimization). Specify standardize = TRUE again to standardize the predictor matrix.

set.seed(123)
lasso.model <- train(
  mpg ~ .,
  data = train,
  method = "glmnet",
  metric = "RMSE",  # Choose from RMSE, RSquared, AIC, BIC, ...others?
  preProcess = c("zv",  # remove and variables with zero variance (a single value)
                 "medianImpute",   # impute NAs with median (then try knnImpute!)
                 "center", "scale",  # standardize data for linear models "it just works better."
                 "pca"  # reduce dimensions of nearly-zero variance columns
                 ),  
  tuneGrid = expand.grid(
    .alpha = 1,  # optimize a lasso regression
    .lambda = seq(0, 10, length = 100)),  # create range by experiment to find metric's local min
  trControl = model.trControl
  )
## Warning in nominalTrainWorkflow(x = x, y = y, wts = weights, info =
## trainInfo, : There were missing values in resampled performance measures.
lasso.model
## glmnet 
## 
## 21 samples
## 10 predictors
## 
## Pre-processing: median imputation (10), centered (10), scaled
##  (10), principal component signal extraction (10) 
## Resampling: Cross-Validated (5 fold) 
## Summary of sample sizes: 6, 3, 4, 3, 5 
## Resampling results across tuning parameters:
## 
##   lambda      RMSE      Rsquared   MAE     
##    0.0000000  2.802329  0.8377750  2.212591
##    0.1010101  2.828988  0.8380361  2.230948
##    0.2020202  2.877310  0.8378302  2.263547
##    0.3030303  2.919480  0.8379567  2.300429
##    0.4040404  2.970206  0.8364965  2.348052
##    0.5050505  3.029649  0.8327822  2.398501
##    0.6060606  3.098036  0.8257891  2.448949
##    0.7070707  3.175586  0.8139409  2.501825
##    0.8080808  3.252849  0.7993631  2.551406
##    0.9090909  3.308850  0.7937066  2.583490
##    1.0101010  3.360316  0.7905247  2.619102
##    1.1111111  3.409772  0.7897443  2.654206
##    1.2121212  3.461994  0.7899803  2.691823
##    1.3131313  3.519119  0.7900000  2.737249
##    1.4141414  3.580936  0.7898012  2.784358
##    1.5151515  3.647195  0.7893836  2.831858
##    1.6161616  3.717621  0.7887488  2.890751
##    1.7171717  3.791920  0.7878999  2.952317
##    1.8181818  3.869796  0.7868417  3.019730
##    1.9191919  3.950956  0.7855807  3.091818
##    2.0202020  4.035117  0.7841246  3.163986
##    2.1212121  4.107993  0.7706469  3.226742
##    2.2222222  4.158882  0.7695640  3.267462
##    2.3232323  4.200636  0.7693146  3.299494
##    2.4242424  4.242347  0.7691816  3.332593
##    2.5252525  4.283940  0.7691816  3.364881
##    2.6262626  4.326916  0.7691816  3.397169
##    2.7272727  4.371230  0.7691816  3.429457
##    2.8282828  4.416843  0.7691816  3.461862
##    2.9292929  4.463716  0.7691816  3.498668
##    3.0303030  4.511815  0.7691816  3.537095
##    3.1313131  4.555402  0.8505127  3.575671
##    3.2323232  4.592293  0.8505127  3.609137
##    3.3333333  4.630202  0.8505127  3.642602
##    3.4343434  4.669103  0.8505127  3.676068
##    3.5353535  4.708967  0.8505127  3.709533
##    3.6363636  4.749768  0.8505127  3.742999
##    3.7373737  4.791480  0.8505127  3.776464
##    3.8383838  4.834074  0.8505127  3.809930
##    3.9393939  4.877522  0.8505127  3.843396
##    4.0404040  4.921797  0.8505127  3.876861
##    4.1414141  4.966870  0.8505127  3.911844
##    4.2424242  5.012713  0.8505127  3.948932
##    4.3434343  5.059295  0.8505127  3.986020
##    4.4444444  5.106589  0.8505127  4.023972
##    4.5454545  5.154565  0.8505127  4.064769
##    4.6464646  5.203196  0.8505127  4.106322
##    4.7474747  5.252453  0.8505127  4.147875
##    4.8484848  5.302309  0.8505127  4.189427
##    4.9494949  5.352737  0.8505127  4.232095
##    5.0505051  5.403710  0.8505127  4.275946
##    5.1515152  5.446825  0.8470432  4.312132
##    5.2525253  5.479758  0.8470432  4.338927
##    5.3535354  5.513112  0.8470432  4.369221
##    5.4545455  5.546869  0.8470432  4.401106
##    5.5555556  5.581007  0.8470432  4.432991
##    5.6565657  5.615508  0.8470432  4.465670
##    5.7575758  5.650353  0.8470432  4.499695
##    5.8585859  5.685526  0.8470432  4.533720
##    5.9595960  5.721009  0.8470432  4.567746
##    6.0606061  5.739679  0.7846698  4.587158
##    6.1616162  5.752921  0.7846698  4.601717
##    6.2626263  5.766399  0.7846698  4.616276
##    6.3636364  5.780102  0.7846698  4.630834
##    6.4646465  5.794019  0.7846698  4.645393
##    6.5656566  5.808138  0.7846698  4.659952
##    6.6666667  5.822449  0.7846698  4.674511
##    6.7676768  5.836943  0.7846698  4.689070
##    6.8686869  5.851610  0.7846698  4.703628
##    6.9696970  5.866442  0.7846698  4.718187
##    7.0707071  5.881431  0.7846698  4.732746
##    7.1717172  5.896568  0.7846698  4.747305
##    7.2727273  5.911847  0.7846698  4.761864
##    7.3737374  5.927261  0.7846698  4.776423
##    7.4747475  5.942803  0.7846698  4.790981
##    7.5757576  5.958468  0.7846698  4.805540
##    7.6767677  5.974248  0.7846698  4.820099
##    7.7777778  5.990140  0.7846698  4.834658
##    7.8787879  6.006137  0.7846698  4.849217
##    7.9797980  6.019297        NaN  4.861125
##    8.0808081  6.019297        NaN  4.861125
##    8.1818182  6.019297        NaN  4.861125
##    8.2828283  6.019297        NaN  4.861125
##    8.3838384  6.019297        NaN  4.861125
##    8.4848485  6.019297        NaN  4.861125
##    8.5858586  6.019297        NaN  4.861125
##    8.6868687  6.019297        NaN  4.861125
##    8.7878788  6.019297        NaN  4.861125
##    8.8888889  6.019297        NaN  4.861125
##    8.9898990  6.019297        NaN  4.861125
##    9.0909091  6.019297        NaN  4.861125
##    9.1919192  6.019297        NaN  4.861125
##    9.2929293  6.019297        NaN  4.861125
##    9.3939394  6.019297        NaN  4.861125
##    9.4949495  6.019297        NaN  4.861125
##    9.5959596  6.019297        NaN  4.861125
##    9.6969697  6.019297        NaN  4.861125
##    9.7979798  6.019297        NaN  4.861125
##    9.8989899  6.019297        NaN  4.861125
##   10.0000000  6.019297        NaN  4.861125
## 
## Tuning parameter 'alpha' was held constant at a value of 1
## RMSE was used to select the optimal model using the smallest value.
## The final values used for the model were alpha = 1 and lambda = 0.

The suumary output shows the model did not tune alpha because I held it at 1 for lasso regression; it optimized using RMSE; and the optimal tuning values (at the miminum RMSE) were alpha = 1 and lambda = 1.3. You can see the RMSE minimum on the the plot.

plot(lasso.model, main = "Lasso Regression Parameter Tuning", xlab = "lambda")

Make predictions on the validation data set and save the performance metrics for comparison to other the models.

(lasso.perf <- postResample(pred = predict(lasso.model, newdata = valid), 
                            obs = valid$mpg))
##     RMSE Rsquared      MAE 
## 3.085846 0.825303 2.473863

On average, this model will miss the true value of mpg by 1.90 mpg (RMSE) or 1.64 mpg (MAE), so better than ridge on the RMSE measure and the MAE measure. RMSE penalizes outliers more, so this model does a better job on the outiers and on the non-outliers. The model explains about 85% of the variation in mpg, same as ridge.

Elastic Net

Elastic Net combines the penalties of ridge and lasso to get the best of both worlds. The loss fuction for elastic net is

\[L_{enet} = \frac{||y - X \hat{\beta}||^2}{2n} + \lambda \frac{1 - \alpha}{2}||\hat\beta||^2 + \lambda \alpha||\hat\beta||.\]

In this loss function, new parameter \(\alpha\) is a “mixing” parameter that balances the two approaches. You can see that if \(\alpha\) is zero, you are back to ridge regression, and if \(\alpha\) is one, you are back to lasso.

Example

Continuing with prediction of mpg from the other variables in the mtcars data set, follow the same steps as before, but with elastic net regression. This time there are two parameters to optimize: \(\lambda\) and \(\alpha\).

Tuning parameters: alpha: mixing percentage lambda: regularization

Set the train control to do 5-fold cross-validation.

set.seed(12345)
elnet.model <- train(
  mpg ~ .,
  data = train,
  method = "glmnet",
  metric = "RMSE",  # Choose from RMSE, RSquared, AIC, BIC, ...others?
  tuneLength = 25,  # Do not choose values - let the algorith search for them
  preProcess = c("zv",  # remove and variables with zero variance (a single value)
                 "medianImpute",   # impute NAs with median (then try knnImpute!)
                 "center", "scale",  # standardize data for linear models "it just works better."
                 "pca"  # reduce dimensions of nearly-zero variance columns
                 ),  
  trControl = model.trControl
  )
## Warning in nominalTrainWorkflow(x = x, y = y, wts = weights, info =
## trainInfo, : There were missing values in resampled performance measures.
elnet.model
## glmnet 
## 
## 21 samples
## 10 predictors
## 
## Pre-processing: median imputation (10), centered (10), scaled
##  (10), principal component signal extraction (10) 
## Resampling: Cross-Validated (5 fold) 
## Summary of sample sizes: 6, 3, 4, 3, 5 
## Resampling results across tuning parameters:
## 
##   alpha   lambda      RMSE      Rsquared   MAE     
##   0.1000  0.01253587  2.822712  0.8341958  2.230540
##   0.1000  0.01786490  2.822712  0.8341958  2.230540
##   0.1000  0.02545932  2.823246  0.8341929  2.230932
##   0.1000  0.03628215  2.824826  0.8341845  2.232082
##   0.1000  0.05170579  2.826812  0.8341810  2.233017
##   0.1000  0.07368606  2.828682  0.8342004  2.233252
##   0.1000  0.10501020  2.837067  0.8342416  2.238171
##   0.1000  0.14965031  2.849107  0.8342998  2.246714
##   0.1000  0.21326706  2.866419  0.8343816  2.259429
##   0.1000  0.30392747  2.887506  0.8346111  2.274015
##   0.1000  0.43312785  2.914886  0.8350383  2.290835
##   0.1000  0.61725166  2.955561  0.8356329  2.313589
##   0.1000  0.87964701  3.017429  0.8363307  2.357754
##   0.1000  1.25358731  3.112282  0.8369538  2.418199
##   0.1000  1.78649065  3.251146  0.8376264  2.510958
##   0.1000  2.54593263  3.448256  0.8380655  2.617976
##   0.1000  3.62821544  3.712419  0.8373171  2.820310
##   0.1000  5.17057958  4.039417  0.8321723  3.094375
##   0.1000  7.36860686  4.411330  0.8093686  3.415163
##   0.1375  0.01253587  2.821570  0.8343939  2.229711
##   0.1375  0.01786490  2.821570  0.8343939  2.229711
##   0.1375  0.02545932  2.822253  0.8343891  2.230221
##   0.1375  0.03628215  2.823929  0.8343775  2.231430
##   0.1375  0.05170579  2.825759  0.8343765  2.232211
##   0.1375  0.07368606  2.828207  0.8344048  2.232907
##   0.1375  0.10501020  2.836675  0.8344610  2.237850
##   0.1375  0.14965031  2.848850  0.8345401  2.246486
##   0.1375  0.21326706  2.866393  0.8346503  2.259299
##   0.1375  0.30392747  2.886845  0.8349884  2.273129
##   0.1375  0.43312785  2.914222  0.8355638  2.289664
##   0.1375  0.61725166  2.955999  0.8362743  2.314058
##   0.1375  0.87964701  3.022104  0.8368865  2.362684
##   0.1375  1.25358731  3.120921  0.8375593  2.426367
##   0.1375  1.78649065  3.266846  0.8380478  2.521659
##   0.1375  2.54593263  3.475889  0.8375163  2.633937
##   0.1375  3.62821544  3.758347  0.8331213  2.858038
##   0.1375  5.17057958  4.110993  0.8134593  3.163432
##   0.1375  7.36860686  4.478935  0.7905112  3.477971
##   0.1750  0.01253587  2.819193  0.8346377  2.227605
##   0.1750  0.01786490  2.819193  0.8346377  2.227605
##   0.1750  0.02545932  2.819776  0.8346326  2.228046
##   0.1750  0.03628215  2.821488  0.8346178  2.229332
##   0.1750  0.05170579  2.823062  0.8346231  2.229859
##   0.1750  0.07368606  2.826300  0.8346621  2.231186
##   0.1750  0.10501020  2.834855  0.8347330  2.236156
##   0.1750  0.14965031  2.847173  0.8348323  2.244891
##   0.1750  0.21326706  2.864954  0.8349699  2.257800
##   0.1750  0.30392747  2.886222  0.8353594  2.272238
##   0.1750  0.43312785  2.913711  0.8360693  2.288522
##   0.1750  0.61725166  2.958852  0.8366788  2.315935
##   0.1750  0.87964701  3.027387  0.8373474  2.367863
##   0.1750  1.25358731  3.130584  0.8379449  2.434994
##   0.1750  1.78649065  3.284487  0.8378995  2.533477
##   0.1750  2.54593263  3.507223  0.8353044  2.660129
##   0.1750  3.62821544  3.810970  0.8229858  2.907738
##   0.1750  5.17057958  4.171380  0.7938581  3.222794
##   0.1750  7.36860686  4.536740  0.7900153  3.532974
##   0.2125  0.01253587  2.817619  0.8348555  2.226296
##   0.2125  0.01786490  2.817619  0.8348555  2.226296
##   0.2125  0.02545932  2.818567  0.8348458  2.227025
##   0.2125  0.03628215  2.820352  0.8348277  2.228376
##   0.2125  0.05170579  2.821914  0.8348352  2.228886
##   0.2125  0.07368606  2.825285  0.8348828  2.230302
##   0.2125  0.10501020  2.833930  0.8349681  2.235298
##   0.2125  0.14965031  2.846394  0.8350870  2.244130
##   0.2125  0.21326706  2.864427  0.8352507  2.257143
##   0.2125  0.30392747  2.885663  0.8357239  2.271364
##   0.2125  0.43312785  2.915006  0.8364042  2.288312
##   0.2125  0.61725166  2.961951  0.8370393  2.319794
##   0.2125  0.87964701  3.033163  0.8377043  2.374176
##   0.2125  1.25358731  3.141380  0.8380692  2.445602
##   0.2125  1.78649065  3.304352  0.8370343  2.547625
##   0.2125  2.54593263  3.542637  0.8307640  2.692841
##   0.2125  3.62821544  3.869739  0.8035089  2.967475
##   0.2125  5.17057958  4.215889  0.7896983  3.265714
##   0.2125  7.36860686  4.603234  0.7891009  3.593574
##   0.2500  0.01253587  2.818050  0.8349978  2.226766
##   0.2500  0.01786490  2.818050  0.8349978  2.226766
##   0.2500  0.02545932  2.818953  0.8349873  2.227469
##   0.2500  0.03628215  2.820858  0.8349659  2.228873
##   0.2500  0.05170579  2.822592  0.8349706  2.229548
##   0.2500  0.07368606  2.825652  0.8350241  2.230687
##   0.2500  0.10501020  2.834389  0.8351236  2.235710
##   0.2500  0.14965031  2.847006  0.8352615  2.244643
##   0.2500  0.21326706  2.865295  0.8354500  2.257755
##   0.2500  0.30392747  2.885240  0.8360738  2.270540
##   0.2500  0.43312785  2.916825  0.8366832  2.289115
##   0.2500  0.61725166  2.965344  0.8373520  2.324474
##   0.2500  0.87964701  3.039520  0.8379474  2.380967
##   0.2500  1.25358731  3.153236  0.8378927  2.456459
##   0.2500  1.78649065  3.326504  0.8352649  2.562324
##   0.2500  2.54593263  3.582613  0.8228285  2.727140
##   0.2500  3.62821544  3.910352  0.7937906  3.010308
##   0.2500  5.17057958  4.262642  0.7900140  3.310668
##   0.2500  7.36860686  4.679775  0.7865756  3.658138
##   0.2875  0.01253587  2.815248  0.8352611  2.224287
##   0.2875  0.01786490  2.815248  0.8352611  2.224287
##   0.2875  0.02545932  2.816346  0.8352470  2.225151
##   0.2875  0.03628215  2.818282  0.8352223  2.226634
##   0.2875  0.05170579  2.819767  0.8352364  2.227055
##   0.2875  0.07368606  2.823512  0.8353019  2.228730
##   0.2875  0.10501020  2.832341  0.8354152  2.233780
##   0.2875  0.14965031  2.845116  0.8355715  2.242814
##   0.2875  0.21326706  2.863680  0.8357834  2.256037
##   0.2875  0.30392747  2.885925  0.8363201  2.270734
##   0.2875  0.43312785  2.918766  0.8369410  2.290783
##   0.2875  0.61725166  2.968986  0.8376138  2.329215
##   0.2875  0.87964701  3.046424  0.8380609  2.387902
##   0.2875  1.25358731  3.166419  0.8373587  2.467783
##   0.2875  1.78649065  3.351254  0.8323494  2.577822
##   0.2875  2.54593263  3.627032  0.8101199  2.764473
##   0.2875  3.62821544  3.944081  0.7901196  3.043788
##   0.2875  5.17057958  4.316510  0.7895247  3.358651
##   0.2875  7.36860686  4.762853  0.7707323  3.722696
##   0.3250  0.01253587  2.814341  0.8354226  2.223584
##   0.3250  0.01786490  2.814341  0.8354226  2.223584
##   0.3250  0.02545932  2.815579  0.8354052  2.224568
##   0.3250  0.03628215  2.817604  0.8353772  2.226113
##   0.3250  0.05170579  2.818932  0.8353990  2.226369
##   0.3250  0.07368606  2.823117  0.8354755  2.228377
##   0.3250  0.10501020  2.832044  0.8356023  2.233456
##   0.3250  0.14965031  2.844981  0.8357764  2.242593
##   0.3250  0.21326706  2.863825  0.8360104  2.256133
##   0.3250  0.30392747  2.887025  0.8365225  2.271198
##   0.3250  0.43312785  2.920857  0.8371751  2.292493
##   0.3250  0.61725166  2.972974  0.8378201  2.334105
##   0.3250  0.87964701  3.053979  0.8380346  2.395061
##   0.3250  1.25358731  3.180800  0.8364190  2.479396
##   0.3250  1.78649065  3.378584  0.8279394  2.594837
##   0.3250  2.54593263  3.667427  0.7970312  2.804010
##   0.3250  3.62821544  3.977853  0.7899258  3.075394
##   0.3250  5.17057958  4.377410  0.7882408  3.409659
##   0.3250  7.36860686  4.827067  0.7691816  3.773404
##   0.3625  0.01253587  2.813324  0.8355992  2.222724
##   0.3625  0.01786490  2.813324  0.8355992  2.222724
##   0.3625  0.02545932  2.814660  0.8355789  2.223796
##   0.3625  0.03628215  2.816764  0.8355476  2.225406
##   0.3625  0.05170579  2.818009  0.8355753  2.225570
##   0.3625  0.07368606  2.822451  0.8356617  2.227761
##   0.3625  0.10501020  2.831478  0.8358018  2.232871
##   0.3625  0.14965031  2.844585  0.8359931  2.242112
##   0.3625  0.21326706  2.863901  0.8362310  2.256210
##   0.3625  0.30392747  2.888192  0.8367147  2.271678
##   0.3625  0.43312785  2.923085  0.8373849  2.294292
##   0.3625  0.61725166  2.977232  0.8379668  2.339064
##   0.3625  0.87964701  3.062201  0.8378489  2.402448
##   0.3625  1.25358731  3.196624  0.8349886  2.492343
##   0.3625  1.78649065  3.408630  0.8216496  2.617653
##   0.3625  2.54593263  3.693435  0.7932769  2.831530
##   0.3625  3.62821544  4.015660  0.7899986  3.109303
##   0.3625  5.17057958  4.446290  0.7861797  3.463833
##   0.3625  7.36860686  4.891414  0.7691816  3.827463
##   0.4000  0.01253587  2.812193  0.8357924  2.221715
##   0.4000  0.01786490  2.812193  0.8357924  2.221715
##   0.4000  0.02545932  2.813591  0.8357696  2.222848
##   0.4000  0.03628215  2.815766  0.8357349  2.224525
##   0.4000  0.05170579  2.816991  0.8357665  2.224660
##   0.4000  0.07368606  2.821548  0.8358618  2.226915
##   0.4000  0.10501020  2.830677  0.8360148  2.232130
##   0.4000  0.14965031  2.843959  0.8362226  2.241605
##   0.4000  0.21326706  2.864335  0.8364090  2.256397
##   0.4000  0.30392747  2.889437  0.8368958  2.272179
##   0.4000  0.43312785  2.925458  0.8375696  2.296578
##   0.4000  0.61725166  2.981803  0.8380500  2.344126
##   0.4000  0.87964701  3.071078  0.8374865  2.410741
##   0.4000  1.25358731  3.213874  0.8329768  2.506400
##   0.4000  1.78649065  3.441751  0.8128555  2.645888
##   0.4000  2.54593263  3.718357  0.7904403  2.854369
##   0.4000  3.62821544  4.058493  0.7896751  3.144840
##   0.4000  5.17057958  4.522863  0.7717762  3.536770
##   0.4000  7.36860686  4.960336  0.7691816  3.884501
##   0.4375  0.01253587  2.810943  0.8360035  2.220559
##   0.4375  0.01786490  2.810943  0.8360035  2.220559
##   0.4375  0.02545932  2.812373  0.8359787  2.221728
##   0.4375  0.03628215  2.814613  0.8359407  2.223474
##   0.4375  0.05170579  2.815876  0.8359739  2.223637
##   0.4375  0.07368606  2.820427  0.8360769  2.225860
##   0.4375  0.10501020  2.829661  0.8362426  2.231180
##   0.4375  0.14965031  2.843122  0.8364661  2.240930
##   0.4375  0.21326706  2.864856  0.8365717  2.256544
##   0.4375  0.30392747  2.890753  0.8370657  2.272697
##   0.4375  0.43312785  2.927990  0.8377273  2.298911
##   0.4375  0.61725166  2.986699  0.8380651  2.349556
##   0.4375  0.87964701  3.080626  0.8369268  2.419644
##   0.4375  1.25358731  3.232630  0.8302725  2.520892
##   0.4375  1.78649065  3.475248  0.8023922  2.673924
##   0.4375  2.54593263  3.742608  0.7897529  2.875405
##   0.4375  3.62821544  4.106517  0.7889560  3.187981
##   0.4375  5.17057958  4.577889  0.7694699  3.587649
##   0.4375  7.36860686  5.026593  0.8505127  3.939822
##   0.4750  0.01253587  2.812414  0.8360165  2.221949
##   0.4750  0.01786490  2.812619  0.8360128  2.222118
##   0.4750  0.02545932  2.814187  0.8359841  2.223410
##   0.4750  0.03628215  2.816414  0.8359456  2.225154
##   0.4750  0.05170579  2.817211  0.8360015  2.224832
##   0.4750  0.07368606  2.821652  0.8361109  2.226937
##   0.4750  0.10501020  2.830994  0.8362889  2.232414
##   0.4750  0.14965031  2.844640  0.8365275  2.242391
##   0.4750  0.21326706  2.867530  0.8365661  2.258633
##   0.4750  0.30392747  2.892145  0.8372238  2.273234
##   0.4750  0.43312785  2.930670  0.8378572  2.301281
##   0.4750  0.61725166  2.991933  0.8380066  2.355660
##   0.4750  0.87964701  3.090887  0.8361454  2.428761
##   0.4750  1.25358731  3.252969  0.8267370  2.535842
##   0.4750  1.78649065  3.501422  0.7952423  2.695120
##   0.4750  2.54593263  3.768554  0.7899763  2.898489
##   0.4750  3.62821544  4.160026  0.7878472  3.233743
##   0.4750  5.17057958  4.625412  0.7691816  3.626050
##   0.4750  7.36860686  5.088642  0.8505127  3.992930
##   0.5125  0.01253587  2.811068  0.8362484  2.220643
##   0.5125  0.01786490  2.811217  0.8362455  2.220767
##   0.5125  0.02545932  2.812832  0.8362145  2.222106
##   0.5125  0.03628215  2.815186  0.8361696  2.223996
##   0.5125  0.05170579  2.816022  0.8362277  2.223702
##   0.5125  0.07368606  2.820258  0.8363420  2.225613
##   0.5125  0.10501020  2.829710  0.8365320  2.231368
##   0.5125  0.14965031  2.843681  0.8367724  2.241533
##   0.5125  0.21326706  2.868052  0.8367195  2.258733
##   0.5125  0.30392747  2.893611  0.8373699  2.273788
##   0.5125  0.43312785  2.933513  0.8379578  2.304529
##   0.5125  0.61725166  2.997484  0.8378682  2.361857
##   0.5125  0.87964701  3.101910  0.8351136  2.438119
##   0.5125  1.25358731  3.275028  0.8221958  2.551317
##   0.5125  1.78649065  3.518738  0.7935127  2.707625
##   0.5125  2.54593263  3.797601  0.7900069  2.926602
##   0.5125  3.62821544  4.219168  0.7863587  3.288704
##   0.5125  5.17057958  4.674616  0.7691816  3.664462
##   0.5125  7.36860686  5.155082  0.8505127  4.051294
##   0.5500  0.01253587  2.807915  0.8364928  2.217829
##   0.5500  0.01786490  2.808324  0.8364846  2.218172
##   0.5500  0.02545932  2.809985  0.8364512  2.219559
##   0.5500  0.03628215  2.812389  0.8364029  2.221524
##   0.5500  0.05170579  2.813386  0.8364576  2.221381
##   0.5500  0.07368606  2.818673  0.8365909  2.224104
##   0.5500  0.10501020  2.828237  0.8367926  2.230138
##   0.5500  0.14965031  2.842812  0.8370084  2.240654
##   0.5500  0.21326706  2.868191  0.8369034  2.258456
##   0.5500  0.30392747  2.895152  0.8375038  2.274873
##   0.5500  0.43312785  2.936502  0.8380269  2.308447
##   0.5500  0.61725166  3.003391  0.8376431  2.368176
##   0.5500  0.87964701  3.113762  0.8337963  2.447759
##   0.5500  1.25358731  3.298851  0.8164588  2.567325
##   0.5500  1.78649065  3.537661  0.7908195  2.721039
##   0.5500  2.54593263  3.829822  0.7898423  2.956772
##   0.5500  3.62821544  4.284165  0.7845028  3.348735
##   0.5500  5.17057958  4.727071  0.7691816  3.704827
##   0.5500  7.36860686  5.225639  0.8505127  4.112522
##   0.5875  0.01253587  2.809711  0.8365016  2.219414
##   0.5875  0.01786490  2.810032  0.8364949  2.219685
##   0.5875  0.02545932  2.811741  0.8364591  2.221121
##   0.5875  0.03628215  2.813924  0.8364197  2.222886
##   0.5875  0.05170579  2.814719  0.8364874  2.222524
##   0.5875  0.07368606  2.819676  0.8366240  2.224950
##   0.5875  0.10501020  2.829357  0.8368370  2.231266
##   0.5875  0.14965031  2.844858  0.8369973  2.242322
##   0.5875  0.21326706  2.869928  0.8369526  2.259617
##   0.5875  0.30392747  2.896758  0.8376245  2.276132
##   0.5875  0.43312785  2.939662  0.8380629  2.312426
##   0.5875  0.61725166  3.009698  0.8373231  2.374648
##   0.5875  0.87964701  3.126425  0.8321665  2.457650
##   0.5875  1.25358731  3.324470  0.8092998  2.583828
##   0.5875  1.78649065  3.554316  0.7900164  2.736808
##   0.5875  2.54593263  3.865385  0.7894810  2.988471
##   0.5875  3.62821544  4.345453  0.7704319  3.402954
##   0.5875  5.17057958  4.782520  0.7691816  3.747011
##   0.5875  7.36860686  5.301717  0.8505127  4.179989
##   0.6250  0.01253587  2.806360  0.8367683  2.216464
##   0.6250  0.01786490  2.806918  0.8367563  2.216937
##   0.6250  0.02545932  2.808674  0.8367181  2.218422
##   0.6250  0.03628215  2.811194  0.8366651  2.220467
##   0.6250  0.05170579  2.811989  0.8367365  2.220091
##   0.6250  0.07368606  2.817868  0.8368919  2.223301
##   0.6250  0.10501020  2.827667  0.8371158  2.229817
##   0.6250  0.14965031  2.844101  0.8372191  2.241415
##   0.6250  0.21326706  2.870795  0.8370703  2.259959
##   0.6250  0.30392747  2.898453  0.8377316  2.277870
##   0.6250  0.43312785  2.943004  0.8380642  2.316471
##   0.6250  0.61725166  3.016359  0.8369059  2.381230
##   0.6250  0.87964701  3.139902  0.8301767  2.467758
##   0.6250  1.25358731  3.348394  0.8021712  2.598701
##   0.6250  1.78649065  3.571654  0.7897622  2.753671
##   0.6250  2.54593263  3.904405  0.7889278  3.021594
##   0.6250  3.62821544  4.383962  0.7694549  3.434458
##   0.6250  5.17057958  4.830374  0.8505127  3.785204
##   0.6250  7.36860686  5.384035  0.8505127  4.256578
##   0.6625  0.01253587  2.808143  0.8367779  2.218004
##   0.6625  0.01786490  2.808587  0.8367680  2.218382
##   0.6625  0.02545932  2.810390  0.8367274  2.219915
##   0.6625  0.03628215  2.812515  0.8366924  2.221587
##   0.6625  0.05170579  2.813311  0.8367675  2.221198
##   0.6625  0.07368606  2.818763  0.8369244  2.224155
##   0.6625  0.10501020  2.828684  0.8371590  2.230833
##   0.6625  0.14965031  2.846062  0.8372036  2.242979
##   0.6625  0.21326706  2.871705  0.8371823  2.260314
##   0.6625  0.30392747  2.900225  0.8378254  2.279960
##   0.6625  0.43312785  2.946502  0.8380297  2.320558
##   0.6625  0.61725166  3.023377  0.8363771  2.387910
##   0.6625  0.87964701  3.154302  0.8277673  2.478246
##   0.6625  1.25358731  3.368584  0.7966071  2.614438
##   0.6625  1.78649065  3.590096  0.7899351  2.770636
##   0.6625  2.54593263  3.947031  0.7881812  3.060239
##   0.6625  3.62821544  4.418681  0.7691816  3.463163
##   0.6625  5.17057958  4.876802  0.8505127  3.824284
##   0.6625  7.36860686  5.471718  0.8505127  4.337901
##   0.7000  0.01253587  2.805327  0.8370339  2.215584
##   0.7000  0.01786490  2.805991  0.8370187  2.216153
##   0.7000  0.02545932  2.807841  0.8369757  2.217734
##   0.7000  0.03628215  2.809664  0.8369587  2.219033
##   0.7000  0.05170579  2.810460  0.8370374  2.218631
##   0.7000  0.07368606  2.816744  0.8372127  2.222333
##   0.7000  0.10501020  2.826790  0.8374576  2.229175
##   0.7000  0.14965031  2.845121  0.8374421  2.241871
##   0.7000  0.21326706  2.872647  0.8372885  2.260673
##   0.7000  0.30392747  2.902068  0.8379044  2.282064
##   0.7000  0.43312785  2.950184  0.8379558  2.324821
##   0.7000  0.61725166  3.030835  0.8357282  2.394760
##   0.7000  0.87964701  3.169631  0.8249020  2.489637
##   0.7000  1.25358731  3.382166  0.7945995  2.625538
##   0.7000  1.78649065  3.610243  0.7900134  2.788116
##   0.7000  2.54593263  3.993346  0.7872439  3.105155
##   0.7000  3.62821544  4.452958  0.7691816  3.490820
##   0.7000  5.17057958  4.925737  0.8505127  3.865008
##   0.7000  7.36860686  5.560022  0.8470432  4.421610
##   0.7375  0.01253587  2.806084  0.8370874  2.216095
##   0.7375  0.01786490  2.806956  0.8370668  2.216846
##   0.7375  0.02545932  2.808854  0.8370214  2.218476
##   0.7375  0.03628215  2.810990  0.8369875  2.220138
##   0.7375  0.05170579  2.811789  0.8370699  2.219723
##   0.7375  0.07368606  2.817563  0.8372446  2.223106
##   0.7375  0.10501020  2.827921  0.8374820  2.230224
##   0.7375  0.14965031  2.847028  0.8374222  2.243361
##   0.7375  0.21326706  2.873630  0.8373885  2.261044
##   0.7375  0.30392747  2.904003  0.8379685  2.284202
##   0.7375  0.43312785  2.954046  0.8378424  2.329313
##   0.7375  0.61725166  3.038674  0.8349524  2.401717
##   0.7375  0.87964701  3.185857  0.8214998  2.501265
##   0.7375  1.25358731  3.394788  0.7931990  2.635884
##   0.7375  1.78649065  3.632341  0.7899960  2.807608
##   0.7375  2.54593263  4.043470  0.7861275  3.151807
##   0.7375  3.62821544  4.489353  0.7691816  3.521335
##   0.7375  5.17057958  4.978559  0.8505127  3.908374
##   0.7375  7.36860686  5.624510  0.8470432  4.483354
##   0.7750  0.01253587  2.806121  0.8370874  2.216265
##   0.7750  0.01786490  2.807192  0.8370614  2.217192
##   0.7750  0.02545932  2.809137  0.8370136  2.218870
##   0.7750  0.03628215  2.810986  0.8369976  2.220179
##   0.7750  0.05170579  2.811787  0.8370834  2.219751
##   0.7750  0.07368606  2.818329  0.8372762  2.223824
##   0.7750  0.10501020  2.829088  0.8374975  2.231271
##   0.7750  0.14965031  2.848895  0.8374000  2.244801
##   0.7750  0.21326706  2.874649  0.8374825  2.261705
##   0.7750  0.30392747  2.906012  0.8380174  2.286356
##   0.7750  0.43312785  2.958080  0.8376854  2.333860
##   0.7750  0.61725166  3.046958  0.8340321  2.408833
##   0.7750  0.87964701  3.203141  0.8174914  2.513262
##   0.7750  1.25358731  3.408440  0.7912533  2.647047
##   0.7750  1.78649065  3.656362  0.7898823  2.828429
##   0.7750  2.54593263  4.097422  0.7848274  3.200498
##   0.7750  3.62821544  4.528098  0.7691816  3.553928
##   0.7750  5.17057958  5.034403  0.8505127  3.956029
##   0.7750  7.36860686  5.692866  0.8470432  4.547892
##   0.8125  0.01253587  2.803613  0.8373580  2.213980
##   0.8125  0.01786490  2.804542  0.8373348  2.214789
##   0.8125  0.02545932  2.806535  0.8372846  2.216516
##   0.8125  0.03628215  2.808038  0.8372864  2.217506
##   0.8125  0.05170579  2.809027  0.8373786  2.217255
##   0.8125  0.07368606  2.816102  0.8375864  2.221801
##   0.8125  0.10501020  2.827459  0.8377726  2.229680
##   0.8125  0.14965031  2.847780  0.8376543  2.243488
##   0.8125  0.21326706  2.875707  0.8375701  2.262388
##   0.8125  0.30392747  2.908112  0.8380500  2.288540
##   0.8125  0.43312785  2.962318  0.8374834  2.338485
##   0.8125  0.61725166  3.055655  0.8329625  2.416070
##   0.8125  0.87964701  3.221402  0.8127968  2.525524
##   0.8125  1.25358731  3.420190  0.7904274  2.656405
##   0.8125  1.78649065  3.682400  0.7896725  2.850115
##   0.8125  2.54593263  4.154248  0.7717496  3.253884
##   0.8125  3.62821544  4.569097  0.7691816  3.588014
##   0.8125  5.17057958  5.094576  0.8505127  4.008771
##   0.8125  7.36860686  5.767344  0.7846698  4.617286
##   0.8500  0.01253587  2.804241  0.8374166  2.214419
##   0.8500  0.01786490  2.805362  0.8373880  2.215398
##   0.8500  0.02545932  2.807403  0.8373354  2.217173
##   0.8500  0.03628215  2.809338  0.8373169  2.218563
##   0.8500  0.05170579  2.810145  0.8374095  2.218238
##   0.8500  0.07368606  2.816822  0.8376174  2.222523
##   0.8500  0.10501020  2.828820  0.8377642  2.230820
##   0.8500  0.14965031  2.849625  0.8376274  2.244887
##   0.8500  0.21326706  2.876803  0.8376513  2.263079
##   0.8500  0.30392747  2.910290  0.8380662  2.290744
##   0.8500  0.43312785  2.966732  0.8372331  2.343162
##   0.8500  0.61725166  3.064817  0.8317211  2.423467
##   0.8500  0.87964701  3.240774  0.8073168  2.540153
##   0.8500  1.25358731  3.432332  0.7897672  2.665902
##   0.8500  1.78649065  3.710510  0.7893663  2.873132
##   0.8500  2.54593263  4.197032  0.7697098  3.294707
##   0.8500  3.62821544  4.612961  0.7691816  3.623927
##   0.8500  5.17057958  5.158489  0.8505127  4.064108
##   0.8500  7.36860686  5.793703  0.7846698  4.645065
##   0.8875  0.01253587  2.804344  0.8374166  2.214539
##   0.8875  0.01786490  2.805649  0.8373826  2.215684
##   0.8875  0.02545932  2.807737  0.8373275  2.217508
##   0.8875  0.03628215  2.809336  0.8373268  2.218591
##   0.8875  0.05170579  2.810272  0.8374248  2.218436
##   0.8875  0.07368606  2.817504  0.8376481  2.223205
##   0.8875  0.10501020  2.830147  0.8377548  2.231923
##   0.8875  0.14965031  2.851444  0.8375981  2.246251
##   0.8875  0.21326706  2.877939  0.8377257  2.263781
##   0.8875  0.30392747  2.912560  0.8380650  2.292977
##   0.8875  0.43312785  2.971353  0.8369326  2.347916
##   0.8875  0.61725166  3.074424  0.8303001  2.430997
##   0.8875  0.87964701  3.257843  0.8024569  2.552324
##   0.8875  1.25358731  3.444207  0.7897491  2.675389
##   0.8875  1.78649065  3.740764  0.7889648  2.900699
##   0.8875  2.54593263  4.220369  0.7694735  3.314077
##   0.8875  3.62821544  4.643972  0.8505127  3.651401
##   0.8875  5.17057958  5.227347  0.8505127  4.123029
##   0.8875  7.36860686  5.822766  0.7846698  4.674832
##   0.9250  0.01253587  2.801299  0.8377198  2.211822
##   0.9250  0.01786490  2.802782  0.8376804  2.213128
##   0.9250  0.02545932  2.804918  0.8376228  2.215001
##   0.9250  0.03628215  2.806261  0.8376398  2.215767
##   0.9250  0.05170579  2.807765  0.8377509  2.216124
##   0.9250  0.07368606  2.815078  0.8379817  2.221006
##   0.9250  0.10501020  2.828371  0.8380475  2.230145
##   0.9250  0.14965031  2.850167  0.8378696  2.244737
##   0.9250  0.21326706  2.879114  0.8377934  2.264503
##   0.9250  0.30392747  2.914913  0.8380461  2.295231
##   0.9250  0.43312785  2.976169  0.8365778  2.352733
##   0.9250  0.61725166  3.084519  0.8286760  2.438691
##   0.9250  0.87964701  3.272197  0.7986964  2.561713
##   0.9250  1.25358731  3.456751  0.7898843  2.684968
##   0.9250  1.78649065  3.773209  0.7884680  2.931625
##   0.9250  2.54593263  4.244859  0.7692341  3.334162
##   0.9250  3.62821544  4.676149  0.8505127  3.679663
##   0.9250  5.17057958  5.300834  0.8505127  4.185775
##   0.9250  7.36860686  5.854292  0.7846698  4.706273
##   0.9625  0.01253587  2.802824  0.8377388  2.213052
##   0.9625  0.01786490  2.804350  0.8376974  2.214401
##   0.9625  0.02545932  2.806534  0.8376373  2.216322
##   0.9625  0.03628215  2.807558  0.8376720  2.216802
##   0.9625  0.05170579  2.808378  0.8377743  2.216694
##   0.9625  0.07368606  2.815732  0.8380118  2.221658
##   0.9625  0.10501020  2.829679  0.8380359  2.231220
##   0.9625  0.14965031  2.851983  0.8378353  2.246190
##   0.9625  0.21326706  2.880329  0.8378540  2.265708
##   0.9625  0.30392747  2.917359  0.8380084  2.297878
##   0.9625  0.43312785  2.981190  0.8361667  2.357620
##   0.9625  0.61725166  3.095092  0.8268340  2.446532
##   0.9625  0.87964701  3.285857  0.7952665  2.570103
##   0.9625  1.25358731  3.470289  0.7899729  2.697466
##   0.9625  1.78649065  3.807943  0.7878782  2.963602
##   0.9625  2.54593263  4.268382  0.7691816  3.352738
##   0.9625  3.62821544  4.710300  0.8505127  3.709303
##   0.9625  5.17057958  5.380008  0.8505127  4.254790
##   0.9625  7.36860686  5.888812  0.7846698  4.739863
##   1.0000  0.01253587  2.802329  0.8377750  2.212591
##   1.0000  0.01786490  2.803822  0.8377338  2.213914
##   1.0000  0.02545932  2.806054  0.8376712  2.215885
##   1.0000  0.03628215  2.807558  0.8376817  2.216821
##   1.0000  0.05170579  2.808880  0.8377962  2.217171
##   1.0000  0.07368606  2.816358  0.8380416  2.222280
##   1.0000  0.10501020  2.830964  0.8380232  2.232267
##   1.0000  0.14965031  2.853783  0.8377985  2.247714
##   1.0000  0.21326706  2.881585  0.8379075  2.267114
##   1.0000  0.30392747  2.919893  0.8379515  2.300841
##   1.0000  0.43312785  2.986420  0.8356949  2.362580
##   1.0000  0.61725166  3.106174  0.8247510  2.454539
##   1.0000  0.87964701  3.294092  0.7945416  2.575282
##   1.0000  1.25358731  3.484862  0.7900147  2.710471
##   1.0000  1.78649065  3.844996  0.7871959  2.997113
##   1.0000  2.54593263  4.292627  0.7691816  3.371492
##   1.0000  3.62821544  4.746443  0.8505127  3.740299
##   1.0000  5.17057958  5.453007  0.8470432  4.317125
##   1.0000  7.36860686  5.926475  0.7846698  4.775683
## 
## RMSE was used to select the optimal model using the smallest value.
## The final values used for the model were alpha = 0.925 and lambda
##  = 0.01253587.

The suumary output shows the model optimized using RMSE; and the optimal tuning values (at the miminum RMSE) were alpha = 1.0 and lambda = 1.42, so the mix is 0% ridge, 100% lasso, and the penalization lambda is 1.42. You can see the RMSE minimum on the the plot. Alpha is on the horizontal axis and the different lambdas are shown as separate series.

plot(elnet.model, main = "Elastic Net Regression Parameter Tuning")

Make predictions on the validation data set and save the performance metrics for comparison to other the models.

(elnet.perf <- postResample(pred = predict(elnet.model, newdata = valid), 
                            obs = valid$mpg))
##      RMSE  Rsquared       MAE 
## 3.0860034 0.8252866 2.4747096

On average, this model will miss the true value of mpg by 1.91 mpg (RMSE) or 1.68 mpg (MAE), so in between the ridge and lasso models already fit. The model explains about 84% of the variation in mpg, slightly less than ridge and lasso.

Comments

Here are the results of the three regularization methods.

rbind(ridge.perf, lasso.perf, elnet.perf)
##                RMSE  Rsquared      MAE
## ridge.perf 2.994102 0.8251066 2.418320
## lasso.perf 3.085846 0.8253030 2.473863
## elnet.perf 3.086003 0.8252866 2.474710

It looks like lasso was the big winner today based on RMSE and MAE. Ridge pulled out the victory on Rsquared. In general, lasso performs well when there are only a few significant parameters and ridge performs well when many parameters are significant. You won’t know which model is best in advance because you won’t know how many parameters are significant. Just try all three models.

You can also compare the models by resampling.

model.resamples <- resamples(list(Ridge = ridge.model,
                                  Lasso = lasso.model,
                                  ELNet = elnet.model))
summary(model.resamples)
## 
## Call:
## summary.resamples(object = model.resamples)
## 
## Models: Ridge, Lasso, ELNet 
## Number of resamples: 5 
## 
## MAE 
##           Min.  1st Qu.   Median     Mean  3rd Qu.     Max. NA's
## Ridge 1.672594 1.739341 2.270152 2.226618 2.666318 2.784682    0
## Lasso 1.723395 1.785833 2.333312 2.212591 2.548349 2.672063    0
## ELNet 1.723414 1.788598 2.329284 2.211822 2.546796 2.671019    0
## 
## RMSE 
##           Min.  1st Qu.   Median     Mean  3rd Qu.     Max. NA's
## Ridge 2.147348 2.345400 2.722408 2.842346 3.483502 3.513071    0
## Lasso 2.257599 2.323704 2.746668 2.802329 3.305164 3.378510    0
## ELNet 2.257534 2.326217 2.742413 2.801299 3.304119 3.376210    0
## 
## Rsquared 
##            Min.   1st Qu.    Median      Mean   3rd Qu.      Max. NA's
## Ridge 0.7524620 0.8141057 0.8574526 0.8335761 0.8694066 0.8744536    0
## Lasso 0.7703154 0.8136039 0.8574518 0.8377750 0.8717754 0.8757286    0
## ELNet 0.7703154 0.8136493 0.8574518 0.8377198 0.8715664 0.8756163    0

You want the smallest mean RMSE, and a small range of RMSEs. Lasso had the smallest mean, and a relatively small range. Boxplots are a common way to visualize this information.

bwplot(model.resamples, metric = "RMSE", main = "Model Comparison on Resamples")

Now that you have identified the optimal model, capture its tuning parameters and refit the model to the entire data set.

set.seed(123)
final.model <- train(
  mpg ~ .,
  data = train,
  method = "glmnet",
  metric = "RMSE",  # Choose from RMSE, RSquared, AIC, BIC, ...others?
  preProcess = c("medianImpute",   # impute NAs with median (then try knnImpute!)
                 "center", "scale"),  # standardize data for linear models "it just works better."
  tuneGrid = data.frame(
    .alpha = lasso.model$bestTune$alpha,  # optimized hyperparameters
    .lambda = lasso.model$bestTune$lambda),  # optimized hyperparameters
  trControl = model.trControl
  )
final.model
## glmnet 
## 
## 21 samples
## 10 predictors
## 
## Pre-processing: median imputation (10), centered (10), scaled (10) 
## Resampling: Cross-Validated (5 fold) 
## Summary of sample sizes: 6, 3, 4, 3, 5 
## Resampling results:
## 
##   RMSE      Rsquared   MAE     
##   4.209802  0.6480416  3.416083
## 
## Tuning parameter 'alpha' was held constant at a value of 1
## 
## Tuning parameter 'lambda' was held constant at a value of 0

The model is ready to predict on new data!

Reference

Regularization: Ridge, Lasso and Elastic Net by Michael Oleszak in DataCamp. https://www.datacamp.com/community/tutorials/tutorial-ridge-lasso-elastic-net

**Machine Learning Toolbox in DataCamp.