R Markdown

library(AppliedPredictiveModeling)
library(mlbench)
library(ggplot2)
library(dplyr)
## 
## Attaching package: 'dplyr'
## The following objects are masked from 'package:stats':
## 
##     filter, lag
## The following objects are masked from 'package:base':
## 
##     intersect, setdiff, setequal, union
library(corrplot)
## corrplot 0.94 loaded
library(purrr)
library(tidyr)
library(fpp3)
## Registered S3 method overwritten by 'tsibble':
##   method               from 
##   as_tibble.grouped_df dplyr
## ── Attaching packages ──────────────────────────────────────────── fpp3 1.0.0 ──
## ✔ tibble      3.2.1     ✔ feasts      0.3.2
## ✔ lubridate   1.9.3     ✔ fable       0.3.4
## ✔ tsibble     1.1.5     ✔ fabletools  0.4.2
## ✔ tsibbledata 0.4.1
## ── Conflicts ───────────────────────────────────────────────── fpp3_conflicts ──
## ✖ lubridate::date()    masks base::date()
## ✖ dplyr::filter()      masks stats::filter()
## ✖ tsibble::intersect() masks base::intersect()
## ✖ tsibble::interval()  masks lubridate::interval()
## ✖ dplyr::lag()         masks stats::lag()
## ✖ tsibble::setdiff()   masks base::setdiff()
## ✖ tsibble::union()     masks base::union()
library(forecast)
## Registered S3 method overwritten by 'quantmod':
##   method            from
##   as.zoo.data.frame zoo
library(latex2exp)
library(caret)
## Loading required package: lattice
## 
## Attaching package: 'caret'
## The following objects are masked from 'package:fabletools':
## 
##     MAE, RMSE
## The following object is masked from 'package:purrr':
## 
##     lift

6.2. Developing a model to predict permeability (see Sect. 1.4) could save significant resources for a pharmaceutical company, while at the same time more rapidly identifying molecules that have a sufficient permeability to become a drug:

a) Start R and use these commands to load the data:

library(AppliedPredictiveModeling)
data(permeability)

The matrix fingerprints contains the 1,107 binary molecular predictors for the 165 compounds, while permeability contains permeability response.

(b) The fingerprint predictors indicate the presence or absence of substructures of a molecule and are often sparse meaning that relatively few of the molecules contain each substructure. Filter out the predictors that have low frequencies using the nearZeroVar function from the caret package. How many predictors are left for modeling?

dim(fingerprints)
## [1]  165 1107
fingerprints <- fingerprints[, -nearZeroVar(fingerprints)]

dim(fingerprints)
## [1] 165 388

There were 1,107 predictors and now there are only 388 predictors left for modeling.

(c) Split the data into a training and a test set, pre-process the data, and tune a PLS model. How many latent variables are optimal and what is the corresponding resampled estimate of R2?

set.seed(624)

# index for training
index <- createDataPartition(permeability, p = .8, list = FALSE)

# train 
train_perm <- permeability[index, ]
train_fp <- fingerprints[index, ]
# test
test_perm <- permeability[-index, ]
test_fp <- fingerprints [-index, ]

# 10-fold cross-validation to make reasonable estimates
ctrl <- trainControl(method = "cv", number = 10)

plsTune <- train(train_fp, train_perm, method = "pls", metric = "Rsquared",
             tuneLength = 20, trControl = ctrl, preProc = c("center", "scale"))

plot(plsTune) 

plsTune
## Partial Least Squares 
## 
## 133 samples
## 388 predictors
## 
## Pre-processing: centered (388), scaled (388) 
## Resampling: Cross-Validated (10 fold) 
## Summary of sample sizes: 118, 119, 120, 120, 121, 120, ... 
## Resampling results across tuning parameters:
## 
##   ncomp  RMSE      Rsquared   MAE      
##    1     13.25656  0.2953172  10.202424
##    2     11.90191  0.4662745   8.710959
##    3     12.05579  0.4624570   9.271134
##    4     12.03297  0.4793759   9.314195
##    5     12.06645  0.4870447   8.986610
##    6     11.82964  0.5012972   8.788270
##    7     11.91363  0.5011672   9.153386
##    8     11.79990  0.4960881   9.119966
##    9     11.81946  0.4959475   9.301787
##   10     11.85288  0.4924486   9.176136
##   11     11.79654  0.5025443   9.128199
##   12     11.62869  0.5131115   8.965070
##   13     11.78348  0.5080595   8.920097
##   14     12.01377  0.4935108   9.101865
##   15     12.09297  0.4862359   9.131109
##   16     12.10087  0.4953868   9.053161
##   17     12.38093  0.4847366   9.218277
##   18     12.59348  0.4768971   9.402569
##   19     12.61895  0.4807222   9.338592
##   20     12.77045  0.4682401   9.549745
## 
## Rsquared was used to select the optimal model using the largest value.
## The final value used for the model was ncomp = 12.

The optimal tuning had 12 components with a corresponding R2 of 0.5297497.

(d) Predict the response for the test set. What is the test set estimate of R2?

fp_predict <- predict(plsTune, test_fp)

postResample(fp_predict, test_perm)
##       RMSE   Rsquared        MAE 
## 11.4895371  0.4741832  9.3113125
#The test set estimate of R2 is 0.4741832.

(e) Try building other models discussed in this chapter. Do any have better predictive performance?

#enet
set.seed(624)

# grid of penalties
enetGrid <- expand.grid(.lambda = c(0, 0.01, .1), .fraction = seq(.05, 1, length = 20))

# tuning penalized regression model
enetTune <- train(train_fp, train_perm, method = "enet",
                  tuneGrid = enetGrid, trControl = ctrl, preProc = c("center", "scale"))
## Warning: model fit failed for Fold07: lambda=0.00, fraction=1 Error in if (zmin < gamhat) { : missing value where TRUE/FALSE needed
## Warning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo,
## : There were missing values in resampled performance measures.
plot(enetTune)

enetTune
## Elasticnet 
## 
## 133 samples
## 388 predictors
## 
## Pre-processing: centered (388), scaled (388) 
## Resampling: Cross-Validated (10 fold) 
## Summary of sample sizes: 121, 118, 119, 121, 119, 120, ... 
## Resampling results across tuning parameters:
## 
##   lambda  fraction  RMSE      Rsquared   MAE      
##   0.00    0.05      12.53022  0.4195116   9.219840
##   0.00    0.10      11.83087  0.4482164   8.465318
##   0.00    0.15      11.94876  0.4475951   8.738893
##   0.00    0.20      11.98463  0.4484245   8.890257
##   0.00    0.25      11.79106  0.4640731   8.861128
##   0.00    0.30      11.66486  0.4742758   8.848355
##   0.00    0.35      11.72130  0.4749181   8.912993
##   0.00    0.40      11.90989  0.4695334   8.998669
##   0.00    0.45      12.30625  0.4529253   9.193462
##   0.00    0.50      12.68544  0.4362691   9.340897
##   0.00    0.55      13.00847  0.4214423   9.471613
##   0.00    0.60      13.29088  0.4089528   9.590155
##   0.00    0.65      13.44531  0.4004275   9.646439
##   0.00    0.70      13.61111  0.3907010   9.737890
##   0.00    0.75      13.82251  0.3811970   9.865902
##   0.00    0.80      13.94077  0.3755526   9.951804
##   0.00    0.85      14.00446  0.3726513  10.006198
##   0.00    0.90      14.15593  0.3660339  10.096242
##   0.00    0.95      14.35377  0.3580099  10.182112
##   0.00    1.00      14.48818  0.3547683  10.231509
##   0.01    0.05      12.91277  0.3962042   9.210662
##   0.01    0.10      14.49519  0.3998262  10.363051
##   0.01    0.15      16.02608  0.4115378  11.216868
##   0.01    0.20      17.63731  0.4189926  12.226957
##   0.01    0.25      19.54774  0.4096559  13.401832
##   0.01    0.30      21.65430  0.3895637  14.606572
##   0.01    0.35      23.60756  0.3748645  15.739320
##   0.01    0.40      25.60311  0.3611846  16.904536
##   0.01    0.45      27.57928  0.3504770  18.090540
##   0.01    0.50      29.55483  0.3408871  19.320808
##   0.01    0.55      31.52744  0.3320255  20.544216
##   0.01    0.60      33.47925  0.3280021  21.731405
##   0.01    0.65      35.46538  0.3231206  22.929380
##   0.01    0.70      37.43374  0.3183345  24.106466
##   0.01    0.75      39.37733  0.3156452  25.290183
##   0.01    0.80      41.31187  0.3139363  26.468389
##   0.01    0.85      43.29420  0.3110890  27.711822
##   0.01    0.90      45.37309  0.3072853  28.998914
##   0.01    0.95      47.41765  0.3041533  30.254470
##   0.01    1.00      49.32673  0.3034756  31.415359
##   0.10    0.05      12.55808  0.4101914   9.475823
##   0.10    0.10      12.00149  0.4328341   8.560342
##   0.10    0.15      12.08155  0.4311345   8.705317
##   0.10    0.20      12.22936  0.4267428   8.969534
##   0.10    0.25      12.11297  0.4349270   8.932117
##   0.10    0.30      12.04598  0.4378837   8.978154
##   0.10    0.35      12.04834  0.4369033   9.003314
##   0.10    0.40      12.08992  0.4341602   9.037805
##   0.10    0.45      12.16888  0.4291846   9.099988
##   0.10    0.50      12.25788  0.4233290   9.143842
##   0.10    0.55      12.32683  0.4187918   9.177390
##   0.10    0.60      12.41168  0.4130270   9.253065
##   0.10    0.65      12.47522  0.4078620   9.287724
##   0.10    0.70      12.53494  0.4039400   9.335168
##   0.10    0.75      12.58185  0.4016082   9.363731
##   0.10    0.80      12.61736  0.4001737   9.380885
##   0.10    0.85      12.65290  0.3987804   9.398010
##   0.10    0.90      12.69026  0.3974567   9.419637
##   0.10    0.95      12.73757  0.3953611   9.442814
##   0.10    1.00      12.77339  0.3939567   9.457362
## 
## RMSE was used to select the optimal model using the smallest value.
## The final values used for the model were fraction = 0.3 and lambda = 0.
enet_predict <- predict(enetTune, test_fp)

postResample(enet_predict, test_perm)
##       RMSE   Rsquared        MAE 
## 11.9836978  0.4157479  9.7314675
#The R2 is 0.5307087 and the RMSE is lower using a penalized Elastic Net regression model.

#lars
set.seed(624)

larsTune <- train(train_fp, train_perm, method = "lars", metric = "Rsquared",
                    tuneLength = 20, trControl = ctrl, preProc = c("center", "scale"))

plot(larsTune)

larsTune
## Least Angle Regression 
## 
## 133 samples
## 388 predictors
## 
## Pre-processing: centered (388), scaled (388) 
## Resampling: Cross-Validated (10 fold) 
## Summary of sample sizes: 121, 118, 119, 121, 119, 120, ... 
## Resampling results across tuning parameters:
## 
##   fraction  RMSE      Rsquared   MAE      
##   0.05      12.30086  0.4056660   9.001982
##   0.10      12.08986  0.4228748   9.100262
##   0.15      12.33176  0.4026815   9.271797
##   0.20      12.70958  0.3698133   9.280733
##   0.25      12.89994  0.3553727   9.406290
##   0.30      13.10546  0.3404480   9.608259
##   0.35      13.43131  0.3366925   9.918288
##   0.40      14.20635  0.3158513  10.558290
##   0.45      14.93059  0.2954834  11.010885
##   0.50      15.54776  0.2796546  11.404569
##   0.55      16.25747  0.2640206  11.901090
##   0.60      16.97790  0.2517277  12.392314
##   0.65      17.79551  0.2426847  12.947625
##   0.70      18.82949  0.2231103  13.605212
##   0.75      19.67668  0.2115004  14.065491
##   0.80      20.42285  0.2113425  14.529247
##   0.85      21.28116  0.2098379  15.015162
##   0.90      21.86257  0.2089126  15.394677
##   0.95      22.41030  0.2066592  15.798343
##   1.00      22.94251  0.2058338  16.195296
## 
## Rsquared was used to select the optimal model using the largest value.
## The final value used for the model was fraction = 0.1.
lars_predict <- predict(larsTune, test_fp)

postResample(lars_predict, test_perm)
##       RMSE   Rsquared        MAE 
## 11.6906258  0.4342625  9.6425013
#The Least Angle Regression is slightly worse than the PLS method as the R2 is lower and the RMSE is higher.

(f) Would you recommend any of your models to replace the permeability laboratory experiment?

I would recommend the Elastic Net regression model as it produced better statistics. It had a higher R2 and lower RMSE.

6.3 A chemical manufacturing process for a pharmaceutical product was discussed in Sect. 1.4. In this problem, the objective is to understand the relationship between biological measurements of the raw materials (predictors), measurements of the manufacturing process (predictors), and the response of product yield. Biological predictors cannot be changed but can be used to assess the quality of the raw material before processing. On the other hand, manufacturing process predictors can be changed in the manufacturing process. Improving product yield by 1 % will boost revenue by approximately one hundred thousand dollars per batch:

(a) Start R and use these commands to load the data:

library(AppliedPredictiveModeling)
data(ChemicalManufacturingProcess)

The matrix processPredictors contains the 57 predictors (12 describing the input biological material and 45 describing the process predictors) for the 176 manufacturing runs. yield contains the percent yield for each run

(b) A small percentage of cells in the predictor set contain missing values. Use an imputation function to fill in these missing values (e.g., see Sect. 3.8).

#find missing values
sum(is.na(ChemicalManufacturingProcess))
## [1] 106
missing <- preProcess(ChemicalManufacturingProcess, method = "bagImpute")
Chemical <- predict(missing, ChemicalManufacturingProcess)

sum(is.na(Chemical))
## [1] 0

There are 106 missing values in ChemicalManufacturingProcess. Bagged trees were used to impute the data.

(c) Split the data into a training and a test set, pre-process the data, and tune a model of your choice from this chapter. What is the optimal value of the performance metric?

# filtering low frequencies
Chemical <- Chemical[, -nearZeroVar(Chemical)]

set.seed(624)

# index for training
index <- createDataPartition(Chemical$Yield, p = .8, list = FALSE)

# train 
train_chem <- Chemical[index, ]

# test
test_chem <- Chemical[-index, ]

#pls

set.seed(624)

plsTune <- train(Yield ~ ., Chemical , method = "pls", 
             tuneLength = 20, trControl = ctrl, preProc = c("center", "scale"))

plot(plsTune) 

plsTune
## Partial Least Squares 
## 
## 176 samples
##  56 predictor
## 
## Pre-processing: centered (56), scaled (56) 
## Resampling: Cross-Validated (10 fold) 
## Summary of sample sizes: 160, 157, 158, 159, 158, 159, ... 
## Resampling results across tuning parameters:
## 
##   ncomp  RMSE      Rsquared   MAE     
##    1     1.436891  0.4568805  1.147590
##    2     1.872742  0.4711564  1.185897
##    3     1.292614  0.5633698  1.020010
##    4     1.480526  0.5381868  1.085319
##    5     1.707358  0.5156812  1.131007
##    6     1.821904  0.4903840  1.156300
##    7     2.006142  0.4802835  1.211850
##    8     2.092370  0.4622998  1.253598
##    9     2.220647  0.4485854  1.290999
##   10     2.322021  0.4410360  1.315081
##   11     2.446697  0.4264842  1.352393
##   12     2.475260  0.4206118  1.367989
##   13     2.464162  0.4197124  1.377783
##   14     2.418661  0.4227280  1.364288
##   15     2.375812  0.4242080  1.350175
##   16     2.368363  0.4267259  1.337946
##   17     2.386174  0.4254577  1.339398
##   18     2.376321  0.4271412  1.334877
##   19     2.400843  0.4255697  1.348122
##   20     2.422785  0.4231162  1.359970
## 
## RMSE was used to select the optimal model using the smallest value.
## The final value used for the model was ncomp = 3.
#Optimal tuning has 3 components with R2 of 0.5623262.

#enet

set.seed(624)

enetTune <- train(Yield ~ ., Chemical , method = "enet", 
                  tuneGrid = enetGrid, trControl = ctrl, preProc = c("center", "scale"))
plot(enetTune)

enetTune
## Elasticnet 
## 
## 176 samples
##  56 predictor
## 
## Pre-processing: centered (56), scaled (56) 
## Resampling: Cross-Validated (10 fold) 
## Summary of sample sizes: 160, 157, 158, 159, 158, 159, ... 
## Resampling results across tuning parameters:
## 
##   lambda  fraction  RMSE      Rsquared   MAE      
##   0.00    0.05      1.263954  0.6225860  1.0294423
##   0.00    0.10      1.171467  0.6195893  0.9388695
##   0.00    0.15      1.285779  0.6043223  0.9537550
##   0.00    0.20      1.455526  0.5547255  1.0089814
##   0.00    0.25      1.743737  0.4948981  1.1067214
##   0.00    0.30      1.939587  0.4680604  1.1795594
##   0.00    0.35      1.908244  0.4619370  1.1924860
##   0.00    0.40      1.850254  0.4606806  1.1877837
##   0.00    0.45      1.794856  0.4594057  1.1771407
##   0.00    0.50      2.101187  0.4481459  1.2573220
##   0.00    0.55      2.407435  0.4725106  1.3304398
##   0.00    0.60      2.744013  0.5217990  1.4001566
##   0.00    0.65      3.225543  0.4863491  1.5339631
##   0.00    0.70      3.771484  0.4747361  1.6693508
##   0.00    0.75      4.094773  0.4681231  1.7557563
##   0.00    0.80      4.135408  0.4589587  1.7764653
##   0.00    0.85      4.225706  0.4419317  1.8076852
##   0.00    0.90      4.318549  0.4257147  1.8358006
##   0.00    0.95      4.415063  0.4120784  1.8636370
##   0.00    1.00      4.501021  0.4019904  1.8870890
##   0.01    0.05      1.536442  0.5973476  1.2453271
##   0.01    0.10      1.309336  0.6258493  1.0618555
##   0.01    0.15      1.201950  0.6195671  0.9778981
##   0.01    0.20      1.176854  0.6175939  0.9523896
##   0.01    0.25      1.161852  0.6236987  0.9319051
##   0.01    0.30      1.175811  0.6198597  0.9301295
##   0.01    0.35      1.246551  0.6032199  0.9540424
##   0.01    0.40      1.296291  0.5952774  0.9647967
##   0.01    0.45      1.362599  0.5626257  0.9909640
##   0.01    0.50      1.460207  0.5489330  1.0197566
##   0.01    0.55      1.600805  0.5172511  1.0731396
##   0.01    0.60      1.793769  0.4868596  1.1322410
##   0.01    0.65      1.885961  0.4712944  1.1672050
##   0.01    0.70      1.953361  0.4609687  1.1927301
##   0.01    0.75      2.016199  0.4534714  1.2156241
##   0.01    0.80      2.076369  0.4480579  1.2356802
##   0.01    0.85      2.143715  0.4433120  1.2568727
##   0.01    0.90      2.179751  0.4401770  1.2700016
##   0.01    0.95      2.129338  0.4398117  1.2610389
##   0.01    1.00      2.043851  0.4421279  1.2426724
##   0.10    0.05      1.649315  0.5363459  1.3354459
##   0.10    0.10      1.487591  0.6095911  1.2076167
##   0.10    0.15      1.354079  0.6236486  1.0996757
##   0.10    0.20      1.258077  0.6219261  1.0220867
##   0.10    0.25      1.200809  0.6201101  0.9778990
##   0.10    0.30      1.181463  0.6179041  0.9615112
##   0.10    0.35      1.171335  0.6193380  0.9468400
##   0.10    0.40      1.165095  0.6229576  0.9424421
##   0.10    0.45      1.167185  0.6235463  0.9390511
##   0.10    0.50      1.212130  0.6118562  0.9527442
##   0.10    0.55      1.310136  0.6005375  0.9787766
##   0.10    0.60      1.383384  0.5963431  0.9968869
##   0.10    0.65      1.442303  0.5805640  1.0192090
##   0.10    0.70      1.511715  0.5573960  1.0504530
##   0.10    0.75      1.598893  0.5315716  1.0836263
##   0.10    0.80      1.707558  0.5079105  1.1181461
##   0.10    0.85      1.772034  0.4960139  1.1387951
##   0.10    0.90      1.812766  0.4895472  1.1528253
##   0.10    0.95      1.846664  0.4841682  1.1650245
##   0.10    1.00      1.873437  0.4803403  1.1741747
## 
## RMSE was used to select the optimal model using the smallest value.
## The final values used for the model were fraction = 0.25 and lambda = 0.01.
#The optimal model has a fraction of 0.1 and lambda of 0. The R2 is 0.6253182.

#lars 

set.seed(624)

larsTune <- train(Yield ~ ., Chemical , method = "lars", metric = "Rsquared",
                    tuneLength = 20, trControl = ctrl, preProc = c("center", "scale"))

plot(larsTune)

larsTune
## Least Angle Regression 
## 
## 176 samples
##  56 predictor
## 
## Pre-processing: centered (56), scaled (56) 
## Resampling: Cross-Validated (10 fold) 
## Summary of sample sizes: 160, 157, 158, 159, 158, 159, ... 
## Resampling results across tuning parameters:
## 
##   fraction  RMSE      Rsquared   MAE     
##   0.05      1.268059  0.6252093  1.037325
##   0.10      1.157751  0.6229682  0.936117
##   0.15      1.163964  0.6233226  0.922333
##   0.20      1.425304  0.5567018  1.005712
##   0.25      1.717206  0.4959408  1.101890
##   0.30      1.936760  0.4676172  1.176116
##   0.35      1.922115  0.4611422  1.188942
##   0.40      1.875680  0.4585226  1.187174
##   0.45      1.838067  0.4577825  1.183074
##   0.50      1.731858  0.4637976  1.161242
##   0.55      1.508554  0.4924491  1.104212
##   0.60      1.286634  0.5676612  1.027343
##   0.65      1.637094  0.4985405  1.138608
##   0.70      2.069278  0.4782574  1.247884
##   0.75      2.478866  0.4698927  1.355965
##   0.80      2.854003  0.4595411  1.460073
##   0.85      3.231161  0.4425963  1.561937
##   0.90      3.619450  0.4262587  1.662857
##   0.95      4.056912  0.4124876  1.774808
##   1.00      4.501021  0.4019904  1.887089
## 
## Rsquared was used to select the optimal model using the largest value.
## The final value used for the model was fraction = 0.05.
#The optimal model has a fraction of 0.05 and R2 of 0.6256310.

#lm
lm_model <- lm(Yield ~ ., Chemical)
summary(lm_model)
## 
## Call:
## lm(formula = Yield ~ ., data = Chemical)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -2.17844 -0.53656 -0.02842  0.50526  2.00415 
## 
## Coefficients: (1 not defined because of singularities)
##                          Estimate Std. Error t value Pr(>|t|)    
## (Intercept)             4.234e+00  8.608e+01   0.049  0.96085    
## BiologicalMaterial01    2.483e-01  3.342e-01   0.743  0.45900    
## BiologicalMaterial02   -1.120e-01  1.281e-01  -0.874  0.38375    
## BiologicalMaterial03    1.636e-01  2.354e-01   0.695  0.48843    
## BiologicalMaterial04   -1.044e-01  5.235e-01  -0.199  0.84233    
## BiologicalMaterial05    1.513e-01  1.061e-01   1.426  0.15641    
## BiologicalMaterial06    3.336e-03  3.014e-01   0.011  0.99119    
## BiologicalMaterial08    3.808e-01  6.358e-01   0.599  0.55034    
## BiologicalMaterial09   -8.180e-01  1.370e+00  -0.597  0.55162    
## BiologicalMaterial10    7.954e-02  1.367e+00   0.058  0.95370    
## BiologicalMaterial11   -8.954e-02  8.230e-02  -1.088  0.27874    
## BiologicalMaterial12    3.493e-01  6.346e-01   0.551  0.58300    
## ManufacturingProcess01  6.695e-02  9.596e-02   0.698  0.48672    
## ManufacturingProcess02  1.343e-02  4.311e-02   0.311  0.75601    
## ManufacturingProcess03 -3.377e+00  5.103e+00  -0.662  0.50934    
## ManufacturingProcess04  6.282e-02  2.940e-02   2.137  0.03464 *  
## ManufacturingProcess05  7.326e-04  3.859e-03   0.190  0.84974    
## ManufacturingProcess06  3.261e-02  4.341e-02   0.751  0.45401    
## ManufacturingProcess07 -1.810e-01  2.126e-01  -0.851  0.39623    
## ManufacturingProcess08 -6.282e-02  2.522e-01  -0.249  0.80374    
## ManufacturingProcess09  2.614e-01  1.812e-01   1.443  0.15176    
## ManufacturingProcess10 -1.166e-01  5.742e-01  -0.203  0.83950    
## ManufacturingProcess11  1.942e-01  7.132e-01   0.272  0.78590    
## ManufacturingProcess12  3.761e-05  1.013e-04   0.371  0.71120    
## ManufacturingProcess13 -2.670e-01  3.843e-01  -0.695  0.48859    
## ManufacturingProcess14  3.058e-04  1.115e-02   0.027  0.97816    
## ManufacturingProcess15  1.972e-03  8.903e-03   0.222  0.82506    
## ManufacturingProcess16 -4.937e-05  3.190e-04  -0.155  0.87728    
## ManufacturingProcess17 -1.402e-01  3.011e-01  -0.466  0.64240    
## ManufacturingProcess18  4.245e-03  4.450e-03   0.954  0.34211    
## ManufacturingProcess19 -2.233e-03  7.301e-03  -0.306  0.76021    
## ManufacturingProcess20 -4.517e-03  4.721e-03  -0.957  0.34062    
## ManufacturingProcess21         NA         NA      NA       NA    
## ManufacturingProcess22 -1.666e-02  4.209e-02  -0.396  0.69299    
## ManufacturingProcess23 -4.181e-02  8.289e-02  -0.504  0.61495    
## ManufacturingProcess24 -1.931e-02  2.340e-02  -0.825  0.41100    
## ManufacturingProcess25 -6.493e-03  1.365e-02  -0.476  0.63506    
## ManufacturingProcess26  6.101e-03  1.041e-02   0.586  0.55909    
## ManufacturingProcess27 -7.061e-03  7.781e-03  -0.907  0.36601    
## ManufacturingProcess28 -7.882e-02  3.094e-02  -2.547  0.01212 *  
## ManufacturingProcess29  1.393e+00  8.961e-01   1.555  0.12261    
## ManufacturingProcess30 -3.693e-01  6.233e-01  -0.592  0.55463    
## ManufacturingProcess31  4.783e-02  1.203e-01   0.398  0.69168    
## ManufacturingProcess32  3.333e-01  6.833e-02   4.877 3.34e-06 ***
## ManufacturingProcess33 -4.068e-01  1.286e-01  -3.164  0.00197 ** 
## ManufacturingProcess34 -1.496e+00  2.753e+00  -0.543  0.58792    
## ManufacturingProcess35 -1.879e-02  1.765e-02  -1.064  0.28926    
## ManufacturingProcess36  2.833e+02  3.132e+02   0.904  0.36765    
## ManufacturingProcess37 -6.935e-01  2.889e-01  -2.401  0.01789 *  
## ManufacturingProcess38 -1.900e-01  2.417e-01  -0.786  0.43333    
## ManufacturingProcess39  7.077e-02  1.307e-01   0.542  0.58907    
## ManufacturingProcess40  4.605e-01  6.545e+00   0.070  0.94403    
## ManufacturingProcess41  2.549e-01  4.736e+00   0.054  0.95716    
## ManufacturingProcess42  4.372e-02  2.102e-01   0.208  0.83557    
## ManufacturingProcess43  2.268e-01  1.182e-01   1.919  0.05741 .  
## ManufacturingProcess44 -4.385e-01  1.186e+00  -0.370  0.71222    
## ManufacturingProcess45  9.547e-01  5.444e-01   1.754  0.08204 .  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.039 on 120 degrees of freedom
## Multiple R-squared:  0.7826, Adjusted R-squared:  0.683 
## F-statistic: 7.854 on 55 and 120 DF,  p-value: < 2.2e-16
#The ordinary linear regression model has a Multiple R2 of 0.7813 and an Adjusted R2 of 0.6811.

#ridge
set.seed(624)

## Define the candidate set of values
ridgeGrid <- data.frame(.lambda = seq(0, .1, length = 15))

ridgeTune <- train(Yield ~ ., Chemical , method = "ridge",
                     tuneGrid = ridgeGrid, trControl = ctrl, preProc = c("center", "scale"))

plot(ridgeTune)

ridgeTune
## Ridge Regression 
## 
## 176 samples
##  56 predictor
## 
## Pre-processing: centered (56), scaled (56) 
## Resampling: Cross-Validated (10 fold) 
## Summary of sample sizes: 160, 157, 158, 159, 158, 159, ... 
## Resampling results across tuning parameters:
## 
##   lambda       RMSE      Rsquared   MAE     
##   0.000000000  4.501021  0.4019904  1.887089
##   0.007142857  1.951471  0.4493837  1.224896
##   0.014285714  2.093121  0.4429601  1.247997
##   0.021428571  2.098017  0.4475006  1.242244
##   0.028571429  2.077066  0.4519785  1.232565
##   0.035714286  2.050861  0.4560167  1.222873
##   0.042857143  2.024673  0.4596597  1.213933
##   0.050000000  1.999988  0.4629748  1.206382
##   0.057142857  1.977156  0.4660169  1.200121
##   0.064285714  1.956157  0.4688278  1.194388
##   0.071428571  1.936855  0.4714397  1.189526
##   0.078571429  1.919085  0.4738781  1.185097
##   0.085714286  1.902686  0.4761635  1.180984
##   0.092857143  1.887513  0.4783129  1.177421
##   0.100000000  1.873437  0.4803403  1.174175
## 
## RMSE was used to select the optimal model using the smallest value.
## The final value used for the model was lambda = 0.1.
#The optimal model has lambda of 0.1 and R2 of 0.4785667.

(d) Predict the response for the test set. What is the value of the performance metric and how does this compare with the resampled performance metric on the training set?

Ordinary linear model had the highest R2, but it comes with consequences. Therefore, the lars method was chosen as it had the highest R2.

The R2 is 0.7170636, which is higher than the training set.

larspredict <- predict(larsTune, test_chem[ ,-1])

postResample(larspredict, test_chem[ ,1])
##     RMSE Rsquared      MAE 
## 1.399505 0.718109 1.095894
varImp(larsTune)
## loess r-squared variable importance
## 
##   only 20 most important variables shown (out of 56)
## 
##                        Overall
## ManufacturingProcess32  100.00
## ManufacturingProcess13   90.02
## BiologicalMaterial06     84.56
## ManufacturingProcess36   76.03
## ManufacturingProcess17   74.88
## BiologicalMaterial03     73.53
## ManufacturingProcess09   70.37
## BiologicalMaterial12     67.98
## BiologicalMaterial02     65.33
## ManufacturingProcess31   60.38
## ManufacturingProcess06   58.03
## ManufacturingProcess33   49.39
## BiologicalMaterial11     48.11
## BiologicalMaterial04     47.13
## ManufacturingProcess11   42.47
## BiologicalMaterial08     41.88
## BiologicalMaterial01     39.14
## ManufacturingProcess12   33.02
## ManufacturingProcess30   32.91
## BiologicalMaterial09     32.41

The 5 most important variables used in the modeling are ManufacturingProcess32, ManufacturingProcess13, BiologicalMaterial06, ManufacturingProcess36, and ManufacturingProcess17. Process predictors dominate the list. The ratio of process to biological predictors is 11:9.

(f) Explore the relationships between each of the top predictors and the response. How could this information be helpful in improving yield in future runs of the manufacturing process?

top10 <- varImp(larsTune)$importance %>%
  arrange(-Overall) %>%
  head(10)


Chemical %>%
  select(c("Yield", row.names(top10))) %>%
  cor() %>%
  corrplot()

According to the correlation plot, ManufacturingProcess32 has the highest positive correlation with Yield. Three of the top ten variables are negatively correlated with Yield. This information can be helpful in future runs of the manufacturing process, as these were the predictors that affected the yield. If they want to maximize or improve their yield, they may want to improve their measurements of the manufacturing process and biological measurements of the raw materials.