A. Start R and use these commands to load the data: library(AppliedPredictiveModeling) data(permeability) The matrix fingerprints contains the 1,107 binary molecular predictors for the 165 compounds, while permeability contains permeability response.
library(AppliedPredictiveModeling)
library(knitr)
data(permeability)
summary(permeability)
## permeability
## Min. : 0.06
## 1st Qu.: 1.55
## Median : 4.91
## Mean :12.24
## 3rd Qu.:15.47
## Max. :55.60
This pharmaceutical data set was used to develop a model for predicting compounds’ permeability. In short, permeability is the measure of a molecule’s ability to cross a membrane.
permeability : permeability values for each compound.
fingerprints : a matrix of binary fingerprint indicator variables.
hist(permeability)
summary(apply(fingerprints, 2, mean))
## Min. 1st Qu. Median Mean 3rd Qu. Max.
## 0.000000 0.006061 0.024242 0.154767 0.181818 1.000000
str(permeability)
## num [1:165, 1] 12.52 1.12 19.41 1.73 1.68 ...
## - attr(*, "dimnames")=List of 2
## ..$ : chr [1:165] "1" "2" "3" "4" ...
## ..$ : chr "permeability"
str(fingerprints)
## num [1:165, 1:1107] 0 0 0 0 0 0 0 0 0 0 ...
## - attr(*, "dimnames")=List of 2
## ..$ : chr [1:165] "1" "2" "3" "4" ...
## ..$ : chr [1:1107] "X1" "X2" "X3" "X4" ...
B. The fingerprint predictors indicate the presence or absence of substructures of a molecule and are often sparse meaning that relatively few of the molecules contain each substructure. Filter out the predictors that have low frequencies using the nearZeroVar function from the caret package. How many predictors are left for modeling?
library(caret)
near.zero.var <- nearZeroVar(fingerprints)
near.zero.var
## [1] 7 8 9 10 13 14 17 18 19 22 23 24 30 31
## [15] 32 33 34 45 77 81 82 83 84 85 89 90 91 92
## [29] 95 100 104 105 106 107 109 110 112 113 114 115 116 117
## [43] 119 120 122 123 124 128 131 132 134 135 136 137 139 140
## [57] 144 145 147 148 149 151 155 160 161 164 165 166 216 217
## [71] 218 219 220 222 243 252 259 273 275 277 282 283 287 288
## [85] 289 292 346 347 348 349 350 351 352 353 354 363 364 365
## [99] 369 375 379 384 391 393 397 399 402 404 405 407 408 409
## [113] 410 411 412 413 414 415 416 417 418 419 420 421 422 423
## [127] 424 425 426 427 428 429 430 431 432 433 434 435 436 437
## [141] 438 439 440 441 442 443 444 445 446 447 448 449 450 451
## [155] 452 453 454 455 456 457 458 459 460 461 462 463 464 465
## [169] 466 467 468 469 470 471 472 473 474 475 476 477 478 479
## [183] 480 481 482 483 484 485 486 487 488 489 490 491 492 493
## [197] 494 495 498 500 501 502 513 523 525 526 527 528 530 531
## [211] 532 533 534 535 536 537 538 539 540 541 542 543 544 545
## [225] 546 547 548 550 552 555 562 563 564 566 567 569 570 572
## [239] 575 578 579 580 581 582 583 584 585 586 587 588 589 596
## [253] 605 606 607 608 609 610 611 612 614 615 616 617 618 619
## [267] 620 622 623 624 625 626 627 628 629 630 631 632 633 634
## [281] 635 636 637 638 639 640 641 642 643 644 645 646 647 648
## [295] 649 650 651 652 653 654 655 656 657 658 659 660 661 662
## [309] 663 664 665 666 667 668 669 670 671 672 673 674 675 676
## [323] 677 678 680 681 682 683 684 685 686 687 688 689 690 691
## [337] 692 693 694 695 696 697 706 707 708 709 710 711 712 713
## [351] 714 715 716 717 718 720 721 722 723 724 725 726 727 728
## [365] 729 730 731 734 735 736 737 738 739 740 741 742 743 744
## [379] 745 746 747 748 749 756 757 758 759 760 761 762 763 764
## [393] 765 766 767 768 769 770 771 772 777 778 779 781 783 784
## [407] 785 786 787 788 789 790 791 794 796 797 799 802 803 804
## [421] 807 808 809 810 811 814 815 816 817 818 819 820 821 822
## [435] 823 824 825 826 827 828 829 830 831 832 833 834 835 836
## [449] 837 838 839 840 841 842 843 844 845 846 847 848 849 850
## [463] 851 852 853 854 855 856 857 858 859 860 861 862 863 864
## [477] 865 866 867 868 869 870 871 872 873 874 875 876 877 878
## [491] 879 880 881 882 883 884 885 886 887 888 889 890 891 892
## [505] 893 894 895 896 897 898 899 900 901 902 903 904 905 906
## [519] 907 908 909 910 911 912 913 914 915 916 917 918 919 920
## [533] 921 922 923 924 925 926 927 928 929 930 931 932 933 934
## [547] 935 936 937 938 939 940 941 942 943 944 945 946 947 948
## [561] 949 950 951 952 953 954 955 956 957 958 959 960 961 962
## [575] 963 964 965 966 967 968 969 970 971 972 973 974 975 976
## [589] 977 978 979 980 981 982 983 984 985 986 987 988 989 990
## [603] 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004
## [617] 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018
## [631] 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032
## [645] 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046
## [659] 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060
## [673] 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074
## [687] 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088
## [701] 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102
## [715] 1103 1104 1105 1106 1107
length(near.zero.var)
## [1] 719
fingerprints.nz <- fingerprints[,-near.zero.var]
str(fingerprints.nz)
## num [1:165, 1:388] 0 0 0 0 0 0 0 0 0 0 ...
## - attr(*, "dimnames")=List of 2
## ..$ : chr [1:165] "1" "2" "3" "4" ...
## ..$ : chr [1:388] "X1" "X2" "X3" "X4" ...
Out of 719 variables that have zero or near zero variance predictors, there are 388 variables left.
C. Split the data into a training and a test set, pre-process the data, and tune a PLS model. How many latent variables are optimal and what is the corresponding resampled estimate of \(R^2\) ?
fingerprints.df <- as.data.frame(fingerprints)
fingerprints.df$permeability <- as.vector(permeability)
z.var <- nearZeroVar(fingerprints.df)
fingerprints.df <- fingerprints.df[,-z.var]
sample.size <- floor(0.8 * nrow(fingerprints.df))
train_ind <- sample(seq_len(nrow(fingerprints.df)), size = sample.size)
trainingData <- fingerprints.df[train_ind,]
testingData <- fingerprints.df[-train_ind,]
Using plsr function for 10 fold cross validation
library(pls)
plsFit <- plsr(permeability ~ ., data = fingerprints.df)
summary(plsFit)
## Data: X dimension: 165 388
## Y dimension: 165 1
## Fit method: kernelpls
## Number of components considered: 164
## TRAINING: % variance explained
## 1 comps 2 comps 3 comps 4 comps 5 comps 6 comps
## X 29.17 42.33 49.34 52.71 60.37 66.18
## permeability 26.69 47.90 54.71 62.75 66.37 70.18
## 7 comps 8 comps 9 comps 10 comps 11 comps 12 comps
## X 68.87 71.80 73.87 75.68 78.30 80.31
## permeability 72.85 74.36 76.34 78.09 79.14 80.26
## 13 comps 14 comps 15 comps 16 comps 17 comps 18 comps
## X 82.70 84.13 85.17 86.76 87.79 89.11
## permeability 81.06 82.04 83.08 83.80 84.79 85.39
## 19 comps 20 comps 21 comps 22 comps 23 comps 24 comps
## X 90.06 90.8 91.34 91.91 92.52 92.84
## permeability 85.94 86.4 86.92 87.29 87.59 88.11
## 25 comps 26 comps 27 comps 28 comps 29 comps 30 comps
## X 93.39 93.88 94.14 94.48 94.70 94.95
## permeability 88.44 88.77 89.15 89.39 89.72 89.94
## 31 comps 32 comps 33 comps 34 comps 35 comps 36 comps
## X 95.18 95.45 95.69 95.93 96.19 96.44
## permeability 90.09 90.20 90.32 90.44 90.52 90.59
## 37 comps 38 comps 39 comps 40 comps 41 comps 42 comps
## X 96.64 96.84 97.08 97.22 97.38 97.58
## permeability 90.68 90.74 90.78 90.84 90.88 90.90
## 43 comps 44 comps 45 comps 46 comps 47 comps 48 comps
## X 97.72 97.85 97.97 98.09 98.17 98.29
## permeability 90.93 90.96 90.99 91.01 91.03 91.04
## 49 comps 50 comps 51 comps 52 comps 53 comps 54 comps
## X 98.40 98.49 98.55 98.64 98.7 98.79
## permeability 91.06 91.07 91.08 91.09 91.1 91.10
## 55 comps 56 comps 57 comps 58 comps 59 comps 60 comps
## X 98.85 98.91 98.97 99.02 99.10 99.15
## permeability 91.11 91.11 91.12 91.13 91.13 91.14
## 61 comps 62 comps 63 comps 64 comps 65 comps 66 comps
## X 99.19 99.25 99.29 99.34 99.38 99.43
## permeability 91.14 91.15 91.15 91.16 91.16 91.16
## 67 comps 68 comps 69 comps 70 comps 71 comps 72 comps
## X 99.46 99.49 99.53 99.57 99.60 99.62
## permeability 91.16 91.16 91.16 91.16 91.16 91.16
## 73 comps 74 comps 75 comps 76 comps 77 comps 78 comps
## X 99.64 99.67 99.69 99.72 99.74 99.76
## permeability 91.17 91.17 91.17 91.17 91.17 91.17
## 79 comps 80 comps 81 comps 82 comps 83 comps 84 comps
## X 99.78 99.79 99.80 99.82 99.83 99.86
## permeability 91.17 91.17 91.17 91.17 91.17 91.17
## 85 comps 86 comps 87 comps 88 comps 89 comps 90 comps
## X 99.87 99.89 99.90 99.90 99.91 99.92
## permeability 91.17 91.17 91.17 91.17 91.17 91.17
## 91 comps 92 comps 93 comps 94 comps 95 comps 96 comps
## X 99.93 99.94 99.95 99.95 99.96 99.97
## permeability 91.17 91.17 91.17 91.17 91.17 91.17
## 97 comps 98 comps 99 comps 100 comps 101 comps
## X 99.97 99.98 99.98 99.99 99.99
## permeability 91.17 91.17 91.17 91.17 91.17
## 102 comps 103 comps 104 comps 105 comps 106 comps
## X 99.99 100.00 100.00 100.00 100.25
## permeability 91.17 91.17 91.17 91.17 91.17
## 107 comps 108 comps 109 comps 110 comps 111 comps
## X 100.51 100.76 101.02 101.27 101.52
## permeability 91.17 91.17 91.17 91.17 91.17
## 112 comps 113 comps 114 comps 115 comps 116 comps
## X 101.78 102.03 102.28 102.54 102.79
## permeability 91.17 91.17 91.17 91.17 91.17
## 117 comps 118 comps 119 comps 120 comps 121 comps
## X 103.05 103.30 103.55 103.81 104.06
## permeability 91.17 91.17 91.17 91.17 91.17
## 122 comps 123 comps 124 comps 125 comps 126 comps
## X 104.32 104.57 104.82 105.08 105.33
## permeability 91.17 91.17 91.17 91.17 91.17
## 127 comps 128 comps 129 comps 130 comps 131 comps
## X 105.58 105.84 106.09 106.35 106.60
## permeability 91.17 91.17 91.17 91.17 91.17
## 132 comps 133 comps 134 comps 135 comps 136 comps
## X 106.85 107.11 107.36 107.62 107.87
## permeability 91.17 91.17 91.17 91.17 91.17
## 137 comps 138 comps 139 comps 140 comps 141 comps
## X 108.12 108.38 108.63 108.88 109.14
## permeability 91.17 91.17 91.17 91.17 91.17
## 142 comps 143 comps 144 comps 145 comps 146 comps
## X 109.39 109.64 109.90 110.15 110.41
## permeability 91.17 91.17 91.17 91.17 91.17
## 147 comps 148 comps 149 comps 150 comps 151 comps
## X 110.66 110.91 111.17 111.42 111.68
## permeability 91.17 91.17 91.17 91.17 91.17
## 152 comps 153 comps 154 comps 155 comps 156 comps
## X 111.93 112.18 112.44 112.69 112.94
## permeability 91.17 91.17 91.17 91.17 91.17
## 157 comps 158 comps 159 comps 160 comps 161 comps
## X 113.20 113.45 113.70 113.96 114.21
## permeability 91.17 91.17 91.17 91.17 91.17
## 162 comps 163 comps 164 comps
## X 114.47 114.72 114.97
## permeability 91.17 91.17 91.17
Using 10 fold validation
ctrl <- trainControl(method= "cv", number = 10)
X <- testingData[1:(length(testingData)-1)]
plsMethod <- train(X, testingData$permeability,
method = "pls",
tuneLength = 20,
trControl = ctrl,
preProc = c("center", "scale"))
plsMethod
## Partial Least Squares
##
## 33 samples
## 388 predictors
##
## Pre-processing: centered (388), scaled (388)
## Resampling: Cross-Validated (10 fold)
## Summary of sample sizes: 29, 29, 31, 29, 29, 31, ...
## Resampling results across tuning parameters:
##
## ncomp RMSE Rsquared MAE
## 1 13.53500 0.6279140 11.815554
## 2 11.48603 0.7465416 9.584754
## 3 12.05091 0.7645225 10.242449
## 4 12.73615 0.7378955 10.697076
## 5 13.50482 0.6745886 11.229611
## 6 13.30592 0.6981185 11.062638
## 7 13.45536 0.6647987 11.319391
## 8 13.38612 0.6357261 11.208354
## 9 13.22214 0.6358357 11.007189
## 10 13.01841 0.6516851 10.817281
## 11 13.37235 0.6332951 11.052051
## 12 13.60255 0.6305407 11.266179
## 13 14.15938 0.6176653 11.706129
## 14 14.78292 0.6024111 12.267921
## 15 15.19304 0.5823713 12.734724
## 16 15.43440 0.5818953 12.915601
## 17 15.50335 0.5744263 12.941436
## 18 15.66640 0.5710076 13.065728
## 19 15.86698 0.5755740 13.117679
## 20 15.81150 0.5831844 12.972991
##
## RMSE was used to select the optimal model using the smallest value.
## The final value used for the model was ncomp = 2.
plot(plsMethod)
D. Predict the response for the test set. What is the test set estimate of \(R^2\) ?
# Using our testing data set
# Splitting the test set into X predictors and Y response variable
X_test <- testingData[1:(length(testingData)-1)]
Y_test <- testingData[length(testingData)]
perm_predict <- predict(plsMethod, X_test , ncomp = 2)
# Displaying the head of the predicted values of X
print(perm_predict)
## [1] 27.73448622 -0.95184101 30.41234385 11.14259126 9.49469674
## [6] 8.37481643 4.95711940 9.93082123 3.17458663 4.53379284
## [11] 13.65721067 14.33205659 41.67577587 12.78377117 12.17474011
## [16] 12.79893565 12.00669542 32.12378038 15.04759864 -4.05436210
## [21] 29.85750071 -1.04089483 17.69665156 13.73476941 49.58358833
## [26] 0.02398996 -0.66669017 10.60281460 4.09101907 -0.81166428
## [31] 12.90579945 5.16175920 -3.88575902
RMSE <- function(m, o) {
sqrt(mean((m-o)^2))
}
print(RMSE)
## function(m, o) {
## sqrt(mean((m-o)^2))
## }
pls.eval = data.frame(obs = Y_test$permeability, pred=perm_predict)
defaultSummary(pls.eval)
## RMSE Rsquared MAE
## 9.288478 0.650053 6.477876
plot(Y_test$permeability, perm_predict, main="Test Dataset", xlab="Observed", ylab="Tuned PLS Predicted")
abline(0,1,col="red")
E. Try building other models discussed in this chapter. Do you have better predictive performance?
X_train <- trainingData[1:(length(trainingData)-1)]
Y_train <- trainingData$permeability
lm_perm <- train(X_train, Y_train,
method = "lm",
trControl = ctrl,
preProc = c("center", "scale"))
summary(lm_perm)
##
## Call:
## lm(formula = .outcome ~ ., data = dat)
##
## Residuals:
## Min 1Q Median 3Q Max
## -15.117 -1.667 0.000 1.532 19.632
##
## Coefficients: (295 not defined because of singularities)
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 12.201326 0.730440 16.704 <2e-16 ***
## X1 13.834999 11.733672 1.179 0.2457
## X2 21.464236 20.648833 1.039 0.3051
## X3 NA NA NA NA
## X4 NA NA NA NA
## X5 NA NA NA NA
## X6 3.656612 1.382918 2.644 0.0118 *
## X11 4.668439 6.489324 0.719 0.4763
## X12 5.489597 7.323174 0.750 0.4581
## X15 -0.538502 8.811724 -0.061 0.9516
## X16 8.500228 6.936305 1.225 0.2279
## X20 NA NA NA NA
## X21 NA NA NA NA
## X25 -6.059902 4.365420 -1.388 0.1732
## X26 NA NA NA NA
## X27 NA NA NA NA
## X28 NA NA NA NA
## X29 NA NA NA NA
## X35 -5.658394 7.653079 -0.739 0.4642
## X36 -1.235050 4.741452 -0.260 0.7959
## X37 NA NA NA NA
## X38 NA NA NA NA
## X39 NA NA NA NA
## X40 NA NA NA NA
## X41 -13.510669 10.486201 -1.288 0.2054
## X42 NA NA NA NA
## X43 NA NA NA NA
## X44 NA NA NA NA
## X46 NA NA NA NA
## X47 NA NA NA NA
## X48 -18.573614 22.609233 -0.822 0.4165
## X49 NA NA NA NA
## X50 NA NA NA NA
## X51 NA NA NA NA
## X52 NA NA NA NA
## X53 NA NA NA NA
## X54 NA NA NA NA
## X55 NA NA NA NA
## X56 NA NA NA NA
## X57 NA NA NA NA
## X58 NA NA NA NA
## X59 NA NA NA NA
## X60 NA NA NA NA
## X61 NA NA NA NA
## X62 NA NA NA NA
## X63 NA NA NA NA
## X64 NA NA NA NA
## X65 NA NA NA NA
## X66 NA NA NA NA
## X67 NA NA NA NA
## X68 NA NA NA NA
## X69 NA NA NA NA
## X70 NA NA NA NA
## X71 NA NA NA NA
## X72 NA NA NA NA
## X73 NA NA NA NA
## X74 NA NA NA NA
## X75 NA NA NA NA
## X76 NA NA NA NA
## X78 NA NA NA NA
## X79 NA NA NA NA
## X80 NA NA NA NA
## X86 21.064066 23.300738 0.904 0.3717
## X87 -42.008766 41.743498 -1.006 0.3206
## X88 -36.116832 21.512387 -1.679 0.1014
## X93 -1.975131 1.371537 -1.440 0.1580
## X94 3.348386 10.076152 0.332 0.7415
## X96 15.593719 11.444751 1.363 0.1811
## X97 NA NA NA NA
## X98 25.212296 24.533139 1.028 0.3106
## X99 2.422645 3.964607 0.611 0.5448
## X101 NA NA NA NA
## X102 54.647263 29.939438 1.825 0.0758 .
## X103 11.959718 11.743476 1.018 0.3149
## X108 NA NA NA NA
## X111 -28.695241 25.604586 -1.121 0.2694
## X118 -3.338429 3.917884 -0.852 0.3995
## X121 9.429614 16.655211 0.566 0.5746
## X125 4.802593 21.142408 0.227 0.8215
## X126 2.528163 7.353497 0.344 0.7329
## X127 NA NA NA NA
## X129 NA NA NA NA
## X130 NA NA NA NA
## X133 NA NA NA NA
## X138 NA NA NA NA
## X141 2.005725 1.635770 1.226 0.2277
## X142 NA NA NA NA
## X143 NA NA NA NA
## X146 -2.003843 12.407064 -0.162 0.8725
## X150 -19.973876 20.505723 -0.974 0.3362
## X152 NA NA NA NA
## X153 24.269598 26.848811 0.904 0.3717
## X154 NA NA NA NA
## X156 0.008679 9.061822 0.001 0.9992
## X157 21.682694 18.756074 1.156 0.2549
## X158 -11.963053 11.473862 -1.043 0.3037
## X159 6.564412 9.016197 0.728 0.4710
## X162 NA NA NA NA
## X163 NA NA NA NA
## X167 NA NA NA NA
## X168 NA NA NA NA
## X169 NA NA NA NA
## X170 NA NA NA NA
## X171 NA NA NA NA
## X172 NA NA NA NA
## X173 NA NA NA NA
## X174 NA NA NA NA
## X175 NA NA NA NA
## X176 NA NA NA NA
## X177 NA NA NA NA
## X178 NA NA NA NA
## X179 NA NA NA NA
## X180 NA NA NA NA
## X181 NA NA NA NA
## X182 -6.326705 13.016808 -0.486 0.6297
## X183 NA NA NA NA
## X184 NA NA NA NA
## X185 NA NA NA NA
## X186 NA NA NA NA
## X187 NA NA NA NA
## X188 NA NA NA NA
## X189 NA NA NA NA
## X190 NA NA NA NA
## X191 NA NA NA NA
## X192 NA NA NA NA
## X193 NA NA NA NA
## X194 NA NA NA NA
## X195 NA NA NA NA
## X196 NA NA NA NA
## X197 NA NA NA NA
## X198 NA NA NA NA
## X199 NA NA NA NA
## X200 NA NA NA NA
## X201 NA NA NA NA
## X202 NA NA NA NA
## X203 NA NA NA NA
## X204 NA NA NA NA
## X205 NA NA NA NA
## X206 NA NA NA NA
## X207 NA NA NA NA
## X208 NA NA NA NA
## X209 NA NA NA NA
## X210 NA NA NA NA
## X211 NA NA NA NA
## X212 NA NA NA NA
## X213 NA NA NA NA
## X214 NA NA NA NA
## X215 NA NA NA NA
## X221 -2.074925 12.304227 -0.169 0.8670
## X223 NA NA NA NA
## X224 NA NA NA NA
## X225 -1.910817 12.779890 -0.150 0.8819
## X226 -10.378450 5.874072 -1.767 0.0853 .
## X227 NA NA NA NA
## X228 NA NA NA NA
## X229 25.625690 23.414370 1.094 0.2806
## X230 -5.810589 28.839763 -0.201 0.8414
## X231 2.784588 6.126463 0.455 0.6520
## X232 NA NA NA NA
## X233 NA NA NA NA
## X234 NA NA NA NA
## X235 -16.771504 9.972703 -1.682 0.1008
## X236 -32.200308 24.486598 -1.315 0.1964
## X237 14.563627 13.824897 1.053 0.2988
## X238 16.095068 12.759270 1.261 0.2148
## X239 NA NA NA NA
## X240 NA NA NA NA
## X241 -44.085201 33.171029 -1.329 0.1918
## X242 -0.079533 17.417917 -0.005 0.9964
## X244 NA NA NA NA
## X245 NA NA NA NA
## X246 NA NA NA NA
## X247 10.842707 16.812896 0.645 0.5229
## X248 7.504517 19.287310 0.389 0.6994
## X249 NA NA NA NA
## X250 NA NA NA NA
## X251 NA NA NA NA
## X253 NA NA NA NA
## X254 NA NA NA NA
## X255 NA NA NA NA
## X256 NA NA NA NA
## X257 -20.535160 24.074728 -0.853 0.3990
## X258 -0.911920 4.394725 -0.208 0.8367
## X260 -1.029639 27.906753 -0.037 0.9708
## X261 NA NA NA NA
## X262 NA NA NA NA
## X263 NA NA NA NA
## X264 NA NA NA NA
## X265 NA NA NA NA
## X266 NA NA NA NA
## X267 NA NA NA NA
## X268 NA NA NA NA
## X269 NA NA NA NA
## X270 6.329786 17.610793 0.359 0.7213
## X271 NA NA NA NA
## X272 0.221458 9.603005 0.023 0.9817
## X274 NA NA NA NA
## X276 NA NA NA NA
## X278 0.044330 1.454983 0.030 0.9759
## X279 NA NA NA NA
## X280 0.883086 4.222721 0.209 0.8355
## X281 NA NA NA NA
## X284 NA NA NA NA
## X285 NA NA NA NA
## X286 NA NA NA NA
## X290 NA NA NA NA
## X291 NA NA NA NA
## X293 2.159571 4.460649 0.484 0.6311
## X294 6.031601 10.679083 0.565 0.5755
## X295 -8.460028 7.968742 -1.062 0.2951
## X296 NA NA NA NA
## X297 NA NA NA NA
## X298 NA NA NA NA
## X299 NA NA NA NA
## X300 NA NA NA NA
## X301 NA NA NA NA
## X302 NA NA NA NA
## X303 NA NA NA NA
## X304 NA NA NA NA
## X305 NA NA NA NA
## X306 -1.766998 3.442560 -0.513 0.6107
## X307 NA NA NA NA
## X308 NA NA NA NA
## X309 NA NA NA NA
## X310 NA NA NA NA
## X311 0.976782 7.740414 0.126 0.9002
## X312 15.299292 21.657918 0.706 0.4842
## X313 -19.661322 12.191072 -1.613 0.1151
## X314 NA NA NA NA
## X315 -0.108279 3.562865 -0.030 0.9759
## X316 -2.151000 9.576013 -0.225 0.8235
## X317 NA NA NA NA
## X318 NA NA NA NA
## X319 11.378339 18.196364 0.625 0.5355
## X320 -7.026621 12.257690 -0.573 0.5699
## X321 NA NA NA NA
## X322 -19.815227 19.529544 -1.015 0.3167
## X323 NA NA NA NA
## X324 NA NA NA NA
## X325 NA NA NA NA
## X326 NA NA NA NA
## X327 NA NA NA NA
## X328 NA NA NA NA
## X329 -5.998249 7.810725 -0.768 0.4473
## X330 NA NA NA NA
## X331 NA NA NA NA
## X332 NA NA NA NA
## X333 NA NA NA NA
## X334 4.506381 4.824594 0.934 0.3562
## X335 NA NA NA NA
## X336 NA NA NA NA
## X337 -8.396982 6.468497 -1.298 0.2021
## X338 21.485246 10.966360 1.959 0.0575 .
## X339 NA NA NA NA
## X340 -7.392453 4.995863 -1.480 0.1472
## X341 NA NA NA NA
## X342 16.083789 7.984789 2.014 0.0511 .
## X343 NA NA NA NA
## X344 NA NA NA NA
## X345 -1.837614 1.861091 -0.987 0.3297
## X355 NA NA NA NA
## X356 NA NA NA NA
## X357 -16.973893 10.955279 -1.549 0.1296
## X358 -12.339087 5.140001 -2.401 0.0214 *
## X359 -10.397825 8.123597 -1.280 0.2083
## X360 NA NA NA NA
## X361 -2.073047 3.234586 -0.641 0.5254
## X362 NA NA NA NA
## X366 NA NA NA NA
## X367 NA NA NA NA
## X368 NA NA NA NA
## X370 21.518380 18.434124 1.167 0.2504
## X371 -6.927469 17.694641 -0.392 0.6976
## X372 NA NA NA NA
## X373 NA NA NA NA
## X374 -4.575232 4.250700 -1.076 0.2886
## X376 0.245594 5.318452 0.046 0.9634
## X377 NA NA NA NA
## X378 NA NA NA NA
## X380 NA NA NA NA
## X381 NA NA NA NA
## X382 NA NA NA NA
## X383 NA NA NA NA
## X385 NA NA NA NA
## X386 NA NA NA NA
## X387 NA NA NA NA
## X388 NA NA NA NA
## X389 NA NA NA NA
## X390 NA NA NA NA
## X392 NA NA NA NA
## X394 NA NA NA NA
## X395 NA NA NA NA
## X396 NA NA NA NA
## X398 NA NA NA NA
## X400 NA NA NA NA
## X401 NA NA NA NA
## X403 NA NA NA NA
## X406 NA NA NA NA
## X496 -6.155974 11.466714 -0.537 0.5945
## X497 4.086848 4.271574 0.957 0.3447
## X499 NA NA NA NA
## X503 3.020280 7.545785 0.400 0.6912
## X504 NA NA NA NA
## X505 NA NA NA NA
## X506 NA NA NA NA
## X507 -7.920595 11.025582 -0.718 0.4769
## X508 NA NA NA NA
## X509 NA NA NA NA
## X510 NA NA NA NA
## X511 NA NA NA NA
## X512 NA NA NA NA
## X514 NA NA NA NA
## X515 NA NA NA NA
## X516 NA NA NA NA
## X517 NA NA NA NA
## X518 NA NA NA NA
## X519 NA NA NA NA
## X520 NA NA NA NA
## X521 NA NA NA NA
## X522 NA NA NA NA
## X524 NA NA NA NA
## X529 NA NA NA NA
## X549 NA NA NA NA
## X551 NA NA NA NA
## X553 NA NA NA NA
## X554 NA NA NA NA
## X556 NA NA NA NA
## X557 NA NA NA NA
## X558 NA NA NA NA
## X559 NA NA NA NA
## X560 NA NA NA NA
## X561 NA NA NA NA
## X565 NA NA NA NA
## X568 NA NA NA NA
## X571 NA NA NA NA
## X573 NA NA NA NA
## X574 NA NA NA NA
## X576 NA NA NA NA
## X577 NA NA NA NA
## X590 NA NA NA NA
## X591 NA NA NA NA
## X592 6.453555 3.236564 1.994 0.0534 .
## X593 NA NA NA NA
## X594 NA NA NA NA
## X595 NA NA NA NA
## X597 NA NA NA NA
## X598 NA NA NA NA
## X599 NA NA NA NA
## X600 NA NA NA NA
## X601 NA NA NA NA
## X602 NA NA NA NA
## X603 NA NA NA NA
## X604 NA NA NA NA
## X613 NA NA NA NA
## X621 NA NA NA NA
## X679 NA NA NA NA
## X698 1.142567 2.637509 0.433 0.6673
## X699 NA NA NA NA
## X700 NA NA NA NA
## X701 NA NA NA NA
## X702 NA NA NA NA
## X703 NA NA NA NA
## X704 0.843834 3.781683 0.223 0.8246
## X705 NA NA NA NA
## X719 NA NA NA NA
## X732 -0.410655 1.625016 -0.253 0.8019
## X733 NA NA NA NA
## X750 0.823378 2.614192 0.315 0.7545
## X751 NA NA NA NA
## X752 NA NA NA NA
## X753 NA NA NA NA
## X754 NA NA NA NA
## X755 NA NA NA NA
## X773 NA NA NA NA
## X774 NA NA NA NA
## X775 NA NA NA NA
## X776 NA NA NA NA
## X780 NA NA NA NA
## X782 NA NA NA NA
## X792 NA NA NA NA
## X793 NA NA NA NA
## X795 NA NA NA NA
## X798 NA NA NA NA
## X800 NA NA NA NA
## X801 NA NA NA NA
## X805 NA NA NA NA
## X806 NA NA NA NA
## X812 NA NA NA NA
## X813 NA NA NA NA
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 8.392 on 38 degrees of freedom
## Multiple R-squared: 0.9155, Adjusted R-squared: 0.7086
## F-statistic: 4.426 on 93 and 38 DF, p-value: 9.624e-07
lm_predict <- predict(lm_perm, X_test)
## Warning in predict.lm(modelFit, newdata): prediction from a rank-deficient
## fit may be misleading
pls.eval = data.frame(obs = Y_test$permeability, pred=lm_predict)
defaultSummary(pls.eval)
## RMSE Rsquared MAE
## 33.25844030 0.07634425 21.78797139
library(elasticnet)
ridgeGrid <- data.frame(.lambda = seq(0, .1, length=15))
ridge_perm <- train(X_train, Y_train,
method = "ridge",
trControl = ctrl,
tuneGrid = ridgeGrid,
preProc = c("center", "scale"))
summary(ridge_perm)
## Length Class Mode
## call 4 -none- call
## actions 205 -none- list
## allset 388 -none- numeric
## beta.pure 79540 -none- numeric
## vn 388 -none- character
## mu 1 -none- numeric
## normx 388 -none- numeric
## meanx 388 -none- numeric
## lambda 1 -none- numeric
## L1norm 205 -none- numeric
## penalty 205 -none- numeric
## df 205 -none- numeric
## Cp 205 -none- numeric
## sigma2 1 -none- numeric
## xNames 388 -none- character
## problemType 1 -none- character
## tuneValue 1 data.frame list
## obsLevels 1 -none- logical
## param 0 -none- list
ridge.predict <- predict(ridge_perm, X_test, s = 1, mode = "fraction")
pls.eval = data.frame(obs = Y_test$permeability, pred=ridge.predict)
defaultSummary(pls.eval)
## RMSE Rsquared MAE
## 13.6366765 0.4755431 9.7728500
enetGrid <- expand.grid(.lambda = c(0, 0.01, .1),
.fraction = seq(.05, 1, length = 20))
enetTune <- train(X_train, Y_train,
method = "enet",
tuneGrid = enetGrid,
trControl = ctrl,
preProc = c("center", "scale"))
summary(enetTune)
## Length Class Mode
## call 4 -none- call
## actions 205 -none- list
## allset 388 -none- numeric
## beta.pure 79540 -none- numeric
## vn 388 -none- character
## mu 1 -none- numeric
## normx 388 -none- numeric
## meanx 388 -none- numeric
## lambda 1 -none- numeric
## L1norm 205 -none- numeric
## penalty 205 -none- numeric
## df 205 -none- numeric
## Cp 205 -none- numeric
## sigma2 1 -none- numeric
## xNames 388 -none- character
## problemType 1 -none- character
## tuneValue 2 data.frame list
## obsLevels 1 -none- logical
## param 0 -none- list
lasso_predict <- predict(enetTune, X_test)
pls.eval = data.frame(obs = Y_test, pred=lasso_predict)
colnames(pls.eval) <- c("obs", "pred")
defaultSummary(pls.eval)
## RMSE Rsquared MAE
## 13.199244 0.339427 8.550310
F. Would you recommend any of your models to replace the permeability laboratory experiment?
PLS model has better R squared and RMSE value.
A. Start R and use these commands to load the data.
This data set contains information about a chemical manufacturing process, in which the goal is to understand the relationship between the process and the resulting final product yield. Raw material in this process is put through a sequence of 27 steps to generate the final pharmaceutical product. The starting material is generated from a biological unit and has a range of quality and characteristics. The objective in this project was to develop a model to predict percent yield of the manufacturing process. The data set consisted of 177 samples of biological material for which 57 characteristics were measured.
#help("ChemicalManufacturingProcess")
data("ChemicalManufacturingProcess")
ChemicalManufacturingProcess: a data frame with columns for the outcome (Yield) and the predictors (BiologicalMaterial01 though BiologicalMaterial12 and ManufacturingProcess01 though ManufacturingProcess45
summary(ChemicalManufacturingProcess)
## Yield BiologicalMaterial01 BiologicalMaterial02
## Min. :35.25 Min. :4.580 Min. :46.87
## 1st Qu.:38.75 1st Qu.:5.978 1st Qu.:52.68
## Median :39.97 Median :6.305 Median :55.09
## Mean :40.18 Mean :6.411 Mean :55.69
## 3rd Qu.:41.48 3rd Qu.:6.870 3rd Qu.:58.74
## Max. :46.34 Max. :8.810 Max. :64.75
##
## BiologicalMaterial03 BiologicalMaterial04 BiologicalMaterial05
## Min. :56.97 Min. : 9.38 Min. :13.24
## 1st Qu.:64.98 1st Qu.:11.24 1st Qu.:17.23
## Median :67.22 Median :12.10 Median :18.49
## Mean :67.70 Mean :12.35 Mean :18.60
## 3rd Qu.:70.43 3rd Qu.:13.22 3rd Qu.:19.90
## Max. :78.25 Max. :23.09 Max. :24.85
##
## BiologicalMaterial06 BiologicalMaterial07 BiologicalMaterial08
## Min. :40.60 Min. :100.0 Min. :15.88
## 1st Qu.:46.05 1st Qu.:100.0 1st Qu.:17.06
## Median :48.46 Median :100.0 Median :17.51
## Mean :48.91 Mean :100.0 Mean :17.49
## 3rd Qu.:51.34 3rd Qu.:100.0 3rd Qu.:17.88
## Max. :59.38 Max. :100.8 Max. :19.14
##
## BiologicalMaterial09 BiologicalMaterial10 BiologicalMaterial11
## Min. :11.44 Min. :1.770 Min. :135.8
## 1st Qu.:12.60 1st Qu.:2.460 1st Qu.:143.8
## Median :12.84 Median :2.710 Median :146.1
## Mean :12.85 Mean :2.801 Mean :147.0
## 3rd Qu.:13.13 3rd Qu.:2.990 3rd Qu.:149.6
## Max. :14.08 Max. :6.870 Max. :158.7
##
## BiologicalMaterial12 ManufacturingProcess01 ManufacturingProcess02
## Min. :18.35 Min. : 0.00 Min. : 0.00
## 1st Qu.:19.73 1st Qu.:10.80 1st Qu.:19.30
## Median :20.12 Median :11.40 Median :21.00
## Mean :20.20 Mean :11.21 Mean :16.68
## 3rd Qu.:20.75 3rd Qu.:12.15 3rd Qu.:21.50
## Max. :22.21 Max. :14.10 Max. :22.50
## NA's :1 NA's :3
## ManufacturingProcess03 ManufacturingProcess04 ManufacturingProcess05
## Min. :1.47 Min. :911.0 Min. : 923.0
## 1st Qu.:1.53 1st Qu.:928.0 1st Qu.: 986.8
## Median :1.54 Median :934.0 Median : 999.2
## Mean :1.54 Mean :931.9 Mean :1001.7
## 3rd Qu.:1.55 3rd Qu.:936.0 3rd Qu.:1008.9
## Max. :1.60 Max. :946.0 Max. :1175.3
## NA's :15 NA's :1 NA's :1
## ManufacturingProcess06 ManufacturingProcess07 ManufacturingProcess08
## Min. :203.0 Min. :177.0 Min. :177.0
## 1st Qu.:205.7 1st Qu.:177.0 1st Qu.:177.0
## Median :206.8 Median :177.0 Median :178.0
## Mean :207.4 Mean :177.5 Mean :177.6
## 3rd Qu.:208.7 3rd Qu.:178.0 3rd Qu.:178.0
## Max. :227.4 Max. :178.0 Max. :178.0
## NA's :2 NA's :1 NA's :1
## ManufacturingProcess09 ManufacturingProcess10 ManufacturingProcess11
## Min. :38.89 Min. : 7.500 Min. : 7.500
## 1st Qu.:44.89 1st Qu.: 8.700 1st Qu.: 9.000
## Median :45.73 Median : 9.100 Median : 9.400
## Mean :45.66 Mean : 9.179 Mean : 9.386
## 3rd Qu.:46.52 3rd Qu.: 9.550 3rd Qu.: 9.900
## Max. :49.36 Max. :11.600 Max. :11.500
## NA's :9 NA's :10
## ManufacturingProcess12 ManufacturingProcess13 ManufacturingProcess14
## Min. : 0.0 Min. :32.10 Min. :4701
## 1st Qu.: 0.0 1st Qu.:33.90 1st Qu.:4828
## Median : 0.0 Median :34.60 Median :4856
## Mean : 857.8 Mean :34.51 Mean :4854
## 3rd Qu.: 0.0 3rd Qu.:35.20 3rd Qu.:4882
## Max. :4549.0 Max. :38.60 Max. :5055
## NA's :1 NA's :1
## ManufacturingProcess15 ManufacturingProcess16 ManufacturingProcess17
## Min. :5904 Min. : 0 Min. :31.30
## 1st Qu.:6010 1st Qu.:4561 1st Qu.:33.50
## Median :6032 Median :4588 Median :34.40
## Mean :6039 Mean :4566 Mean :34.34
## 3rd Qu.:6061 3rd Qu.:4619 3rd Qu.:35.10
## Max. :6233 Max. :4852 Max. :40.00
##
## ManufacturingProcess18 ManufacturingProcess19 ManufacturingProcess20
## Min. : 0 Min. :5890 Min. : 0
## 1st Qu.:4813 1st Qu.:6001 1st Qu.:4553
## Median :4835 Median :6022 Median :4582
## Mean :4810 Mean :6028 Mean :4556
## 3rd Qu.:4862 3rd Qu.:6050 3rd Qu.:4610
## Max. :4971 Max. :6146 Max. :4759
##
## ManufacturingProcess21 ManufacturingProcess22 ManufacturingProcess23
## Min. :-1.8000 Min. : 0.000 Min. :0.000
## 1st Qu.:-0.6000 1st Qu.: 3.000 1st Qu.:2.000
## Median :-0.3000 Median : 5.000 Median :3.000
## Mean :-0.1642 Mean : 5.406 Mean :3.017
## 3rd Qu.: 0.0000 3rd Qu.: 8.000 3rd Qu.:4.000
## Max. : 3.6000 Max. :12.000 Max. :6.000
## NA's :1 NA's :1
## ManufacturingProcess24 ManufacturingProcess25 ManufacturingProcess26
## Min. : 0.000 Min. : 0 Min. : 0
## 1st Qu.: 4.000 1st Qu.:4832 1st Qu.:6020
## Median : 8.000 Median :4855 Median :6047
## Mean : 8.834 Mean :4828 Mean :6016
## 3rd Qu.:14.000 3rd Qu.:4877 3rd Qu.:6070
## Max. :23.000 Max. :4990 Max. :6161
## NA's :1 NA's :5 NA's :5
## ManufacturingProcess27 ManufacturingProcess28 ManufacturingProcess29
## Min. : 0 Min. : 0.000 Min. : 0.00
## 1st Qu.:4560 1st Qu.: 0.000 1st Qu.:19.70
## Median :4587 Median :10.400 Median :19.90
## Mean :4563 Mean : 6.592 Mean :20.01
## 3rd Qu.:4609 3rd Qu.:10.750 3rd Qu.:20.40
## Max. :4710 Max. :11.500 Max. :22.00
## NA's :5 NA's :5 NA's :5
## ManufacturingProcess30 ManufacturingProcess31 ManufacturingProcess32
## Min. : 0.000 Min. : 0.00 Min. :143.0
## 1st Qu.: 8.800 1st Qu.:70.10 1st Qu.:155.0
## Median : 9.100 Median :70.80 Median :158.0
## Mean : 9.161 Mean :70.18 Mean :158.5
## 3rd Qu.: 9.700 3rd Qu.:71.40 3rd Qu.:162.0
## Max. :11.200 Max. :72.50 Max. :173.0
## NA's :5 NA's :5
## ManufacturingProcess33 ManufacturingProcess34 ManufacturingProcess35
## Min. :56.00 Min. :2.300 Min. :463.0
## 1st Qu.:62.00 1st Qu.:2.500 1st Qu.:490.0
## Median :64.00 Median :2.500 Median :495.0
## Mean :63.54 Mean :2.494 Mean :495.6
## 3rd Qu.:65.00 3rd Qu.:2.500 3rd Qu.:501.5
## Max. :70.00 Max. :2.600 Max. :522.0
## NA's :5 NA's :5 NA's :5
## ManufacturingProcess36 ManufacturingProcess37 ManufacturingProcess38
## Min. :0.01700 Min. :0.000 Min. :0.000
## 1st Qu.:0.01900 1st Qu.:0.700 1st Qu.:2.000
## Median :0.02000 Median :1.000 Median :3.000
## Mean :0.01957 Mean :1.014 Mean :2.534
## 3rd Qu.:0.02000 3rd Qu.:1.300 3rd Qu.:3.000
## Max. :0.02200 Max. :2.300 Max. :3.000
## NA's :5
## ManufacturingProcess39 ManufacturingProcess40 ManufacturingProcess41
## Min. :0.000 Min. :0.00000 Min. :0.00000
## 1st Qu.:7.100 1st Qu.:0.00000 1st Qu.:0.00000
## Median :7.200 Median :0.00000 Median :0.00000
## Mean :6.851 Mean :0.01771 Mean :0.02371
## 3rd Qu.:7.300 3rd Qu.:0.00000 3rd Qu.:0.00000
## Max. :7.500 Max. :0.10000 Max. :0.20000
## NA's :1 NA's :1
## ManufacturingProcess42 ManufacturingProcess43 ManufacturingProcess44
## Min. : 0.00 Min. : 0.0000 Min. :0.000
## 1st Qu.:11.40 1st Qu.: 0.6000 1st Qu.:1.800
## Median :11.60 Median : 0.8000 Median :1.900
## Mean :11.21 Mean : 0.9119 Mean :1.805
## 3rd Qu.:11.70 3rd Qu.: 1.0250 3rd Qu.:1.900
## Max. :12.10 Max. :11.0000 Max. :2.100
##
## ManufacturingProcess45
## Min. :0.000
## 1st Qu.:2.100
## Median :2.200
## Mean :2.138
## 3rd Qu.:2.300
## Max. :2.600
##
B. A small percentage of cells in the predictor set contain missing values. Use an imputation function to fill in these missing values (e.g., see Sec 3.8).
Finding the missing valuesa with NA function.
is.na <- sort(colSums(is.na(ChemicalManufacturingProcess)))
is.na[is.na > 0]
## ManufacturingProcess01 ManufacturingProcess04 ManufacturingProcess05
## 1 1 1
## ManufacturingProcess07 ManufacturingProcess08 ManufacturingProcess12
## 1 1 1
## ManufacturingProcess14 ManufacturingProcess22 ManufacturingProcess23
## 1 1 1
## ManufacturingProcess24 ManufacturingProcess40 ManufacturingProcess41
## 1 1 1
## ManufacturingProcess06 ManufacturingProcess02 ManufacturingProcess25
## 2 3 5
## ManufacturingProcess26 ManufacturingProcess27 ManufacturingProcess28
## 5 5 5
## ManufacturingProcess29 ManufacturingProcess30 ManufacturingProcess31
## 5 5 5
## ManufacturingProcess33 ManufacturingProcess34 ManufacturingProcess35
## 5 5 5
## ManufacturingProcess36 ManufacturingProcess10 ManufacturingProcess11
## 5 9 10
## ManufacturingProcess03
## 15
Using KNN imputation function to fill the missing values
library(DMwR)
knn.df <- knnImputation(ChemicalManufacturingProcess[, 1:57], k = 3, meth = "weighAvg")
anyNA(knn.df)
## [1] FALSE
The above KNN imputation shows that there are no more missing values present in the dataset.
C. Split the data into a training and a test set, pre-process the data, and tune a model of your choice from this chapter. What is the optimal value of the performance metric?
near_zero <- nearZeroVar(knn.df)
knn.df <- knn.df[,-near_zero]
library(caret)
inTraining <- createDataPartition(knn.df$Yield, p = 0.80, list=FALSE)
training <- knn.df[ inTraining,]
testing <- knn.df[-inTraining,]
X <- training[,2:(length(training))]
Y <- training$Yield
X_test <- testing[,2:(length(testing))]
Y_test <- testing$Yield
Using 10 fold cross validation method
fitControl <- trainControl(## 10-fold CV
method = "repeatedcv",
number = 10,
## repeated ten times
repeats = 10)
Now, fitting the training data into the model
lm.fit <- train(Yield ~ ., data = training,
method = "lm",
trControl = fitControl,
preProcess = c("center", "scale"))
lm.fit
## Linear Regression
##
## 144 samples
## 55 predictor
##
## Pre-processing: centered (55), scaled (55)
## Resampling: Cross-Validated (10 fold, repeated 10 times)
## Summary of sample sizes: 129, 128, 130, 129, 130, 130, ...
## Resampling results:
##
## RMSE Rsquared MAE
## 9.29491 0.4044213 3.290245
##
## Tuning parameter 'intercept' was held constant at a value of TRUE
Using predict method to see the performance on the test dataset
lm.predict <- predict(lm.fit, X_test)
## Warning in predict.lm(modelFit, newdata): prediction from a rank-deficient
## fit may be misleading
pls.eval = data.frame(obs = Y_test, pred=lm.predict)
defaultSummary(pls.eval)
## RMSE Rsquared MAE
## 1.3198930 0.4653987 1.1201833
Regularization (Lasso):
enetGrid <- expand.grid(.lambda = c(0, 0.01, .1),
.fraction = seq(.05, 1, length = 20))
enetTune <- train(X, Y,
method = "enet",
tuneGrid = enetGrid,
trControl = ctrl,
preProc = c("center", "scale"))
enetTune
## Elasticnet
##
## 144 samples
## 55 predictor
##
## Pre-processing: centered (55), scaled (55)
## Resampling: Cross-Validated (10 fold)
## Summary of sample sizes: 128, 131, 130, 128, 130, 130, ...
## Resampling results across tuning parameters:
##
## lambda fraction RMSE Rsquared MAE
## 0.00 0.05 1.267568 0.5773498 1.0333933
## 0.00 0.10 1.245904 0.5771644 0.9909232
## 0.00 0.15 1.603025 0.5518980 1.1001737
## 0.00 0.20 2.017602 0.4968594 1.2122255
## 0.00 0.25 2.436960 0.4564900 1.3262209
## 0.00 0.30 2.615256 0.4501060 1.3730862
## 0.00 0.35 2.557211 0.4550027 1.3622971
## 0.00 0.40 2.625839 0.4298842 1.3911245
## 0.00 0.45 2.712819 0.4193243 1.4198133
## 0.00 0.50 2.610139 0.4338638 1.3853878
## 0.00 0.55 2.529627 0.4563470 1.3486733
## 0.00 0.60 2.470223 0.4520518 1.3376444
## 0.00 0.65 2.319759 0.4638618 1.3048197
## 0.00 0.70 2.605125 0.4097010 1.3979327
## 0.00 0.75 2.993461 0.3987008 1.5078069
## 0.00 0.80 3.466449 0.3934683 1.6402375
## 0.00 0.85 3.956242 0.3890953 1.7753019
## 0.00 0.90 4.493759 0.3857713 1.9207788
## 0.00 0.95 5.035470 0.3824601 2.0664589
## 0.00 1.00 5.495694 0.3793959 2.1894346
## 0.01 0.05 1.537730 0.5680963 1.2543244
## 0.01 0.10 1.311655 0.5812928 1.0677783
## 0.01 0.15 1.267092 0.5598840 1.0173916
## 0.01 0.20 1.260399 0.5738845 0.9985975
## 0.01 0.25 1.456039 0.5604871 1.0594364
## 0.01 0.30 1.556327 0.5572234 1.0851620
## 0.01 0.35 1.832595 0.5123193 1.1592086
## 0.01 0.40 2.068111 0.4952529 1.2239767
## 0.01 0.45 2.249060 0.4856299 1.2696817
## 0.01 0.50 2.373197 0.4767219 1.2988207
## 0.01 0.55 2.491371 0.4695680 1.3268874
## 0.01 0.60 2.559322 0.4645187 1.3447704
## 0.01 0.65 2.590099 0.4621749 1.3581962
## 0.01 0.70 2.604266 0.4594933 1.3705879
## 0.01 0.75 2.638845 0.4559254 1.3868735
## 0.01 0.80 2.831025 0.4441539 1.4432474
## 0.01 0.85 3.003629 0.4368871 1.4925628
## 0.01 0.90 3.175878 0.4317501 1.5408519
## 0.01 0.95 3.337160 0.4276237 1.5864057
## 0.01 1.00 3.483110 0.4242291 1.6278024
## 0.10 0.05 1.660825 0.5185046 1.3618525
## 0.10 0.10 1.496576 0.5727020 1.2211029
## 0.10 0.15 1.360328 0.5846325 1.1041772
## 0.10 0.20 1.280772 0.5654795 1.0463485
## 0.10 0.25 1.290977 0.5514714 1.0270911
## 0.10 0.30 1.296489 0.5583142 1.0111196
## 0.10 0.35 1.524950 0.5548659 1.0713126
## 0.10 0.40 1.621519 0.5549864 1.1035364
## 0.10 0.45 1.692705 0.5487950 1.1272976
## 0.10 0.50 1.813530 0.5232364 1.1624173
## 0.10 0.55 1.915764 0.5086601 1.1875928
## 0.10 0.60 2.025293 0.5019054 1.2121947
## 0.10 0.65 2.115212 0.4965486 1.2377294
## 0.10 0.70 2.208748 0.4878848 1.2685630
## 0.10 0.75 2.284164 0.4821458 1.2902155
## 0.10 0.80 2.337279 0.4784020 1.3055405
## 0.10 0.85 2.384429 0.4758255 1.3186464
## 0.10 0.90 2.455484 0.4732922 1.3391387
## 0.10 0.95 2.549578 0.4709969 1.3654884
## 0.10 1.00 2.635868 0.4688566 1.3884230
##
## RMSE was used to select the optimal model using the smallest value.
## The final values used for the model were fraction = 0.1 and lambda = 0.
lasso.predict <- predict(enetTune, X_test)
pls.eval = data.frame(obs = Y_test, pred=lasso.predict)
defaultSummary(pls.eval)
## RMSE Rsquared MAE
## 1.0689234 0.6177378 0.8395732
Ridge regression:
ridgeGrid <- data.frame(.lambda = seq(0, .1, length=15))
ridgeFit <- train(X, Y,
method = "ridge",
trControl = ctrl,
tuneGrid = ridgeGrid,
preProc = c("center", "scale"))
ridgeFit
## Ridge Regression
##
## 144 samples
## 55 predictor
##
## Pre-processing: centered (55), scaled (55)
## Resampling: Cross-Validated (10 fold)
## Summary of sample sizes: 131, 129, 131, 128, 129, 131, ...
## Resampling results across tuning parameters:
##
## lambda RMSE Rsquared MAE
## 0.000000000 9.467052 0.4927922 3.266832
## 0.007142857 4.457858 0.4253499 1.959005
## 0.014285714 3.778705 0.4450981 1.760447
## 0.021428571 3.429546 0.4563373 1.657652
## 0.028571429 3.203716 0.4639114 1.590590
## 0.035714286 3.041872 0.4696115 1.542654
## 0.042857143 2.918692 0.4742146 1.505713
## 0.050000000 2.821093 0.4781088 1.476507
## 0.057142857 2.741489 0.4815091 1.453700
## 0.064285714 2.675123 0.4845443 1.434822
## 0.071428571 2.618835 0.4872967 1.418701
## 0.078571429 2.570431 0.4898213 1.404730
## 0.085714286 2.528334 0.4921565 1.392485
## 0.092857143 2.491375 0.4943301 1.381654
## 0.100000000 2.458669 0.4963629 1.371997
##
## RMSE was used to select the optimal model using the smallest value.
## The final value used for the model was lambda = 0.1.
ridge_predict <- predict(ridgeFit, X_test, s = 1, mode = "fraction")
pls.eval = data.frame(obs = Y_test, pred=ridge_predict)
defaultSummary(pls.eval)
## RMSE Rsquared MAE
## 1.1938323 0.5483699 0.9773565
plsMethod <- train(X, Y,
method = "pls",
tuneLength = 20,
trControl = ctrl,
preProc = c("center", "scale"))
summary(plsMethod)
## Data: X dimension: 144 55
## Y dimension: 144 1
## Fit method: oscorespls
## Number of components considered: 1
## TRAINING: % variance explained
## 1 comps
## X 18.17
## .outcome 48.05
plot(plsMethod)
pls_predict <- predict(plsMethod, X_test , ncomp = 3)
pls.eval = data.frame(obs = Y_test, pred=pls_predict)
defaultSummary(pls.eval)
## RMSE Rsquared MAE
## 1.3936165 0.3466422 1.1501436
Lasso regression appears to be better becuase of better RMSE value.
D. Predict the response for the test set. What is the value of the performance metric and how does this compare with the resampled perfomance metric on the training set?
Below are the predicted values:
lasso.predict
## 12 24 26 37 42 44 54 57
## 42.45368 41.87467 36.89473 42.44589 40.50263 41.46830 41.05846 39.50399
## 58 60 62 68 72 80 83 98
## 40.04354 40.42921 39.04089 40.97283 41.34523 39.25583 40.71175 40.12717
## 99 101 106 110 111 112 115 127
## 38.26270 38.10791 39.04598 38.19135 39.45739 39.88781 39.97919 40.26538
## 131 133 139 140 162 168 169 171
## 40.42099 40.53798 39.48247 38.56638 39.72675 38.01674 39.91505 40.66493
E. Which predictors are most important in the model you have trained? Do either the biological or process predictors dominate the list?
We would continue usung enet for Lasso model building with lambda value =0
lassoModel <- enet(x = as.matrix(X), y = Y,
lambda = 0.0, normalize = TRUE)
lassoCoef <- predict(lassoModel, newx = as.matrix(X_test),
s=.1, mode = "fraction", type = "coefficients")
sort(lassoCoef$coefficients)
## ManufacturingProcess36 ManufacturingProcess37 ManufacturingProcess13
## -2.346544e+02 -3.680208e-01 -1.576435e-01
## ManufacturingProcess03 ManufacturingProcess17 ManufacturingProcess07
## -1.334910e-01 -1.204765e-01 -4.940365e-02
## ManufacturingProcess23 ManufacturingProcess35 BiologicalMaterial01
## -9.149275e-03 -2.184691e-03 0.000000e+00
## BiologicalMaterial02 BiologicalMaterial04 BiologicalMaterial06
## 0.000000e+00 0.000000e+00 0.000000e+00
## BiologicalMaterial08 BiologicalMaterial09 BiologicalMaterial10
## 0.000000e+00 0.000000e+00 0.000000e+00
## BiologicalMaterial11 BiologicalMaterial12 ManufacturingProcess01
## 0.000000e+00 0.000000e+00 0.000000e+00
## ManufacturingProcess02 ManufacturingProcess05 ManufacturingProcess08
## 0.000000e+00 0.000000e+00 0.000000e+00
## ManufacturingProcess10 ManufacturingProcess11 ManufacturingProcess12
## 0.000000e+00 0.000000e+00 0.000000e+00
## ManufacturingProcess14 ManufacturingProcess16 ManufacturingProcess18
## 0.000000e+00 0.000000e+00 0.000000e+00
## ManufacturingProcess20 ManufacturingProcess21 ManufacturingProcess22
## 0.000000e+00 0.000000e+00 0.000000e+00
## ManufacturingProcess24 ManufacturingProcess25 ManufacturingProcess26
## 0.000000e+00 0.000000e+00 0.000000e+00
## ManufacturingProcess27 ManufacturingProcess28 ManufacturingProcess29
## 0.000000e+00 0.000000e+00 0.000000e+00
## ManufacturingProcess30 ManufacturingProcess31 ManufacturingProcess33
## 0.000000e+00 0.000000e+00 0.000000e+00
## ManufacturingProcess38 ManufacturingProcess39 ManufacturingProcess40
## 0.000000e+00 0.000000e+00 0.000000e+00
## ManufacturingProcess41 ManufacturingProcess44 ManufacturingProcess19
## 0.000000e+00 0.000000e+00 3.356058e-05
## ManufacturingProcess15 ManufacturingProcess04 BiologicalMaterial03
## 2.353684e-03 9.602038e-03 1.888770e-02
## ManufacturingProcess06 ManufacturingProcess43 BiologicalMaterial05
## 2.176282e-02 2.558353e-02 8.677082e-02
## ManufacturingProcess42 ManufacturingProcess32 ManufacturingProcess09
## 9.386690e-02 1.310633e-01 4.130517e-01
## ManufacturingProcess34
## 2.362190e+00
list.coef <- lassoCoef$coefficients
kable(sort(list.coef[list.coef != 0]))
x | |
---|---|
ManufacturingProcess36 | -234.6544318 |
ManufacturingProcess37 | -0.3680208 |
ManufacturingProcess13 | -0.1576435 |
ManufacturingProcess03 | -0.1334910 |
ManufacturingProcess17 | -0.1204765 |
ManufacturingProcess07 | -0.0494036 |
ManufacturingProcess23 | -0.0091493 |
ManufacturingProcess35 | -0.0021847 |
ManufacturingProcess19 | 0.0000336 |
ManufacturingProcess15 | 0.0023537 |
ManufacturingProcess04 | 0.0096020 |
BiologicalMaterial03 | 0.0188877 |
ManufacturingProcess06 | 0.0217628 |
ManufacturingProcess43 | 0.0255835 |
BiologicalMaterial05 | 0.0867708 |
ManufacturingProcess42 | 0.0938669 |
ManufacturingProcess32 | 0.1310633 |
ManufacturingProcess09 | 0.4130517 |
ManufacturingProcess34 | 2.3621901 |
Manufaturing process appears to be better than biological material variables.
F. Explore the relationships between each of the top predictors and the response. How could this information be helpful in improving yield in future runs of the manufacturing process?
Manufacuring Process 36 is one of the top responses. In order to increase the yield we need to increase the bottom process and decrease the manufacturing process 36