Kuhn and Johnson
from K&J Developing a model to predict permeability could save significant resources for a pharmaceutical company, while at the same time more rapidly identifying molecules that have a sufficient permeability to become a drug:
6.2 a)
getwd()
## [1] "T:/00-624 HH Predictive Analytics/HMWK7 Linear Regression HH"
library(AppliedPredictiveModeling)
library (elasticnet)
library(ggplot2)
library(caret)
library (tidyverse)
library (kableExtra)
data(permeability)
head (permeability)
## permeability
## 1 12.520
## 2 1.120
## 3 19.405
## 4 1.730
## 5 1.680
## 6 0.510
class(permeability)
## [1] "matrix" "array"
summary(permeability)
## permeability
## Min. : 0.06
## 1st Qu.: 1.55
## Median : 4.91
## Mean :12.24
## 3rd Qu.:15.47
## Max. :55.60
The data set of permeability contains 165 compunds with their permeability values as a numerical matrix. There is no missing values in this matrix , and the permeability value for the chemical compounds ranges from 0.06 to 55.60.
library(caret)
# drop everything with near zero variability
class(fingerprints)
## [1] "matrix" "array"
dim(fingerprints)
## [1] 165 1107
f <- fingerprints[, -nearZeroVar(fingerprints)]
# head(f)
# summary(f)
# colnames(f)
# head(f, 2)
class (f)
## [1] "matrix" "array"
dim(f)
## [1] 165 388
# colnames(f)
There are originally 1107 predictors in the fingerprints matrix. After filtering out the near zero ones, there are 388 predictors left for modeling.
set.seed(565)
#add permeability data to f before we split
f <- cbind(data.frame(permeability),f)
dim(f)
## [1] 165 389
#split 80/20
n <- floor(0.80 * nrow(f))
idx <- sample(seq_len(nrow(f)), size = n)
train <- f[idx, ]
test <- f[-idx, ]
dim(train)
## [1] 132 389
dim(test)
## [1] 33 389
## corrplot 0.84 loaded
## [1] 279
## [1] 110
dim(filtered_train)
## [1] 132 110
dim(test)
## [1] 33 389
Now they are 132 observations in the training data and the 33 observations in the testing data.
#train the pls model
pls.model <- train(filtered_train[,-1],
train$permeability,
method = "pls",
tuneLength = 20,
trControl = trainControl(method = "cv"))
pls.model$bestTune
## ncomp
## 5 5
summary(pls.model)
## Data: X dimension: 132 109
## Y dimension: 132 1
## Fit method: oscorespls
## Number of components considered: 5
## TRAINING: % variance explained
## 1 comps 2 comps 3 comps 4 comps 5 comps
## X 16.33 26.04 34.17 42.15 47.39
## .outcome 44.35 57.57 66.10 72.08 74.84
## optimal n component with its corresponding R square
print(which.max(pls.model$results$Rsquared))
## [1] 5
print(max(pls.model$Results$Rsquared))
## Warning in max(pls.model$Results$Rsquared): no non-missing arguments to max;
## returning -Inf
## [1] -Inf
#no non-missing arguments to max; returning -Inf[1] -Inf
The PLS package tells us that with 1 component ,the X can be explained by 16.3% of variance while outcome can be explained by 44.3% of variance. Then the 5 components model can explain 47.39% of variation in X, and can explain 70.48% of variance in outcome. So 5 components model is significantly better than one component model. It did not consider any higher (such as 6) numbers of components in its model.
#output results
kable(pls.model$results)
ncomp | RMSE | Rsquared | MAE | RMSESD | RsquaredSD | MAESD |
---|---|---|---|---|---|---|
1 | 12.30282 | 0.4377951 | 8.956515 | 3.219503 | 0.2402805 | 2.178193 |
2 | 11.51310 | 0.5232750 | 8.710935 | 3.628012 | 0.2328060 | 2.276950 |
3 | 11.46227 | 0.5257648 | 8.862881 | 3.549764 | 0.2161606 | 2.424577 |
4 | 11.38302 | 0.5445444 | 8.663366 | 3.685221 | 0.2262313 | 2.314093 |
5 | 11.18529 | 0.5581383 | 8.515418 | 3.367247 | 0.2033848 | 2.331825 |
6 | 11.31163 | 0.5545029 | 8.624500 | 2.970595 | 0.1798164 | 1.762387 |
7 | 11.67501 | 0.5270360 | 8.784805 | 3.137917 | 0.2094167 | 2.140556 |
8 | 12.26412 | 0.4979940 | 9.262433 | 3.039613 | 0.2126070 | 2.210452 |
9 | 12.51929 | 0.4888166 | 9.457361 | 3.129536 | 0.2183111 | 2.381762 |
10 | 12.66223 | 0.4842596 | 9.551346 | 2.951570 | 0.2123899 | 2.149070 |
11 | 13.10742 | 0.4585698 | 9.778093 | 2.875405 | 0.2175529 | 2.220109 |
12 | 13.16479 | 0.4627268 | 9.900486 | 2.826145 | 0.2039360 | 2.170976 |
13 | 13.51318 | 0.4594261 | 10.081289 | 2.510701 | 0.1897588 | 2.175870 |
14 | 13.78783 | 0.4418945 | 10.417810 | 2.620078 | 0.2006759 | 2.343141 |
15 | 14.17663 | 0.4268529 | 10.774998 | 2.519321 | 0.1955183 | 2.324342 |
16 | 14.53325 | 0.4118195 | 10.966815 | 2.350513 | 0.1943018 | 2.305982 |
17 | 14.89244 | 0.3967179 | 11.223915 | 2.297252 | 0.1906188 | 2.353945 |
18 | 15.21620 | 0.3834179 | 11.581818 | 2.313119 | 0.1863851 | 2.254695 |
19 | 15.47223 | 0.3728959 | 11.700360 | 2.273206 | 0.1884778 | 2.195879 |
20 | 15.84762 | 0.3610087 | 11.890933 | 2.167690 | 0.1827425 | 2.062334 |
We run the PLS model on this data set and using maximum of 20 rounds of tuning. As we can see that as number of components increase from 1 to 6 R-squared improved and RMSE decreases. But the model has decreased its fit after that ,so the number of components for best fit is 6. The best R-square we get is 0.5826733
As compound number increases the RMSE decreases. The 10 compound model gives us the least smallest RMSE decreases. The R square increases as number of compounds increases. Because we only allow maximum tuning at 10 cycles, this model suggests that bigger number of compounds give us better model.
plot(pls.model$results$Rsquared,
xlab = "N of components", ylab = "Rsquared" )
plot(pls.model$results$RMSE,
xlab = "N of components", ylab = "RMSE" )
output <- predict(pls.model, newdata=test, ncomp = 6)
summary(output)
## Min. 1st Qu. Median Mean 3rd Qu. Max.
## -3.371 1.592 6.942 9.059 16.809 32.819
print(output)
## [1] -3.3714224 5.6552798 24.7296020 5.3394667 -1.4358364 5.4586035
## [7] 11.2532310 2.1056401 19.4762712 4.4828810 -3.0761819 10.7463131
## [13] 3.7829678 0.5569995 1.5919443 13.1645725 18.8741581 15.2626003
## [19] 21.1874554 10.9334413 17.3307620 -2.4308862 -2.1016842 10.6472924
## [25] 24.9803390 6.9419064 32.8191361 17.3680267 -1.4210725 16.8089030
## [31] 11.8354884 -2.3810734 1.8349102
## index of obs of min and max value
print(which.min(output))
## [1] 1
which.max(output)
## [1] 27
All the 33 predicted observations with their permeability response value is printed out above. The first observation have the lowest permeability at-3.371, and the 27th observation have the biggest Premeability at 32.82.
postResample(pred = output, obs = test$permeability)
## RMSE Rsquared MAE
## 12.1878487 0.2606434 8.4767044
# dim (postResample)
# summary(postResample())
# Error in postResample() : argument "pred" is missing, with no default
# There were missing values in resampled performance measures
Ordinary Linear Regression
class(train)
## [1] "data.frame"
perm_train <- filtered_train[, 1 ]
head(perm_train)
## [1] 3.800 32.375 2.710 19.405 8.620 8.585
length(perm_train)
## [1] 132
length(filtered_train)
## [1] 110
lm_permeability <- lm (perm_train~ ., data=data.frame(filtered_train))
predict_olr<- predict(lm_permeability, data.frame(test))
## Warning in predict.lm(lm_permeability, data.frame(test)): prediction from a
## rank-deficient fit may be misleading
cor((predict_olr), test) ^ 2
## Warning in cor((predict_olr), test): the standard deviation is zero
## permeability X1 X2 X3 X4 X5
## [1,] 1 0.02231104 0.01446122 0.00628222 0.00628222 0.00628222
## X6 X11 X12 X15 X16 X20 X21
## [1,] 0.3108628 0.01383134 0.02816076 0.1397354 0.1903787 0.00628222 0.00628222
## X25 X26 X27 X28 X29 X35
## [1,] 0.02231104 0.00628222 0.00628222 0.00628222 0.00628222 0.005275267
## X36 X37 X38 X39 X40 X41
## [1,] 0.01012629 0.02231104 0.00628222 0.00628222 0.00628222 0.0488562
## X42 X43 X44 X46 X47 X48
## [1,] 0.03392951 0.03392951 0.03392951 0.00628222 0.00628222 0.00628222
## X49 X50 X51 X52 X53 X54
## [1,] 0.02231104 0.00628222 0.0488562 0.03392951 0.03392951 0.00628222
## X55 X56 X57 X58 X59 X60
## [1,] 0.00628222 0.00628222 0.00628222 0.00628222 0.00628222 0.00628222
## X61 X62 X63 X64 X65 X66
## [1,] 0.00628222 0.00628222 0.00628222 0.00628222 0.00628222 0.00628222
## X67 X68 X69 X70 X71 X72
## [1,] 0.00628222 0.00628222 0.00628222 0.00628222 0.02231104 0.02231104
## X73 X74 X75 X76 X78 X79 X80
## [1,] 0.02231104 0.02231104 0.02231104 0.02231104 0.0488562 0.03392951 0.0488562
## X86 X87 X88 X93 X94 X96 X97
## [1,] 0.0475208 0.0007098039 0.009755541 0.0006990664 NA 0.0475208 0.0007098039
## X98 X99 X101 X102 X103 X108
## [1,] 0.009755541 0.03717016 0.0475208 0.0007098039 0.0218463 0.0218463
## X111 X118 X121 X125 X126 X127 X129
## [1,] 0.02943651 0.01279974 0.004676721 0.1397354 0.2826611 0.2826611 0.1397354
## X130 X133 X138 X141 X142 X143 X146
## [1,] 0.1397354 0.1903787 0.1903787 0.1772188 0.00628222 0.01684148 NA
## X150 X152 X153 X154 X156 X157
## [1,] 0.0005255734 0.0005255734 0.0005255734 0.0005255734 0.002341202 0.2826611
## X158 X159 X162 X163 X167 X168
## [1,] 0.1397305 0.1614548 0.0005255734 0.0005255734 0.0005255734 0.0005255734
## X169 X170 X171 X172 X173 X174
## [1,] 0.0005255734 0.0005255734 NA 0.0005255734 0.0005255734 0.0005255734
## X175 X176 X177 X178 X179
## [1,] 0.0005255734 0.0005255734 0.0005255734 0.0005255734 0.0005255734
## X180 X181 X182 X183 X184
## [1,] 0.0005255734 0.0005255734 0.0005255734 0.0005255734 0.0005255734
## X185 X186 X187 X188 X189
## [1,] 0.0005255734 0.0005255734 0.0005255734 0.0005255734 0.0005255734
## X190 X191 X192 X193 X194
## [1,] 0.0005255734 0.0005255734 0.0005255734 0.0005255734 0.0005255734
## X195 X196 X197 X198 X199
## [1,] 0.0005255734 0.0005255734 0.0005255734 0.0005255734 0.0005255734
## X200 X201 X202 X203 X204
## [1,] 0.0005255734 0.0005255734 0.0005255734 0.0005255734 0.0005255734
## X205 X206 X207 X208 X209
## [1,] 0.0005255734 0.0005255734 0.0005255734 0.0005255734 0.0005255734
## X210 X211 X212 X213 X214 X215
## [1,] 0.0005255734 0.0005255734 0.0005255734 0.0005255734 0.0005255734 NA
## X221 X223 X224 X225 X226 X227
## [1,] 0.003680528 0.003680528 0.003680528 0.009919855 0.004525842 0.004525842
## X228 X229 X230 X231 X232 X233
## [1,] 0.004525842 0.04616394 0.002341202 0.009499583 0.009499583 0.009499583
## X234 X235 X236 X237 X238 X239
## [1,] 0.009499583 0.07837093 0.05663068 0.05557856 5.651753e-06 0.2826611
## X240 X241 X242 X244 X245 X246 X247
## [1,] 0.2826611 0.0475208 0.0007098039 0.2826611 0.2826611 0.2826611 0.3731589
## X248 X249 X250 X251 X253 X254
## [1,] 0.04477769 0.0475208 0.0007098039 0.009755541 0.2826611 0.2826611
## X255 X256 X257 X258 X260 X261 X262
## [1,] 0.3731589 0.04477769 0.01662982 0.02005509 0.3731589 0.0475208 0.07100567
## X263 X264 X265 X266 X267 X268
## [1,] 0.0475208 0.0007098039 0.3731589 0.07100567 0.001914279 0.1689921
## X269 X270 X271 X272 X274 X276
## [1,] 0.1614548 0.07411396 0.01913597 0.0008387654 0.0005255734 0.02420869
## X278 X279 X280 X281 X284 X285
## [1,] 0.02220895 0.02220895 0.06789741 0.06789741 0.02220895 0.02220895
## X286 X290 X291 X293 X294 X295
## [1,] 0.02220895 0.02220895 0.02220895 0.03825939 0.03825939 0.03825939
## X296 X297 X298 X299 X300 X301
## [1,] 0.09687089 0.06292426 0.06789741 0.06789741 0.06789741 0.09687089
## X302 X303 X304 X305 X306 X307
## [1,] 0.09687089 0.06292426 0.06789741 0.06789741 0.03825939 0.03825939
## X308 X309 X310 X311 X312 X313
## [1,] 0.03825939 0.06292426 0.06789871 0.002341202 5.651753e-06 5.651753e-06
## X314 X315 X316 X317 X318 X319
## [1,] 0.00127449 0.01188205 0.01940126 0.01940126 0.01940126 0.008553532
## X320 X321 X322 X323 X324 X325
## [1,] 0.008553532 0.008553532 0.002341202 0.002341202 0.002341202 5.651753e-06
## X326 X327 X328 X329 X330
## [1,] 5.651753e-06 5.651753e-06 5.651753e-06 0.008765487 5.651753e-06
## X331 X332 X333 X334 X335 X336
## [1,] 5.651753e-06 5.651753e-06 5.651753e-06 0.04983727 0.008765487 0.02420869
## X337 X338 X339 X340 X341 X342
## [1,] 0.01486621 0.02424378 0.02424378 0.0218463 0.02424378 0.02424378
## X343 X344 X345 X355 X356 X357
## [1,] 0.02424378 0.0218463 0.01317638 0.002322603 0.002322603 0.0008918894
## X358 X359 X360 X361 X362 X366
## [1,] 0.007970375 0.001361655 0.001361655 0.003992783 0.001361655 0.0008918894
## X367 X368 X370 X371 X372 X373 X374
## [1,] 0.0008918894 0.002322603 1.951481e-06 1.951481e-06 NA NA 0.01692449
## X376 X377 X378 X380 X381 X382
## [1,] 0.03563732 0.03563732 0.03563732 0.03563732 0.03563732 0.03563732
## X383 X385 X386 X387 X388 X389
## [1,] 0.03563732 0.03563732 0.03563732 0.03563732 0.03563732 0.03563732
## X390 X392 X394 X395 X396 X398
## [1,] 0.03563732 0.03563732 0.03563732 0.03563732 0.03563732 0.03563732
## X400 X401 X403 X406 X496 X497
## [1,] 0.03563732 0.03563732 0.03563732 0.03563732 0.01710705 0.01710705
## X499 X503 X504 X505 X506 X507
## [1,] 0.01710705 0.00639515 0.01400752 0.01400752 0.01400752 0.001409701
## X508 X509 X510 X511 X512 X514 X515 X516 X517 X518
## [1,] 0.005275267 0.0222616 0.00365018 0.00365018 NA NA NA NA NA NA
## X519 X520 X521 X522 X524 X529 X549 X551
## [1,] NA 0.02323246 0.02323246 NA NA NA 0.004773629 0.01186534
## X553 X554 X556 X557 X558 X559
## [1,] 0.01186534 0.01186534 0.004773629 0.004773629 0.004773629 0.004773629
## X560 X561 X565 X568 X571 X573
## [1,] 0.004773629 0.02501938 0.004773629 0.002830494 0.005275267 0.005275267
## X574 X576 X577 X590 X591 X592
## [1,] 0.005275267 0.005275267 0.01426345 0.09235877 0.09235877 0.09235877
## X593 X594 X595 X597 X598 X599
## [1,] 0.09235877 0.09235877 0.09235877 0.01486621 0.01486621 2.262663e-05
## X600 X601 X602 X603 X604 X613
## [1,] 0.01486621 2.262663e-05 2.262663e-05 2.262663e-05 0.01486621 0.01235782
## X621 X679 X698 X699 X700 X701
## [1,] 0.01235782 0.02481757 0.004990659 0.0007083306 0.01486621 0.01486621
## X702 X703 X704 X705 X719 X732 X733
## [1,] 0.01486621 0.0007083306 0.004990659 0.01486621 0.004990659 NA NA
## X750 X751 X752 X753 X754 X755 X773
## [1,] 0.01126671 0.01126671 0.01126671 0.01126671 0.01126671 0.01126671 NA
## X774 X775 X776 X780 X782 X792 X793 X795
## [1,] NA NA NA 0.02373114 0.02373114 0.02373114 0.02373114 0.00639515
## X798 X800 X801 X805 X806 X812
## [1,] 0.00639515 0.02373114 0.02373114 0.00639515 0.02373114 0.00639515
## X813
## [1,] 0.02373114
summary(lm_permeability)
## Warning in summary.lm(lm_permeability): essentially perfect fit: summary may be
## unreliable
##
## Call:
## lm(formula = perm_train ~ ., data = data.frame(filtered_train))
##
## Residuals:
## Min 1Q Median 3Q Max
## -2.931e-14 -6.598e-16 0.000e+00 6.594e-16 2.931e-14
##
## Coefficients: (21 not defined because of singularities)
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 3.216e-14 2.306e-14 1.394e+00 0.170545
## permeability 1.000e+00 1.273e-16 7.856e+15 < 2e-16 ***
## X1 -4.027e-15 1.498e-14 -2.690e-01 0.789428
## X6 -8.321e-16 2.379e-15 -3.500e-01 0.728244
## X11 6.747e-15 2.210e-14 3.050e-01 0.761690
## X12 -1.994e-15 5.477e-14 -3.600e-02 0.971131
## X15 2.715e-15 2.392e-14 1.140e-01 0.910173
## X25 3.399e-14 7.350e-15 4.625e+00 3.56e-05 ***
## X35 -1.946e-14 6.149e-14 -3.160e-01 0.753253
## X36 8.819e-15 3.364e-14 2.620e-01 0.794473
## X55 -1.316e-15 9.074e-15 -1.450e-01 0.885402
## X80 -3.188e-14 2.161e-14 -1.475e+00 0.147601
## X88 -3.821e-14 1.081e-13 -3.540e-01 0.725425
## X93 8.653e-16 2.654e-15 3.260e-01 0.746003
## X94 1.722e-14 4.726e-14 3.640e-01 0.717389
## X96 -1.232e-14 1.055e-13 -1.170e-01 0.907544
## X97 2.403e-14 4.372e-14 5.500e-01 0.585494
## X98 -2.067e-14 4.330e-14 -4.770e-01 0.635597
## X99 2.547e-15 3.145e-14 8.100e-02 0.935849
## X101 6.526e-15 3.345e-14 1.950e-01 0.846261
## X103 -2.170e-14 3.254e-14 -6.670e-01 0.508454
## X111 1.052e-14 8.366e-14 1.260e-01 0.900499
## X118 -4.486e-15 9.078e-15 -4.940e-01 0.623817
## X121 -1.511e-14 7.113e-14 -2.120e-01 0.832767
## X126 5.387e-17 2.415e-14 2.000e-03 0.998230
## X130 2.124e-15 4.876e-14 4.400e-02 0.965465
## X138 -3.362e-15 7.137e-15 -4.710e-01 0.640047
## X141 2.835e-15 2.319e-15 1.223e+00 0.228305
## X143 -3.319e-14 1.681e-14 -1.974e+00 0.054997 .
## X146 5.894e-14 1.238e-13 4.760e-01 0.636604
## X158 1.352e-14 1.988e-14 6.800e-01 0.500395
## X207 -1.990e-15 1.463e-14 -1.360e-01 0.892443
## X224 -3.751e-15 9.181e-15 -4.090e-01 0.684940
## X225 1.589e-14 3.492e-14 4.550e-01 0.651337
## X228 4.528e-17 8.134e-15 6.000e-03 0.995584
## X229 -2.433e-14 3.090e-14 -7.870e-01 0.435496
## X230 -2.528e-14 5.405e-14 -4.680e-01 0.642479
## X234 2.572e-15 7.976e-15 3.220e-01 0.748744
## X235 9.453e-15 1.430e-14 6.610e-01 0.512058
## X237 1.221e-14 2.142e-14 5.700e-01 0.571657
## X238 7.460e-15 3.002e-14 2.490e-01 0.804942
## X250 -2.616e-14 9.105e-14 -2.870e-01 0.775242
## X251 5.398e-14 9.988e-14 5.400e-01 0.591749
## X254 3.139e-14 4.542e-14 6.910e-01 0.493332
## X257 3.338e-16 1.057e-13 3.000e-03 0.997496
## X258 3.874e-15 3.676e-14 1.050e-01 0.916568
## X264 NA NA NA NA
## X265 -4.083e-14 4.230e-14 -9.650e-01 0.339952
## X266 3.842e-15 2.885e-14 1.330e-01 0.894696
## X267 1.200e-15 5.340e-14 2.200e-02 0.982178
## X269 -5.322e-15 1.897e-14 -2.810e-01 0.780437
## X272 9.754e-15 7.059e-14 1.380e-01 0.890771
## X276 5.918e-15 3.031e-14 1.950e-01 0.846143
## X278 -1.269e-15 3.041e-15 -4.170e-01 0.678652
## X294 7.256e-16 8.494e-15 8.500e-02 0.932334
## X298 7.053e-16 7.909e-15 8.900e-02 0.929372
## X302 NA NA NA NA
## X303 3.438e-14 3.821e-14 9.000e-01 0.373306
## X310 -2.373e-14 9.280e-14 -2.560e-01 0.799455
## X314 7.994e-15 2.191e-14 3.650e-01 0.717073
## X315 -6.642e-16 6.485e-15 -1.020e-01 0.918907
## X316 -4.458e-15 1.668e-14 -2.670e-01 0.790556
## X319 1.768e-14 3.793e-14 4.660e-01 0.643579
## X329 -6.351e-15 3.083e-14 -2.060e-01 0.837780
## X336 8.728e-15 3.378e-14 2.580e-01 0.797399
## X337 -9.225e-15 1.415e-14 -6.520e-01 0.517867
## X340 8.267e-15 1.131e-14 7.310e-01 0.468870
## X341 4.723e-16 1.322e-14 3.600e-02 0.971669
## X345 -9.678e-16 5.691e-15 -1.700e-01 0.865783
## X358 -4.834e-15 9.367e-15 -5.160e-01 0.608512
## X361 3.006e-16 8.768e-15 3.400e-02 0.972817
## X362 7.450e-15 1.378e-14 5.410e-01 0.591545
## X366 1.714e-14 3.400e-14 5.040e-01 0.616732
## X367 -1.098e-14 3.363e-14 -3.270e-01 0.745584
## X368 NA NA NA NA
## X370 -9.907e-15 1.176e-14 -8.420e-01 0.404436
## X372 8.006e-15 1.211e-14 6.610e-01 0.511982
## X374 3.439e-15 6.871e-15 5.010e-01 0.619276
## X394 NA NA NA NA
## X496 -2.035e-15 7.969e-15 -2.550e-01 0.799671
## X503 -9.714e-16 2.330e-14 -4.200e-02 0.966939
## X507 1.184e-14 2.826e-14 4.190e-01 0.677387
## X508 1.133e-14 4.844e-14 2.340e-01 0.816212
## X509 1.550e-14 2.644e-14 5.860e-01 0.560749
## X510 -1.512e-14 3.398e-14 -4.450e-01 0.658634
## X511 NA NA NA NA
## X512 -1.821e-15 1.522e-14 -1.200e-01 0.905361
## X519 -4.292e-15 3.049e-14 -1.410e-01 0.888724
## X520 NA NA NA NA
## X551 -7.264e-15 2.179e-14 -3.330e-01 0.740557
## X561 NA NA NA NA
## X565 NA NA NA NA
## X568 NA NA NA NA
## X573 NA NA NA NA
## X574 -4.768e-16 2.079e-14 -2.300e-02 0.981811
## X577 NA NA NA NA
## X590 NA NA NA NA
## X592 -5.027e-15 8.962e-15 -5.610e-01 0.577829
## X600 NA NA NA NA
## X602 NA NA NA NA
## X613 NA NA NA NA
## X679 NA NA NA NA
## X698 -3.524e-14 8.371e-15 -4.210e+00 0.000132 ***
## X704 -6.775e-15 1.755e-14 -3.860e-01 0.701477
## X705 NA NA NA NA
## X732 2.875e-16 4.936e-15 5.800e-02 0.953834
## X750 2.648e-15 7.771e-15 3.410e-01 0.734959
## X773 NA NA NA NA
## X782 NA NA NA NA
## X805 NA NA NA NA
## X812 NA NA NA NA
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 6.751e-15 on 42 degrees of freedom
## Multiple R-squared: 1, Adjusted R-squared: 1
## F-statistic: 8.165e+30 on 89 and 42 DF, p-value: < 2.2e-16
# warning: prediction from a rank-deficient fit may be misleadingessentially perfect fit: summary may be unreliable