Using cv.glmnet, we have created a model which used the following variables for prediction: - Postcode - Age 1, Year 2016 (but Priya’s model includes all years and ages) - PHN code - 40 distinct SEIFA variables, ranging from max mins, to scores deciles and percentages
Hastie and Qian write in the GLMnet vignette that GLMnet is a package that fits a generalized linear model via penalized maximum likelihood. The regularization path is computed for the lasso or elasticnet penalty at a grid of values for the regularization parameter lambda. The algorithm is extremely fast, and can exploit sparsity in the input matrix x. It fits linear, logistic and multinomial, poisson, and Cox regression models. A variety of predictions can be made from the fitted models. It can also fit multi-response linear regression.
We used sparse input-matrix formats. The authors of the package don’t encourageusers to extract the components directly. Instead, various methods are provided for the object such as plot, print, coef and predict that enable us to execute those tasks more elegantly.
#import data
all_seifa<-read.csv("../cleaned_data/balanced.csv")
#QUESTION: what is the "target" variable in this data - how was 0 or 1 calculated in this cleaned dataset? (EG, was it 93 or what?)
all_seifa$postcode=as.factor(all_seifa$postcode)
#TIME a double? Also unsure which os the scores this will use
str(all_seifa)
## 'data.frame': 40247 obs. of 51 variables:
## $ X : int 1 2 3 4 5 6 7 8 9 10 ...
## $ postcode : Factor w/ 2538 levels "810","812","820",..: 51 2255 2012 623 1426 1390 2106 1666 2042 1129 ...
## $ year : int 2014 2014 2012 2016 2016 2015 2012 2012 2015 2014 ...
## $ age : int 2 1 2 5 1 2 1 5 2 2 ...
## $ pc_immun : Factor w/ 9 levels "<70.0","70.0-74.9",..: 4 9 8 9 8 6 5 5 9 9 ...
## $ caution : num 0.0602 0.00775 0.79615 0.02196 0.34722 ...
## $ pc_immun_class: num 2.939 0.656 7.768 0.384 9.408 ...
## $ PHN_code : Factor w/ 31 levels "PHN101","PHN102",..: 1 28 25 7 18 18 26 22 25 15 ...
## $ PHN_number : num 266 274 297 340 370 ...
## $ Time : num 2012 2009 2012 2015 2016 ...
## $ IEO_MAXS : num 1283 843 1135 942 1176 ...
## $ IEO_MINS : num 1052 979 892 888 730 ...
## $ IEO_RWAD : num 8.22 5.95 4.57 4.02 6.18 ...
## $ IEO_RWAP : num 61.5 83.9 62.2 1.4 36 ...
## $ IEO_RWAR : num 2067 2316 940 1195 2201 ...
## $ IEO_RWSD : num 9.43 6.67 1.01 5.4 4.32 ...
## $ IEO_RWSP : num 64.5 43.9 37.9 14 65 ...
## $ IEO_RWSR : num 595.461 279.496 -0.283 -214.477 229.106 ...
## $ IEO_SCORE : num 1244 1021 1000 974 1018 ...
## $ IEO_URP : num -6703 -1033 -1917 4545 20405 ...
## $ IER_MAXS : num 1165 1158 1034 1025 1151 ...
## $ IER_MINS : num 692 1099 919 789 814 ...
## $ IER_RWAD : num 5.34 8.41 3.39 4.34 1.56 ...
## $ IER_RWAP : num 47.2 88.1 35.9 20.1 67.2 ...
## $ IER_RWAR : num 1514 2478 417 1393 1784 ...
## $ IER_RWSD : num 8.057 7.021 6.809 0.555 4.918 ...
## $ IER_RWSP : num 51.9 58.4 10.7 10.1 36.5 ...
## $ IER_RWSR : num 195 193 304 144 446 ...
## $ IER_SCORE : num 967 1020 902 873 1019 ...
## $ IER_URP : num 4430 -1251 -3554 2853 13425 ...
## $ IRSAD_MAXS : num 1088 1012 989 1000 1060 ...
## $ IRSAD_MINS : num 980 1021 979 951 959 ...
## $ IRSAD_RWAD : num 10.69 12.29 6.47 5.05 6.11 ...
## $ IRSAD_RWAP : num 100.1 91.2 11.8 50.7 105.2 ...
## $ IRSAD_RWAR : num 2292 2361 968 662 2994 ...
## $ IRSAD_RWSD : num 4.655 8.644 1.21 0.844 8.263 ...
## $ IRSAD_RWSP : num 72.7 12 45.8 33.6 72.7 ...
## $ IRSAD_RWSR : num 283.83 152.63 8.76 186.11 474.64 ...
## $ IRSAD_SCORE : num 1046 917 1008 867 1116 ...
## $ IRSAD_URP : num 10752 1239 -2486 5418 16029 ...
## $ IRSD_MAXS : num 1079 1013 1109 930 1058 ...
## $ IRSD_MINS : num 1081 1124 868 721 909 ...
## $ IRSD_RWAD : num 8.08 4.59 3.72 1.18 7.65 ...
## $ IRSD_RWAP : num 115.5 58.2 32.2 50.9 66.1 ...
## $ IRSD_RWAR : num 2817 1143 1826 -537 1260 ...
## $ IRSD_RWSD : num 8.87 2.77 3.75 1.03 7.67 ...
## $ IRSD_RWSP : num 114.2 73.4 72.5 40.2 60 ...
## $ IRSD_RWSR : num 493.5 36.6 252.3 174.8 184.5 ...
## $ IRSD_SCORE : num 996 992 940 1048 890 ...
## $ IRSD_URP : num 1740.3 -4610.9 4894.4 -57.5 11583.5 ...
## $ target : int 0 0 0 0 0 0 0 0 0 0 ...
ignore_cols=c('pc_immun','caution','pc_immun_class','Time', 'PHN_number','state','X')
#so these are the important variables we removed from this model, mostly because we made % immunisation a target (93%) and we removed the caution, class and time variables, along with State, X and PHN-Number (which was duplicated with PHN_code)
colnames(all_seifa)
## [1] "X" "postcode" "year" "age"
## [5] "pc_immun" "caution" "pc_immun_class" "PHN_code"
## [9] "PHN_number" "Time" "IEO_MAXS" "IEO_MINS"
## [13] "IEO_RWAD" "IEO_RWAP" "IEO_RWAR" "IEO_RWSD"
## [17] "IEO_RWSP" "IEO_RWSR" "IEO_SCORE" "IEO_URP"
## [21] "IER_MAXS" "IER_MINS" "IER_RWAD" "IER_RWAP"
## [25] "IER_RWAR" "IER_RWSD" "IER_RWSP" "IER_RWSR"
## [29] "IER_SCORE" "IER_URP" "IRSAD_MAXS" "IRSAD_MINS"
## [33] "IRSAD_RWAD" "IRSAD_RWAP" "IRSAD_RWAR" "IRSAD_RWSD"
## [37] "IRSAD_RWSP" "IRSAD_RWSR" "IRSAD_SCORE" "IRSAD_URP"
## [41] "IRSD_MAXS" "IRSD_MINS" "IRSD_RWAD" "IRSD_RWAP"
## [45] "IRSD_RWAR" "IRSD_RWSD" "IRSD_RWSP" "IRSD_RWSR"
## [49] "IRSD_SCORE" "IRSD_URP" "target"
#SEIFA question - we have used all indexes here? But I can't see which one is used in the final model later on?
d<-all_seifa[ , -which(names(all_seifa) %in% ignore_cols)]
all_seifa=d
head(all_seifa)
## postcode year age PHN_code IEO_MAXS IEO_MINS IEO_RWAD IEO_RWAP
## 1 2031 2014 2 PHN101 1282.5580 1052.3344 8.219062 61.489128
## 2 6288 2014 1 PHN503 843.2888 978.6085 5.950701 83.946861
## 3 5461 2012 2 PHN402 1134.7840 891.7537 4.570225 62.198543
## 4 2844 2016 5 PHN107 941.5900 887.7535 4.017014 1.399589
## 5 4163 2016 1 PHN302 1176.3550 730.2627 6.180369 35.992205
## 6 4108 2015 2 PHN302 998.7844 886.1625 7.835038 74.052448
## IEO_RWAR IEO_RWSD IEO_RWSP IEO_RWSR IEO_SCORE IEO_URP IER_MAXS
## 1 2066.9023 9.433339 64.53066 595.4605357 1243.8130 -6703.0709 1165.4785
## 2 2315.8615 6.668361 43.87983 279.4955453 1020.5613 -1033.2677 1157.6153
## 3 939.8147 1.007194 37.88064 -0.2832382 999.5188 -1917.0994 1034.3796
## 4 1194.7279 5.401566 14.03503 -214.4774793 974.3584 4544.8917 1024.9919
## 5 2200.9659 4.317274 65.01334 229.1063162 1017.8092 20404.9311 1151.1901
## 6 1686.5457 3.059073 100.41271 297.5696231 1057.4833 -402.5839 993.2622
## IER_MINS IER_RWAD IER_RWAP IER_RWAR IER_RWSD IER_RWSP IER_RWSR
## 1 692.4078 5.340914 47.166235 1513.9571 8.057129 51.88421 194.8135
## 2 1099.0738 8.406186 88.092620 2477.5750 7.020786 58.42236 193.3070
## 3 919.2138 3.388100 35.865074 417.2709 6.809197 10.72630 304.1847
## 4 789.2449 4.338550 20.093235 1393.0437 0.555161 10.12725 144.0718
## 5 813.9470 1.555715 67.233748 1783.6962 4.918363 36.50759 446.3327
## 6 730.9176 5.425435 2.674985 465.7348 2.967113 17.85700 -215.0338
## IER_SCORE IER_URP IRSAD_MAXS IRSAD_MINS IRSAD_RWAD IRSAD_RWAP
## 1 967.2174 4430.197 1087.9257 979.6162 10.690053 100.11546
## 2 1019.7959 -1250.667 1012.2310 1020.9575 12.290321 91.15573
## 3 902.3195 -3554.009 989.3820 978.9226 6.472938 11.78414
## 4 873.0239 2853.484 999.8912 950.6086 5.047616 50.65590
## 5 1018.6768 13424.522 1060.4731 959.4616 6.110582 105.24057
## 6 944.6573 -1099.972 1011.1898 801.0303 3.651693 13.72347
## IRSAD_RWAR IRSAD_RWSD IRSAD_RWSP IRSAD_RWSR IRSAD_SCORE IRSAD_URP
## 1 2292.0045 4.6549689 72.69122 283.833462 1046.1634 10752.335
## 2 2361.3491 8.6439831 12.02550 152.632408 916.9593 1239.283
## 3 968.3934 1.2101682 45.83918 8.758847 1007.5826 -2485.996
## 4 661.6716 0.8444275 33.55211 186.109428 866.7834 5417.512
## 5 2993.5845 8.2625236 72.69949 474.641902 1116.1132 16028.861
## 6 692.8747 6.2891912 107.22795 173.869898 1009.8786 7627.589
## IRSD_MAXS IRSD_MINS IRSD_RWAD IRSD_RWAP IRSD_RWAR IRSD_RWSD IRSD_RWSP
## 1 1078.8797 1080.5934 8.078312 115.47239 2816.9949 8.872875 114.21666
## 2 1012.9604 1123.6807 4.591699 58.15358 1143.4930 2.767785 73.40351
## 3 1109.3975 868.1662 3.718510 32.24376 1826.4373 3.745975 72.51636
## 4 930.3463 721.4900 1.176803 50.87903 -537.0945 1.028908 40.21812
## 5 1057.5958 908.8387 7.647484 66.07355 1259.9307 7.674904 59.98450
## 6 992.8472 977.7346 6.780901 21.82252 1006.3311 5.446750 35.93161
## IRSD_RWSR IRSD_SCORE IRSD_URP target
## 1 493.51981 996.1682 1740.29491 0
## 2 36.55822 992.4139 -4610.92385 0
## 3 252.30547 939.9822 4894.39478 0
## 4 174.75564 1048.2552 -57.54919 0
## 5 184.47234 890.1873 11583.52291 0
## 6 171.23714 933.0474 11995.80406 0
#remove the na data
library(dplyr)
##
## Attaching package: 'dplyr'
## The following objects are masked from 'package:stats':
##
## filter, lag
## The following objects are masked from 'package:base':
##
## intersect, setdiff, setequal, union
all_seifa=na.omit(all_seifa)
## we only need the age 1. We also only need year 2016
all_seifa <- filter(all_seifa, age == 1 & year == 2016)
View(all_seifa)
## creating two tables - train and test - and added a src column to each and then added the value 'train' or 'test' depending on which dataset it came from. Bound the two datasets together again (called all)
#all_seifa$target<-as.factor(all_seifa$target)
#QUESTION - as.factor code is commented out from Priya - is that because it didn't work? Can we remove?
trainIndex = createDataPartition(all_seifa$target,
p=0.7, list=FALSE,times=1)
#all_seifa$postcode=as.numeric(all_seifa$postcode)
train = all_seifa[trainIndex,]
test = all_seifa[-trainIndex,]
train$src='train'
test$src='test'
all=rbind(train,test)
colnames(all)
## [1] "postcode" "year" "age" "PHN_code" "IEO_MAXS"
## [6] "IEO_MINS" "IEO_RWAD" "IEO_RWAP" "IEO_RWAR" "IEO_RWSD"
## [11] "IEO_RWSP" "IEO_RWSR" "IEO_SCORE" "IEO_URP" "IER_MAXS"
## [16] "IER_MINS" "IER_RWAD" "IER_RWAP" "IER_RWAR" "IER_RWSD"
## [21] "IER_RWSP" "IER_RWSR" "IER_SCORE" "IER_URP" "IRSAD_MAXS"
## [26] "IRSAD_MINS" "IRSAD_RWAD" "IRSAD_RWAP" "IRSAD_RWAR" "IRSAD_RWSD"
## [31] "IRSAD_RWSP" "IRSAD_RWSR" "IRSAD_SCORE" "IRSAD_URP" "IRSD_MAXS"
## [36] "IRSD_MINS" "IRSD_RWAD" "IRSD_RWAP" "IRSD_RWAR" "IRSD_RWSD"
## [41] "IRSD_RWSP" "IRSD_RWSR" "IRSD_SCORE" "IRSD_URP" "target"
## [46] "src"
#split dataframe into test and train, then remove target and src column
train1=all[all$src=='train',-c(46,45)]
test1=all[all$src=='test',-c(46,45)]
train_y=all[all$src=='train','target']
test_y=all[all$src=='test','target']
library(Matrix)
#
train_sparse <- sparse.model.matrix(~.,train1)
test_sparse <- sparse.model.matrix(~.,test1)
#fitting model to the train and then returning predictions against test
fit <- cv.glmnet(train_sparse,train_y,nfolds=3)
pred <- predict(fit, test_sparse,type="response",s=fit$lambda.min)
str(fit)
## List of 10
## $ lambda : num [1:98] 0.181 0.173 0.165 0.158 0.15 ...
## $ cvm : num [1:98] 0.249 0.247 0.244 0.242 0.239 ...
## $ cvsd : num [1:98] 0.00278 0.00326 0.00313 0.003 0.00292 ...
## $ cvup : num [1:98] 0.252 0.25 0.247 0.245 0.242 ...
## $ cvlo : num [1:98] 0.246 0.244 0.241 0.239 0.236 ...
## $ nzero : Named int [1:98] 0 1 1 2 3 3 3 4 5 5 ...
## ..- attr(*, "names")= chr [1:98] "s0" "s1" "s2" "s3" ...
## $ name : Named chr "Mean-Squared Error"
## ..- attr(*, "names")= chr "mse"
## $ glmnet.fit:List of 12
## ..$ a0 : Named num [1:100] 0.541 0.534 0.528 0.521 0.514 ...
## .. ..- attr(*, "names")= chr [1:100] "s0" "s1" "s2" "s3" ...
## ..$ beta :Formal class 'dgCMatrix' [package "Matrix"] with 6 slots
## .. .. ..@ i : int [1:33301] 2609 2609 2589 2609 2579 2589 2609 2579 2589 2609 ...
## .. .. ..@ p : int [1:101] 0 0 1 2 4 7 10 13 17 22 ...
## .. .. ..@ Dim : int [1:2] 2610 100
## .. .. ..@ Dimnames:List of 2
## .. .. .. ..$ : chr [1:2610] "(Intercept)" "postcode812" "postcode820" "postcode822" ...
## .. .. .. ..$ : chr [1:100] "s0" "s1" "s2" "s3" ...
## .. .. ..@ x : num [1:33301] 5.26e-07 1.03e-06 2.60e-07 1.29e-06 7.23e-08 ...
## .. .. ..@ factors : list()
## ..$ df : int [1:100] 0 1 1 2 3 3 3 4 5 5 ...
## ..$ dim : int [1:2] 2610 100
## ..$ lambda : num [1:100] 0.181 0.173 0.165 0.158 0.15 ...
## ..$ dev.ratio: num [1:100] 0 0.0118 0.0225 0.033 0.0428 ...
## ..$ nulldev : num 470
## ..$ npasses : int 997
## ..$ jerr : int 0
## ..$ offset : logi FALSE
## ..$ call : language glmnet(x = train_sparse, y = train_y)
## ..$ nobs : int 1895
## ..- attr(*, "class")= chr [1:2] "elnet" "glmnet"
## $ lambda.min: num 0.00504
## $ lambda.1se: num 0.0101
## - attr(*, "class")= chr "cv.glmnet"
str(pred)
## num [1:812, 1] 0.708 0.142 1.066 -0.139 0.178 ...
## - attr(*, "dimnames")=List of 2
## ..$ : chr [1:812] "1" "9" "10" "14" ...
## ..$ : chr "1"
#if prediction is greater than or equal to 93% then it is a 1, otherwise 0
prediction <- ifelse(pred >= 0.93, 1, 0)
table(prediction)
## prediction
## 0 1
## 698 114
#predicts 648 0s and 164 1s in the first run - but it changes on each run
confusionMatrix(as.factor(prediction),as.factor(test_y))
## Confusion Matrix and Statistics
##
## Reference
## Prediction 0 1
## 0 356 342
## 1 6 108
##
## Accuracy : 0.5714
## 95% CI : (0.5366, 0.6058)
## No Information Rate : 0.5542
## P-Value [Acc > NIR] : 0.1703
##
## Kappa : 0.2048
## Mcnemar's Test P-Value : <2e-16
##
## Sensitivity : 0.9834
## Specificity : 0.2400
## Pos Pred Value : 0.5100
## Neg Pred Value : 0.9474
## Prevalence : 0.4458
## Detection Rate : 0.4384
## Detection Prevalence : 0.8596
## Balanced Accuracy : 0.6117
##
## 'Positive' Class : 0
##
#366 true zeros and 282 incorrect zeros - but it changes on each run. Third run shows 354 true zeros and 260 incorrect zeros
#16 incorrect ones and 148 correct ones - but it changes on each run. Third run shows 190 true 1s and 8 incorrect 1s.
View(test_y)
#Accuracy is 0.633 and on second run is 0.7044 on third run is 0.67
This seems challenging, as the algorithm is fast and effective but also a little bit “black box”, so we can’t see exactly what it is doing.
#changing the threshold to see how the model changes
prediction <- ifelse(pred >= 0.95, 1, 0)
table(prediction)
## prediction
## 0 1
## 716 96
#predicts 656 0s and 156 1s in the first run
confusionMatrix(as.factor(prediction),as.factor(test_y))
## Confusion Matrix and Statistics
##
## Reference
## Prediction 0 1
## 0 357 359
## 1 5 91
##
## Accuracy : 0.5517
## 95% CI : (0.5168, 0.5863)
## No Information Rate : 0.5542
## P-Value [Acc > NIR] : 0.5705
##
## Kappa : 0.172
## Mcnemar's Test P-Value : <2e-16
##
## Sensitivity : 0.9862
## Specificity : 0.2022
## Pos Pred Value : 0.4986
## Neg Pred Value : 0.9479
## Prevalence : 0.4458
## Detection Rate : 0.4397
## Detection Prevalence : 0.8818
## Balanced Accuracy : 0.5942
##
## 'Positive' Class : 0
##
#356 true zeros and 300 incorrect zeros - but it changes on each run.
#6 incorrect ones and 150 correct ones - but it changes on each run.
#plotting AUC
library(ROCR)
## Loading required package: gplots
##
## Attaching package: 'gplots'
## The following object is masked from 'package:stats':
##
## lowess
ROCRpred <- prediction(pred, test_y)
ROCRperf <- performance(ROCRpred, 'tpr','fpr')
plot(ROCRperf, colorize = TRUE, text.adj = c(-0.2,1.7))
ROCRperf
## An object of class "performance"
## Slot "x.name":
## [1] "False positive rate"
##
## Slot "y.name":
## [1] "True positive rate"
##
## Slot "alpha.name":
## [1] "Cutoff"
##
## Slot "x.values":
## [[1]]
## [1] 0.000000000 0.002762431 0.002762431 0.002762431 0.002762431
## [6] 0.002762431 0.002762431 0.002762431 0.002762431 0.002762431
## [11] 0.002762431 0.002762431 0.002762431 0.002762431 0.002762431
## [16] 0.002762431 0.002762431 0.002762431 0.002762431 0.002762431
## [21] 0.002762431 0.002762431 0.002762431 0.002762431 0.002762431
## [26] 0.002762431 0.005524862 0.005524862 0.005524862 0.005524862
## [31] 0.005524862 0.008287293 0.008287293 0.008287293 0.008287293
## [36] 0.008287293 0.008287293 0.008287293 0.008287293 0.008287293
## [41] 0.008287293 0.008287293 0.011049724 0.011049724 0.011049724
## [46] 0.011049724 0.011049724 0.011049724 0.011049724 0.011049724
## [51] 0.011049724 0.011049724 0.011049724 0.011049724 0.011049724
## [56] 0.011049724 0.011049724 0.011049724 0.011049724 0.011049724
## [61] 0.011049724 0.011049724 0.011049724 0.011049724 0.011049724
## [66] 0.011049724 0.011049724 0.011049724 0.011049724 0.011049724
## [71] 0.011049724 0.011049724 0.011049724 0.011049724 0.011049724
## [76] 0.011049724 0.011049724 0.011049724 0.011049724 0.013812155
## [81] 0.013812155 0.013812155 0.013812155 0.013812155 0.013812155
## [86] 0.013812155 0.013812155 0.013812155 0.013812155 0.013812155
## [91] 0.013812155 0.013812155 0.013812155 0.013812155 0.013812155
## [96] 0.013812155 0.013812155 0.013812155 0.013812155 0.013812155
## [101] 0.013812155 0.013812155 0.013812155 0.013812155 0.013812155
## [106] 0.013812155 0.013812155 0.013812155 0.013812155 0.013812155
## [111] 0.016574586 0.016574586 0.016574586 0.016574586 0.016574586
## [116] 0.016574586 0.016574586 0.016574586 0.016574586 0.016574586
## [121] 0.016574586 0.019337017 0.019337017 0.019337017 0.019337017
## [126] 0.019337017 0.022099448 0.022099448 0.022099448 0.022099448
## [131] 0.022099448 0.022099448 0.022099448 0.022099448 0.022099448
## [136] 0.022099448 0.022099448 0.022099448 0.022099448 0.022099448
## [141] 0.022099448 0.022099448 0.022099448 0.022099448 0.022099448
## [146] 0.022099448 0.022099448 0.022099448 0.022099448 0.022099448
## [151] 0.022099448 0.022099448 0.024861878 0.024861878 0.024861878
## [156] 0.024861878 0.024861878 0.024861878 0.024861878 0.024861878
## [161] 0.024861878 0.024861878 0.024861878 0.024861878 0.024861878
## [166] 0.024861878 0.024861878 0.024861878 0.024861878 0.024861878
## [171] 0.024861878 0.024861878 0.024861878 0.024861878 0.024861878
## [176] 0.024861878 0.024861878 0.024861878 0.024861878 0.024861878
## [181] 0.024861878 0.027624309 0.027624309 0.027624309 0.030386740
## [186] 0.030386740 0.030386740 0.030386740 0.030386740 0.030386740
## [191] 0.030386740 0.030386740 0.030386740 0.030386740 0.030386740
## [196] 0.033149171 0.035911602 0.035911602 0.035911602 0.035911602
## [201] 0.035911602 0.035911602 0.035911602 0.038674033 0.038674033
## [206] 0.038674033 0.038674033 0.038674033 0.038674033 0.038674033
## [211] 0.038674033 0.038674033 0.041436464 0.041436464 0.041436464
## [216] 0.041436464 0.041436464 0.041436464 0.041436464 0.041436464
## [221] 0.041436464 0.041436464 0.041436464 0.041436464 0.041436464
## [226] 0.041436464 0.041436464 0.041436464 0.041436464 0.041436464
## [231] 0.041436464 0.041436464 0.041436464 0.041436464 0.041436464
## [236] 0.041436464 0.041436464 0.044198895 0.044198895 0.044198895
## [241] 0.044198895 0.044198895 0.044198895 0.044198895 0.044198895
## [246] 0.046961326 0.046961326 0.046961326 0.046961326 0.046961326
## [251] 0.046961326 0.046961326 0.046961326 0.046961326 0.046961326
## [256] 0.046961326 0.046961326 0.046961326 0.046961326 0.046961326
## [261] 0.046961326 0.046961326 0.046961326 0.046961326 0.046961326
## [266] 0.046961326 0.046961326 0.046961326 0.046961326 0.046961326
## [271] 0.046961326 0.046961326 0.049723757 0.049723757 0.049723757
## [276] 0.049723757 0.049723757 0.049723757 0.049723757 0.049723757
## [281] 0.049723757 0.049723757 0.049723757 0.049723757 0.049723757
## [286] 0.049723757 0.049723757 0.049723757 0.049723757 0.049723757
## [291] 0.049723757 0.052486188 0.052486188 0.052486188 0.052486188
## [296] 0.052486188 0.052486188 0.052486188 0.052486188 0.052486188
## [301] 0.052486188 0.052486188 0.055248619 0.055248619 0.055248619
## [306] 0.055248619 0.055248619 0.055248619 0.055248619 0.055248619
## [311] 0.055248619 0.055248619 0.055248619 0.055248619 0.055248619
## [316] 0.055248619 0.055248619 0.055248619 0.055248619 0.055248619
## [321] 0.055248619 0.055248619 0.055248619 0.055248619 0.055248619
## [326] 0.055248619 0.055248619 0.055248619 0.055248619 0.055248619
## [331] 0.055248619 0.058011050 0.058011050 0.060773481 0.060773481
## [336] 0.063535912 0.063535912 0.066298343 0.066298343 0.066298343
## [341] 0.066298343 0.066298343 0.066298343 0.066298343 0.066298343
## [346] 0.066298343 0.066298343 0.069060773 0.069060773 0.069060773
## [351] 0.069060773 0.069060773 0.069060773 0.069060773 0.069060773
## [356] 0.071823204 0.071823204 0.071823204 0.074585635 0.074585635
## [361] 0.074585635 0.074585635 0.074585635 0.074585635 0.074585635
## [366] 0.074585635 0.074585635 0.074585635 0.074585635 0.074585635
## [371] 0.074585635 0.074585635 0.074585635 0.074585635 0.074585635
## [376] 0.074585635 0.074585635 0.074585635 0.074585635 0.074585635
## [381] 0.074585635 0.074585635 0.074585635 0.077348066 0.077348066
## [386] 0.077348066 0.077348066 0.077348066 0.077348066 0.077348066
## [391] 0.077348066 0.080110497 0.080110497 0.080110497 0.080110497
## [396] 0.080110497 0.082872928 0.082872928 0.082872928 0.082872928
## [401] 0.082872928 0.085635359 0.085635359 0.085635359 0.088397790
## [406] 0.088397790 0.088397790 0.088397790 0.091160221 0.091160221
## [411] 0.091160221 0.091160221 0.091160221 0.091160221 0.091160221
## [416] 0.091160221 0.093922652 0.093922652 0.093922652 0.093922652
## [421] 0.093922652 0.093922652 0.096685083 0.096685083 0.099447514
## [426] 0.102209945 0.102209945 0.102209945 0.102209945 0.104972376
## [431] 0.104972376 0.104972376 0.104972376 0.107734807 0.110497238
## [436] 0.113259669 0.116022099 0.116022099 0.116022099 0.116022099
## [441] 0.116022099 0.116022099 0.118784530 0.121546961 0.121546961
## [446] 0.121546961 0.121546961 0.121546961 0.124309392 0.124309392
## [451] 0.124309392 0.124309392 0.124309392 0.127071823 0.127071823
## [456] 0.127071823 0.127071823 0.129834254 0.132596685 0.132596685
## [461] 0.132596685 0.132596685 0.132596685 0.132596685 0.132596685
## [466] 0.132596685 0.132596685 0.132596685 0.132596685 0.135359116
## [471] 0.138121547 0.140883978 0.140883978 0.143646409 0.146408840
## [476] 0.146408840 0.146408840 0.149171271 0.151933702 0.151933702
## [481] 0.154696133 0.157458564 0.157458564 0.160220994 0.162983425
## [486] 0.165745856 0.168508287 0.171270718 0.174033149 0.176795580
## [491] 0.176795580 0.179558011 0.179558011 0.182320442 0.182320442
## [496] 0.185082873 0.187845304 0.190607735 0.190607735 0.193370166
## [501] 0.196132597 0.196132597 0.198895028 0.201657459 0.201657459
## [506] 0.201657459 0.204419890 0.207182320 0.207182320 0.209944751
## [511] 0.212707182 0.212707182 0.215469613 0.215469613 0.215469613
## [516] 0.218232044 0.220994475 0.223756906 0.226519337 0.226519337
## [521] 0.229281768 0.229281768 0.229281768 0.232044199 0.234806630
## [526] 0.237569061 0.240331492 0.243093923 0.245856354 0.248618785
## [531] 0.251381215 0.251381215 0.254143646 0.256906077 0.259668508
## [536] 0.262430939 0.265193370 0.267955801 0.270718232 0.273480663
## [541] 0.276243094 0.279005525 0.281767956 0.284530387 0.284530387
## [546] 0.287292818 0.290055249 0.292817680 0.292817680 0.295580110
## [551] 0.298342541 0.301104972 0.303867403 0.306629834 0.309392265
## [556] 0.312154696 0.314917127 0.317679558 0.320441989 0.323204420
## [561] 0.323204420 0.325966851 0.328729282 0.331491713 0.334254144
## [566] 0.337016575 0.339779006 0.342541436 0.345303867 0.348066298
## [571] 0.350828729 0.353591160 0.356353591 0.359116022 0.361878453
## [576] 0.364640884 0.367403315 0.370165746 0.372928177 0.375690608
## [581] 0.378453039 0.381215470 0.383977901 0.386740331 0.389502762
## [586] 0.392265193 0.395027624 0.395027624 0.397790055 0.400552486
## [591] 0.403314917 0.406077348 0.408839779 0.411602210 0.414364641
## [596] 0.417127072 0.419889503 0.422651934 0.425414365 0.428176796
## [601] 0.430939227 0.433701657 0.436464088 0.439226519 0.441988950
## [606] 0.444751381 0.447513812 0.450276243 0.453038674 0.455801105
## [611] 0.458563536 0.461325967 0.464088398 0.464088398 0.466850829
## [616] 0.469613260 0.472375691 0.475138122 0.477900552 0.480662983
## [621] 0.483425414 0.483425414 0.486187845 0.488950276 0.491712707
## [626] 0.494475138 0.497237569 0.500000000 0.502762431 0.505524862
## [631] 0.508287293 0.511049724 0.513812155 0.516574586 0.519337017
## [636] 0.522099448 0.524861878 0.527624309 0.530386740 0.533149171
## [641] 0.535911602 0.538674033 0.541436464 0.541436464 0.544198895
## [646] 0.546961326 0.549723757 0.552486188 0.555248619 0.558011050
## [651] 0.560773481 0.563535912 0.566298343 0.569060773 0.571823204
## [656] 0.574585635 0.577348066 0.580110497 0.582872928 0.585635359
## [661] 0.588397790 0.591160221 0.593922652 0.596685083 0.599447514
## [666] 0.602209945 0.602209945 0.604972376 0.607734807 0.610497238
## [671] 0.613259669 0.616022099 0.618784530 0.621546961 0.624309392
## [676] 0.627071823 0.629834254 0.632596685 0.635359116 0.638121547
## [681] 0.640883978 0.643646409 0.646408840 0.649171271 0.651933702
## [686] 0.654696133 0.657458564 0.660220994 0.662983425 0.665745856
## [691] 0.668508287 0.671270718 0.674033149 0.676795580 0.679558011
## [696] 0.682320442 0.685082873 0.687845304 0.690607735 0.693370166
## [701] 0.696132597 0.698895028 0.701657459 0.704419890 0.707182320
## [706] 0.709944751 0.712707182 0.715469613 0.718232044 0.720994475
## [711] 0.723756906 0.726519337 0.729281768 0.732044199 0.734806630
## [716] 0.737569061 0.740331492 0.743093923 0.745856354 0.748618785
## [721] 0.751381215 0.754143646 0.756906077 0.759668508 0.762430939
## [726] 0.765193370 0.765193370 0.767955801 0.770718232 0.773480663
## [731] 0.776243094 0.779005525 0.781767956 0.784530387 0.787292818
## [736] 0.790055249 0.792817680 0.795580110 0.798342541 0.801104972
## [741] 0.803867403 0.806629834 0.809392265 0.812154696 0.814917127
## [746] 0.817679558 0.820441989 0.823204420 0.825966851 0.828729282
## [751] 0.831491713 0.834254144 0.837016575 0.839779006 0.842541436
## [756] 0.845303867 0.848066298 0.850828729 0.853591160 0.856353591
## [761] 0.859116022 0.861878453 0.864640884 0.867403315 0.870165746
## [766] 0.872928177 0.875690608 0.878453039 0.881215470 0.883977901
## [771] 0.886740331 0.889502762 0.892265193 0.895027624 0.897790055
## [776] 0.900552486 0.903314917 0.906077348 0.908839779 0.911602210
## [781] 0.914364641 0.917127072 0.919889503 0.922651934 0.925414365
## [786] 0.928176796 0.930939227 0.933701657 0.936464088 0.939226519
## [791] 0.939226519 0.941988950 0.944751381 0.947513812 0.950276243
## [796] 0.953038674 0.955801105 0.958563536 0.961325967 0.964088398
## [801] 0.966850829 0.969613260 0.972375691 0.975138122 0.977900552
## [806] 0.980662983 0.983425414 0.986187845 0.988950276 0.991712707
## [811] 0.994475138 0.997237569 1.000000000
##
##
## Slot "y.values":
## [[1]]
## [1] 0.000000000 0.000000000 0.002222222 0.004444444 0.006666667
## [6] 0.008888889 0.011111111 0.013333333 0.015555556 0.017777778
## [11] 0.020000000 0.022222222 0.024444444 0.026666667 0.028888889
## [16] 0.031111111 0.033333333 0.035555556 0.037777778 0.040000000
## [21] 0.042222222 0.044444444 0.046666667 0.048888889 0.051111111
## [26] 0.053333333 0.053333333 0.055555556 0.057777778 0.060000000
## [31] 0.062222222 0.062222222 0.064444444 0.066666667 0.068888889
## [36] 0.071111111 0.073333333 0.075555556 0.077777778 0.080000000
## [41] 0.082222222 0.084444444 0.084444444 0.086666667 0.088888889
## [46] 0.091111111 0.093333333 0.095555556 0.097777778 0.100000000
## [51] 0.102222222 0.104444444 0.106666667 0.108888889 0.111111111
## [56] 0.113333333 0.115555556 0.117777778 0.120000000 0.122222222
## [61] 0.124444444 0.126666667 0.128888889 0.131111111 0.133333333
## [66] 0.135555556 0.137777778 0.140000000 0.142222222 0.144444444
## [71] 0.146666667 0.148888889 0.151111111 0.153333333 0.155555556
## [76] 0.157777778 0.160000000 0.162222222 0.164444444 0.164444444
## [81] 0.166666667 0.168888889 0.171111111 0.173333333 0.175555556
## [86] 0.177777778 0.180000000 0.182222222 0.184444444 0.186666667
## [91] 0.188888889 0.191111111 0.193333333 0.195555556 0.197777778
## [96] 0.200000000 0.202222222 0.204444444 0.206666667 0.208888889
## [101] 0.211111111 0.213333333 0.215555556 0.217777778 0.220000000
## [106] 0.222222222 0.224444444 0.226666667 0.228888889 0.231111111
## [111] 0.231111111 0.233333333 0.235555556 0.237777778 0.240000000
## [116] 0.242222222 0.244444444 0.246666667 0.248888889 0.251111111
## [121] 0.253333333 0.253333333 0.255555556 0.257777778 0.260000000
## [126] 0.262222222 0.262222222 0.264444444 0.266666667 0.268888889
## [131] 0.271111111 0.273333333 0.275555556 0.277777778 0.280000000
## [136] 0.282222222 0.284444444 0.286666667 0.288888889 0.291111111
## [141] 0.293333333 0.295555556 0.297777778 0.300000000 0.302222222
## [146] 0.304444444 0.306666667 0.308888889 0.311111111 0.313333333
## [151] 0.315555556 0.317777778 0.317777778 0.320000000 0.322222222
## [156] 0.324444444 0.326666667 0.328888889 0.331111111 0.333333333
## [161] 0.335555556 0.337777778 0.340000000 0.342222222 0.344444444
## [166] 0.346666667 0.348888889 0.351111111 0.353333333 0.355555556
## [171] 0.357777778 0.360000000 0.362222222 0.364444444 0.366666667
## [176] 0.368888889 0.371111111 0.373333333 0.375555556 0.377777778
## [181] 0.380000000 0.380000000 0.382222222 0.384444444 0.384444444
## [186] 0.386666667 0.388888889 0.391111111 0.393333333 0.395555556
## [191] 0.397777778 0.400000000 0.402222222 0.404444444 0.406666667
## [196] 0.406666667 0.406666667 0.408888889 0.411111111 0.413333333
## [201] 0.415555556 0.417777778 0.420000000 0.420000000 0.422222222
## [206] 0.424444444 0.426666667 0.428888889 0.431111111 0.433333333
## [211] 0.435555556 0.437777778 0.437777778 0.440000000 0.442222222
## [216] 0.444444444 0.446666667 0.448888889 0.451111111 0.453333333
## [221] 0.455555556 0.457777778 0.460000000 0.462222222 0.464444444
## [226] 0.466666667 0.468888889 0.471111111 0.473333333 0.475555556
## [231] 0.477777778 0.480000000 0.482222222 0.484444444 0.486666667
## [236] 0.488888889 0.491111111 0.491111111 0.493333333 0.495555556
## [241] 0.497777778 0.500000000 0.502222222 0.504444444 0.506666667
## [246] 0.506666667 0.508888889 0.511111111 0.513333333 0.515555556
## [251] 0.517777778 0.520000000 0.522222222 0.524444444 0.526666667
## [256] 0.528888889 0.531111111 0.533333333 0.535555556 0.537777778
## [261] 0.540000000 0.542222222 0.544444444 0.546666667 0.548888889
## [266] 0.551111111 0.553333333 0.555555556 0.557777778 0.560000000
## [271] 0.562222222 0.564444444 0.564444444 0.566666667 0.568888889
## [276] 0.571111111 0.573333333 0.575555556 0.577777778 0.580000000
## [281] 0.582222222 0.584444444 0.586666667 0.588888889 0.591111111
## [286] 0.593333333 0.595555556 0.597777778 0.600000000 0.602222222
## [291] 0.604444444 0.604444444 0.606666667 0.608888889 0.611111111
## [296] 0.613333333 0.615555556 0.617777778 0.620000000 0.622222222
## [301] 0.624444444 0.626666667 0.626666667 0.628888889 0.631111111
## [306] 0.633333333 0.635555556 0.637777778 0.640000000 0.642222222
## [311] 0.644444444 0.646666667 0.648888889 0.651111111 0.653333333
## [316] 0.655555556 0.657777778 0.660000000 0.662222222 0.664444444
## [321] 0.666666667 0.668888889 0.671111111 0.673333333 0.675555556
## [326] 0.677777778 0.680000000 0.682222222 0.684444444 0.686666667
## [331] 0.688888889 0.688888889 0.691111111 0.691111111 0.693333333
## [336] 0.693333333 0.695555556 0.695555556 0.697777778 0.700000000
## [341] 0.702222222 0.704444444 0.706666667 0.708888889 0.711111111
## [346] 0.713333333 0.715555556 0.715555556 0.717777778 0.720000000
## [351] 0.722222222 0.724444444 0.726666667 0.728888889 0.731111111
## [356] 0.731111111 0.733333333 0.735555556 0.735555556 0.737777778
## [361] 0.740000000 0.742222222 0.744444444 0.746666667 0.748888889
## [366] 0.751111111 0.753333333 0.755555556 0.757777778 0.760000000
## [371] 0.762222222 0.764444444 0.766666667 0.768888889 0.771111111
## [376] 0.773333333 0.775555556 0.777777778 0.780000000 0.782222222
## [381] 0.784444444 0.786666667 0.788888889 0.788888889 0.791111111
## [386] 0.793333333 0.795555556 0.797777778 0.800000000 0.802222222
## [391] 0.804444444 0.804444444 0.806666667 0.808888889 0.811111111
## [396] 0.813333333 0.813333333 0.815555556 0.817777778 0.820000000
## [401] 0.822222222 0.822222222 0.824444444 0.826666667 0.826666667
## [406] 0.828888889 0.831111111 0.833333333 0.833333333 0.835555556
## [411] 0.837777778 0.840000000 0.842222222 0.844444444 0.846666667
## [416] 0.848888889 0.848888889 0.851111111 0.853333333 0.855555556
## [421] 0.857777778 0.860000000 0.860000000 0.862222222 0.862222222
## [426] 0.862222222 0.864444444 0.866666667 0.868888889 0.868888889
## [431] 0.871111111 0.873333333 0.875555556 0.875555556 0.875555556
## [436] 0.875555556 0.875555556 0.877777778 0.880000000 0.882222222
## [441] 0.884444444 0.886666667 0.886666667 0.886666667 0.888888889
## [446] 0.891111111 0.893333333 0.895555556 0.895555556 0.897777778
## [451] 0.900000000 0.902222222 0.904444444 0.904444444 0.906666667
## [456] 0.908888889 0.911111111 0.911111111 0.911111111 0.913333333
## [461] 0.915555556 0.917777778 0.920000000 0.922222222 0.924444444
## [466] 0.926666667 0.928888889 0.931111111 0.933333333 0.933333333
## [471] 0.933333333 0.933333333 0.935555556 0.935555556 0.935555556
## [476] 0.937777778 0.940000000 0.940000000 0.940000000 0.942222222
## [481] 0.942222222 0.942222222 0.944444444 0.944444444 0.944444444
## [486] 0.944444444 0.944444444 0.944444444 0.944444444 0.944444444
## [491] 0.946666667 0.946666667 0.948888889 0.948888889 0.951111111
## [496] 0.951111111 0.951111111 0.951111111 0.953333333 0.953333333
## [501] 0.953333333 0.955555556 0.955555556 0.955555556 0.957777778
## [506] 0.960000000 0.960000000 0.960000000 0.962222222 0.962222222
## [511] 0.962222222 0.964444444 0.964444444 0.966666667 0.968888889
## [516] 0.968888889 0.968888889 0.968888889 0.968888889 0.971111111
## [521] 0.971111111 0.973333333 0.975555556 0.975555556 0.975555556
## [526] 0.975555556 0.975555556 0.975555556 0.975555556 0.975555556
## [531] 0.975555556 0.977777778 0.977777778 0.977777778 0.977777778
## [536] 0.977777778 0.977777778 0.977777778 0.977777778 0.977777778
## [541] 0.977777778 0.977777778 0.977777778 0.977777778 0.980000000
## [546] 0.980000000 0.980000000 0.980000000 0.982222222 0.982222222
## [551] 0.982222222 0.982222222 0.982222222 0.982222222 0.982222222
## [556] 0.982222222 0.982222222 0.982222222 0.982222222 0.982222222
## [561] 0.984444444 0.984444444 0.984444444 0.984444444 0.984444444
## [566] 0.984444444 0.984444444 0.984444444 0.984444444 0.984444444
## [571] 0.984444444 0.984444444 0.984444444 0.984444444 0.984444444
## [576] 0.984444444 0.984444444 0.984444444 0.984444444 0.984444444
## [581] 0.984444444 0.984444444 0.984444444 0.984444444 0.984444444
## [586] 0.984444444 0.984444444 0.986666667 0.986666667 0.986666667
## [591] 0.986666667 0.986666667 0.986666667 0.986666667 0.986666667
## [596] 0.986666667 0.986666667 0.986666667 0.986666667 0.986666667
## [601] 0.986666667 0.986666667 0.986666667 0.986666667 0.986666667
## [606] 0.986666667 0.986666667 0.986666667 0.986666667 0.986666667
## [611] 0.986666667 0.986666667 0.986666667 0.988888889 0.988888889
## [616] 0.988888889 0.988888889 0.988888889 0.988888889 0.988888889
## [621] 0.988888889 0.991111111 0.991111111 0.991111111 0.991111111
## [626] 0.991111111 0.991111111 0.991111111 0.991111111 0.991111111
## [631] 0.991111111 0.991111111 0.991111111 0.991111111 0.991111111
## [636] 0.991111111 0.991111111 0.991111111 0.991111111 0.991111111
## [641] 0.991111111 0.991111111 0.991111111 0.993333333 0.993333333
## [646] 0.993333333 0.993333333 0.993333333 0.993333333 0.993333333
## [651] 0.993333333 0.993333333 0.993333333 0.993333333 0.993333333
## [656] 0.993333333 0.993333333 0.993333333 0.993333333 0.993333333
## [661] 0.993333333 0.993333333 0.993333333 0.993333333 0.993333333
## [666] 0.993333333 0.995555556 0.995555556 0.995555556 0.995555556
## [671] 0.995555556 0.995555556 0.995555556 0.995555556 0.995555556
## [676] 0.995555556 0.995555556 0.995555556 0.995555556 0.995555556
## [681] 0.995555556 0.995555556 0.995555556 0.995555556 0.995555556
## [686] 0.995555556 0.995555556 0.995555556 0.995555556 0.995555556
## [691] 0.995555556 0.995555556 0.995555556 0.995555556 0.995555556
## [696] 0.995555556 0.995555556 0.995555556 0.995555556 0.995555556
## [701] 0.995555556 0.995555556 0.995555556 0.995555556 0.995555556
## [706] 0.995555556 0.995555556 0.995555556 0.995555556 0.995555556
## [711] 0.995555556 0.995555556 0.995555556 0.995555556 0.995555556
## [716] 0.995555556 0.995555556 0.995555556 0.995555556 0.995555556
## [721] 0.995555556 0.995555556 0.995555556 0.995555556 0.995555556
## [726] 0.995555556 0.997777778 0.997777778 0.997777778 0.997777778
## [731] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [736] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [741] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [746] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [751] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [756] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [761] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [766] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [771] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [776] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [781] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [786] 0.997777778 0.997777778 0.997777778 0.997777778 0.997777778
## [791] 1.000000000 1.000000000 1.000000000 1.000000000 1.000000000
## [796] 1.000000000 1.000000000 1.000000000 1.000000000 1.000000000
## [801] 1.000000000 1.000000000 1.000000000 1.000000000 1.000000000
## [806] 1.000000000 1.000000000 1.000000000 1.000000000 1.000000000
## [811] 1.000000000 1.000000000 1.000000000
##
##
## Slot "alpha.values":
## [[1]]
## [1] Inf 1.3359005854 1.2740320422 1.2523836950 1.2365929577
## [6] 1.2352150475 1.2169062781 1.1992598214 1.1800485038 1.1793153481
## [11] 1.1628303500 1.1431037820 1.1407973708 1.1242065334 1.1227836126
## [16] 1.1220038348 1.1156108867 1.1113636827 1.1086869881 1.1083517889
## [21] 1.1082092880 1.1019121898 1.0986616651 1.0898359936 1.0807095839
## [26] 1.0714611764 1.0658139784 1.0650935150 1.0637493920 1.0617640156
## [31] 1.0606268092 1.0552760117 1.0512252103 1.0425604885 1.0395358602
## [36] 1.0392460080 1.0384564986 1.0381820401 1.0347707819 1.0285919809
## [41] 1.0260632073 1.0239558914 1.0229992601 1.0226363429 1.0196226567
## [46] 1.0164063633 1.0153407165 1.0151023596 1.0146333693 1.0141879507
## [51] 1.0108920605 1.0086176883 1.0082192248 1.0068466138 1.0014909524
## [56] 0.9989298809 0.9916942279 0.9893793933 0.9877798556 0.9874623747
## [61] 0.9866219965 0.9838167061 0.9837550656 0.9827735542 0.9827578301
## [66] 0.9819652162 0.9788677565 0.9778759400 0.9749302507 0.9746247367
## [71] 0.9735763664 0.9730168313 0.9699412447 0.9688429924 0.9677845908
## [76] 0.9657157277 0.9651692650 0.9651043319 0.9651005209 0.9635589101
## [81] 0.9634666687 0.9621545895 0.9610770566 0.9600854357 0.9583471823
## [86] 0.9577234225 0.9561242537 0.9559110183 0.9558248661 0.9555551805
## [91] 0.9549329905 0.9541478083 0.9536489193 0.9518704473 0.9507000660
## [96] 0.9506643852 0.9505865917 0.9483622942 0.9478949996 0.9470398696
## [101] 0.9470075182 0.9465867538 0.9445974080 0.9434615017 0.9432325719
## [106] 0.9417498911 0.9416675520 0.9379409122 0.9376797021 0.9356846641
## [111] 0.9354581772 0.9345666088 0.9331736453 0.9327251977 0.9309918686
## [116] 0.9280705996 0.9275637733 0.9271260624 0.9254667685 0.9246489934
## [121] 0.9234680705 0.9232518172 0.9228428966 0.9222056649 0.9217905124
## [126] 0.9214986766 0.9214801674 0.9213235376 0.9209919199 0.9202894928
## [131] 0.9202682067 0.9171381528 0.9167458869 0.9166251117 0.9159897967
## [136] 0.9155539690 0.9155460567 0.9146228922 0.9132756235 0.9127560337
## [141] 0.9126099029 0.9122595445 0.9122340084 0.9122176898 0.9121119235
## [146] 0.9087014634 0.9081369942 0.9068660641 0.9068359037 0.9060030855
## [151] 0.9059254139 0.9058433581 0.9054546618 0.9045225115 0.9044650429
## [156] 0.9026569291 0.9025640402 0.9019818603 0.9005594600 0.9004605607
## [161] 0.9000790309 0.8999510764 0.8984042814 0.8975858842 0.8970878606
## [166] 0.8968883220 0.8966372184 0.8965232624 0.8964859737 0.8960838950
## [171] 0.8957560135 0.8944182133 0.8928390944 0.8925096785 0.8916535396
## [176] 0.8915091179 0.8907959188 0.8906716523 0.8904727884 0.8891278740
## [181] 0.8888052381 0.8886767796 0.8881555490 0.8880223559 0.8868357103
## [186] 0.8866184428 0.8861995992 0.8845137410 0.8844998363 0.8843983369
## [191] 0.8840642328 0.8835494275 0.8834499772 0.8834230394 0.8827844233
## [196] 0.8824134901 0.8822956331 0.8801413451 0.8799633588 0.8784723434
## [201] 0.8781420471 0.8762287100 0.8752032013 0.8751709830 0.8744135635
## [206] 0.8741585250 0.8738520608 0.8725265811 0.8724966691 0.8716264606
## [211] 0.8711851181 0.8710348080 0.8695162221 0.8687464615 0.8680997832
## [216] 0.8677412999 0.8660502709 0.8653340744 0.8652431092 0.8651580005
## [221] 0.8639892958 0.8626470144 0.8625346666 0.8621851614 0.8609368465
## [226] 0.8601240806 0.8595242433 0.8590858422 0.8588702463 0.8576432369
## [231] 0.8570126214 0.8569547153 0.8568154845 0.8565735391 0.8549302508
## [236] 0.8547543787 0.8544192933 0.8544093941 0.8538234816 0.8537908314
## [241] 0.8535396773 0.8534884614 0.8530423336 0.8522792514 0.8515777424
## [246] 0.8511322512 0.8497734627 0.8494754799 0.8494169540 0.8493317235
## [251] 0.8493281550 0.8484492465 0.8471431772 0.8468289371 0.8467460879
## [256] 0.8460040504 0.8456999593 0.8456610445 0.8449153832 0.8448238242
## [261] 0.8432028957 0.8425371899 0.8424100644 0.8412549274 0.8408929973
## [266] 0.8408258419 0.8399576500 0.8399207896 0.8397991668 0.8382771276
## [271] 0.8378537150 0.8350169473 0.8348222482 0.8345010176 0.8337532794
## [276] 0.8336382926 0.8330600415 0.8314220514 0.8307602210 0.8306753795
## [281] 0.8306550058 0.8299982680 0.8286385899 0.8276089038 0.8266659801
## [286] 0.8264770466 0.8260182979 0.8251606838 0.8247730939 0.8238879997
## [291] 0.8237373660 0.8223062971 0.8222176922 0.8218761674 0.8207775725
## [296] 0.8204404487 0.8195089221 0.8188886550 0.8187321124 0.8185426138
## [301] 0.8182973340 0.8177299581 0.8172837832 0.8168483366 0.8166235549
## [306] 0.8156575350 0.8154600289 0.8142708543 0.8138567765 0.8135912467
## [311] 0.8130258565 0.8116587931 0.8116037469 0.8113468734 0.8104527470
## [316] 0.8088030641 0.8081606251 0.8079219723 0.8076951891 0.8075173097
## [321] 0.8057734876 0.8057215369 0.8055749166 0.8055604859 0.8055603214
## [326] 0.8049818696 0.8048959133 0.8045370000 0.8033644068 0.8030469558
## [331] 0.8021982197 0.8006758945 0.8005481302 0.8000067682 0.7998535153
## [336] 0.7998029922 0.7980784311 0.7980581918 0.7979982264 0.7973439707
## [341] 0.7969877490 0.7968440808 0.7962552569 0.7960515025 0.7953666450
## [346] 0.7942316622 0.7940001887 0.7939932903 0.7937836110 0.7936652867
## [351] 0.7925041363 0.7920968418 0.7919316208 0.7919065774 0.7899607152
## [356] 0.7897188080 0.7889312599 0.7873647161 0.7872516539 0.7859295244
## [361] 0.7855347292 0.7851472976 0.7847547451 0.7838490707 0.7816235154
## [366] 0.7806709023 0.7800656226 0.7783543377 0.7779207587 0.7771689855
## [371] 0.7770446586 0.7770384582 0.7769146839 0.7763146415 0.7752334436
## [376] 0.7745537712 0.7741563520 0.7725562419 0.7709490453 0.7708169555
## [381] 0.7696286894 0.7690900495 0.7690394748 0.7684742281 0.7676155375
## [386] 0.7656252667 0.7645427099 0.7644674509 0.7607193138 0.7568046621
## [391] 0.7562870060 0.7558387187 0.7557039966 0.7552264007 0.7543292076
## [396] 0.7532489729 0.7528239443 0.7520255362 0.7517958443 0.7506315676
## [401] 0.7499772270 0.7490166501 0.7482697103 0.7482144062 0.7469946234
## [406] 0.7468947875 0.7468784007 0.7459046135 0.7436439309 0.7426710002
## [411] 0.7412648700 0.7368674064 0.7364408805 0.7353492170 0.7347133141
## [416] 0.7345331278 0.7340239582 0.7330796454 0.7327959988 0.7317210492
## [421] 0.7268305036 0.7232976163 0.7199184094 0.7190189880 0.7190002507
## [426] 0.7189643589 0.7167838570 0.7166391358 0.7153378614 0.7150043360
## [431] 0.7122374980 0.7115375035 0.7110974643 0.7084528152 0.7073436855
## [436] 0.7073096219 0.7049233516 0.7039514501 0.7038963830 0.7035346376
## [441] 0.7026827971 0.6999486775 0.6988318173 0.6987902755 0.6979289434
## [446] 0.6953750536 0.6916588917 0.6858803320 0.6820117318 0.6798946339
## [451] 0.6784670010 0.6775934008 0.6761058051 0.6757615967 0.6710953556
## [456] 0.6663930927 0.6610955948 0.6610137321 0.6592714836 0.6483734847
## [461] 0.6473010688 0.6455327359 0.6440342719 0.6391295340 0.6386655534
## [466] 0.6317562266 0.6300259499 0.6288620035 0.6268732893 0.6264889126
## [471] 0.6261644121 0.6254828154 0.6252029903 0.6177692858 0.6171206397
## [476] 0.6157477788 0.6119641624 0.6085503880 0.6083615686 0.6043165655
## [481] 0.5838078283 0.5833324616 0.5829914675 0.5820871417 0.5758441876
## [486] 0.5750375116 0.5738854751 0.5734880773 0.5722895298 0.5611711330
## [491] 0.5515385709 0.5513061300 0.5468256240 0.5396523930 0.5394558387
## [496] 0.5355271156 0.5351178635 0.5226986432 0.5130523802 0.5077619613
## [501] 0.5066880183 0.4976697436 0.4958162661 0.4936050672 0.4926132802
## [506] 0.4781972013 0.4691415075 0.4616329022 0.4608320870 0.4596258850
## [511] 0.4507264097 0.4504829812 0.4479752690 0.4427632330 0.4424691604
## [516] 0.4419487867 0.4408037846 0.4374733851 0.4294342998 0.4293577032
## [521] 0.4276013377 0.4255134070 0.4219852694 0.4199441066 0.4192890246
## [526] 0.4181445893 0.4160715129 0.4135910330 0.3991194175 0.3934911570
## [531] 0.3924009301 0.3915851756 0.3911417825 0.3900286319 0.3875059725
## [536] 0.3611679677 0.3587883080 0.3578372849 0.3564103570 0.3557256180
## [541] 0.3518955379 0.3493211541 0.3475141170 0.3465656831 0.3390123424
## [546] 0.3345307818 0.3338701123 0.3336893749 0.3334074662 0.3307696802
## [551] 0.3305739122 0.3260372513 0.3253283979 0.3235019486 0.3163585015
## [556] 0.3123381876 0.3066190157 0.3059690251 0.3028312125 0.2960714479
## [561] 0.2948787552 0.2932291826 0.2912811928 0.2897627766 0.2884596652
## [566] 0.2874936077 0.2869876435 0.2807197596 0.2807025383 0.2798276589
## [571] 0.2759681800 0.2707153612 0.2694731568 0.2659809572 0.2625745998
## [576] 0.2598294694 0.2587609972 0.2538532691 0.2504338250 0.2497859776
## [581] 0.2492191631 0.2489475687 0.2472613471 0.2461932949 0.2455386949
## [586] 0.2452626447 0.2450892266 0.2439678662 0.2418752276 0.2414876179
## [591] 0.2413490415 0.2322640989 0.2322156007 0.2303572922 0.2284040591
## [596] 0.2261163779 0.2257335771 0.2248357405 0.2224853154 0.2222573652
## [601] 0.2193973382 0.2192626146 0.2185734333 0.2184481321 0.2146168118
## [606] 0.2130143483 0.2123252540 0.2093537257 0.2083973353 0.2070326046
## [611] 0.2066413184 0.2052776181 0.2050711672 0.2048382745 0.2036445909
## [616] 0.2035094785 0.2023634621 0.2014641389 0.2006150614 0.1994837187
## [621] 0.1991799183 0.1989084790 0.1965390645 0.1895823315 0.1883011621
## [626] 0.1856492977 0.1854486218 0.1852338015 0.1844885056 0.1838105503
## [631] 0.1831673804 0.1811739927 0.1802276685 0.1802001284 0.1789965807
## [636] 0.1783240725 0.1762938637 0.1728968456 0.1720306947 0.1718312367
## [641] 0.1708738221 0.1686757128 0.1663623030 0.1651630998 0.1642789474
## [646] 0.1642330181 0.1633063165 0.1623442018 0.1620200329 0.1618940453
## [651] 0.1600666861 0.1590243386 0.1572384535 0.1563592953 0.1562405062
## [656] 0.1541495706 0.1514841414 0.1481712773 0.1479399779 0.1464009553
## [661] 0.1454076997 0.1449168081 0.1440329751 0.1436165186 0.1418392285
## [666] 0.1404228146 0.1371617273 0.1347594197 0.1346136887 0.1334661617
## [671] 0.1334648895 0.1330196558 0.1320206015 0.1316511348 0.1309358593
## [676] 0.1271717158 0.1260607580 0.1251583470 0.1233499926 0.1233252337
## [681] 0.1220415242 0.1210146845 0.1195737241 0.1185140401 0.1183159246
## [686] 0.1180905483 0.1147334405 0.1142000335 0.1125071421 0.1123567736
## [691] 0.1123407662 0.1081991888 0.1057839248 0.1043847780 0.1037330973
## [696] 0.1031164610 0.1022439769 0.0990633397 0.0980923012 0.0978888668
## [701] 0.0977083321 0.0969071549 0.0948640347 0.0933583728 0.0930601029
## [706] 0.0899190494 0.0896318056 0.0887468687 0.0878412755 0.0872276489
## [711] 0.0857896476 0.0855361194 0.0842945550 0.0833180036 0.0818751426
## [716] 0.0816108962 0.0807903077 0.0804632826 0.0786097579 0.0784679047
## [721] 0.0773784468 0.0766876436 0.0764485107 0.0754385809 0.0735077779
## [726] 0.0722977220 0.0713486517 0.0710646485 0.0709234055 0.0706774254
## [731] 0.0705421361 0.0669571847 0.0651522225 0.0645555094 0.0628890192
## [736] 0.0624689442 0.0617787500 0.0606079909 0.0600876612 0.0586136870
## [741] 0.0521841674 0.0504799177 0.0450899495 0.0447295613 0.0444620065
## [746] 0.0428253935 0.0421053939 0.0418919421 0.0407155732 0.0370891242
## [751] 0.0365922351 0.0361459796 0.0330666421 0.0325252530 0.0320883935
## [756] 0.0314943391 0.0299490347 0.0289405466 0.0248170182 0.0244327676
## [761] 0.0239519480 0.0224966284 0.0208436095 0.0179046553 0.0154284559
## [766] 0.0147099341 0.0123588886 0.0114574925 0.0105923064 0.0083548126
## [771] 0.0078681342 0.0043946040 0.0033054291 -0.0004660734 -0.0010527031
## [776] -0.0050037880 -0.0056779060 -0.0078530429 -0.0082439091 -0.0101985992
## [781] -0.0106623574 -0.0109739142 -0.0112000939 -0.0115692419 -0.0145882140
## [786] -0.0169104559 -0.0229012507 -0.0232894626 -0.0235953039 -0.0250425146
## [791] -0.0277568286 -0.0325089363 -0.0380048271 -0.0383567952 -0.0393980584
## [796] -0.0408924215 -0.0572702013 -0.0619745913 -0.0672272986 -0.0674715643
## [801] -0.0678366025 -0.0683678886 -0.0704788332 -0.0819134369 -0.0832046685
## [806] -0.0938228304 -0.0953160898 -0.0984016331 -0.1224442973 -0.1255854991
## [811] -0.1300348478 -0.1391284444 -0.2184014346
auc_ROCR <- performance(ROCRpred, measure = "auc")
auc_ROCR <- auc_ROCR@y.values[[1]]
auc_ROCR
## [1] 0.9383917
#AUC is .9370936 on first run and is 0.9676059 on second run
#QUESTION: how is this model telling us what the best variables are?
This model uses just age group 1 and 2016 (unlike Priya’s original model) and has used cv.glmnet is using multiple models to find us the best-fit model.