Predictive Modeling

knitr::opts_chunk$set(echo = TRUE, fig.align = "center", fig.height = 7, fig.width = 7, warning = FALSE, message = FALSE, include = TRUE)

Chapter 1

  1. 기존에 가지고 있는 자료를 바탕으로
  2. 모형을 만들고
  3. 모형의 성능을 비교하고
  4. 각각의 모형을 세부조정하여
  5. 새로운 값을 가장 잘 예측할 수 있는 모형을 선택하는 것.

    • ex) 보험 사기 예측, 질병 진단, google prediction API.
  • 예측분석은 사전에 특정한 이론이나 가정 없이 현재 혹은 미래의 값을 예측하는 것이 목표.

    • 주요 관심사 : 예측한 값과 실제 값의 차이는 얼마인가 ?
    • 즉, 데이터에 기반한 접근법
  • 설명분석은 이론에 기반한 분석.

    • 주요 관심사 : 나의 가설이 맞나? 근거가 충분한가? 근거가 부족한가?
    • 즉, 가설검증 절차를 사용하는 이론기반 접근법
  • 두 가지 분석법은 분석 목적이 다름.

    • 예측분석 : 미래의 값, 모르는 값을 예측하는 것이 목적
    • 설명분석 : 가지고 있는 자료를 잘 설명하는 가설을 찾는 것이 목적
  • 좋은 설명분석이라고 해서 좋은 예측분석일 수 없음.
  • 좋은 예측분석이라고 해서 좋은 설명분석일 수 없음.
  • 예측분석과 설명분석은 각각 다른 목적을 가지고 실행되어야 하며, 상호호환되지 않음.

Chapter 2

  • 예측분석의 절차

    1. 자료전처리

      • 많은 변수를 사용하여 모형을 만드는 경우 부적절한 자료가 추가되거나 원자료의 척도가 서로 다를 가능성이 있음.
      • 부적절한 자료들을 제거하면 모형의 예측력이 높아질 수 있음.

        • 부적절한 자료 예시

          1. near zero variance (변수의 분산이 0에 가까운 경우, 응답이 하나로 고정되는 경우)
          2. 서로 상관이 매우 높은 변수가 존재
          3. 변수분포의 비대칭성 (ex : 소득) -> 분포가 대칭성을 가질 때, 분석에 용이함.
          4. 변수 표준화 -> scale이 서로 다른경우, 표준화를 통해 scale을 맞춰주는 것이 분석에 용이함.
    2. 자료분할

      • 일반적으로 예측모형 분석을 하는 경우, 모형이 얼마나 좋은지 알아보기 위하여 예측 정확도를 검증하는 자료가 필요함.
      • 예측 정확도를 분석하기위한 자료는 분석을 위한 자료와는 별도로 준비해야 함.
      • 모형을 검증할 자료를 단기간에 새롭게 모으는 것이 힘들다면 분석을 위해 모은 자료의 일부분을 모형을 만드는데 사용하지 않고 검증을 위해 남겨둠.
    3. 모형적합

      • 학습자료에서 가장 오차를 줄일 수 있도록 모형 파라미터를 선택함.
      • 컴퓨터에서 실행되는 부분임.
    4. 모형평가 (모형선택)

      • 여러모형들을 비교하여 가장 좋은 모형을 선택함.
      • 모형을 선택하는 기준은 여러가지가 있을 수 있음.

        • 모형선택의 기준

          1. Mean Square Errors
          2. Sum of Absolute Errors
          3. Accuracy (이산값 분석시)
          4. AUC
    5. 최종모형 선정 및 최종 예측

      • 모형 선정 과정에서 가장 좋은 모형으로 판단한 모형으로 검증 자료를 예측.
      • 최종모형의 예측력을 평가.
      • 기존에 사용되던 모형의 정확도와 비교 후 의사결정.

예측변수 이슈

  1. 과잉적합 (overfitting)

    • 오차가 있는 자료를 잘 설명하는 모형을 만들기 때문에 과잉적합이 발생하기 쉬움.
    • 복잡한 모형일수록 학습자료의 복잡한 패턴을 잡아내는 것이 가능.
    • 하지만 복잡한 모형일수록 새로운 자료를 예측할 때, 오류가 발생할 가능성이 높아짐. -> 과잉적합
    • 적절한 수준의 복잡도를 가진 모형을 찾는 것이 중요.

    • Cross-validation (교차검증) & Resampling 방법을 사용함.

  2. 비대칭 자료 (Imbalanced data)

    • 많은 분석방법들은 각 집단의 자료 수가 비슷한 경우에 잘 작동함.
    • 그러나 자료 수가 비슷하게 자료를 모으기 어려운 경우가 존재함.

      • ex) 말기암 환자의 생존예측 = 생존자 수가 적음.
      • ex) 보험 사기꾼 예측 = 보험 사긱꾼은 전체 가입자 중 극히 일부.
    • 비대칭 자료를 어떻게 해결하는가 ?

      1. cut-off point 조정

        • A가 B보다 월등히 많다고 하면, p(A) > 0.5일때보다 더 높은 기준을 잡으면 B를 분류할 때 좀더 유리함.
        • P(A) > 0.7로 cut-off를 조정하면 A 분류에서 오류가 좀 생기지만 B 분류를 더 잘 하게 됨.
      2. class weight

        • 모형을 학습시킬 때에 적은 자료 예측이 빗나가면 더 많은 패널티를 부여하는 방법.
        • A가 B보다 월등히 많은 경우, B를 예측하는 것에 패널티를 부여함.
      3. sampling method

        • 적은 집단의 자료를 늘려서 or 많은 집단의 자료를 인위적으로 줄여서 각 집단에 속한 자료 수를 맞추는 방법.
        • sampling method 방법에는 over sampling, under sampling, smote algorithm (over sampling + under sampling) 등이 있음.

Chapter 3

  • 예측분석을 위해서 많은 모형을 만들게 됨.

    1. 서로 다른 종류의 모형 만들기.

      • ex) 선형회귀, 회귀나무, 인공 신경망 등
    2. 같은 종류의 모형을 다양하게 만들기.

      • ex) K-NN에서 K가 가질 수 있는 값
  • 많은 모형을 만든 후, 어떤 모형이 더 좋은 모형일까 ?

    • 평가점수 (Evaluation Metrics / Performance Profile)

      • 좋은모형을 알아내기 위해서는 각 모형이 얼마나 좋은 모형인지를 말해주는 점수가 필요함.

        • ex) Accuracy, per-class Accuracy, Kappa, AUC …
      1. Accuracy : (TP + TN) / (TP + FP + FN + TN)

      1. Kappa : Kappa = O - E / 1 - E

        • O : oberved accuracy, E : expected accuracy
        • kappa = 1 : 모형이 완벽하게 예측하고 있다는 의미를 나타냄.

      1. per-class Accuracy : 민감도, 특이도

        • 민감도 : TP / (TP + FN), 특이도 : TN / (FP + TN)
        • 민감도가 중요한 모형 or 특이도가 중요한 모형이 다를 수 있음.
        • Accuracy와 함께 고려해야하는 지수.

      1. AUC (Area Under the Curve)

        • 양분된 결과를 예측하는 테스트의 정확도를 평가하기 위하여 두가지 지표 민감도 (sensitivity), 특이도 (specificity)를 사용함.
        • 민감도는 1인 케이스를 1로 예측한 것, 특이도는 0인 케이스를 0으로 예측한 것.
        • TPR (true positive rate) = 민감도
          • 1인 케이스에 대해 1로 맞게 예측한 비율 (ex 암환자에게서 암을 진단함.)
        • FPR (false positive rate) = 1 - 특이도
          • 0인 케이스에 대해 1로 잘못 예측한 비율 (ex 암환자가 아닌데 암으로 진단함.)
        • TPR과 FPR은 반비례적 관계에 있음.
        • TPR과 FPR은 둘다 어떤 기준 (cut-off)을 연속적으로 바꾸면서 측정해야함.

        • TPR과 FPR의 여러가지 상황을 고려해서 모형 성능을 판단해야 하는데, 이것을 한눈에 볼 수 있게 하는 그래프가 ROC 커브임.
        • ROC 커브는 어떤 지점을 기준으로 잡을지 결정하기 쉽게 시각화한 것.
        • ROC 곡선 아래 면적을 계산하는 것이 AUC임.
        • 0과 1사이 값을 가지며 1에 가까울수록 좋은 모형임.
        • 민감도, 특이도 모두 1에 가까워야 의미가 있음.
        • x축이 1 - specificity (특이도) 이기 때문에 특이도가 감소하는 속도에 비해 얼마나 민감도가 증가하는지를 나타냄.

Chapter 4

  • 분류와 회귀

    • 다루는 문제들

      1. 오늘 낮 최고 온도는 몇도일까 ?
      2. 이 상품에 대한 고객 만족도는 몇점일까 ?
      3. 각 매장의 매출은 얼마나 될까 ?
      4. 우리나라 지역별 전세가는 어떻게 될까 ?
      5. 내가 받은 메일은 스팸 메일일까 ?
      6. 보험에 가입하려는 이 사람이 과연 보험 사기꾼일까 ?
      7. MRI 영상에 찍힌 종양이 악성일까 양성일까 ?
      8. 오늘의 날씨는 맑음일까 흐림일까 ?
      • 1번 ~ 4번 예, 아니오가 아닌 연속적인 수치로 나타낼 수 있는 문제
      • 5번 ~ 8번 연속적인 수치보다는 예, 아니오 또는 여러그룹으로 나타낼 수 있는 문제제
      • 즉, 어떤값을 예측하는지에 따라 분류 & 회귀로 나눌 수 있음.
    • 예측하고자 하는 값에 따라 문제의 성질이 다름.

      • 예측하려는 것이 점수인가 ? -> 회귀
      • 예측하려는 것이 집단인가 ? -> 분류

      • 예측하려는 문제에 따라 사용하는 방법들이 달라짐.
      • 문제에 따라 파라미터들이 달라짐.

    • 연속변수

      • linear regression / partial least square
      • penalized regression (LASSO, ridge, elastic net)
      • Neural network
      • Multivariate Adaptive regression Splines
      • Support Vector Machine
      • K-Nearest Neighbors
      • Regression tree / Random forest … 등등이 있음.
    • 이산 / 범주 변수

      • linear discriminant analysis / quadratic discriminant analysis
      • logistic regression
      • Nearest shrunken centroids
      • Neural Network
      • Flexible discriminant analysis
      • support vector machine
      • navie bayes
      • classification tree / random forest … 등등이 있음.

Chapter 5

  • CARET (classicifation and regression training)

    • CARET은 매우 다양한 모형을 하나의 방식으로 분석하게 도와주는 패키지

DATA Load

setwd("C:/Users/LG/Documents/R code, file")

library(caret)
library(corrplot)

colname <- c('Class','Alcohol', 'Malic acid', 'Ash', 'Alcalinity of ash',
             'Magnesium', 'Total phenols', 'Flavanoids', 'Nonflavanoid phenols',
             'Proanthocyanisns', 'Color intensity', 'Hue', 'OD280', 'Proline')

wdata <- read.table("wine.data.txt", header = FALSE, sep = ",", col.names = colname)

head(wdata)
##   Class Alcohol Malic.acid  Ash Alcalinity.of.ash Magnesium Total.phenols
## 1     1   14.23       1.71 2.43              15.6       127          2.80
## 2     1   13.20       1.78 2.14              11.2       100          2.65
## 3     1   13.16       2.36 2.67              18.6       101          2.80
## 4     1   14.37       1.95 2.50              16.8       113          3.85
## 5     1   13.24       2.59 2.87              21.0       118          2.80
## 6     1   14.20       1.76 2.45              15.2       112          3.27
##   Flavanoids Nonflavanoid.phenols Proanthocyanisns Color.intensity  Hue
## 1       3.06                 0.28             2.29            5.64 1.04
## 2       2.76                 0.26             1.28            4.38 1.05
## 3       3.24                 0.30             2.81            5.68 1.03
## 4       3.49                 0.24             2.18            7.80 0.86
## 5       2.69                 0.39             1.82            4.32 1.04
## 6       3.39                 0.34             1.97            6.75 1.05
##   OD280 Proline
## 1  3.92    1065
## 2  3.40    1050
## 3  3.17    1185
## 4  3.45    1480
## 5  2.93     735
## 6  2.85    1450
View(wdata)
str(wdata)
## 'data.frame':    178 obs. of  14 variables:
##  $ Class               : int  1 1 1 1 1 1 1 1 1 1 ...
##  $ Alcohol             : num  14.2 13.2 13.2 14.4 13.2 ...
##  $ Malic.acid          : num  1.71 1.78 2.36 1.95 2.59 1.76 1.87 2.15 1.64 1.35 ...
##  $ Ash                 : num  2.43 2.14 2.67 2.5 2.87 2.45 2.45 2.61 2.17 2.27 ...
##  $ Alcalinity.of.ash   : num  15.6 11.2 18.6 16.8 21 15.2 14.6 17.6 14 16 ...
##  $ Magnesium           : int  127 100 101 113 118 112 96 121 97 98 ...
##  $ Total.phenols       : num  2.8 2.65 2.8 3.85 2.8 3.27 2.5 2.6 2.8 2.98 ...
##  $ Flavanoids          : num  3.06 2.76 3.24 3.49 2.69 3.39 2.52 2.51 2.98 3.15 ...
##  $ Nonflavanoid.phenols: num  0.28 0.26 0.3 0.24 0.39 0.34 0.3 0.31 0.29 0.22 ...
##  $ Proanthocyanisns    : num  2.29 1.28 2.81 2.18 1.82 1.97 1.98 1.25 1.98 1.85 ...
##  $ Color.intensity     : num  5.64 4.38 5.68 7.8 4.32 6.75 5.25 5.05 5.2 7.22 ...
##  $ Hue                 : num  1.04 1.05 1.03 0.86 1.04 1.05 1.02 1.06 1.08 1.01 ...
##  $ OD280               : num  3.92 3.4 3.17 3.45 2.93 2.85 3.58 3.58 2.85 3.55 ...
##  $ Proline             : int  1065 1050 1185 1480 735 1450 1290 1295 1045 1045 ...
unique(wdata$Class)
## [1] 1 2 3
wdata$Class <- as.factor(wdata$Class)
sapply(wdata, class)
##                Class              Alcohol           Malic.acid 
##             "factor"            "numeric"            "numeric" 
##                  Ash    Alcalinity.of.ash            Magnesium 
##            "numeric"            "numeric"            "integer" 
##        Total.phenols           Flavanoids Nonflavanoid.phenols 
##            "numeric"            "numeric"            "numeric" 
##     Proanthocyanisns      Color.intensity                  Hue 
##            "numeric"            "numeric"            "numeric" 
##                OD280              Proline 
##            "numeric"            "integer"
levels(wdata$Class) <- c('w1', 'w2', 'w3') #wine 종류 1, 2, 3을 의미함. 
table(wdata$Class) #class별 개수 확인.
## 
## w1 w2 w3 
## 59 71 48

DATA 전처리

  1. missing value
  2. skewness (왜도, 비대칭성을 나타냄)
  3. near-zero variance (분산이 0인 값)
  4. 서로 상관이 높은 변수
  5. n < p 인 경우 (n < p인 경우, 차원축소기법 사용)

1. Missing value

x <- as.data.frame(matrix(1:10, 2))
x[c(1,2),c(1,2,3)] <- NA
x
##   V1 V2 V3 V4 V5
## 1 NA NA NA  7  9
## 2 NA NA NA  8 10
is.na(x)
##        V1   V2   V3    V4    V5
## [1,] TRUE TRUE TRUE FALSE FALSE
## [2,] TRUE TRUE TRUE FALSE FALSE
which(is.na(x))
## [1] 1 2 3 4 5 6
sum(is.na(x))
## [1] 6
y <- preProcess(x, method = "knnImpute") #imputation 비어있는 값, NA 값을 예측해서 채우는것.
y
## Created from 0 samples and 5 variables
## 
## Pre-processing:
##   - centered (5)
##   - ignored (0)
##   - 5 nearest neighbor imputation (5)
##   - scaled (5)
which(is.na(y))
## named integer(0)
sum(is.na(y))
## [1] 0
head(is.na(wdata))
##      Class Alcohol Malic.acid   Ash Alcalinity.of.ash Magnesium
## [1,] FALSE   FALSE      FALSE FALSE             FALSE     FALSE
## [2,] FALSE   FALSE      FALSE FALSE             FALSE     FALSE
## [3,] FALSE   FALSE      FALSE FALSE             FALSE     FALSE
## [4,] FALSE   FALSE      FALSE FALSE             FALSE     FALSE
## [5,] FALSE   FALSE      FALSE FALSE             FALSE     FALSE
## [6,] FALSE   FALSE      FALSE FALSE             FALSE     FALSE
##      Total.phenols Flavanoids Nonflavanoid.phenols Proanthocyanisns
## [1,]         FALSE      FALSE                FALSE            FALSE
## [2,]         FALSE      FALSE                FALSE            FALSE
## [3,]         FALSE      FALSE                FALSE            FALSE
## [4,]         FALSE      FALSE                FALSE            FALSE
## [5,]         FALSE      FALSE                FALSE            FALSE
## [6,]         FALSE      FALSE                FALSE            FALSE
##      Color.intensity   Hue OD280 Proline
## [1,]           FALSE FALSE FALSE   FALSE
## [2,]           FALSE FALSE FALSE   FALSE
## [3,]           FALSE FALSE FALSE   FALSE
## [4,]           FALSE FALSE FALSE   FALSE
## [5,]           FALSE FALSE FALSE   FALSE
## [6,]           FALSE FALSE FALSE   FALSE
sum(is.na(wdata)) #True = 1, False = 0 sum이 0이면 NA값이 없음.
## [1] 0

2. skewness (왜도, 비대칭성을 나타냄)

library(e1071) #skewness 구하는 함수

skewValues <- apply(wdata[,-1], 2, skewness) #1번째 column은 제외하고 column별로 skewness를 구함
skewValues 
##              Alcohol           Malic.acid                  Ash 
##          -0.05061790           1.02219461          -0.17373239 
##    Alcalinity.of.ash            Magnesium        Total.phenols 
##           0.20946966           1.07975154           0.08518385 
##           Flavanoids Nonflavanoid.phenols     Proanthocyanisns 
##           0.02491801           0.44259293           0.50845402 
##      Color.intensity                  Hue                OD280 
##           0.85400055           0.02073713          -0.30212593 
##              Proline 
##           0.75492943
#skewness가 -1/2 ~ 1/2 사이 값을 가지는 경우 = approximately symmetric
#skewness < -1/2 또는 skewness > 1/2 값을 가지는 경우 = skewed 

hist(wdata$Alcohol, main = "approximately symmetric(Alcohol)")

hist(wdata$Nonflavanoid.phenols, main = "moderately skewed(Nonflav)")

hist(wdata$Magnesium, main = "Highly skewed(Mg)")

#skewness를 해결하는 방법 = log transform (tail이 오른쪽으로 긴 경우)
wdata$log_Magnesium <- log(wdata$Magnesium)
skewness(wdata$log_Magnesium)
## [1] 0.5913459
#skewness를 해결하는 방법 = box-cox transform (data를 symmetric하게 만들 수 있는 lamda 값을 찾아서 변환하는 방법)
bctrans_Mg <- BoxCoxTrans(wdata$Magnesium)
bctrans_Mg
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   70.00   88.00   98.00   99.74  107.00  162.00 
## 
## Largest/Smallest: 2.31 
## Sample Skewness: 1.08 
## 
## Estimated Lambda: -1.4
wdata$bc_Magnesium <- predict(bctrans_Mg, wdata$Magnesium) #data point를 찾기위해서 predict를 사용함.
skewness(wdata$bc_Magnesium)
## [1] 0.0174775
hist(wdata$bc_Magnesium, main = "after box-cox transform")

  • 변환한 자료는 원자료와 scale이 달라짐.

3. near-zero variance (분산이 0인 값)

xx <- matrix(c(c(rep(0,99),1), rnorm(100,0,1), c(rep(1,99),2)), ncol = 3) 
head(xx)
##      [,1]       [,2] [,3]
## [1,]    0 -0.7556101    1
## [2,]    0  1.0176219    1
## [3,]    0  0.7023171    1
## [4,]    0  0.7834270    1
## [5,]    0 -0.3510557    1
## [6,]    0 -0.9261012    1
nearZeroVar(xx) #return column numbers
## [1] 1 3
#nearZeroVar결과 1, 3 column은 variance가 0에 가까움.

nearZeroVar(wdata[,-1]) #near-zero variance가 없으면 integer(0) 이라는 결과가 나옴. 
## integer(0)

4. 서로 상관이 높은 변수

head(wdata)
##   Class Alcohol Malic.acid  Ash Alcalinity.of.ash Magnesium Total.phenols
## 1    w1   14.23       1.71 2.43              15.6       127          2.80
## 2    w1   13.20       1.78 2.14              11.2       100          2.65
## 3    w1   13.16       2.36 2.67              18.6       101          2.80
## 4    w1   14.37       1.95 2.50              16.8       113          3.85
## 5    w1   13.24       2.59 2.87              21.0       118          2.80
## 6    w1   14.20       1.76 2.45              15.2       112          3.27
##   Flavanoids Nonflavanoid.phenols Proanthocyanisns Color.intensity  Hue
## 1       3.06                 0.28             2.29            5.64 1.04
## 2       2.76                 0.26             1.28            4.38 1.05
## 3       3.24                 0.30             2.81            5.68 1.03
## 4       3.49                 0.24             2.18            7.80 0.86
## 5       2.69                 0.39             1.82            4.32 1.04
## 6       3.39                 0.34             1.97            6.75 1.05
##   OD280 Proline log_Magnesium bc_Magnesium
## 1  3.92    1065      4.844187    0.7134756
## 2  3.40    1050      4.605170    0.7131536
## 3  3.17    1185      4.615121    0.7131693
## 4  3.45    1480      4.727388    0.7133317
## 5  2.93     735      4.770685    0.7133878
## 6  2.85    1450      4.718499    0.7133197
cor_mat <- cor(wdata[,-1]) #correlation matrix 만들기
head(cor_mat)
##                       Alcohol  Malic.acid       Ash Alcalinity.of.ash
## Alcohol            1.00000000  0.09439694 0.2115446       -0.31023514
## Malic.acid         0.09439694  1.00000000 0.1640455        0.28850040
## Ash                0.21154460  0.16404547 1.0000000        0.44336719
## Alcalinity.of.ash -0.31023514  0.28850040 0.4433672        1.00000000
## Magnesium          0.27079823 -0.05457510 0.2865867       -0.08333309
## Total.phenols      0.28910112 -0.33516700 0.1289795       -0.32111332
##                     Magnesium Total.phenols Flavanoids
## Alcohol            0.27079823     0.2891011  0.2368149
## Malic.acid        -0.05457510    -0.3351670 -0.4110066
## Ash                0.28658669     0.1289795  0.1150773
## Alcalinity.of.ash -0.08333309    -0.3211133 -0.3513699
## Magnesium          1.00000000     0.2144012  0.1957838
## Total.phenols      0.21440123     1.0000000  0.8645635
##                   Nonflavanoid.phenols Proanthocyanisns Color.intensity
## Alcohol                     -0.1559295      0.136697912      0.54636420
## Malic.acid                   0.2929771     -0.220746187      0.24898534
## Ash                          0.1862304      0.009651935      0.25888726
## Alcalinity.of.ash            0.3619217     -0.197326836      0.01873198
## Magnesium                   -0.2562940      0.236440610      0.19995001
## Total.phenols               -0.4499353      0.612413084     -0.05513642
##                           Hue        OD280    Proline log_Magnesium
## Alcohol           -0.07174720  0.072343187  0.6437200    0.29854282
## Malic.acid        -0.56129569 -0.368710428 -0.1920106   -0.04113269
## Ash               -0.07466689  0.003911231  0.2236263    0.31266782
## Alcalinity.of.ash -0.27395522 -0.276768549 -0.4405969   -0.09312921
## Magnesium          0.05539820  0.066003936  0.3933508    0.99456693
## Total.phenols      0.43368134  0.699949365  0.4981149    0.22562548
##                   bc_Magnesium
## Alcohol             0.32695674
## Malic.acid         -0.02281699
## Ash                 0.33699248
## Alcalinity.of.ash  -0.10537018
## Magnesium           0.97164493
## Total.phenols       0.23462843
library(corrplot)
corrplot(cor_mat, order = 'hclust') #correlation matrix clustering 

library(caret)
highCor <- findCorrelation(cor_mat, cutoff = 0.9, names = T)
highCor
## [1] "bc_Magnesium" "Magnesium"
wwdata <- subset(wdata, select = -c(Magnesium, log_Magnesium)) #상관이 높은 변수를 제외하기 위해서 subset 사용
head(wwdata)
##   Class Alcohol Malic.acid  Ash Alcalinity.of.ash Total.phenols Flavanoids
## 1    w1   14.23       1.71 2.43              15.6          2.80       3.06
## 2    w1   13.20       1.78 2.14              11.2          2.65       2.76
## 3    w1   13.16       2.36 2.67              18.6          2.80       3.24
## 4    w1   14.37       1.95 2.50              16.8          3.85       3.49
## 5    w1   13.24       2.59 2.87              21.0          2.80       2.69
## 6    w1   14.20       1.76 2.45              15.2          3.27       3.39
##   Nonflavanoid.phenols Proanthocyanisns Color.intensity  Hue OD280 Proline
## 1                 0.28             2.29            5.64 1.04  3.92    1065
## 2                 0.26             1.28            4.38 1.05  3.40    1050
## 3                 0.30             2.81            5.68 1.03  3.17    1185
## 4                 0.24             2.18            7.80 0.86  3.45    1480
## 5                 0.39             1.82            4.32 1.04  2.93     735
## 6                 0.34             1.97            6.75 1.05  2.85    1450
##   bc_Magnesium
## 1    0.7134756
## 2    0.7131536
## 3    0.7131693
## 4    0.7133317
## 5    0.7133878
## 6    0.7133197

5. n < p 인 경우 (n < p인 경우, 차원축소기법 사용), 차원축소 기법

trans <- preProcess(wdata[,-1], method = c('knnImpute', 'center', 'scale', 'BoxCox', 'pca'), pcaComp = 4)
#지금까지 사용한 방법들을 한번에 처리할 수 있는 함수 preprocess를 사용함
#missing value처리, centering & scale로 정규화, BoxCox후에 pca 실행
#pca component를 4개로 지정해서 차원을 축소함
head(trans)
## $dim
## [1] 178  15
## 
## $bc
## $bc$Alcohol
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   11.03   12.36   13.05   13.00   13.68   14.83 
## 
## Largest/Smallest: 1.34 
## Sample Skewness: -0.0506 
## 
## Estimated Lambda: 1.3 
## 
## 
## $bc$Malic.acid
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   0.740   1.603   1.865   2.336   3.083   5.800 
## 
## Largest/Smallest: 7.84 
## Sample Skewness: 1.02 
## 
## Estimated Lambda: -0.3 
## 
## 
## $bc$Ash
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   1.360   2.210   2.360   2.367   2.558   3.230 
## 
## Largest/Smallest: 2.38 
## Sample Skewness: -0.174 
## 
## Estimated Lambda: 1.4 
## 
## 
## $bc$Alcalinity.of.ash
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   10.60   17.20   19.50   19.49   21.50   30.00 
## 
## Largest/Smallest: 2.83 
## Sample Skewness: 0.209 
## 
## Estimated Lambda: 0.7 
## 
## 
## $bc$Magnesium
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   70.00   88.00   98.00   99.74  107.00  162.00 
## 
## Largest/Smallest: 2.31 
## Sample Skewness: 1.08 
## 
## Estimated Lambda: -1.4 
## 
## 
## $bc$Total.phenols
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   0.980   1.742   2.355   2.295   2.800   3.880 
## 
## Largest/Smallest: 3.96 
## Sample Skewness: 0.0852 
## 
## Estimated Lambda: 0.7 
## 
## 
## $bc$Flavanoids
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   0.340   1.205   2.135   2.029   2.875   5.080 
## 
## Largest/Smallest: 14.9 
## Sample Skewness: 0.0249 
## 
## Estimated Lambda: 0.7 
## 
## 
## $bc$Nonflavanoid.phenols
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##  0.1300  0.2700  0.3400  0.3619  0.4375  0.6600 
## 
## Largest/Smallest: 5.08 
## Sample Skewness: 0.443 
## 
## Estimated Lambda: 0.3 
## 
## 
## $bc$Proanthocyanisns
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   0.410   1.250   1.555   1.591   1.950   3.580 
## 
## Largest/Smallest: 8.73 
## Sample Skewness: 0.508 
## 
## Estimated Lambda: 0.6 
## 
## 
## $bc$Color.intensity
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   1.280   3.220   4.690   5.058   6.200  13.000 
## 
## Largest/Smallest: 10.2 
## Sample Skewness: 0.854 
## 
## Estimated Lambda: 0.1 
## With fudge factor, Lambda = 0 will be used for transformations
## 
## 
## $bc$Hue
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##  0.4800  0.7825  0.9650  0.9574  1.1200  1.7100 
## 
## Largest/Smallest: 3.56 
## Sample Skewness: 0.0207 
## 
## Estimated Lambda: 0.9 
## With fudge factor, no transformation is applied
## 
## 
## $bc$OD280
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   1.270   1.938   2.780   2.612   3.170   4.000 
## 
## Largest/Smallest: 3.15 
## Sample Skewness: -0.302 
## 
## Estimated Lambda: 1.4 
## 
## 
## $bc$Proline
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   278.0   500.5   673.5   746.9   985.0  1680.0 
## 
## Largest/Smallest: 6.04 
## Sample Skewness: 0.755 
## 
## Estimated Lambda: -0.1 
## With fudge factor, Lambda = 0 will be used for transformations
## 
## 
## $bc$log_Magnesium
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   4.248   4.477   4.585   4.593   4.673   5.088 
## 
## Largest/Smallest: 1.2 
## Sample Skewness: 0.591 
## 
## Estimated Lambda: -2 
## 
## 
## $bc$bc_Magnesium
## Box-Cox Transformation
## 
## 178 data points used to estimate Lambda
## 
## Input data summary:
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##  0.7124  0.7129  0.7131  0.7131  0.7133  0.7137 
## 
## Largest/Smallest: 1 
## Sample Skewness: 0.0175 
## 
## Estimated Lambda: -2 
## 
## 
## 
## $yj
## NULL
## 
## $et
## NULL
## 
## $invHyperbolicSine
## NULL
## 
## $mean
##              Alcohol           Malic.acid                  Ash 
##           20.8349902            0.6436856            1.6804717 
##    Alcalinity.of.ash            Magnesium        Total.phenols 
##            9.9608513            0.7131139            1.1064097 
##           Flavanoids Nonflavanoid.phenols     Proanthocyanisns 
##            0.8477921           -0.9068837            0.5003586 
##      Color.intensity                  Hue                OD280 
##            1.5180315            0.9574494            2.0825291 
##              Proline        log_Magnesium         bc_Magnesium 
##            6.5303028            0.4762375           -0.4832237
ppred <- predict(trans, wdata[,-1])
head(ppred)
##         PC1         PC2        PC3        PC4
## 1 -4.292639 -0.89344254 -0.0719308  0.3523860
## 2 -2.004350  1.01652954 -1.5158263  1.4783491
## 3 -2.449463  0.03447965 -0.1929337 -1.5525552
## 4 -4.237889 -0.93565834 -1.3063505 -0.8888534
## 5 -1.976779 -1.45633401  1.6768344 -1.1496907
## 6 -3.474553 -0.86015023 -1.2307021 -0.1180490

자료분할 1

set.seed(1234)

length_data <- dim(wdata)[1]
length_data
## [1] 178
TrainingIndex <- sample(1:length_data, round(length_data*0.8)) #80%의 데이터를 추출하는 방법
head(TrainingIndex)
## [1]  21 111 108 110 150 177
  • caret함수를 사용하지 않고 분할하는 방법.

자료분할 2 (caret::createDataPartition)

library(caret)

set.seed(1234)

TrainingIndex <- createDataPartition(y = wdata$Class, p =0.8, times = 1, list = F) 
#y : 예측변수, p : 전체자료의 몇%, times : 총 샘플 몇 개를 만들것인가, list :list로 만들것인가 

Train_dat <- wdata[TrainingIndex,]
Test_dat <- wdata[-TrainingIndex,]
head(Train_dat)
##   Class Alcohol Malic.acid  Ash Alcalinity.of.ash Magnesium Total.phenols
## 1    w1   14.23       1.71 2.43              15.6       127          2.80
## 2    w1   13.20       1.78 2.14              11.2       100          2.65
## 5    w1   13.24       2.59 2.87              21.0       118          2.80
## 6    w1   14.20       1.76 2.45              15.2       112          3.27
## 7    w1   14.39       1.87 2.45              14.6        96          2.50
## 8    w1   14.06       2.15 2.61              17.6       121          2.60
##   Flavanoids Nonflavanoid.phenols Proanthocyanisns Color.intensity  Hue
## 1       3.06                 0.28             2.29            5.64 1.04
## 2       2.76                 0.26             1.28            4.38 1.05
## 5       2.69                 0.39             1.82            4.32 1.04
## 6       3.39                 0.34             1.97            6.75 1.05
## 7       2.52                 0.30             1.98            5.25 1.02
## 8       2.51                 0.31             1.25            5.05 1.06
##   OD280 Proline log_Magnesium bc_Magnesium
## 1  3.92    1065      4.844187    0.7134756
## 2  3.40    1050      4.605170    0.7131536
## 5  2.93     735      4.770685    0.7133878
## 6  2.85    1450      4.718499    0.7133197
## 7  3.58    1290      4.564348    0.7130871
## 8  3.58    1295      4.795791    0.7134188
head(Test_dat)
##    Class Alcohol Malic.acid  Ash Alcalinity.of.ash Magnesium Total.phenols
## 3     w1   13.16       2.36 2.67              18.6       101          2.80
## 4     w1   14.37       1.95 2.50              16.8       113          3.85
## 15    w1   14.38       1.87 2.38              12.0       102          3.30
## 16    w1   13.63       1.81 2.70              17.2       112          2.85
## 24    w1   12.85       1.60 2.52              17.8        95          2.48
## 31    w1   13.73       1.50 2.70              22.5       101          3.00
##    Flavanoids Nonflavanoid.phenols Proanthocyanisns Color.intensity  Hue
## 3        3.24                 0.30             2.81            5.68 1.03
## 4        3.49                 0.24             2.18            7.80 0.86
## 15       3.64                 0.29             2.96            7.50 1.20
## 16       2.91                 0.30             1.46            7.30 1.28
## 24       2.37                 0.26             1.46            3.93 1.09
## 31       3.25                 0.29             2.38            5.70 1.19
##    OD280 Proline log_Magnesium bc_Magnesium
## 3   3.17    1185      4.615121    0.7131693
## 4   3.45    1480      4.727388    0.7133317
## 15  3.00    1547      4.624973    0.7131846
## 16  2.88    1310      4.718499    0.7133197
## 24  3.63    1015      4.553877    0.7130694
## 31  2.71    1285      4.615121    0.7131693

자료분할 3 (caret::maxDissim)

tmp <- sample(1:dim(wdata)[1], 10) #178개 중 10개만 선택함 

base <- wdata[tmp,] #내가 가지고 있는 자료와 가장 유사하지 않은 자료를 선택할 때, 기준이 되는 자료
pool <- wdata[-tmp,]

maxdiss <- maxDissim(base, pool, n = 140)
#내가 가지고 있는 자료와 가장 멀리있는, 가장 유사하지 않는 자료를 수집하는 방법 = maxDissimilarity
head(maxdiss)
## [1] 16 33 25 52  6 17
train_data <- wdata[maxdiss,]
test_data <- wdata[-maxdiss,]
  • 아주 큰 자료에는 적합하지 않은 방법.

자료분할 4 (caret::createFolds)

Fold_5 <- createFolds(wdata$Class, k = 5, list = T) 
#5개의 fold를 만들고 list로 받는 코드
#fold별 train, test를 만들고 평균을 내면 더 정확한 결과를 얻을 수 있음
Fold_5
## $Fold1
##  [1]  12  13  23  25  28  29  32  34  39  46  57  68  69  75  76  79  84
## [18]  91  93  99 103 112 114 118 128 135 141 144 159 163 164 168 173 176
## 
## $Fold2
##  [1]   9  14  15  22  27  33  47  51  54  55  56  58  67  74  77  82  95
## [18] 101 102 104 105 108 110 116 119 122 132 137 142 145 149 158 160 161
## [35] 169 172
## 
## $Fold3
##  [1]   1   2   6  19  20  21  26  40  42  43  48  52  63  64  65  72  73
## [18]  89  90  96 109 115 117 123 126 129 133 138 146 147 151 154 157 162
## [35] 167 170
## 
## $Fold4
##  [1]   5   8  10  16  24  30  31  38  44  45  49  53  66  70  71  83  87
## [18]  88  92 100 106 107 113 120 121 124 127 140 148 153 166 171 174 175
## [35] 177 178
## 
## $Fold5
##  [1]   3   4   7  11  17  18  35  36  37  41  50  59  60  61  62  78  80
## [18]  81  85  86  94  97  98 111 125 130 131 134 136 139 143 150 152 155
## [35] 156 165
test_data_fold <- wdata[Fold_5[[1]],]
head(test_data_fold)
##    Class Alcohol Malic.acid  Ash Alcalinity.of.ash Magnesium Total.phenols
## 12    w1   14.12       1.48 2.32              16.8        95          2.20
## 13    w1   13.75       1.73 2.41              16.0        89          2.60
## 23    w1   13.71       1.86 2.36              16.6       101          2.61
## 25    w1   13.50       1.81 2.61              20.0        96          2.53
## 28    w1   13.30       1.72 2.14              17.0        94          2.40
## 29    w1   13.87       1.90 2.80              19.4       107          2.95
##    Flavanoids Nonflavanoid.phenols Proanthocyanisns Color.intensity  Hue
## 12       2.43                 0.26             1.57            5.00 1.17
## 13       2.76                 0.29             1.81            5.60 1.15
## 23       2.88                 0.27             1.69            3.80 1.11
## 25       2.61                 0.28             1.66            3.52 1.12
## 28       2.19                 0.27             1.35            3.95 1.02
## 29       2.97                 0.37             1.76            4.50 1.25
##    OD280 Proline log_Magnesium bc_Magnesium
## 12  2.82    1280      4.553877    0.7130694
## 13  2.90    1320      4.488636    0.7129530
## 23  4.00    1035      4.615121    0.7131693
## 25  3.82     845      4.564348    0.7130871
## 28  2.77    1285      4.543295    0.7130512
## 29  3.40     915      4.672829    0.7132560
train_data_fold <- wdata[-Fold_5[[1]],] 
head(train_data_fold)
##   Class Alcohol Malic.acid  Ash Alcalinity.of.ash Magnesium Total.phenols
## 1    w1   14.23       1.71 2.43              15.6       127          2.80
## 2    w1   13.20       1.78 2.14              11.2       100          2.65
## 3    w1   13.16       2.36 2.67              18.6       101          2.80
## 4    w1   14.37       1.95 2.50              16.8       113          3.85
## 5    w1   13.24       2.59 2.87              21.0       118          2.80
## 6    w1   14.20       1.76 2.45              15.2       112          3.27
##   Flavanoids Nonflavanoid.phenols Proanthocyanisns Color.intensity  Hue
## 1       3.06                 0.28             2.29            5.64 1.04
## 2       2.76                 0.26             1.28            4.38 1.05
## 3       3.24                 0.30             2.81            5.68 1.03
## 4       3.49                 0.24             2.18            7.80 0.86
## 5       2.69                 0.39             1.82            4.32 1.04
## 6       3.39                 0.34             1.97            6.75 1.05
##   OD280 Proline log_Magnesium bc_Magnesium
## 1  3.92    1065      4.844187    0.7134756
## 2  3.40    1050      4.605170    0.7131536
## 3  3.17    1185      4.615121    0.7131693
## 4  3.45    1480      4.727388    0.7133317
## 5  2.93     735      4.770685    0.7133878
## 6  2.85    1450      4.718499    0.7133197

자료분할 5 (caret::createResample)

resamp_dat <- createResample(wdata$Class, times = 1)
resamp_dat
## $Resample1
##   [1]   1   1   1   4   6   7   9   9   9  10  14  15  16  17  18  19  20
##  [18]  20  21  23  24  24  25  26  26  28  30  32  34  35  35  36  36  37
##  [35]  39  39  42  44  46  48  49  49  50  52  53  55  55  56  59  61  61
##  [52]  62  64  66  66  67  67  68  70  70  71  72  74  75  75  75  76  76
##  [69]  79  82  84  87  87  90  92  92  93  93  94  96  98  99 100 100 101
##  [86] 102 103 103 106 106 106 107 107 108 108 108 108 110 111 112 113 117
## [103] 117 118 118 118 118 118 119 119 120 121 122 123 123 124 126 127 129
## [120] 129 131 131 131 132 133 133 133 134 135 135 136 136 136 136 137 138
## [137] 139 139 139 140 140 142 143 144 146 146 148 149 151 153 155 157 157
## [154] 158 160 162 163 164 165 166 166 166 167 169 169 169 171 171 171 173
## [171] 174 174 175 175 176 176 178 178
#반복을 허용하는 방법 (bootstrapping과 유사함)

testIndex <- !(1:dim(wdata)[1]) %in% unique(resamp_dat[[1]])
#unique를 사용해서 중복되지 않는 수를 뽑음
#%in% = !(1:dim(wdata)[1])에 unique(resamp_dat[[1]])값이 있는가 ?

head(testIndex)
## [1] FALSE  TRUE  TRUE FALSE  TRUE FALSE
  • 비교적 자료가 적을 때, 사용하는 방법.

Chapter 6

모형적합 1 (Elastic Net)

  • Elastic Net

    • 회귀모형 + 회귀계수에 약간의 bias를 추가하여 설명력을 약간 감소시킴 (자료를 덜 설명하는 모형을 만듬)
    • 자료가 가질 수 있는 variance로 인해 학습 자료에 가장 적합시킨 모형보다 예측력을 향상시킬 수 있음

    - lamda, alpha는 튜닝 파라미터임 (연구자가 찾아서 조정해야 하는 파라미터)
    - lamda : 회귀계수에 섞을 bias 양, alpha : 두가지 bias의 종류의 비중
library(caret)

#control object
controlObject <- trainControl(method = "repeatedcv",
                              repeats = 5,
                              number = 10,
                              classProbs = T)
#traincontrol : train할 때, 어떤 부분을 control할 것인지 지정함 
  #repeatedcv = cross validation을 여러번시행
  #repeates = cross validation 시행 횟수
  #number = cross validation할 때, 자료를 몇개로 분할 하는지 지정함
  #classprobs = 예측값을 class로 볼것인가 probability로 볼것인가

#tuning parameters
Elst_Grid <- expand.grid(.alpha = seq(0,1,0.1), #0과 1을 0.1로 나누면 11개 
                         .lambda = seq(0.01, 1, length = 50)) #0.01과 1을 50개로 나눔
#lamda, alpha의 쌍별 모델을 만듬
dim(Elst_Grid)
## [1] 550   2
Elst_Model <- train(Class ~ ., 
                    data = Train_dat, 
                    method = "glmnet",
                    tuneGrid = Elst_Grid,
                    preProc = c("center", "scale"),
                    metric = "Kappa",
                    trControl = controlObject)
sapply(Train_dat, class)
##                Class              Alcohol           Malic.acid 
##             "factor"            "numeric"            "numeric" 
##                  Ash    Alcalinity.of.ash            Magnesium 
##            "numeric"            "numeric"            "integer" 
##        Total.phenols           Flavanoids Nonflavanoid.phenols 
##            "numeric"            "numeric"            "numeric" 
##     Proanthocyanisns      Color.intensity                  Hue 
##            "numeric"            "numeric"            "numeric" 
##                OD280              Proline        log_Magnesium 
##            "numeric"            "integer"            "numeric" 
##         bc_Magnesium 
##            "numeric"
#Class ~ .(dependent variable ~ prediction variable), . = data에 있는 모든 변수를 포함한다는 의미
#tungrid = 튜닝파라미터를 어떻게 넣을 것인지 설정함, lamda & alpha의 여러 쌍을 격자형태로 만듬 
#preproc = 자료의 preprocessing 
#meteric = 평가 기준이 되는 값을 지정 
Elst_Model
## glmnet 
## 
## 144 samples
##  15 predictor
##   3 classes: 'w1', 'w2', 'w3' 
## 
## Pre-processing: centered (15), scaled (15) 
## Resampling: Cross-Validated (10 fold, repeated 5 times) 
## Summary of sample sizes: 130, 129, 130, 130, 129, 129, ... 
## Resampling results across tuning parameters:
## 
##   alpha  lambda      Accuracy   Kappa     
##   0.0    0.01000000  0.9764469  0.96438797
##   0.0    0.03020408  0.9764469  0.96438797
##   0.0    0.05040816  0.9750183  0.96225082
##   0.0    0.07061224  0.9763516  0.96434698
##   0.0    0.09081633  0.9750183  0.96237347
##   0.0    0.11102041  0.9736850  0.96036005
##   0.0    0.13122449  0.9738901  0.96064751
##   0.0    0.15142857  0.9753187  0.96281806
##   0.0    0.17163265  0.9767473  0.96498860
##   0.0    0.19183673  0.9767473  0.96498860
##   0.0    0.21204082  0.9767473  0.96498860
##   0.0    0.23224490  0.9767473  0.96498860
##   0.0    0.25244898  0.9767473  0.96498860
##   0.0    0.27265306  0.9767473  0.96498860
##   0.0    0.29285714  0.9767473  0.96498860
##   0.0    0.31306122  0.9753187  0.96288408
##   0.0    0.33326531  0.9753187  0.96288408
##   0.0    0.35346939  0.9753187  0.96288408
##   0.0    0.37367347  0.9753187  0.96288408
##   0.0    0.39387755  0.9753187  0.96288408
##   0.0    0.41408163  0.9738901  0.96058021
##   0.0    0.43428571  0.9738901  0.96058021
##   0.0    0.45448980  0.9738901  0.96058021
##   0.0    0.47469388  0.9738901  0.96058021
##   0.0    0.49489796  0.9738901  0.96058021
##   0.0    0.51510204  0.9738901  0.96058021
##   0.0    0.53530612  0.9738901  0.96058021
##   0.0    0.55551020  0.9738901  0.96058021
##   0.0    0.57571429  0.9738901  0.96058021
##   0.0    0.59591837  0.9738901  0.96058021
##   0.0    0.61612245  0.9738901  0.96058021
##   0.0    0.63632653  0.9738901  0.96058021
##   0.0    0.65653061  0.9725568  0.95852541
##   0.0    0.67673469  0.9725568  0.95852541
##   0.0    0.69693878  0.9725568  0.95852541
##   0.0    0.71714286  0.9725568  0.95852541
##   0.0    0.73734694  0.9725568  0.95852541
##   0.0    0.75755102  0.9725568  0.95852541
##   0.0    0.77775510  0.9712234  0.95647062
##   0.0    0.79795918  0.9712234  0.95647062
##   0.0    0.81816327  0.9712234  0.95647062
##   0.0    0.83836735  0.9712234  0.95647062
##   0.0    0.85857143  0.9712234  0.95647062
##   0.0    0.87877551  0.9697949  0.95431677
##   0.0    0.89897959  0.9697949  0.95431677
##   0.0    0.91918367  0.9697949  0.95431677
##   0.0    0.93938776  0.9697949  0.95431677
##   0.0    0.95959184  0.9711282  0.95631677
##   0.0    0.97979592  0.9711282  0.95631677
##   0.0    1.00000000  0.9711282  0.95631677
##   0.1    0.01000000  0.9750183  0.96225056
##   0.1    0.03020408  0.9820806  0.97296898
##   0.1    0.05040816  0.9849377  0.97731007
##   0.1    0.07061224  0.9849377  0.97727589
##   0.1    0.09081633  0.9836044  0.97526247
##   0.1    0.11102041  0.9836044  0.97526247
##   0.1    0.13122449  0.9836044  0.97526247
##   0.1    0.15142857  0.9822711  0.97328896
##   0.1    0.17163265  0.9809377  0.97127553
##   0.1    0.19183673  0.9809377  0.97127553
##   0.1    0.21204082  0.9796044  0.96926211
##   0.1    0.23224490  0.9796044  0.96926211
##   0.1    0.25244898  0.9796044  0.96926211
##   0.1    0.27265306  0.9796044  0.96926211
##   0.1    0.29285714  0.9796044  0.96926211
##   0.1    0.31306122  0.9796044  0.96926211
##   0.1    0.33326531  0.9796044  0.96926211
##   0.1    0.35346939  0.9796044  0.96926211
##   0.1    0.37367347  0.9796044  0.96926211
##   0.1    0.39387755  0.9796044  0.96926211
##   0.1    0.41408163  0.9796044  0.96926211
##   0.1    0.43428571  0.9796044  0.96926211
##   0.1    0.45448980  0.9796044  0.96926211
##   0.1    0.47469388  0.9796044  0.96926211
##   0.1    0.49489796  0.9796044  0.96926211
##   0.1    0.51510204  0.9796044  0.96926211
##   0.1    0.53530612  0.9796044  0.96926211
##   0.1    0.55551020  0.9796044  0.96926211
##   0.1    0.57571429  0.9796044  0.96926211
##   0.1    0.59591837  0.9795092  0.96909157
##   0.1    0.61612245  0.9808425  0.97105165
##   0.1    0.63632653  0.9808425  0.97105165
##   0.1    0.65653061  0.9808425  0.97105165
##   0.1    0.67673469  0.9808425  0.97105165
##   0.1    0.69693878  0.9808425  0.97105165
##   0.1    0.71714286  0.9794139  0.96888111
##   0.1    0.73734694  0.9793187  0.96868735
##   0.1    0.75755102  0.9806520  0.97068735
##   0.1    0.77775510  0.9806520  0.97068735
##   0.1    0.79795918  0.9792234  0.96848942
##   0.1    0.81816327  0.9792234  0.96848942
##   0.1    0.83836735  0.9807619  0.97079030
##   0.1    0.85857143  0.9780000  0.96658166
##   0.1    0.87877551  0.9763663  0.96406090
##   0.1    0.89897959  0.9736996  0.95992297
##   0.1    0.91918367  0.9710330  0.95581338
##   0.1    0.93938776  0.9683663  0.95168943
##   0.1    0.95959184  0.9670330  0.94960629
##   0.1    0.97979592  0.9670330  0.94960629
##   0.1    1.00000000  0.9654945  0.94726395
##   0.2    0.01000000  0.9777802  0.96644276
##   0.2    0.03020408  0.9863663  0.97948061
##   0.2    0.05040816  0.9863663  0.97948061
##   0.2    0.07061224  0.9849377  0.97727589
##   0.2    0.09081633  0.9849377  0.97727589
##   0.2    0.11102041  0.9836044  0.97526247
##   0.2    0.13122449  0.9836044  0.97526247
##   0.2    0.15142857  0.9836044  0.97526247
##   0.2    0.17163265  0.9836044  0.97526247
##   0.2    0.19183673  0.9836044  0.97526247
##   0.2    0.21204082  0.9836044  0.97526247
##   0.2    0.23224490  0.9836044  0.97526247
##   0.2    0.25244898  0.9836044  0.97526247
##   0.2    0.27265306  0.9836044  0.97526247
##   0.2    0.29285714  0.9836044  0.97526247
##   0.2    0.31306122  0.9836044  0.97526247
##   0.2    0.33326531  0.9835092  0.97510862
##   0.2    0.35346939  0.9835092  0.97506780
##   0.2    0.37367347  0.9848425  0.97708123
##   0.2    0.39387755  0.9834139  0.97492738
##   0.2    0.41408163  0.9778755  0.96644129
##   0.2    0.43428571  0.9805421  0.97040138
##   0.2    0.45448980  0.9777802  0.96613834
##   0.2    0.47469388  0.9748132  0.96157378
##   0.2    0.49489796  0.9750183  0.96177755
##   0.2    0.51510204  0.9735897  0.95960701
##   0.2    0.53530612  0.9666227  0.94885448
##   0.2    0.55551020  0.9623223  0.94230078
##   0.2    0.57571429  0.9623223  0.94230078
##   0.2    0.59591837  0.9623223  0.94230078
##   0.2    0.61612245  0.9623223  0.94230078
##   0.2    0.63632653  0.9593553  0.93773289
##   0.2    0.65653061  0.9564982  0.93321676
##   0.2    0.67673469  0.9524029  0.92695061
##   0.2    0.69693878  0.9496410  0.92271185
##   0.2    0.71714286  0.9425788  0.91146022
##   0.2    0.73734694  0.9331502  0.89684034
##   0.2    0.75755102  0.9304835  0.89260475
##   0.2    0.77775510  0.9235311  0.88153663
##   0.2    0.79795918  0.9195311  0.87520077
##   0.2    0.81816327  0.9111355  0.86186598
##   0.2    0.83836735  0.9042784  0.85125953
##   0.2    0.85857143  0.9001832  0.84479622
##   0.2    0.87877551  0.9001832  0.84479622
##   0.2    0.89897959  0.8934212  0.83422605
##   0.2    0.91918367  0.8906593  0.82983776
##   0.2    0.93938776  0.8892308  0.82741174
##   0.2    0.95959184  0.8821832  0.81648103
##   0.2    0.97979592  0.8726593  0.80137263
##   0.2    1.00000000  0.8631355  0.78617821
##   0.3    0.01000000  0.9819707  0.97283864
##   0.3    0.03020408  0.9863663  0.97948061
##   0.3    0.05040816  0.9863663  0.97948061
##   0.3    0.07061224  0.9863663  0.97948061
##   0.3    0.09081633  0.9849377  0.97727589
##   0.3    0.11102041  0.9849377  0.97727589
##   0.3    0.13122449  0.9849377  0.97727589
##   0.3    0.15142857  0.9849377  0.97727589
##   0.3    0.17163265  0.9835092  0.97512204
##   0.3    0.19183673  0.9835092  0.97512204
##   0.3    0.21204082  0.9821758  0.97308123
##   0.3    0.23224490  0.9795092  0.96899959
##   0.3    0.25244898  0.9723516  0.95799685
##   0.3    0.27265306  0.9723516  0.95799685
##   0.3    0.29285714  0.9721465  0.95761369
##   0.3    0.31306122  0.9749084  0.96181842
##   0.3    0.33326531  0.9734799  0.95958384
##   0.3    0.35346939  0.9690696  0.95283741
##   0.3    0.37367347  0.9678462  0.95088697
##   0.3    0.39387755  0.9664176  0.94869973
##   0.3    0.41408163  0.9623223  0.94235165
##   0.3    0.43428571  0.9609890  0.94031084
##   0.3    0.45448980  0.9609890  0.94031084
##   0.3    0.47469388  0.9595604  0.93803442
##   0.3    0.49489796  0.9538315  0.92918699
##   0.3    0.51510204  0.9469744  0.91833321
##   0.3    0.53530612  0.9426740  0.91164879
##   0.3    0.55551020  0.9371502  0.90304705
##   0.3    0.57571429  0.9290549  0.89002362
##   0.3    0.59591837  0.9208645  0.87728217
##   0.3    0.61612245  0.9153260  0.86830418
##   0.3    0.63632653  0.9098974  0.85992693
##   0.3    0.65653061  0.9032308  0.84942404
##   0.3    0.67673469  0.8977070  0.84078941
##   0.3    0.69693878  0.8907546  0.82988187
##   0.3    0.71714286  0.8780879  0.80997555
##   0.3    0.73734694  0.8686593  0.79503335
##   0.3    0.75755102  0.8533114  0.77068907
##   0.3    0.77775510  0.8319194  0.73614242
##   0.3    0.79795918  0.8152381  0.70858632
##   0.3    0.81816327  0.8043810  0.69111730
##   0.3    0.83836735  0.7897509  0.66817803
##   0.3    0.85857143  0.7563883  0.61364888
##   0.3    0.87877551  0.7113480  0.54009850
##   0.3    0.89897959  0.6602711  0.45487347
##   0.3    0.91918367  0.6298755  0.40365818
##   0.3    0.93938776  0.6021319  0.35664895
##   0.3    0.95959184  0.5908938  0.33745684
##   0.3    0.97979592  0.5588645  0.28376219
##   0.3    1.00000000  0.5134066  0.20582164
##   0.4    0.01000000  0.9819707  0.97287178
##   0.4    0.03020408  0.9863663  0.97948061
##   0.4    0.05040816  0.9863663  0.97948061
##   0.4    0.07061224  0.9863663  0.97948061
##   0.4    0.09081633  0.9863663  0.97948061
##   0.4    0.11102041  0.9849377  0.97732677
##   0.4    0.13122449  0.9849377  0.97732677
##   0.4    0.15142857  0.9795092  0.96898209
##   0.4    0.17163265  0.9780806  0.96675987
##   0.4    0.19183673  0.9737802  0.96020157
##   0.4    0.21204082  0.9722418  0.95781842
##   0.4    0.23224490  0.9708132  0.95562375
##   0.4    0.25244898  0.9690696  0.95283741
##   0.4    0.27265306  0.9676410  0.95068357
##   0.4    0.29285714  0.9691795  0.95298408
##   0.4    0.31306122  0.9664176  0.94878942
##   0.4    0.33326531  0.9664176  0.94878942
##   0.4    0.35346939  0.9650842  0.94669231
##   0.4    0.37367347  0.9609890  0.94036732
##   0.4    0.39387755  0.9609890  0.94036732
##   0.4    0.41408163  0.9609890  0.94036732
##   0.4    0.43428571  0.9583223  0.93620200
##   0.4    0.45448980  0.9499267  0.92309860
##   0.4    0.47469388  0.9386740  0.90526524
##   0.4    0.49489796  0.9288645  0.88975268
##   0.4    0.51510204  0.9178974  0.87271667
##   0.4    0.53530612  0.9057070  0.85353029
##   0.4    0.55551020  0.9003736  0.84511328
##   0.4    0.57571429  0.8823736  0.81682458
##   0.4    0.59591837  0.8627253  0.78557728
##   0.4    0.61612245  0.8405201  0.74995425
##   0.4    0.63632653  0.8194286  0.71539150
##   0.4    0.65653061  0.7973333  0.68013633
##   0.4    0.67673469  0.7436923  0.59329876
##   0.4    0.69693878  0.6922051  0.50786009
##   0.4    0.71714286  0.6356996  0.41335222
##   0.4    0.73734694  0.5992747  0.35114155
##   0.4    0.75755102  0.5809744  0.32101272
##   0.4    0.77775510  0.5191355  0.21548627
##   0.4    0.79795918  0.4757582  0.14158179
##   0.4    0.81816327  0.4358095  0.07160202
##   0.4    0.83836735  0.4163663  0.03649075
##   0.4    0.85857143  0.4037949  0.01444100
##   0.4    0.87877551  0.3956044  0.00000000
##   0.4    0.89897959  0.3956044  0.00000000
##   0.4    0.91918367  0.3956044  0.00000000
##   0.4    0.93938776  0.3956044  0.00000000
##   0.4    0.95959184  0.3956044  0.00000000
##   0.4    0.97979592  0.3956044  0.00000000
##   0.4    1.00000000  0.3956044  0.00000000
##   0.5    0.01000000  0.9806374  0.97087178
##   0.5    0.03020408  0.9848278  0.97717973
##   0.5    0.05040816  0.9863663  0.97948061
##   0.5    0.07061224  0.9849377  0.97732677
##   0.5    0.09081633  0.9836044  0.97528595
##   0.5    0.11102041  0.9794139  0.96880069
##   0.5    0.13122449  0.9780806  0.96675987
##   0.5    0.15142857  0.9722418  0.95785923
##   0.5    0.17163265  0.9681465  0.95162375
##   0.5    0.19183673  0.9677363  0.95087823
##   0.5    0.21204082  0.9677363  0.95083741
##   0.5    0.23224490  0.9623223  0.94263664
##   0.5    0.25244898  0.9608938  0.94048280
##   0.5    0.27265306  0.9608938  0.94048280
##   0.5    0.29285714  0.9623223  0.94259601
##   0.5    0.31306122  0.9650842  0.94674879
##   0.5    0.33326531  0.9650842  0.94674879
##   0.5    0.35346939  0.9609890  0.94036732
##   0.5    0.37367347  0.9609890  0.94036732
##   0.5    0.39387755  0.9568938  0.93403145
##   0.5    0.41408163  0.9459267  0.91676021
##   0.5    0.43428571  0.9319267  0.89483581
##   0.5    0.45448980  0.9124689  0.86409435
##   0.5    0.47469388  0.9016117  0.84692517
##   0.5    0.49489796  0.8640586  0.78787819
##   0.5    0.51510204  0.8375531  0.74536601
##   0.5    0.53530612  0.8139048  0.70666117
##   0.5    0.55551020  0.7562637  0.61364175
##   0.5    0.57571429  0.6826007  0.49209124
##   0.5    0.59591837  0.6166520  0.38059410
##   0.5    0.61612245  0.5937509  0.34148323
##   0.5    0.63632653  0.5312308  0.23612501
##   0.5    0.65653061  0.4870110  0.16121107
##   0.5    0.67673469  0.4357143  0.07114795
##   0.5    0.69693878  0.4052234  0.01724802
##   0.5    0.71714286  0.3956044  0.00000000
##   0.5    0.73734694  0.3956044  0.00000000
##   0.5    0.75755102  0.3956044  0.00000000
##   0.5    0.77775510  0.3956044  0.00000000
##   0.5    0.79795918  0.3956044  0.00000000
##   0.5    0.81816327  0.3956044  0.00000000
##   0.5    0.83836735  0.3956044  0.00000000
##   0.5    0.85857143  0.3956044  0.00000000
##   0.5    0.87877551  0.3956044  0.00000000
##   0.5    0.89897959  0.3956044  0.00000000
##   0.5    0.91918367  0.3956044  0.00000000
##   0.5    0.93938776  0.3956044  0.00000000
##   0.5    0.95959184  0.3956044  0.00000000
##   0.5    0.97979592  0.3956044  0.00000000
##   0.5    1.00000000  0.3956044  0.00000000
##   0.6    0.01000000  0.9806374  0.97087178
##   0.6    0.03020408  0.9849377  0.97731007
##   0.6    0.05040816  0.9835092  0.97515622
##   0.6    0.07061224  0.9808425  0.97102291
##   0.6    0.09081633  0.9779853  0.96663015
##   0.6    0.11102041  0.9738901  0.96039467
##   0.6    0.13122449  0.9693846  0.95353484
##   0.6    0.15142857  0.9622125  0.94259482
##   0.6    0.17163265  0.9621172  0.94236025
##   0.6    0.19183673  0.9593553  0.93818228
##   0.6    0.21204082  0.9595604  0.93849677
##   0.6    0.23224490  0.9595604  0.93844217
##   0.6    0.25244898  0.9595604  0.93844217
##   0.6    0.27265306  0.9595604  0.93844217
##   0.6    0.29285714  0.9595604  0.93844217
##   0.6    0.31306122  0.9623223  0.94249783
##   0.6    0.33326531  0.9583223  0.93625773
##   0.6    0.35346939  0.9528938  0.92788105
##   0.6    0.37367347  0.9402125  0.90788560
##   0.6    0.39387755  0.9234212  0.88156630
##   0.6    0.41408163  0.8943590  0.83560225
##   0.6    0.43428571  0.8572967  0.77690644
##   0.6    0.45448980  0.8096044  0.69988815
##   0.6    0.47469388  0.7505348  0.60404430
##   0.6    0.49489796  0.6607912  0.45496458
##   0.6    0.51510204  0.6124762  0.37290087
##   0.6    0.53530612  0.5619267  0.28879807
##   0.6    0.55551020  0.4998974  0.18342904
##   0.6    0.57571429  0.4425714  0.08359287
##   0.6    0.59591837  0.4052234  0.01724802
##   0.6    0.61612245  0.3956044  0.00000000
##   0.6    0.63632653  0.3956044  0.00000000
##   0.6    0.65653061  0.3956044  0.00000000
##   0.6    0.67673469  0.3956044  0.00000000
##   0.6    0.69693878  0.3956044  0.00000000
##   0.6    0.71714286  0.3956044  0.00000000
##   0.6    0.73734694  0.3956044  0.00000000
##   0.6    0.75755102  0.3956044  0.00000000
##   0.6    0.77775510  0.3956044  0.00000000
##   0.6    0.79795918  0.3956044  0.00000000
##   0.6    0.81816327  0.3956044  0.00000000
##   0.6    0.83836735  0.3956044  0.00000000
##   0.6    0.85857143  0.3956044  0.00000000
##   0.6    0.87877551  0.3956044  0.00000000
##   0.6    0.89897959  0.3956044  0.00000000
##   0.6    0.91918367  0.3956044  0.00000000
##   0.6    0.93938776  0.3956044  0.00000000
##   0.6    0.95959184  0.3956044  0.00000000
##   0.6    0.97979592  0.3956044  0.00000000
##   0.6    1.00000000  0.3956044  0.00000000
##   0.7    0.01000000  0.9806374  0.97087178
##   0.7    0.03020408  0.9849377  0.97731007
##   0.7    0.05040816  0.9794139  0.96885237
##   0.7    0.07061224  0.9751136  0.96232926
##   0.7    0.09081633  0.9681465  0.95176578
##   0.7    0.11102041  0.9596557  0.93890086
##   0.7    0.13122449  0.9565788  0.93415443
##   0.7    0.15142857  0.9510549  0.92573917
##   0.7    0.17163265  0.9497216  0.92369854
##   0.7    0.19183673  0.9510549  0.92569854
##   0.7    0.21204082  0.9525934  0.92799905
##   0.7    0.23224490  0.9566886  0.93415562
##   0.7    0.25244898  0.9582271  0.93645614
##   0.7    0.27265306  0.9595604  0.93844217
##   0.7    0.29285714  0.9554652  0.93213097
##   0.7    0.31306122  0.9527985  0.92790650
##   0.7    0.33326531  0.9443077  0.91440396
##   0.7    0.35346939  0.9209597  0.87770155
##   0.7    0.37367347  0.8721538  0.80067258
##   0.7    0.39387755  0.8093993  0.69987214
##   0.7    0.41408163  0.7389963  0.58473805
##   0.7    0.43428571  0.6445861  0.42664360
##   0.7    0.45448980  0.6095092  0.36762757
##   0.7    0.47469388  0.5380879  0.24876257
##   0.7    0.49489796  0.4787253  0.14712163
##   0.7    0.51510204  0.4079853  0.02182319
##   0.7    0.53530612  0.3956044  0.00000000
##   0.7    0.55551020  0.3956044  0.00000000
##   0.7    0.57571429  0.3956044  0.00000000
##   0.7    0.59591837  0.3956044  0.00000000
##   0.7    0.61612245  0.3956044  0.00000000
##   0.7    0.63632653  0.3956044  0.00000000
##   0.7    0.65653061  0.3956044  0.00000000
##   0.7    0.67673469  0.3956044  0.00000000
##   0.7    0.69693878  0.3956044  0.00000000
##   0.7    0.71714286  0.3956044  0.00000000
##   0.7    0.73734694  0.3956044  0.00000000
##   0.7    0.75755102  0.3956044  0.00000000
##   0.7    0.77775510  0.3956044  0.00000000
##   0.7    0.79795918  0.3956044  0.00000000
##   0.7    0.81816327  0.3956044  0.00000000
##   0.7    0.83836735  0.3956044  0.00000000
##   0.7    0.85857143  0.3956044  0.00000000
##   0.7    0.87877551  0.3956044  0.00000000
##   0.7    0.89897959  0.3956044  0.00000000
##   0.7    0.91918367  0.3956044  0.00000000
##   0.7    0.93938776  0.3956044  0.00000000
##   0.7    0.95959184  0.3956044  0.00000000
##   0.7    0.97979592  0.3956044  0.00000000
##   0.7    1.00000000  0.3956044  0.00000000
##   0.8    0.01000000  0.9833993  0.97500918
##   0.8    0.03020408  0.9819707  0.97285534
##   0.8    0.05040816  0.9736850  0.96019186
##   0.8    0.07061224  0.9723516  0.95819186
##   0.8    0.09081633  0.9582271  0.93676345
##   0.8    0.11102041  0.9524835  0.92803100
##   0.8    0.13122449  0.9469597  0.91965812
##   0.8    0.15142857  0.9456264  0.91761748
##   0.8    0.17163265  0.9470549  0.91973845
##   0.8    0.19183673  0.9470549  0.91973845
##   0.8    0.21204082  0.9483883  0.92173845
##   0.8    0.23224490  0.9525934  0.92799905
##   0.8    0.25244898  0.9553553  0.93216959
##   0.8    0.27265306  0.9484982  0.92158099
##   0.8    0.29285714  0.9432747  0.91328157
##   0.8    0.31306122  0.9266886  0.88706693
##   0.8    0.33326531  0.8747253  0.80469179
##   0.8    0.35346939  0.7911795  0.67038542
##   0.8    0.37367347  0.6992674  0.51829921
##   0.8    0.39387755  0.6374432  0.41415024
##   0.8    0.41408163  0.5896703  0.33524553
##   0.8    0.43428571  0.4984689  0.18127477
##   0.8    0.45448980  0.4094139  0.02453341
##   0.8    0.47469388  0.3956044  0.00000000
##   0.8    0.49489796  0.3956044  0.00000000
##   0.8    0.51510204  0.3956044  0.00000000
##   0.8    0.53530612  0.3956044  0.00000000
##   0.8    0.55551020  0.3956044  0.00000000
##   0.8    0.57571429  0.3956044  0.00000000
##   0.8    0.59591837  0.3956044  0.00000000
##   0.8    0.61612245  0.3956044  0.00000000
##   0.8    0.63632653  0.3956044  0.00000000
##   0.8    0.65653061  0.3956044  0.00000000
##   0.8    0.67673469  0.3956044  0.00000000
##   0.8    0.69693878  0.3956044  0.00000000
##   0.8    0.71714286  0.3956044  0.00000000
##   0.8    0.73734694  0.3956044  0.00000000
##   0.8    0.75755102  0.3956044  0.00000000
##   0.8    0.77775510  0.3956044  0.00000000
##   0.8    0.79795918  0.3956044  0.00000000
##   0.8    0.81816327  0.3956044  0.00000000
##   0.8    0.83836735  0.3956044  0.00000000
##   0.8    0.85857143  0.3956044  0.00000000
##   0.8    0.87877551  0.3956044  0.00000000
##   0.8    0.89897959  0.3956044  0.00000000
##   0.8    0.91918367  0.3956044  0.00000000
##   0.8    0.93938776  0.3956044  0.00000000
##   0.8    0.95959184  0.3956044  0.00000000
##   0.8    0.97979592  0.3956044  0.00000000
##   0.8    1.00000000  0.3956044  0.00000000
##   0.9    0.01000000  0.9806374  0.97087178
##   0.9    0.03020408  0.9736850  0.96019186
##   0.9    0.05040816  0.9736850  0.96019186
##   0.9    0.07061224  0.9582271  0.93680427
##   0.9    0.09081633  0.9524835  0.92803100
##   0.9    0.11102041  0.9469597  0.91965812
##   0.9    0.13122449  0.9456264  0.91761748
##   0.9    0.15142857  0.9440879  0.91535769
##   0.9    0.17163265  0.9440879  0.91535769
##   0.9    0.19183673  0.9456264  0.91761748
##   0.9    0.21204082  0.9498315  0.92391800
##   0.9    0.23224490  0.9512601  0.92593139
##   0.9    0.25244898  0.9457363  0.91744111
##   0.9    0.27265306  0.9361978  0.90235764
##   0.9    0.29285714  0.8958974  0.83868733
##   0.9    0.31306122  0.7984615  0.68262374
##   0.9    0.33326531  0.6951722  0.51215909
##   0.9    0.35346939  0.6377289  0.41493271
##   0.9    0.37367347  0.5965275  0.34687506
##   0.9    0.39387755  0.4927399  0.17154474
##   0.9    0.41408163  0.3996996  0.00751290
##   0.9    0.43428571  0.3956044  0.00000000
##   0.9    0.45448980  0.3956044  0.00000000
##   0.9    0.47469388  0.3956044  0.00000000
##   0.9    0.49489796  0.3956044  0.00000000
##   0.9    0.51510204  0.3956044  0.00000000
##   0.9    0.53530612  0.3956044  0.00000000
##   0.9    0.55551020  0.3956044  0.00000000
##   0.9    0.57571429  0.3956044  0.00000000
##   0.9    0.59591837  0.3956044  0.00000000
##   0.9    0.61612245  0.3956044  0.00000000
##   0.9    0.63632653  0.3956044  0.00000000
##   0.9    0.65653061  0.3956044  0.00000000
##   0.9    0.67673469  0.3956044  0.00000000
##   0.9    0.69693878  0.3956044  0.00000000
##   0.9    0.71714286  0.3956044  0.00000000
##   0.9    0.73734694  0.3956044  0.00000000
##   0.9    0.75755102  0.3956044  0.00000000
##   0.9    0.77775510  0.3956044  0.00000000
##   0.9    0.79795918  0.3956044  0.00000000
##   0.9    0.81816327  0.3956044  0.00000000
##   0.9    0.83836735  0.3956044  0.00000000
##   0.9    0.85857143  0.3956044  0.00000000
##   0.9    0.87877551  0.3956044  0.00000000
##   0.9    0.89897959  0.3956044  0.00000000
##   0.9    0.91918367  0.3956044  0.00000000
##   0.9    0.93938776  0.3956044  0.00000000
##   0.9    0.95959184  0.3956044  0.00000000
##   0.9    0.97979592  0.3956044  0.00000000
##   0.9    1.00000000  0.3956044  0.00000000
##   1.0    0.01000000  0.9793040  0.96887178
##   1.0    0.03020408  0.9736850  0.96019186
##   1.0    0.05040816  0.9694945  0.95388523
##   1.0    0.07061224  0.9524835  0.92807182
##   1.0    0.09081633  0.9481832  0.92157654
##   1.0    0.11102041  0.9440879  0.91535769
##   1.0    0.13122449  0.9426593  0.91322028
##   1.0    0.15142857  0.9411209  0.91100013
##   1.0    0.17163265  0.9425495  0.91313754
##   1.0    0.19183673  0.9454212  0.91735769
##   1.0    0.21204082  0.9469597  0.91959054
##   1.0    0.23224490  0.9401978  0.90895176
##   1.0    0.25244898  0.9205495  0.87793576
##   1.0    0.27265306  0.8717729  0.80061646
##   1.0    0.29285714  0.7590549  0.61829876
##   1.0    0.31306122  0.6417289  0.42220667
##   1.0    0.33326531  0.6249524  0.39360701
##   1.0    0.35346939  0.5303077  0.23630206
##   1.0    0.37367347  0.4025568  0.01195734
##   1.0    0.39387755  0.3956044  0.00000000
##   1.0    0.41408163  0.3956044  0.00000000
##   1.0    0.43428571  0.3956044  0.00000000
##   1.0    0.45448980  0.3956044  0.00000000
##   1.0    0.47469388  0.3956044  0.00000000
##   1.0    0.49489796  0.3956044  0.00000000
##   1.0    0.51510204  0.3956044  0.00000000
##   1.0    0.53530612  0.3956044  0.00000000
##   1.0    0.55551020  0.3956044  0.00000000
##   1.0    0.57571429  0.3956044  0.00000000
##   1.0    0.59591837  0.3956044  0.00000000
##   1.0    0.61612245  0.3956044  0.00000000
##   1.0    0.63632653  0.3956044  0.00000000
##   1.0    0.65653061  0.3956044  0.00000000
##   1.0    0.67673469  0.3956044  0.00000000
##   1.0    0.69693878  0.3956044  0.00000000
##   1.0    0.71714286  0.3956044  0.00000000
##   1.0    0.73734694  0.3956044  0.00000000
##   1.0    0.75755102  0.3956044  0.00000000
##   1.0    0.77775510  0.3956044  0.00000000
##   1.0    0.79795918  0.3956044  0.00000000
##   1.0    0.81816327  0.3956044  0.00000000
##   1.0    0.83836735  0.3956044  0.00000000
##   1.0    0.85857143  0.3956044  0.00000000
##   1.0    0.87877551  0.3956044  0.00000000
##   1.0    0.89897959  0.3956044  0.00000000
##   1.0    0.91918367  0.3956044  0.00000000
##   1.0    0.93938776  0.3956044  0.00000000
##   1.0    0.95959184  0.3956044  0.00000000
##   1.0    0.97979592  0.3956044  0.00000000
##   1.0    1.00000000  0.3956044  0.00000000
## 
## Kappa was used to select the optimal model using the largest value.
## The final values used for the model were alpha = 0.4 and lambda
##  = 0.09081633.
Elst_Model$results
##     alpha     lambda  Accuracy      Kappa AccuracySD    KappaSD
## 1     0.0 0.01000000 0.9764469 0.96438797 0.03583989 0.05408062
## 2     0.0 0.03020408 0.9764469 0.96438797 0.03583989 0.05408062
## 3     0.0 0.05040816 0.9750183 0.96225082 0.03906555 0.05891609
## 4     0.0 0.07061224 0.9763516 0.96434698 0.03875020 0.05834470
## 5     0.0 0.09081633 0.9750183 0.96237347 0.04132219 0.06211283
## 6     0.0 0.11102041 0.9736850 0.96036005 0.04157445 0.06249871
## 7     0.0 0.13122449 0.9738901 0.96064751 0.04134432 0.06217704
## 8     0.0 0.15142857 0.9753187 0.96281806 0.04097889 0.06160459
## 9     0.0 0.17163265 0.9767473 0.96498860 0.04055885 0.06094794
## 10    0.0 0.19183673 0.9767473 0.96498860 0.04055885 0.06094794
## 11    0.0 0.21204082 0.9767473 0.96498860 0.04055885 0.06094794
## 12    0.0 0.23224490 0.9767473 0.96498860 0.04055885 0.06094794
## 13    0.0 0.25244898 0.9767473 0.96498860 0.04055885 0.06094794
## 14    0.0 0.27265306 0.9767473 0.96498860 0.04055885 0.06094794
## 15    0.0 0.29285714 0.9767473 0.96498860 0.04055885 0.06094794
## 16    0.0 0.31306122 0.9753187 0.96288408 0.04577949 0.06863523
## 17    0.0 0.33326531 0.9753187 0.96288408 0.04577949 0.06863523
## 18    0.0 0.35346939 0.9753187 0.96288408 0.04577949 0.06863523
## 19    0.0 0.37367347 0.9753187 0.96288408 0.04577949 0.06863523
## 20    0.0 0.39387755 0.9753187 0.96288408 0.04577949 0.06863523
## 21    0.0 0.41408163 0.9738901 0.96058021 0.04134432 0.06237300
## 22    0.0 0.43428571 0.9738901 0.96058021 0.04134432 0.06237300
## 23    0.0 0.45448980 0.9738901 0.96058021 0.04134432 0.06237300
## 24    0.0 0.47469388 0.9738901 0.96058021 0.04134432 0.06237300
## 25    0.0 0.49489796 0.9738901 0.96058021 0.04134432 0.06237300
## 26    0.0 0.51510204 0.9738901 0.96058021 0.04134432 0.06237300
## 27    0.0 0.53530612 0.9738901 0.96058021 0.04134432 0.06237300
## 28    0.0 0.55551020 0.9738901 0.96058021 0.04134432 0.06237300
## 29    0.0 0.57571429 0.9738901 0.96058021 0.04134432 0.06237300
## 30    0.0 0.59591837 0.9738901 0.96058021 0.04134432 0.06237300
## 31    0.0 0.61612245 0.9738901 0.96058021 0.04134432 0.06237300
## 32    0.0 0.63632653 0.9738901 0.96058021 0.04134432 0.06237300
## 33    0.0 0.65653061 0.9725568 0.95852541 0.04155953 0.06273910
## 34    0.0 0.67673469 0.9725568 0.95852541 0.04155953 0.06273910
## 35    0.0 0.69693878 0.9725568 0.95852541 0.04155953 0.06273910
## 36    0.0 0.71714286 0.9725568 0.95852541 0.04155953 0.06273910
## 37    0.0 0.73734694 0.9725568 0.95852541 0.04155953 0.06273910
## 38    0.0 0.75755102 0.9725568 0.95852541 0.04155953 0.06273910
## 39    0.0 0.77775510 0.9712234 0.95647062 0.04173018 0.06303477
## 40    0.0 0.79795918 0.9712234 0.95647062 0.04173018 0.06303477
## 41    0.0 0.81816327 0.9712234 0.95647062 0.04173018 0.06303477
## 42    0.0 0.83836735 0.9712234 0.95647062 0.04173018 0.06303477
## 43    0.0 0.85857143 0.9712234 0.95647062 0.04173018 0.06303477
## 44    0.0 0.87877551 0.9697949 0.95431677 0.04194701 0.06335611
## 45    0.0 0.89897959 0.9697949 0.95431677 0.04194701 0.06335611
## 46    0.0 0.91918367 0.9697949 0.95431677 0.04194701 0.06335611
## 47    0.0 0.93938776 0.9697949 0.95431677 0.04194701 0.06335611
## 48    0.0 0.95959184 0.9711282 0.95631677 0.04182374 0.06318461
## 49    0.0 0.97979592 0.9711282 0.95631677 0.04182374 0.06318461
## 50    0.0 1.00000000 0.9711282 0.95631677 0.04182374 0.06318461
## 51    0.1 0.01000000 0.9750183 0.96225056 0.03630249 0.05475215
## 52    0.1 0.03020408 0.9820806 0.97296898 0.03341590 0.05031813
## 53    0.1 0.05040816 0.9849377 0.97731007 0.03169436 0.04765944
## 54    0.1 0.07061224 0.9849377 0.97727589 0.03169436 0.04772283
## 55    0.1 0.09081633 0.9836044 0.97526247 0.03244126 0.04885476
## 56    0.1 0.11102041 0.9836044 0.97526247 0.03244126 0.04885476
## 57    0.1 0.13122449 0.9836044 0.97526247 0.03244126 0.04885476
## 58    0.1 0.15142857 0.9822711 0.97328896 0.03575075 0.05370894
## 59    0.1 0.17163265 0.9809377 0.97127553 0.03631478 0.05456727
## 60    0.1 0.19183673 0.9809377 0.97127553 0.03631478 0.05456727
## 61    0.1 0.21204082 0.9796044 0.96926211 0.03682095 0.05533761
## 62    0.1 0.23224490 0.9796044 0.96926211 0.03682095 0.05533761
## 63    0.1 0.25244898 0.9796044 0.96926211 0.03682095 0.05533761
## 64    0.1 0.27265306 0.9796044 0.96926211 0.03682095 0.05533761
## 65    0.1 0.29285714 0.9796044 0.96926211 0.03682095 0.05533761
## 66    0.1 0.31306122 0.9796044 0.96926211 0.03682095 0.05533761
## 67    0.1 0.33326531 0.9796044 0.96926211 0.03682095 0.05533761
## 68    0.1 0.35346939 0.9796044 0.96926211 0.03682095 0.05533761
## 69    0.1 0.37367347 0.9796044 0.96926211 0.03682095 0.05533761
## 70    0.1 0.39387755 0.9796044 0.96926211 0.03682095 0.05533761
## 71    0.1 0.41408163 0.9796044 0.96926211 0.03682095 0.05533761
## 72    0.1 0.43428571 0.9796044 0.96926211 0.03682095 0.05533761
## 73    0.1 0.45448980 0.9796044 0.96926211 0.03682095 0.05533761
## 74    0.1 0.47469388 0.9796044 0.96926211 0.03682095 0.05533761
## 75    0.1 0.49489796 0.9796044 0.96926211 0.03682095 0.05533761
## 76    0.1 0.51510204 0.9796044 0.96926211 0.03682095 0.05533761
## 77    0.1 0.53530612 0.9796044 0.96926211 0.03682095 0.05533761
## 78    0.1 0.55551020 0.9796044 0.96926211 0.03682095 0.05533761
## 79    0.1 0.57571429 0.9796044 0.96926211 0.03682095 0.05533761
## 80    0.1 0.59591837 0.9795092 0.96909157 0.03694900 0.05556808
## 81    0.1 0.61612245 0.9808425 0.97105165 0.03386832 0.05107652
## 82    0.1 0.63632653 0.9808425 0.97105165 0.03386832 0.05107652
## 83    0.1 0.65653061 0.9808425 0.97105165 0.03386832 0.05107652
## 84    0.1 0.67673469 0.9808425 0.97105165 0.03386832 0.05107652
## 85    0.1 0.69693878 0.9808425 0.97105165 0.03386832 0.05107652
## 86    0.1 0.71714286 0.9794139 0.96888111 0.03454347 0.05211662
## 87    0.1 0.73734694 0.9793187 0.96868735 0.03195709 0.04839178
## 88    0.1 0.75755102 0.9806520 0.97068735 0.03138493 0.04755427
## 89    0.1 0.77775510 0.9806520 0.97068735 0.03138493 0.04755427
## 90    0.1 0.79795918 0.9792234 0.96848942 0.03210369 0.04869323
## 91    0.1 0.81816327 0.9792234 0.96848942 0.03210369 0.04869323
## 92    0.1 0.83836735 0.9807619 0.97079030 0.03118823 0.04736546
## 93    0.1 0.85857143 0.9780000 0.96658166 0.03549026 0.05385170
## 94    0.1 0.87877551 0.9763663 0.96406090 0.03629964 0.05513786
## 95    0.1 0.89897959 0.9736996 0.95992297 0.03930141 0.05994006
## 96    0.1 0.91918367 0.9710330 0.95581338 0.03969407 0.06058291
## 97    0.1 0.93938776 0.9683663 0.95168943 0.04211331 0.06443388
## 98    0.1 0.95959184 0.9670330 0.94960629 0.04424644 0.06782398
## 99    0.1 0.97979592 0.9670330 0.94960629 0.04424644 0.06782398
## 100   0.1 1.00000000 0.9654945 0.94726395 0.04441378 0.06807000
## 101   0.2 0.01000000 0.9777802 0.96644276 0.03544109 0.05342589
## 102   0.2 0.03020408 0.9863663 0.97948061 0.03069593 0.04611674
## 103   0.2 0.05040816 0.9863663 0.97948061 0.03069593 0.04611674
## 104   0.2 0.07061224 0.9849377 0.97727589 0.03169436 0.04772283
## 105   0.2 0.09081633 0.9849377 0.97727589 0.03169436 0.04772283
## 106   0.2 0.11102041 0.9836044 0.97526247 0.03244126 0.04885476
## 107   0.2 0.13122449 0.9836044 0.97526247 0.03244126 0.04885476
## 108   0.2 0.15142857 0.9836044 0.97526247 0.03244126 0.04885476
## 109   0.2 0.17163265 0.9836044 0.97526247 0.03244126 0.04885476
## 110   0.2 0.19183673 0.9836044 0.97526247 0.03244126 0.04885476
## 111   0.2 0.21204082 0.9836044 0.97526247 0.03244126 0.04885476
## 112   0.2 0.23224490 0.9836044 0.97526247 0.03244126 0.04885476
## 113   0.2 0.25244898 0.9836044 0.97526247 0.03244126 0.04885476
## 114   0.2 0.27265306 0.9836044 0.97526247 0.03244126 0.04885476
## 115   0.2 0.29285714 0.9836044 0.97526247 0.03244126 0.04885476
## 116   0.2 0.31306122 0.9836044 0.97526247 0.03244126 0.04885476
## 117   0.2 0.33326531 0.9835092 0.97510862 0.03259846 0.04910806
## 118   0.2 0.35346939 0.9835092 0.97506780 0.03259846 0.04917257
## 119   0.2 0.37367347 0.9848425 0.97708123 0.03185931 0.04805644
## 120   0.2 0.39387755 0.9834139 0.97492738 0.03579263 0.05398153
## 121   0.2 0.41408163 0.9778755 0.96644129 0.03814717 0.05769601
## 122   0.2 0.43428571 0.9805421 0.97040138 0.03472095 0.05273747
## 123   0.2 0.45448980 0.9777802 0.96613834 0.03581776 0.05449436
## 124   0.2 0.47469388 0.9748132 0.96157378 0.03693597 0.05625672
## 125   0.2 0.49489796 0.9750183 0.96177755 0.03906555 0.05976411
## 126   0.2 0.51510204 0.9735897 0.95960701 0.03943761 0.06031583
## 127   0.2 0.53530612 0.9666227 0.94885448 0.05145964 0.07878380
## 128   0.2 0.55551020 0.9623223 0.94230078 0.05348016 0.08180787
## 129   0.2 0.57571429 0.9623223 0.94230078 0.05348016 0.08180787
## 130   0.2 0.59591837 0.9623223 0.94230078 0.05348016 0.08180787
## 131   0.2 0.61612245 0.9623223 0.94230078 0.05348016 0.08180787
## 132   0.2 0.63632653 0.9593553 0.93773289 0.05336555 0.08164776
## 133   0.2 0.65653061 0.9564982 0.93321676 0.05494588 0.08433408
## 134   0.2 0.67673469 0.9524029 0.92695061 0.05407304 0.08298450
## 135   0.2 0.69693878 0.9496410 0.92271185 0.05499145 0.08437834
## 136   0.2 0.71714286 0.9425788 0.91146022 0.06097929 0.09460454
## 137   0.2 0.73734694 0.9331502 0.89684034 0.06587990 0.10201031
## 138   0.2 0.75755102 0.9304835 0.89260475 0.06982926 0.10822153
## 139   0.2 0.77775510 0.9235311 0.88153663 0.07436424 0.11630270
## 140   0.2 0.79795918 0.9195311 0.87520077 0.07672816 0.12006022
## 141   0.2 0.81816327 0.9111355 0.86186598 0.08240467 0.12962019
## 142   0.2 0.83836735 0.9042784 0.85125953 0.07963751 0.12552097
## 143   0.2 0.85857143 0.9001832 0.84479622 0.07874132 0.12417743
## 144   0.2 0.87877551 0.9001832 0.84479622 0.07874132 0.12417743
## 145   0.2 0.89897959 0.8934212 0.83422605 0.07505114 0.11858854
## 146   0.2 0.91918367 0.8906593 0.82983776 0.07348732 0.11616245
## 147   0.2 0.93938776 0.8892308 0.82741174 0.07483410 0.11858960
## 148   0.2 0.95959184 0.8821832 0.81648103 0.07522813 0.11897890
## 149   0.2 0.97979592 0.8726593 0.80137263 0.07755345 0.12266776
## 150   0.2 1.00000000 0.8631355 0.78617821 0.07730749 0.12256649
## 151   0.3 0.01000000 0.9819707 0.97283864 0.03360396 0.05054149
## 152   0.3 0.03020408 0.9863663 0.97948061 0.03069593 0.04611674
## 153   0.3 0.05040816 0.9863663 0.97948061 0.03069593 0.04611674
## 154   0.3 0.07061224 0.9863663 0.97948061 0.03069593 0.04611674
## 155   0.3 0.09081633 0.9849377 0.97727589 0.03169436 0.04772283
## 156   0.3 0.11102041 0.9849377 0.97727589 0.03169436 0.04772283
## 157   0.3 0.13122449 0.9849377 0.97727589 0.03169436 0.04772283
## 158   0.3 0.15142857 0.9849377 0.97727589 0.03169436 0.04772283
## 159   0.3 0.17163265 0.9835092 0.97512204 0.03259846 0.04908701
## 160   0.3 0.19183673 0.9835092 0.97512204 0.03259846 0.04908701
## 161   0.3 0.21204082 0.9821758 0.97308123 0.03326674 0.05014146
## 162   0.3 0.23224490 0.9795092 0.96899959 0.03440673 0.05194160
## 163   0.3 0.25244898 0.9723516 0.95799685 0.03955139 0.05991493
## 164   0.3 0.27265306 0.9723516 0.95799685 0.03955139 0.05991493
## 165   0.3 0.29285714 0.9721465 0.95761369 0.03978379 0.06034123
## 166   0.3 0.31306122 0.9749084 0.96181842 0.03920635 0.05945302
## 167   0.3 0.33326531 0.9734799 0.95958384 0.04243951 0.06444108
## 168   0.3 0.35346939 0.9690696 0.95283741 0.04855742 0.07380422
## 169   0.3 0.37367347 0.9678462 0.95088697 0.04788554 0.07302986
## 170   0.3 0.39387755 0.9664176 0.94869973 0.05009560 0.07638092
## 171   0.3 0.41408163 0.9623223 0.94235165 0.05348016 0.08177531
## 172   0.3 0.43428571 0.9609890 0.94031084 0.05335253 0.08158031
## 173   0.3 0.45448980 0.9609890 0.94031084 0.05335253 0.08158031
## 174   0.3 0.47469388 0.9595604 0.93803442 0.05324283 0.08146870
## 175   0.3 0.49489796 0.9538315 0.92918699 0.05601661 0.08590147
## 176   0.3 0.51510204 0.9469744 0.91833321 0.06232862 0.09636251
## 177   0.3 0.53530612 0.9426740 0.91164879 0.06096068 0.09428220
## 178   0.3 0.55551020 0.9371502 0.90304705 0.06645324 0.10286670
## 179   0.3 0.57571429 0.9290549 0.89002362 0.07591473 0.11907304
## 180   0.3 0.59591837 0.9208645 0.87728217 0.07636917 0.11966534
## 181   0.3 0.61612245 0.9153260 0.86830418 0.08397752 0.13203116
## 182   0.3 0.63632653 0.9098974 0.85992693 0.08149670 0.12822399
## 183   0.3 0.65653061 0.9032308 0.84942404 0.08314100 0.13087354
## 184   0.3 0.67673469 0.8977070 0.84078941 0.08232834 0.12959530
## 185   0.3 0.69693878 0.8907546 0.82988187 0.08362810 0.13126916
## 186   0.3 0.71714286 0.8780879 0.80997555 0.08158550 0.12848892
## 187   0.3 0.73734694 0.8686593 0.79503335 0.08316934 0.13116634
## 188   0.3 0.75755102 0.8533114 0.77068907 0.08124036 0.12819057
## 189   0.3 0.77775510 0.8319194 0.73614242 0.08341465 0.13330258
## 190   0.3 0.79795918 0.8152381 0.70858632 0.09512800 0.15316768
## 191   0.3 0.81816327 0.8043810 0.69111730 0.09683403 0.15599547
## 192   0.3 0.83836735 0.7897509 0.66817803 0.09736954 0.15563315
## 193   0.3 0.85857143 0.7563883 0.61364888 0.09884004 0.16019166
## 194   0.3 0.87877551 0.7113480 0.54009850 0.09451295 0.15358572
## 195   0.3 0.89897959 0.6602711 0.45487347 0.08784850 0.14306869
## 196   0.3 0.91918367 0.6298755 0.40365818 0.08240518 0.13360916
## 197   0.3 0.93938776 0.6021319 0.35664895 0.07224992 0.11412238
## 198   0.3 0.95959184 0.5908938 0.33745684 0.06466946 0.10041052
## 199   0.3 0.97979592 0.5588645 0.28376219 0.06828455 0.10649635
## 200   0.3 1.00000000 0.5134066 0.20582164 0.06716758 0.10396435
## 201   0.4 0.01000000 0.9819707 0.97287178 0.03360396 0.05048757
## 202   0.4 0.03020408 0.9863663 0.97948061 0.03069593 0.04611674
## 203   0.4 0.05040816 0.9863663 0.97948061 0.03069593 0.04611674
## 204   0.4 0.07061224 0.9863663 0.97948061 0.03069593 0.04611674
## 205   0.4 0.09081633 0.9863663 0.97948061 0.03069593 0.04611674
## 206   0.4 0.11102041 0.9849377 0.97732677 0.03169436 0.04762889
## 207   0.4 0.13122449 0.9849377 0.97732677 0.03169436 0.04762889
## 208   0.4 0.15142857 0.9795092 0.96898209 0.03440673 0.05196898
## 209   0.4 0.17163265 0.9780806 0.96675987 0.03501605 0.05298130
## 210   0.4 0.19183673 0.9737802 0.96020157 0.03922644 0.05937869
## 211   0.4 0.21204082 0.9722418 0.95781842 0.03968292 0.06012331
## 212   0.4 0.23224490 0.9708132 0.95562375 0.04485912 0.06791772
## 213   0.4 0.25244898 0.9690696 0.95283741 0.04855742 0.07380422
## 214   0.4 0.27265306 0.9676410 0.95068357 0.04867944 0.07397100
## 215   0.4 0.29285714 0.9691795 0.95298408 0.04589766 0.06984246
## 216   0.4 0.31306122 0.9664176 0.94878942 0.04604255 0.07004796
## 217   0.4 0.33326531 0.9664176 0.94878942 0.04604255 0.07004796
## 218   0.4 0.35346939 0.9650842 0.94669231 0.04794612 0.07310373
## 219   0.4 0.37367347 0.9609890 0.94036732 0.05335253 0.08144221
## 220   0.4 0.39387755 0.9609890 0.94036732 0.05335253 0.08144221
## 221   0.4 0.41408163 0.9609890 0.94036732 0.05335253 0.08144221
## 222   0.4 0.43428571 0.9583223 0.93620200 0.05631296 0.08609568
## 223   0.4 0.45448980 0.9499267 0.92309860 0.05787097 0.08870893
## 224   0.4 0.47469388 0.9386740 0.90526524 0.07223015 0.11206730
## 225   0.4 0.49489796 0.9288645 0.88975268 0.07608621 0.11909474
## 226   0.4 0.51510204 0.9178974 0.87271667 0.07796695 0.12202638
## 227   0.4 0.53530612 0.9057070 0.85353029 0.08373777 0.13130338
## 228   0.4 0.55551020 0.9003736 0.84511328 0.08711672 0.13658282
## 229   0.4 0.57571429 0.8823736 0.81682458 0.08209621 0.12920070
## 230   0.4 0.59591837 0.8627253 0.78557728 0.07768214 0.12295980
## 231   0.4 0.61612245 0.8405201 0.74995425 0.08337685 0.13278360
## 232   0.4 0.63632653 0.8194286 0.71539150 0.09452696 0.15189778
## 233   0.4 0.65653061 0.7973333 0.68013633 0.09940596 0.15890592
## 234   0.4 0.67673469 0.7436923 0.59329876 0.09146484 0.14783665
## 235   0.4 0.69693878 0.6922051 0.50786009 0.08456675 0.13882469
## 236   0.4 0.71714286 0.6356996 0.41335222 0.07746783 0.12607924
## 237   0.4 0.73734694 0.5992747 0.35114155 0.06731212 0.10706035
## 238   0.4 0.75755102 0.5809744 0.32101272 0.06966842 0.10879159
## 239   0.4 0.77775510 0.5191355 0.21548627 0.06975288 0.10922627
## 240   0.4 0.79795918 0.4757582 0.14158179 0.06988709 0.10443672
## 241   0.4 0.81816327 0.4358095 0.07160202 0.05466458 0.08044736
## 242   0.4 0.83836735 0.4163663 0.03649075 0.04483384 0.06698732
## 243   0.4 0.85857143 0.4037949 0.01444100 0.03286802 0.03963616
## 244   0.4 0.87877551 0.3956044 0.00000000 0.02336401 0.00000000
## 245   0.4 0.89897959 0.3956044 0.00000000 0.02336401 0.00000000
## 246   0.4 0.91918367 0.3956044 0.00000000 0.02336401 0.00000000
## 247   0.4 0.93938776 0.3956044 0.00000000 0.02336401 0.00000000
## 248   0.4 0.95959184 0.3956044 0.00000000 0.02336401 0.00000000
## 249   0.4 0.97979592 0.3956044 0.00000000 0.02336401 0.00000000
## 250   0.4 1.00000000 0.3956044 0.00000000 0.02336401 0.00000000
## 251   0.5 0.01000000 0.9806374 0.97087178 0.03419146 0.05136407
## 252   0.5 0.03020408 0.9848278 0.97717973 0.03190262 0.04790725
## 253   0.5 0.05040816 0.9863663 0.97948061 0.03069593 0.04611674
## 254   0.5 0.07061224 0.9849377 0.97732677 0.03169436 0.04762889
## 255   0.5 0.09081633 0.9836044 0.97528595 0.03244126 0.04880907
## 256   0.5 0.11102041 0.9794139 0.96880069 0.03454347 0.05223709
## 257   0.5 0.13122449 0.9780806 0.96675987 0.03501605 0.05298130
## 258   0.5 0.15142857 0.9722418 0.95785923 0.03968292 0.06008252
## 259   0.5 0.17163265 0.9681465 0.95162375 0.04502944 0.06813515
## 260   0.5 0.19183673 0.9677363 0.95087823 0.04860604 0.07382575
## 261   0.5 0.21204082 0.9677363 0.95083741 0.04860604 0.07385501
## 262   0.5 0.23224490 0.9623223 0.94263664 0.04812390 0.07319717
## 263   0.5 0.25244898 0.9608938 0.94048280 0.04804272 0.07305910
## 264   0.5 0.27265306 0.9608938 0.94048280 0.04804272 0.07305910
## 265   0.5 0.29285714 0.9623223 0.94259601 0.04997315 0.07603087
## 266   0.5 0.31306122 0.9650842 0.94674879 0.04794612 0.07294455
## 267   0.5 0.33326531 0.9650842 0.94674879 0.04794612 0.07294455
## 268   0.5 0.35346939 0.9609890 0.94036732 0.05335253 0.08144221
## 269   0.5 0.37367347 0.9609890 0.94036732 0.05335253 0.08144221
## 270   0.5 0.39387755 0.9568938 0.93403145 0.05613983 0.08582205
## 271   0.5 0.41408163 0.9459267 0.91676021 0.06426462 0.09913102
## 272   0.5 0.43428571 0.9319267 0.89483581 0.07340236 0.11372063
## 273   0.5 0.45448980 0.9124689 0.86409435 0.08517882 0.13318080
## 274   0.5 0.47469388 0.9016117 0.84692517 0.08185726 0.12845915
## 275   0.5 0.49489796 0.8640586 0.78787819 0.07561505 0.11955405
## 276   0.5 0.51510204 0.8375531 0.74536601 0.08282470 0.13127508
## 277   0.5 0.53530612 0.8139048 0.70666117 0.09500065 0.15240489
## 278   0.5 0.55551020 0.7562637 0.61364175 0.09264425 0.14942974
## 279   0.5 0.57571429 0.6826007 0.49209124 0.08548198 0.13877595
## 280   0.5 0.59591837 0.6166520 0.38059410 0.06878793 0.10943198
## 281   0.5 0.61612245 0.5937509 0.34148323 0.06184838 0.09841780
## 282   0.5 0.63632653 0.5312308 0.23612501 0.06975199 0.10886749
## 283   0.5 0.65653061 0.4870110 0.16121107 0.07429637 0.11215802
## 284   0.5 0.67673469 0.4357143 0.07114795 0.05383724 0.08011478
## 285   0.5 0.69693878 0.4052234 0.01724802 0.03542009 0.04338509
## 286   0.5 0.71714286 0.3956044 0.00000000 0.02336401 0.00000000
## 287   0.5 0.73734694 0.3956044 0.00000000 0.02336401 0.00000000
## 288   0.5 0.75755102 0.3956044 0.00000000 0.02336401 0.00000000
## 289   0.5 0.77775510 0.3956044 0.00000000 0.02336401 0.00000000
## 290   0.5 0.79795918 0.3956044 0.00000000 0.02336401 0.00000000
## 291   0.5 0.81816327 0.3956044 0.00000000 0.02336401 0.00000000
## 292   0.5 0.83836735 0.3956044 0.00000000 0.02336401 0.00000000
## 293   0.5 0.85857143 0.3956044 0.00000000 0.02336401 0.00000000
## 294   0.5 0.87877551 0.3956044 0.00000000 0.02336401 0.00000000
## 295   0.5 0.89897959 0.3956044 0.00000000 0.02336401 0.00000000
## 296   0.5 0.91918367 0.3956044 0.00000000 0.02336401 0.00000000
## 297   0.5 0.93938776 0.3956044 0.00000000 0.02336401 0.00000000
## 298   0.5 0.95959184 0.3956044 0.00000000 0.02336401 0.00000000
## 299   0.5 0.97979592 0.3956044 0.00000000 0.02336401 0.00000000
## 300   0.5 1.00000000 0.3956044 0.00000000 0.02336401 0.00000000
## 301   0.6 0.01000000 0.9806374 0.97087178 0.03419146 0.05136407
## 302   0.6 0.03020408 0.9849377 0.97731007 0.03169436 0.04765944
## 303   0.6 0.05040816 0.9835092 0.97515622 0.03259846 0.04902692
## 304   0.6 0.07061224 0.9808425 0.97102291 0.03386832 0.05111962
## 305   0.6 0.09081633 0.9779853 0.96663015 0.03514647 0.05316083
## 306   0.6 0.11102041 0.9738901 0.96039467 0.03908896 0.05913378
## 307   0.6 0.13122449 0.9693846 0.95353484 0.04504763 0.06813754
## 308   0.6 0.15142857 0.9622125 0.94259482 0.05241188 0.07948339
## 309   0.6 0.17163265 0.9621172 0.94236025 0.05071160 0.07704838
## 310   0.6 0.19183673 0.9593553 0.93818228 0.05044978 0.07663340
## 311   0.6 0.21204082 0.9595604 0.93849677 0.04971908 0.07554601
## 312   0.6 0.23224490 0.9595604 0.93844217 0.04971908 0.07561235
## 313   0.6 0.25244898 0.9595604 0.93844217 0.04971908 0.07561235
## 314   0.6 0.27265306 0.9595604 0.93844217 0.04971908 0.07561235
## 315   0.6 0.29285714 0.9595604 0.93844217 0.04971908 0.07561235
## 316   0.6 0.31306122 0.9623223 0.94249783 0.04970387 0.07575141
## 317   0.6 0.33326531 0.9583223 0.93625773 0.05467854 0.08352924
## 318   0.6 0.35346939 0.9528938 0.92788105 0.05689448 0.08702278
## 319   0.6 0.37367347 0.9402125 0.90788560 0.07688502 0.11840645
## 320   0.6 0.39387755 0.9234212 0.88156630 0.08130465 0.12581009
## 321   0.6 0.41408163 0.8943590 0.83560225 0.08340838 0.13069919
## 322   0.6 0.43428571 0.8572967 0.77690644 0.07695091 0.12222953
## 323   0.6 0.45448980 0.8096044 0.69988815 0.09350246 0.14972914
## 324   0.6 0.47469388 0.7505348 0.60404430 0.09199321 0.14857847
## 325   0.6 0.49489796 0.6607912 0.45496458 0.08137368 0.13200192
## 326   0.6 0.51510204 0.6124762 0.37290087 0.06026665 0.09656021
## 327   0.6 0.53530612 0.5619267 0.28879807 0.07139966 0.11029200
## 328   0.6 0.55551020 0.4998974 0.18342904 0.07168132 0.10751918
## 329   0.6 0.57571429 0.4425714 0.08359287 0.05996273 0.09028476
## 330   0.6 0.59591837 0.4052234 0.01724802 0.03542009 0.04338509
## 331   0.6 0.61612245 0.3956044 0.00000000 0.02336401 0.00000000
## 332   0.6 0.63632653 0.3956044 0.00000000 0.02336401 0.00000000
## 333   0.6 0.65653061 0.3956044 0.00000000 0.02336401 0.00000000
## 334   0.6 0.67673469 0.3956044 0.00000000 0.02336401 0.00000000
## 335   0.6 0.69693878 0.3956044 0.00000000 0.02336401 0.00000000
## 336   0.6 0.71714286 0.3956044 0.00000000 0.02336401 0.00000000
## 337   0.6 0.73734694 0.3956044 0.00000000 0.02336401 0.00000000
## 338   0.6 0.75755102 0.3956044 0.00000000 0.02336401 0.00000000
## 339   0.6 0.77775510 0.3956044 0.00000000 0.02336401 0.00000000
## 340   0.6 0.79795918 0.3956044 0.00000000 0.02336401 0.00000000
## 341   0.6 0.81816327 0.3956044 0.00000000 0.02336401 0.00000000
## 342   0.6 0.83836735 0.3956044 0.00000000 0.02336401 0.00000000
## 343   0.6 0.85857143 0.3956044 0.00000000 0.02336401 0.00000000
## 344   0.6 0.87877551 0.3956044 0.00000000 0.02336401 0.00000000
## 345   0.6 0.89897959 0.3956044 0.00000000 0.02336401 0.00000000
## 346   0.6 0.91918367 0.3956044 0.00000000 0.02336401 0.00000000
## 347   0.6 0.93938776 0.3956044 0.00000000 0.02336401 0.00000000
## 348   0.6 0.95959184 0.3956044 0.00000000 0.02336401 0.00000000
## 349   0.6 0.97979592 0.3956044 0.00000000 0.02336401 0.00000000
## 350   0.6 1.00000000 0.3956044 0.00000000 0.02336401 0.00000000
## 351   0.7 0.01000000 0.9806374 0.97087178 0.03419146 0.05136407
## 352   0.7 0.03020408 0.9849377 0.97731007 0.03169436 0.04765944
## 353   0.7 0.05040816 0.9794139 0.96885237 0.03454347 0.05215764
## 354   0.7 0.07061224 0.9751136 0.96232926 0.03618421 0.05466707
## 355   0.7 0.09081633 0.9681465 0.95176578 0.04510078 0.06813286
## 356   0.7 0.11102041 0.9596557 0.93890086 0.05145700 0.07780207
## 357   0.7 0.13122449 0.9565788 0.93415443 0.05576175 0.08444857
## 358   0.7 0.15142857 0.9510549 0.92573917 0.05457471 0.08272205
## 359   0.7 0.17163265 0.9497216 0.92369854 0.05581675 0.08462348
## 360   0.7 0.19183673 0.9510549 0.92569854 0.05621214 0.08523148
## 361   0.7 0.21204082 0.9525934 0.92799905 0.05430275 0.08238913
## 362   0.7 0.23224490 0.9566886 0.93415562 0.05348782 0.08120756
## 363   0.7 0.25244898 0.9582271 0.93645614 0.05130538 0.07796490
## 364   0.7 0.27265306 0.9595604 0.93844217 0.04971908 0.07561235
## 365   0.7 0.29285714 0.9554652 0.93213097 0.05452964 0.08302320
## 366   0.7 0.31306122 0.9527985 0.92790650 0.05531574 0.08440358
## 367   0.7 0.33326531 0.9443077 0.91440396 0.06745174 0.10352234
## 368   0.7 0.35346939 0.9209597 0.87770155 0.08205930 0.12699507
## 369   0.7 0.37367347 0.8721538 0.80067258 0.07986419 0.12668735
## 370   0.7 0.39387755 0.8093993 0.69987214 0.08929627 0.14238691
## 371   0.7 0.41408163 0.7389963 0.58473805 0.09095134 0.14818765
## 372   0.7 0.43428571 0.6445861 0.42664360 0.07888436 0.12695062
## 373   0.7 0.45448980 0.6095092 0.36762757 0.05797076 0.09310852
## 374   0.7 0.47469388 0.5380879 0.24876257 0.07344864 0.11121886
## 375   0.7 0.49489796 0.4787253 0.14712163 0.07565734 0.11487644
## 376   0.7 0.51510204 0.4079853 0.02182319 0.03587806 0.04726816
## 377   0.7 0.53530612 0.3956044 0.00000000 0.02336401 0.00000000
## 378   0.7 0.55551020 0.3956044 0.00000000 0.02336401 0.00000000
## 379   0.7 0.57571429 0.3956044 0.00000000 0.02336401 0.00000000
## 380   0.7 0.59591837 0.3956044 0.00000000 0.02336401 0.00000000
## 381   0.7 0.61612245 0.3956044 0.00000000 0.02336401 0.00000000
## 382   0.7 0.63632653 0.3956044 0.00000000 0.02336401 0.00000000
## 383   0.7 0.65653061 0.3956044 0.00000000 0.02336401 0.00000000
## 384   0.7 0.67673469 0.3956044 0.00000000 0.02336401 0.00000000
## 385   0.7 0.69693878 0.3956044 0.00000000 0.02336401 0.00000000
## 386   0.7 0.71714286 0.3956044 0.00000000 0.02336401 0.00000000
## 387   0.7 0.73734694 0.3956044 0.00000000 0.02336401 0.00000000
## 388   0.7 0.75755102 0.3956044 0.00000000 0.02336401 0.00000000
## 389   0.7 0.77775510 0.3956044 0.00000000 0.02336401 0.00000000
## 390   0.7 0.79795918 0.3956044 0.00000000 0.02336401 0.00000000
## 391   0.7 0.81816327 0.3956044 0.00000000 0.02336401 0.00000000
## 392   0.7 0.83836735 0.3956044 0.00000000 0.02336401 0.00000000
## 393   0.7 0.85857143 0.3956044 0.00000000 0.02336401 0.00000000
## 394   0.7 0.87877551 0.3956044 0.00000000 0.02336401 0.00000000
## 395   0.7 0.89897959 0.3956044 0.00000000 0.02336401 0.00000000
## 396   0.7 0.91918367 0.3956044 0.00000000 0.02336401 0.00000000
## 397   0.7 0.93938776 0.3956044 0.00000000 0.02336401 0.00000000
## 398   0.7 0.95959184 0.3956044 0.00000000 0.02336401 0.00000000
## 399   0.7 0.97979592 0.3956044 0.00000000 0.02336401 0.00000000
## 400   0.7 1.00000000 0.3956044 0.00000000 0.02336401 0.00000000
## 401   0.8 0.01000000 0.9833993 0.97500918 0.03279610 0.04929078
## 402   0.8 0.03020408 0.9819707 0.97285534 0.03360396 0.05051419
## 403   0.8 0.05040816 0.9736850 0.96019186 0.03658937 0.05525026
## 404   0.8 0.07061224 0.9723516 0.95819186 0.03682480 0.05558875
## 405   0.8 0.09081633 0.9582271 0.93676345 0.05130538 0.07755688
## 406   0.8 0.11102041 0.9524835 0.92803100 0.05653579 0.08557122
## 407   0.8 0.13122449 0.9469597 0.91965812 0.05681023 0.08591895
## 408   0.8 0.15142857 0.9456264 0.91761748 0.05790829 0.08760671
## 409   0.8 0.17163265 0.9470549 0.91973845 0.05654592 0.08562288
## 410   0.8 0.19183673 0.9470549 0.91973845 0.05654592 0.08562288
## 411   0.8 0.21204082 0.9483883 0.92173845 0.05699994 0.08631751
## 412   0.8 0.23224490 0.9525934 0.92799905 0.05430275 0.08238913
## 413   0.8 0.25244898 0.9553553 0.93216959 0.05489433 0.08329821
## 414   0.8 0.27265306 0.9484982 0.92158099 0.06337819 0.09628703
## 415   0.8 0.29285714 0.9432747 0.91328157 0.06229423 0.09503839
## 416   0.8 0.31306122 0.9266886 0.88706693 0.07691208 0.11840751
## 417   0.8 0.33326531 0.8747253 0.80469179 0.08570090 0.13545450
## 418   0.8 0.35346939 0.7911795 0.67038542 0.10027565 0.16000178
## 419   0.8 0.37367347 0.6992674 0.51829921 0.08802997 0.14477173
## 420   0.8 0.39387755 0.6374432 0.41415024 0.06165264 0.09851161
## 421   0.8 0.41408163 0.5896703 0.33524553 0.06874850 0.10737661
## 422   0.8 0.43428571 0.4984689 0.18127477 0.07131190 0.10505704
## 423   0.8 0.45448980 0.4094139 0.02453341 0.04071290 0.05707127
## 424   0.8 0.47469388 0.3956044 0.00000000 0.02336401 0.00000000
## 425   0.8 0.49489796 0.3956044 0.00000000 0.02336401 0.00000000
## 426   0.8 0.51510204 0.3956044 0.00000000 0.02336401 0.00000000
## 427   0.8 0.53530612 0.3956044 0.00000000 0.02336401 0.00000000
## 428   0.8 0.55551020 0.3956044 0.00000000 0.02336401 0.00000000
## 429   0.8 0.57571429 0.3956044 0.00000000 0.02336401 0.00000000
## 430   0.8 0.59591837 0.3956044 0.00000000 0.02336401 0.00000000
## 431   0.8 0.61612245 0.3956044 0.00000000 0.02336401 0.00000000
## 432   0.8 0.63632653 0.3956044 0.00000000 0.02336401 0.00000000
## 433   0.8 0.65653061 0.3956044 0.00000000 0.02336401 0.00000000
## 434   0.8 0.67673469 0.3956044 0.00000000 0.02336401 0.00000000
## 435   0.8 0.69693878 0.3956044 0.00000000 0.02336401 0.00000000
## 436   0.8 0.71714286 0.3956044 0.00000000 0.02336401 0.00000000
## 437   0.8 0.73734694 0.3956044 0.00000000 0.02336401 0.00000000
## 438   0.8 0.75755102 0.3956044 0.00000000 0.02336401 0.00000000
## 439   0.8 0.77775510 0.3956044 0.00000000 0.02336401 0.00000000
## 440   0.8 0.79795918 0.3956044 0.00000000 0.02336401 0.00000000
## 441   0.8 0.81816327 0.3956044 0.00000000 0.02336401 0.00000000
## 442   0.8 0.83836735 0.3956044 0.00000000 0.02336401 0.00000000
## 443   0.8 0.85857143 0.3956044 0.00000000 0.02336401 0.00000000
## 444   0.8 0.87877551 0.3956044 0.00000000 0.02336401 0.00000000
## 445   0.8 0.89897959 0.3956044 0.00000000 0.02336401 0.00000000
## 446   0.8 0.91918367 0.3956044 0.00000000 0.02336401 0.00000000
## 447   0.8 0.93938776 0.3956044 0.00000000 0.02336401 0.00000000
## 448   0.8 0.95959184 0.3956044 0.00000000 0.02336401 0.00000000
## 449   0.8 0.97979592 0.3956044 0.00000000 0.02336401 0.00000000
## 450   0.8 1.00000000 0.3956044 0.00000000 0.02336401 0.00000000
## 451   0.9 0.01000000 0.9806374 0.97087178 0.03419146 0.05136407
## 452   0.9 0.03020408 0.9736850 0.96019186 0.03658937 0.05525026
## 453   0.9 0.05040816 0.9736850 0.96019186 0.03658937 0.05525026
## 454   0.9 0.07061224 0.9582271 0.93680427 0.05130538 0.07753657
## 455   0.9 0.09081633 0.9524835 0.92803100 0.05653579 0.08557122
## 456   0.9 0.11102041 0.9469597 0.91965812 0.05681023 0.08591895
## 457   0.9 0.13122449 0.9456264 0.91761748 0.05790829 0.08760671
## 458   0.9 0.15142857 0.9440879 0.91535769 0.06151446 0.09284902
## 459   0.9 0.17163265 0.9440879 0.91535769 0.06151446 0.09284902
## 460   0.9 0.19183673 0.9456264 0.91761748 0.05790829 0.08760671
## 461   0.9 0.21204082 0.9498315 0.92391800 0.05708129 0.08640418
## 462   0.9 0.23224490 0.9512601 0.92593139 0.05558854 0.08432933
## 463   0.9 0.25244898 0.9457363 0.91744111 0.06234345 0.09466749
## 464   0.9 0.27265306 0.9361978 0.90235764 0.06721463 0.10274049
## 465   0.9 0.29285714 0.8958974 0.83868733 0.08488958 0.13227722
## 466   0.9 0.31306122 0.7984615 0.68262374 0.09615091 0.15346495
## 467   0.9 0.33326531 0.6951722 0.51215909 0.08942460 0.14716967
## 468   0.9 0.35346939 0.6377289 0.41493271 0.06730682 0.10846315
## 469   0.9 0.37367347 0.5965275 0.34687506 0.06482916 0.10034061
## 470   0.9 0.39387755 0.4927399 0.17154474 0.07504595 0.11264691
## 471   0.9 0.41408163 0.3996996 0.00751290 0.03056679 0.03015533
## 472   0.9 0.43428571 0.3956044 0.00000000 0.02336401 0.00000000
## 473   0.9 0.45448980 0.3956044 0.00000000 0.02336401 0.00000000
## 474   0.9 0.47469388 0.3956044 0.00000000 0.02336401 0.00000000
## 475   0.9 0.49489796 0.3956044 0.00000000 0.02336401 0.00000000
## 476   0.9 0.51510204 0.3956044 0.00000000 0.02336401 0.00000000
## 477   0.9 0.53530612 0.3956044 0.00000000 0.02336401 0.00000000
## 478   0.9 0.55551020 0.3956044 0.00000000 0.02336401 0.00000000
## 479   0.9 0.57571429 0.3956044 0.00000000 0.02336401 0.00000000
## 480   0.9 0.59591837 0.3956044 0.00000000 0.02336401 0.00000000
## 481   0.9 0.61612245 0.3956044 0.00000000 0.02336401 0.00000000
## 482   0.9 0.63632653 0.3956044 0.00000000 0.02336401 0.00000000
## 483   0.9 0.65653061 0.3956044 0.00000000 0.02336401 0.00000000
## 484   0.9 0.67673469 0.3956044 0.00000000 0.02336401 0.00000000
## 485   0.9 0.69693878 0.3956044 0.00000000 0.02336401 0.00000000
## 486   0.9 0.71714286 0.3956044 0.00000000 0.02336401 0.00000000
## 487   0.9 0.73734694 0.3956044 0.00000000 0.02336401 0.00000000
## 488   0.9 0.75755102 0.3956044 0.00000000 0.02336401 0.00000000
## 489   0.9 0.77775510 0.3956044 0.00000000 0.02336401 0.00000000
## 490   0.9 0.79795918 0.3956044 0.00000000 0.02336401 0.00000000
## 491   0.9 0.81816327 0.3956044 0.00000000 0.02336401 0.00000000
## 492   0.9 0.83836735 0.3956044 0.00000000 0.02336401 0.00000000
## 493   0.9 0.85857143 0.3956044 0.00000000 0.02336401 0.00000000
## 494   0.9 0.87877551 0.3956044 0.00000000 0.02336401 0.00000000
## 495   0.9 0.89897959 0.3956044 0.00000000 0.02336401 0.00000000
## 496   0.9 0.91918367 0.3956044 0.00000000 0.02336401 0.00000000
## 497   0.9 0.93938776 0.3956044 0.00000000 0.02336401 0.00000000
## 498   0.9 0.95959184 0.3956044 0.00000000 0.02336401 0.00000000
## 499   0.9 0.97979592 0.3956044 0.00000000 0.02336401 0.00000000
## 500   0.9 1.00000000 0.3956044 0.00000000 0.02336401 0.00000000
## 501   1.0 0.01000000 0.9793040 0.96887178 0.03471682 0.05214764
## 502   1.0 0.03020408 0.9736850 0.96019186 0.03658937 0.05525026
## 503   1.0 0.05040816 0.9694945 0.95388523 0.04255891 0.06433040
## 504   1.0 0.07061224 0.9524835 0.92807182 0.05653579 0.08555707
## 505   1.0 0.09081633 0.9481832 0.92157654 0.05965758 0.09006805
## 506   1.0 0.11102041 0.9440879 0.91535769 0.06151446 0.09284902
## 507   1.0 0.13122449 0.9426593 0.91322028 0.06101689 0.09208774
## 508   1.0 0.15142857 0.9411209 0.91100013 0.06622628 0.09948458
## 509   1.0 0.17163265 0.9425495 0.91313754 0.06671862 0.10023800
## 510   1.0 0.19183673 0.9454212 0.91735769 0.06199721 0.09358555
## 511   1.0 0.21204082 0.9469597 0.91959054 0.06190098 0.09347435
## 512   1.0 0.23224490 0.9401978 0.90895176 0.06515894 0.09896162
## 513   1.0 0.25244898 0.9205495 0.87793576 0.07441856 0.11472384
## 514   1.0 0.27265306 0.8717729 0.80061646 0.08700028 0.13664051
## 515   1.0 0.29285714 0.7590549 0.61829876 0.10162632 0.16419448
## 516   1.0 0.31306122 0.6417289 0.42220667 0.06575282 0.10573805
## 517   1.0 0.33326531 0.6249524 0.39360701 0.06699824 0.10957725
## 518   1.0 0.35346939 0.5303077 0.23630206 0.07293446 0.10772753
## 519   1.0 0.37367347 0.4025568 0.01195734 0.02976744 0.03639984
## 520   1.0 0.39387755 0.3956044 0.00000000 0.02336401 0.00000000
## 521   1.0 0.41408163 0.3956044 0.00000000 0.02336401 0.00000000
## 522   1.0 0.43428571 0.3956044 0.00000000 0.02336401 0.00000000
## 523   1.0 0.45448980 0.3956044 0.00000000 0.02336401 0.00000000
## 524   1.0 0.47469388 0.3956044 0.00000000 0.02336401 0.00000000
## 525   1.0 0.49489796 0.3956044 0.00000000 0.02336401 0.00000000
## 526   1.0 0.51510204 0.3956044 0.00000000 0.02336401 0.00000000
## 527   1.0 0.53530612 0.3956044 0.00000000 0.02336401 0.00000000
## 528   1.0 0.55551020 0.3956044 0.00000000 0.02336401 0.00000000
## 529   1.0 0.57571429 0.3956044 0.00000000 0.02336401 0.00000000
## 530   1.0 0.59591837 0.3956044 0.00000000 0.02336401 0.00000000
## 531   1.0 0.61612245 0.3956044 0.00000000 0.02336401 0.00000000
## 532   1.0 0.63632653 0.3956044 0.00000000 0.02336401 0.00000000
## 533   1.0 0.65653061 0.3956044 0.00000000 0.02336401 0.00000000
## 534   1.0 0.67673469 0.3956044 0.00000000 0.02336401 0.00000000
## 535   1.0 0.69693878 0.3956044 0.00000000 0.02336401 0.00000000
## 536   1.0 0.71714286 0.3956044 0.00000000 0.02336401 0.00000000
## 537   1.0 0.73734694 0.3956044 0.00000000 0.02336401 0.00000000
## 538   1.0 0.75755102 0.3956044 0.00000000 0.02336401 0.00000000
## 539   1.0 0.77775510 0.3956044 0.00000000 0.02336401 0.00000000
## 540   1.0 0.79795918 0.3956044 0.00000000 0.02336401 0.00000000
## 541   1.0 0.81816327 0.3956044 0.00000000 0.02336401 0.00000000
## 542   1.0 0.83836735 0.3956044 0.00000000 0.02336401 0.00000000
## 543   1.0 0.85857143 0.3956044 0.00000000 0.02336401 0.00000000
## 544   1.0 0.87877551 0.3956044 0.00000000 0.02336401 0.00000000
## 545   1.0 0.89897959 0.3956044 0.00000000 0.02336401 0.00000000
## 546   1.0 0.91918367 0.3956044 0.00000000 0.02336401 0.00000000
## 547   1.0 0.93938776 0.3956044 0.00000000 0.02336401 0.00000000
## 548   1.0 0.95959184 0.3956044 0.00000000 0.02336401 0.00000000
## 549   1.0 0.97979592 0.3956044 0.00000000 0.02336401 0.00000000
## 550   1.0 1.00000000 0.3956044 0.00000000 0.02336401 0.00000000
Elst_Model$bestTune
##     alpha     lambda
## 205   0.4 0.09081633

모형적합 2 (K-Nearest Neighbors) K = 몇 개의 이웃을 사용할까

  • K-Nearest Neighbors

    • 누군가에 대해 파악하고 싶다면 그 주위의 친구(이웃)를 보자
    • 내 주변 친구들(이웃들)이 착하다면 나는 착한 아이
    • 내 주변 친구들(이웃들)이 나쁘다면 나는 나쁜 아이
    • Tuning parameter = N (caret에서는 N으로 사용)

      • N = 몇명의 주변 친구(이웃)를 참고할것인가?
    • 너무 멀리 있는 자료를 참고해도 오류가 발생할 수 있음

controlObject <- trainControl(method = "repeatedcv",
                              repeats = 5,
                              number = 10,
                              classProbs = T)

Knn_Model <- train(Class ~ ., 
                   data = Train_dat,
                   method = "knn",
                   preProc = c("center", "scale"),
                   metric = "Kappa",
                   tuneGrid = data.frame(.k = 2:11), #튜닝파라미터가 1개인 경우 바로 적용 가능함
                                                     #Knn에서 tuning parameter는 전체 smaple 수 - 1 (N - 1) 개만큼 가능함
                   trControl = controlObject)

Knn_Model
## k-Nearest Neighbors 
## 
## 144 samples
##  15 predictor
##   3 classes: 'w1', 'w2', 'w3' 
## 
## Pre-processing: centered (15), scaled (15) 
## Resampling: Cross-Validated (10 fold, repeated 5 times) 
## Summary of sample sizes: 129, 129, 130, 129, 130, 129, ... 
## Resampling results across tuning parameters:
## 
##   k   Accuracy   Kappa    
##    2  0.9525128  0.9283953
##    3  0.9720513  0.9580119
##    4  0.9539267  0.9308298
##    5  0.9544322  0.9314323
##    6  0.9609890  0.9413209
##    7  0.9610842  0.9413481
##    8  0.9584322  0.9372770
##    9  0.9568938  0.9352185
##   10  0.9528938  0.9290811
##   11  0.9512601  0.9266726
## 
## Kappa was used to select the optimal model using the largest value.
## The final value used for the model was k = 3.
Knn_Model$results
##     k  Accuracy     Kappa AccuracySD    KappaSD
## 1   2 0.9525128 0.9283953 0.05585461 0.08365350
## 2   3 0.9720513 0.9580119 0.04280496 0.06409868
## 3   4 0.9539267 0.9308298 0.05238084 0.07816783
## 4   5 0.9544322 0.9314323 0.05752356 0.08608663
## 5   6 0.9609890 0.9413209 0.05162447 0.07745454
## 6   7 0.9610842 0.9413481 0.04862141 0.07312426
## 7   8 0.9584322 0.9372770 0.05032573 0.07583804
## 8   9 0.9568938 0.9352185 0.05425342 0.08119680
## 9  10 0.9528938 0.9290811 0.05689448 0.08533087
## 10 11 0.9512601 0.9266726 0.05666904 0.08497555
Knn_Model$bestTune
##   k
## 2 3

모형적합 3 (Linear Discriminant Analysis)

  • 집단을 구분하는 기준을 만들때의 규칙

    1. 집단 내부의 자료들의 거리는 최대한 작게 (유사한 자료끼리) = within
    2. 집단 간의 거리는 최대한 멀게 (서로 다른 집단으로 보이도록) = between
    • Tuning parameter = X (기본적인 LDA 분석에서 tunning parameter는 존재하지 않음)
    • 집단이 3개 이상인 경우에는 LDA2를 사용하며, LDA2에서는 tuning parameter를 지정함
controlObject <- trainControl(method = "repeatedcv",
                              repeats = 5,
                              number = 10,
                              classProbs = T)

lda_Model <- train(Class ~ ., data = Train_dat,
                   method = "lda2", #집단이 3개 이상이기 때문에 LDA2를 사용함
                   preProc = c("center", "scale"),
                   metric = "Kappa",
                   tuneLength = 2, #tunlength를 사용해서 tuning parameter를 조정함 
                                   #최대 집단 개수 - 1 만큼 가능함 
                   trControl = controlObject)

lda_Model
## Linear Discriminant Analysis 
## 
## 144 samples
##  15 predictor
##   3 classes: 'w1', 'w2', 'w3' 
## 
## Pre-processing: centered (15), scaled (15) 
## Resampling: Cross-Validated (10 fold, repeated 5 times) 
## Summary of sample sizes: 130, 130, 129, 129, 129, 130, ... 
## Resampling results across tuning parameters:
## 
##   dimen  Accuracy  Kappa    
##   1      0.908674  0.8617268
##   2      0.993033  0.9895351
## 
## Kappa was used to select the optimal model using the largest value.
## The final value used for the model was dimen = 2.
lda_Model$results
##   dimen Accuracy     Kappa AccuracySD    KappaSD
## 1     1 0.908674 0.8617268 0.06607605 0.09943481
## 2     2 0.993033 0.9895351 0.02507721 0.03753725
lda_Model$bestTune
##   dimen
## 2     2
  • 자료가 선형적으로 분리가 될 수 있을 때, 판별이 잘됨

모형적합 4 (Neural Network)

  • 입력노드에서 자극이 전달되고 역치를 넘으면 발화하는 신경의 활동을 모사한 모형
  • 여러가지 신경활동의 다양한 조합으로 복잡한 결과를 예측하는 것이 가능함

    • Tuning parameter = number of hidden unit, decay
    • number of hidden unit = 입력과 출력 사이에 존재하는 제3의 층에 속하는 뉴런 개수를 말함
    • decay = 연결강도 학습에서 일정수준이상이 되면 학습강도를 약화시키는데 이것과 관련된 parameter
controlObject <- trainControl(method = "repeatedcv",
                              repeats = 5,
                              number = 10,
                              classProbs = T)

nnetGrid <- expand.grid(.decay = c(0.001, 0.01, 0.1),
                        .size = seq(3,11, by = 2),
                        .bag = FALSE) #bag = bootstrap aggregation

nnetModel <- train(Class ~ .,
                   data = Train_dat,
                   method = "avNNet", #Neural Network를 여려번 실행하고 평균을 내는 방법
                   tuneGrid = nnetGrid,
                   preProc = c("center", "scale"),
                   linout = F, #output을 linear로 할것인가 class로 할 것인가 정하는 부분, linout = regression할 때 사용해야함
                   trace = F,
                   maxit = 2000,
                   trControl = controlObject)

nnetModel
## Model Averaged Neural Network 
## 
## 144 samples
##  15 predictor
##   3 classes: 'w1', 'w2', 'w3' 
## 
## Pre-processing: centered (15), scaled (15) 
## Resampling: Cross-Validated (10 fold, repeated 5 times) 
## Summary of sample sizes: 130, 129, 129, 131, 130, 129, ... 
## Resampling results across tuning parameters:
## 
##   decay  size  Accuracy   Kappa    
##   0.001   3    0.9763663  0.9641526
##   0.001   5    0.9777949  0.9663231
##   0.001   7    0.9777949  0.9663231
##   0.001   9    0.9763663  0.9641526
##   0.001  11    0.9763663  0.9641526
##   0.010   3    0.9763663  0.9641526
##   0.010   5    0.9763663  0.9641526
##   0.010   7    0.9763663  0.9641526
##   0.010   9    0.9763663  0.9641526
##   0.010  11    0.9763663  0.9641526
##   0.100   3    0.9750330  0.9621533
##   0.100   5    0.9750330  0.9621533
##   0.100   7    0.9763663  0.9641526
##   0.100   9    0.9750330  0.9621533
##   0.100  11    0.9750330  0.9621533
## 
## Tuning parameter 'bag' was held constant at a value of FALSE
## Accuracy was used to select the optimal model using the largest value.
## The final values used for the model were size = 5, decay = 0.001 and bag
##  = FALSE.
nnetModel$results
##    decay size   bag  Accuracy     Kappa AccuracySD    KappaSD
## 1  0.001    3 FALSE 0.9763663 0.9641526 0.04164321 0.06315057
## 6  0.010    3 FALSE 0.9763663 0.9641526 0.04164321 0.06315057
## 11 0.100    3 FALSE 0.9750330 0.9621533 0.04404705 0.06678904
## 2  0.001    5 FALSE 0.9777949 0.9663231 0.03858248 0.05848820
## 7  0.010    5 FALSE 0.9763663 0.9641526 0.04164321 0.06315057
## 12 0.100    5 FALSE 0.9750330 0.9621533 0.04404705 0.06678904
## 3  0.001    7 FALSE 0.9777949 0.9663231 0.03858248 0.05848820
## 8  0.010    7 FALSE 0.9763663 0.9641526 0.04164321 0.06315057
## 13 0.100    7 FALSE 0.9763663 0.9641526 0.04164321 0.06315057
## 4  0.001    9 FALSE 0.9763663 0.9641526 0.04164321 0.06315057
## 9  0.010    9 FALSE 0.9763663 0.9641526 0.04164321 0.06315057
## 14 0.100    9 FALSE 0.9750330 0.9621533 0.04404705 0.06678904
## 5  0.001   11 FALSE 0.9763663 0.9641526 0.04164321 0.06315057
## 10 0.010   11 FALSE 0.9763663 0.9641526 0.04164321 0.06315057
## 15 0.100   11 FALSE 0.9750330 0.9621533 0.04404705 0.06678904
nnetModel$bestTune
##   size decay   bag
## 2    5 0.001 FALSE

모형적합 5 (Support Vector Machine)

  • 집단 간에 가장 큰 폭을 만드는 결정 규칙을 찾는 방법 (maximum margin을 찾는 방법)
  • 전체 자료를 사용한는 것이 아니라 결정 규칙을 정하는데 중요한 일부 자료만을 활용함.
  • 사용되는 일부자료 = support vector

    • Tuning parameter = C, kernel parameter
    • C = 에러에 얼마나 많은 페널티를 줄지 결정하는 parameter
    • C = optimal hyperplane을 기준으로 얼만큼의 에러를 허용할지 결정하는 parameter
    • kernel parameter = 자료가 직선이 아닌 다른 차원으로 projection해야하는 경우 사용하는 parameter

      • ex) polynomian Kernel, Gaussian Kernel, sigmoid Kernel …

library(kernlab)
    
controlObject <- trainControl(method = "repeatedcv",
                              repeats = 5,
                              number = 10,
                              classProbs = T)

sigmaRange <- sigest(as.matrix(Train_dat[,-1])) #sigest = sigma의 범위를 결정하는 함수, matrix 형태가 적용됨 
sigmaRange
##        90%        50%        10% 
## 0.02199392 0.03961734 0.11910789
svmGrid <- expand.grid(.sigma = sigmaRange,
                       .C = 2^(seq(-5,5,1))) #C가 1/2^5 ~ 2^5 까지 총 11개, sigma = 3개, 11 x 3 = 33개

svmModel <- train(Class ~ ., 
                  data = Train_dat,
                  method = "svmRadial",
                  tuneGrid = svmGrid,
                  preProc = c("center", "scale"),
                  trControl = controlObject)

svmModel
## Support Vector Machines with Radial Basis Function Kernel 
## 
## 144 samples
##  15 predictor
##   3 classes: 'w1', 'w2', 'w3' 
## 
## Pre-processing: centered (15), scaled (15) 
## Resampling: Cross-Validated (10 fold, repeated 5 times) 
## Summary of sample sizes: 129, 129, 130, 129, 130, 130, ... 
## Resampling results across tuning parameters:
## 
##   sigma       C         Accuracy   Kappa    
##   0.02199392   0.03125  0.9248791  0.8875777
##   0.02199392   0.06250  0.9262125  0.8895252
##   0.02199392   0.12500  0.9582418  0.9373101
##   0.02199392   0.25000  0.9767619  0.9646494
##   0.02199392   0.50000  0.9810476  0.9710063
##   0.02199392   1.00000  0.9863810  0.9790778
##   0.02199392   2.00000  0.9878095  0.9812731
##   0.02199392   4.00000  0.9782857  0.9668391
##   0.02199392   8.00000  0.9796190  0.9689424
##   0.02199392  16.00000  0.9797143  0.9691528
##   0.02199392  32.00000  0.9782857  0.9669823
##   0.03961734   0.03125  0.9277363  0.8918458
##   0.03961734   0.06250  0.9276410  0.8916785
##   0.03961734   0.12500  0.9739048  0.9603799
##   0.03961734   0.25000  0.9822857  0.9728124
##   0.03961734   0.50000  0.9820952  0.9724588
##   0.03961734   1.00000  0.9837143  0.9749664
##   0.03961734   2.00000  0.9797143  0.9689929
##   0.03961734   4.00000  0.9838095  0.9753067
##   0.03961734   8.00000  0.9811429  0.9713067
##   0.03961734  16.00000  0.9797143  0.9689929
##   0.03961734  32.00000  0.9810476  0.9709929
##   0.11910789   0.03125  0.9383077  0.9080083
##   0.11910789   0.06250  0.9426081  0.9142504
##   0.11910789   0.12500  0.9693187  0.9536642
##   0.11910789   0.25000  0.9739048  0.9597410
##   0.11910789   0.50000  0.9781905  0.9665844
##   0.11910789   1.00000  0.9821905  0.9726790
##   0.11910789   2.00000  0.9808571  0.9706242
##   0.11910789   4.00000  0.9822857  0.9727948
##   0.11910789   8.00000  0.9821905  0.9727122
##   0.11910789  16.00000  0.9821905  0.9726790
##   0.11910789  32.00000  0.9808571  0.9706242
## 
## Accuracy was used to select the optimal model using the largest value.
## The final values used for the model were sigma = 0.02199392 and C = 2.
svmModel$results
##         sigma        C  Accuracy     Kappa AccuracySD    KappaSD
## 1  0.02199392  0.03125 0.9248791 0.8875777 0.06112341 0.09067715
## 2  0.02199392  0.06250 0.9262125 0.8895252 0.05903518 0.08764781
## 3  0.02199392  0.12500 0.9582418 0.9373101 0.04478321 0.06699974
## 4  0.02199392  0.25000 0.9767619 0.9646494 0.04270722 0.06496163
## 5  0.02199392  0.50000 0.9810476 0.9710063 0.03891834 0.05981994
## 6  0.02199392  1.00000 0.9863810 0.9790778 0.03645662 0.05619784
## 7  0.02199392  2.00000 0.9878095 0.9812731 0.03553287 0.05478812
## 8  0.02199392  4.00000 0.9782857 0.9668391 0.03993562 0.06129960
## 9  0.02199392  8.00000 0.9796190 0.9689424 0.03423114 0.05221619
## 10 0.02199392 16.00000 0.9797143 0.9691528 0.03665663 0.05566125
## 11 0.02199392 32.00000 0.9782857 0.9669823 0.03723718 0.05654285
## 12 0.03961734  0.03125 0.9277363 0.8918458 0.05661319 0.08397571
## 13 0.03961734  0.06250 0.9276410 0.8916785 0.05684417 0.08435362
## 14 0.03961734  0.12500 0.9739048 0.9603799 0.03666206 0.05577122
## 15 0.03961734  0.25000 0.9822857 0.9728124 0.03608701 0.05583038
## 16 0.03961734  0.50000 0.9820952 0.9724588 0.03377549 0.05242683
## 17 0.03961734  1.00000 0.9837143 0.9749664 0.03532296 0.05471693
## 18 0.03961734  2.00000 0.9797143 0.9689929 0.03939486 0.06051432
## 19 0.03961734  4.00000 0.9838095 0.9753067 0.03479377 0.05293475
## 20 0.03961734  8.00000 0.9811429 0.9713067 0.03600896 0.05470235
## 21 0.03961734 16.00000 0.9797143 0.9689929 0.03939486 0.06051432
## 22 0.03961734 32.00000 0.9810476 0.9709929 0.03891834 0.05983627
## 23 0.11910789  0.03125 0.9383077 0.9080083 0.06925756 0.10242085
## 24 0.11910789  0.06250 0.9426081 0.9142504 0.06380271 0.09483704
## 25 0.11910789  0.12500 0.9693187 0.9536642 0.04507944 0.06811710
## 26 0.11910789  0.25000 0.9739048 0.9597410 0.03666206 0.05698000
## 27 0.11910789  0.50000 0.9781905 0.9665844 0.03484813 0.05356513
## 28 0.11910789  1.00000 0.9821905 0.9726790 0.03362788 0.05208088
## 29 0.11910789  2.00000 0.9808571 0.9706242 0.03422370 0.05299960
## 30 0.11910789  4.00000 0.9822857 0.9727948 0.03347934 0.05190284
## 31 0.11910789  8.00000 0.9821905 0.9727122 0.03362788 0.05202865
## 32 0.11910789 16.00000 0.9821905 0.9726790 0.03362788 0.05208088
## 33 0.11910789 32.00000 0.9808571 0.9706242 0.03422370 0.05299960
svmModel$bestTune
##        sigma C
## 7 0.02199392 2

모형적합 6 (Classification Tree)

  • 쉽고 빠르고 해석이 쉬워 많이 쓰이는 모형중 하나임.
  • 하나의 변수 값을 기준으로 두 영역으로 나누는 과정을 반복하여 최적의 분류 규칙을 찾아내는 방법.
  • 분류 규칙을 만들었을 때, 나무처럼 생김 (모형이 가지치기 하는 형태처럼 보임).
  • 모형을 해석하기 용이해서 많이 사용함.

    • Tuning parameter = cp
    • cp = complexity parameter, 얼마나 풍성한 나무를 만들지 결정하는 parameter
    • 나무를 복잡하게 만들수록 자료를 잘 설명하지만, 예측하는 측면에서는 오류가 많이 발생함 -> cp로 조정하게 됨

controlObject <- trainControl(method = "repeatedcv",
                              repeats = 5,
                              number = 10,
                              classProbs = T)

rpartModel <- train(Class ~ ., 
                    data = Train_dat,
                    method = "rpart", #cp만 조정하면 됨
                    tuneLength = 30,
                    trControl = controlObject)

rpartModel
## CART 
## 
## 144 samples
##  15 predictor
##   3 classes: 'w1', 'w2', 'w3' 
## 
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 5 times) 
## Summary of sample sizes: 131, 130, 131, 129, 129, 130, ... 
## Resampling results across tuning parameters:
## 
##   cp          Accuracy   Kappa    
##   0.00000000  0.8701502  0.8010349
##   0.01545779  0.8648168  0.7933550
##   0.03091558  0.8581502  0.7830085
##   0.04637337  0.8481209  0.7683358
##   0.06183115  0.8495495  0.7704897
##   0.07728894  0.8495495  0.7704897
##   0.09274673  0.8495495  0.7704897
##   0.10820452  0.8495495  0.7704897
##   0.12366231  0.8495495  0.7704897
##   0.13912010  0.8495495  0.7704897
##   0.15457788  0.8495495  0.7704897
##   0.17003567  0.8495495  0.7704897
##   0.18549346  0.8495495  0.7704897
##   0.20095125  0.8495495  0.7704897
##   0.21640904  0.8495495  0.7704897
##   0.23186683  0.8495495  0.7704897
##   0.24732461  0.8495495  0.7704897
##   0.26278240  0.8495495  0.7704897
##   0.27824019  0.8495495  0.7704897
##   0.29369798  0.8495495  0.7704897
##   0.30915577  0.8495495  0.7704897
##   0.32461356  0.8495495  0.7704897
##   0.34007134  0.8285018  0.7360282
##   0.35552913  0.8022637  0.6931900
##   0.37098692  0.7938828  0.6794943
##   0.38644471  0.7938828  0.6794943
##   0.40190250  0.7938828  0.6794943
##   0.41736029  0.7938828  0.6794943
##   0.43281807  0.7834432  0.6638090
##   0.44827586  0.6771502  0.4918981
## 
## Accuracy was used to select the optimal model using the largest value.
## The final value used for the model was cp = 0.
rpartModel$results
##            cp  Accuracy     Kappa AccuracySD   KappaSD
## 1  0.00000000 0.8701502 0.8010349 0.08160548 0.1256790
## 2  0.01545779 0.8648168 0.7933550 0.08182412 0.1254311
## 3  0.03091558 0.8581502 0.7830085 0.08304683 0.1276059
## 4  0.04637337 0.8481209 0.7683358 0.08066412 0.1236229
## 5  0.06183115 0.8495495 0.7704897 0.08145578 0.1248445
## 6  0.07728894 0.8495495 0.7704897 0.08145578 0.1248445
## 7  0.09274673 0.8495495 0.7704897 0.08145578 0.1248445
## 8  0.10820452 0.8495495 0.7704897 0.08145578 0.1248445
## 9  0.12366231 0.8495495 0.7704897 0.08145578 0.1248445
## 10 0.13912010 0.8495495 0.7704897 0.08145578 0.1248445
## 11 0.15457788 0.8495495 0.7704897 0.08145578 0.1248445
## 12 0.17003567 0.8495495 0.7704897 0.08145578 0.1248445
## 13 0.18549346 0.8495495 0.7704897 0.08145578 0.1248445
## 14 0.20095125 0.8495495 0.7704897 0.08145578 0.1248445
## 15 0.21640904 0.8495495 0.7704897 0.08145578 0.1248445
## 16 0.23186683 0.8495495 0.7704897 0.08145578 0.1248445
## 17 0.24732461 0.8495495 0.7704897 0.08145578 0.1248445
## 18 0.26278240 0.8495495 0.7704897 0.08145578 0.1248445
## 19 0.27824019 0.8495495 0.7704897 0.08145578 0.1248445
## 20 0.29369798 0.8495495 0.7704897 0.08145578 0.1248445
## 21 0.30915577 0.8495495 0.7704897 0.08145578 0.1248445
## 22 0.32461356 0.8495495 0.7704897 0.08145578 0.1248445
## 23 0.34007134 0.8285018 0.7360282 0.09957340 0.1568791
## 24 0.35552913 0.8022637 0.6931900 0.11407642 0.1815706
## 25 0.37098692 0.7938828 0.6794943 0.11801007 0.1879561
## 26 0.38644471 0.7938828 0.6794943 0.11801007 0.1879561
## 27 0.40190250 0.7938828 0.6794943 0.11801007 0.1879561
## 28 0.41736029 0.7938828 0.6794943 0.11801007 0.1879561
## 29 0.43281807 0.7834432 0.6638090 0.12932181 0.2033859
## 30 0.44827586 0.6771502 0.4918981 0.15962462 0.2585976
rpartModel$bestTune
##   cp
## 1  0

모형적합 7 (Random Forest)

  • tree기법은 변수값에 너무 민감함.
  • 변수의 값이 조금 변하거나 단지 하나의 자료만 추가되더라도 결정 규칙이 바뀌는 경우가 생김.
  • 일부 자료만을 사용한 여러개의 하위나무(sub trees)를 만들고 이 나무들의 결과를 평균내어 의사결정을 내림 -> 훨씬 안정적인 결과

    • Tuning parameter = mtry
    • mtry = sub trees를 만들때 사용할 predictor의 개수를 지정함
controlObject <- trainControl(method = "repeatedcv",
                              repeats = 5,
                              number = 10,
                              classProbs = T)

rfGrid <- expand.grid(.mtry = c(4,6,8,10)) #변수를 4개, 6개, 8개 10개 사용한 모델을 만들겠다는 명령어

rfModel <- train(Class ~.,
                 data = Train_dat,
                 method = "rf",
                 tuneGrid = rfGrid,
                 metric = "Kappa",
                 ntrees = 1500,
                 importance = T, #변수의 중요도를 보여주는 옵션
                 trControl = controlObject)

rfModel
## Random Forest 
## 
## 144 samples
##  15 predictor
##   3 classes: 'w1', 'w2', 'w3' 
## 
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 5 times) 
## Summary of sample sizes: 130, 130, 129, 131, 129, 129, ... 
## Resampling results across tuning parameters:
## 
##   mtry  Accuracy   Kappa    
##    4    0.9750330  0.9624079
##    6    0.9722711  0.9582377
##    8    0.9695092  0.9540455
##   10    0.9654139  0.9477513
## 
## Kappa was used to select the optimal model using the largest value.
## The final value used for the model was mtry = 4.
rfModel$results
##   mtry  Accuracy     Kappa AccuracySD    KappaSD
## 1    4 0.9750330 0.9624079 0.03903395 0.05867040
## 2    6 0.9722711 0.9582377 0.04216884 0.06330115
## 3    8 0.9695092 0.9540455 0.04878541 0.07342329
## 4   10 0.9654139 0.9477513 0.05250155 0.07957595
rfModel$bestTune
##   mtry
## 1    4

모형적합 8 (Naice Bayes)

  • Bayes rule에 기반한 방법 (조건부 확률).

    • X가 주어졌을 때, Y = Class가 나타날 확률
    • X가 서로 영향이 있는 상황을 고려해서 확률을 계산하기 어려움

    • X가 서로 영향이 없이 독립이라고 가정하고 확률을 계산함
    • X의 영향을 고려하지 않아 계산이 더 간단함
controlObject <- trainControl(method = "repeatedcv",
                              repeats = 5,
                              number = 10,
                              classProbs = T)

nbtune <- expand.grid(.fL = 1:5, 
                      .usekernel = T,
                      .adjust = 0:5) #fL = laplace correction (확률이 0에 근접하는 경우 또는 0인 경우 추정값이 불안정할 수 있으니, 0에 근접하는 작은 값으로 가정함) ,usekernel = 성능향상을 위해 사용하는 kernel

nbModel <- train(Class ~.,
                 data = Train_dat,
                 method = "nb",
                 tuneGrid = nbtune,
                 trControl = controlObject)

nbModel
## Naive Bayes 
## 
## 144 samples
##  15 predictor
##   3 classes: 'w1', 'w2', 'w3' 
## 
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 5 times) 
## Summary of sample sizes: 130, 130, 130, 129, 131, 129, ... 
## Resampling results across tuning parameters:
## 
##   fL  adjust  Accuracy   Kappa    
##   1   0             NaN        NaN
##   1   1       0.9724615  0.9583648
##   1   2       0.9724615  0.9585568
##   1   3       0.9611136  0.9414534
##   1   4       0.9320513  0.8977184
##   1   5       0.9057509  0.8587426
##   2   0             NaN        NaN
##   2   1       0.9724615  0.9583648
##   2   2       0.9724615  0.9585568
##   2   3       0.9611136  0.9414534
##   2   4       0.9320513  0.8977184
##   2   5       0.9057509  0.8587426
##   3   0             NaN        NaN
##   3   1       0.9724615  0.9583648
##   3   2       0.9724615  0.9585568
##   3   3       0.9611136  0.9414534
##   3   4       0.9320513  0.8977184
##   3   5       0.9057509  0.8587426
##   4   0             NaN        NaN
##   4   1       0.9724615  0.9583648
##   4   2       0.9724615  0.9585568
##   4   3       0.9611136  0.9414534
##   4   4       0.9320513  0.8977184
##   4   5       0.9057509  0.8587426
##   5   0             NaN        NaN
##   5   1       0.9724615  0.9583648
##   5   2       0.9724615  0.9585568
##   5   3       0.9611136  0.9414534
##   5   4       0.9320513  0.8977184
##   5   5       0.9057509  0.8587426
## 
## Tuning parameter 'usekernel' was held constant at a value of TRUE
## Accuracy was used to select the optimal model using the largest value.
## The final values used for the model were fL = 1, usekernel = TRUE
##  and adjust = 1.
nbModel$results
##    fL usekernel adjust  Accuracy     Kappa AccuracySD    KappaSD
## 1   1      TRUE      0       NaN       NaN         NA         NA
## 2   1      TRUE      1 0.9724615 0.9583648 0.04165659 0.06311470
## 3   1      TRUE      2 0.9724615 0.9585568 0.03412057 0.05135909
## 4   1      TRUE      3 0.9611136 0.9414534 0.04261127 0.06407348
## 5   1      TRUE      4 0.9320513 0.8977184 0.05538807 0.08310758
## 6   1      TRUE      5 0.9057509 0.8587426 0.06667366 0.09908958
## 7   2      TRUE      0       NaN       NaN         NA         NA
## 8   2      TRUE      1 0.9724615 0.9583648 0.04165659 0.06311470
## 9   2      TRUE      2 0.9724615 0.9585568 0.03412057 0.05135909
## 10  2      TRUE      3 0.9611136 0.9414534 0.04261127 0.06407348
## 11  2      TRUE      4 0.9320513 0.8977184 0.05538807 0.08310758
## 12  2      TRUE      5 0.9057509 0.8587426 0.06667366 0.09908958
## 13  3      TRUE      0       NaN       NaN         NA         NA
## 14  3      TRUE      1 0.9724615 0.9583648 0.04165659 0.06311470
## 15  3      TRUE      2 0.9724615 0.9585568 0.03412057 0.05135909
## 16  3      TRUE      3 0.9611136 0.9414534 0.04261127 0.06407348
## 17  3      TRUE      4 0.9320513 0.8977184 0.05538807 0.08310758
## 18  3      TRUE      5 0.9057509 0.8587426 0.06667366 0.09908958
## 19  4      TRUE      0       NaN       NaN         NA         NA
## 20  4      TRUE      1 0.9724615 0.9583648 0.04165659 0.06311470
## 21  4      TRUE      2 0.9724615 0.9585568 0.03412057 0.05135909
## 22  4      TRUE      3 0.9611136 0.9414534 0.04261127 0.06407348
## 23  4      TRUE      4 0.9320513 0.8977184 0.05538807 0.08310758
## 24  4      TRUE      5 0.9057509 0.8587426 0.06667366 0.09908958
## 25  5      TRUE      0       NaN       NaN         NA         NA
## 26  5      TRUE      1 0.9724615 0.9583648 0.04165659 0.06311470
## 27  5      TRUE      2 0.9724615 0.9585568 0.03412057 0.05135909
## 28  5      TRUE      3 0.9611136 0.9414534 0.04261127 0.06407348
## 29  5      TRUE      4 0.9320513 0.8977184 0.05538807 0.08310758
## 30  5      TRUE      5 0.9057509 0.8587426 0.06667366 0.09908958
nbModel$bestTune
##   fL usekernel adjust
## 2  1      TRUE      1

모형선택

  1. 모형 내 선택 = tuning parameter에 따라 모형을 선택
  2. 모형 간 선택 = 실행한 여러가지 모형들 중에 적절한 모형을 선택

모형내 선택

whichonepct <- best(rfModel$results, metric = "Kappa", maximize = T)
rfModel$results[whichonepct, ]
##   mtry Accuracy     Kappa AccuracySD   KappaSD
## 1    4 0.975033 0.9624079 0.03903395 0.0586704
whichTwoPct <- tolerance(rfModel$results, metric = "Kappa", tol = 2, maximize = TRUE) #가장 간단한 모형을 보여주는 코드
rfModel$results[whichTwoPct, ]
##   mtry Accuracy     Kappa AccuracySD   KappaSD
## 1    4 0.975033 0.9624079 0.03903395 0.0586704

모형 간 선택

#control object를 동일하게 맞춰야가능함
allResamples <- resamples(list("Elastic Net" = Elst_Model,
                               "K-NN" = Knn_Model,
                               "lda" = lda_Model,
                               "svm" = svmModel,
                               "Neural network" = nnetModel,
                               "tree" = rpartModel,
                               "Random Forest" = rfModel,
                               "naive bayes" = nbModel))


summary(allResamples)
## 
## Call:
## summary.resamples(object = allResamples)
## 
## Models: Elastic Net, K-NN, lda, svm, Neural network, tree, Random Forest, naive bayes 
## Number of resamples: 50 
## 
## Accuracy 
##                     Min.   1st Qu.    Median      Mean   3rd Qu. Max. NA's
## Elastic Net    0.8666667 1.0000000 1.0000000 0.9863663 1.0000000    1    0
## K-NN           0.8461538 0.9333333 1.0000000 0.9720513 1.0000000    1    0
## lda            0.8666667 1.0000000 1.0000000 0.9930330 1.0000000    1    0
## svm            0.8571429 1.0000000 1.0000000 0.9878095 1.0000000    1    0
## Neural network 0.8571429 0.9333333 1.0000000 0.9777949 1.0000000    1    0
## tree           0.6153846 0.8000000 0.8666667 0.8701502 0.9321429    1    0
## Random Forest  0.8571429 0.9333333 1.0000000 0.9750330 1.0000000    1    0
## naive bayes    0.8571429 0.9333333 1.0000000 0.9724615 1.0000000    1    0
## 
## Kappa 
##                     Min.   1st Qu.   Median      Mean   3rd Qu. Max. NA's
## Elastic Net    0.8013245 1.0000000 1.000000 0.9794806 1.0000000    1    0
## K-NN           0.7719298 0.8994966 1.000000 0.9580119 1.0000000    1    0
## lda            0.8026316 1.0000000 1.000000 0.9895351 1.0000000    1    0
## svm            0.7704918 1.0000000 1.000000 0.9812731 1.0000000    1    0
## Neural network 0.7812500 0.8994966 1.000000 0.9663231 1.0000000    1    0
## tree           0.4144144 0.6964561 0.793812 0.8010349 0.8962276    1    0
## Random Forest  0.7846154 0.9000000 1.000000 0.9624079 1.0000000    1    0
## naive bayes    0.7812500 0.9000000 1.000000 0.9583648 1.0000000    1    0
#plot을 그려서 가장 좋은 모델을 확인한 후에 prediction을 실행함
parallelplot(allResamples, metric = "Kappa") #모델을 실행하는 것에 따라 kappa값의 변화를 보여줌

trellis.par.set(caretTheme())
bwplot(allResamples, layout = c(2, 1))

dotplot(allResamples, layout = c(2, 1))

#final prediction
options(scipen=999)

predicted <- predict(lda_Model, Test_dat) #lda_Model를 사용해서 test data를 예측함
confusionMatrix(predicted, Test_dat[,1])
## Confusion Matrix and Statistics
## 
##           Reference
## Prediction w1 w2 w3
##         w1 11  0  0
##         w2  0 13  0
##         w3  0  1  9
## 
## Overall Statistics
##                                           
##                Accuracy : 0.9706          
##                  95% CI : (0.8467, 0.9993)
##     No Information Rate : 0.4118          
##     P-Value [Acc > NIR] : 0.00000000000392
##                                           
##                   Kappa : 0.9554          
##                                           
##  Mcnemar's Test P-Value : NA              
## 
## Statistics by Class:
## 
##                      Class: w1 Class: w2 Class: w3
## Sensitivity             1.0000    0.9286    1.0000
## Specificity             1.0000    1.0000    0.9600
## Pos Pred Value          1.0000    1.0000    0.9000
## Neg Pred Value          1.0000    0.9524    1.0000
## Prevalence              0.3235    0.4118    0.2647
## Detection Rate          0.3235    0.3824    0.2647
## Detection Prevalence    0.3235    0.3824    0.2941
## Balanced Accuracy       1.0000    0.9643    0.9800

Bias - Variance Trade - off

  • 학습 후에 예측할 때, 발생하는 error는 3가지로 구성되어 있음.

    • Error(x) = noise(x) + bias(x) + variance(x)

      • bias = 데이터에 있는 모든 정보를 고려하지 않아서 지속적으로 잘못된 것을 학습하는 경향성을 말하며, underfitting이라고 함.
      • variance = 데이터에 있는 error, noise를 모두 포함하는 highly flexible models을 fitting하여 실제 데이터와 관련없는 random things를 학습하는 경향성을 말하며, overfitting이라고 함.
    • bias와 variance는 시소 처럼 한쪽이 올라가면 한쪽이 내려가는 관계임 -> trade-off 관계를 가짐.

    • 타켓 한 가운데가 실제값이라면, 점들은 예측 값을 나타냄.
    • bias가 높다는 것은 실제값과 예측값 사이 오차가 크게 벌어진 상황을 뜻함.
    • variance가 높다는 것은 예측의 범위가 넓다는 것을 뜻함.

    • 모델을 학습 시킬수록 모델 복잡도(x축)는 커짐.
    • 학습 시킬수록 (모델 복잡도가 커질수록) bias는 줄어들지만, variance는 증가함.
    • 학습이 덜 될수록 (모델 복잡도가 낮을수록) bias가 증가하고, variance는 감소함.

      • 즉, bias 낮음 & variance 높음 = 모델은 복잡함 -> 과대적합(over fitting)

        • bias가 높아지는것은 많은 데이터를 고려하지 않아 (모델이 너무 단순함) 정확한 예측을 하지못하는 경우를 말함.
      • 즉, bisa 높음 & variance 낮음 = 모델은 단순함 -> 과소적합(under fitting)

        • variance가 높아지는 것은 노이즈까지 전부 학습해서 (모델이 너무 복잡함) 약간의 input에도 예측 Y 값이 크게 흔들리는 것을 말함.

    • 전체 Error에 대한 합은 Variance + Bias + Noise로 표현됨
    • 1 = variance, 2 = bias, 3 = noise
    • variance & bias는 학습을 통해서 줄일 수 있는 영역임.
    • noise는 본질적인 한계치로 줄일 수 없는 오차임.

    • test sample의 오차가 증가하는 시점에서 학습을 중단하는 것이 좋음.
    • test sample = validation set으로 이해하면 됨.