library(tidyverse)
library(openintro)
library(ISLR2)
library(MASS)
library(class)
library(e1071)Equation 4.32 derived an expression for \(log(\frac{Pr(Y =k|X=x}{Pr(Y =K|X=x})\) in the setting where p > 1, so that the mean for the kth class, µk, is a p-dimensional vector, and the shared covariance Σ is a p × p matrix. However, in the setting with p = 1, (4.32) takes a simpler form, since the means µ1,…,µK and the variance σ2 are scalars. In this simpler setting, repeat the calculation in (4.32), and provide expressions for ak and bkj in terms of πk, πK, µk, µK, and σ2.
In this case of p=1,
\(p_{k}(x)=log(\frac{\pi_{k}exp(-0.5(x-\mu{k})^{2})*(1/\sigma^{2}))}{\pi_{K}exp(-0.5(x-\mu{K})^{2})*(1/\sigma^{2}))})\)
after simplification, \(p_{k}(x)=(log(\frac{\pi_{k}}{\pi_{K}})-\frac{(\mu^{2}_{K}-\mu^{2}_{k})}{4\sigma^{2}})+x*(\frac{(\mu_{K}-\mu_{k})}{2\sigma^{2}})\)
therefore,
\(a= ((log(\frac{\pi_{k}}{\pi_{K}})-\frac{(\mu^{2}_{K}-\mu^{2}_{k})}{4\sigma^{2}})\)
\(b=(\frac{(\mu_{K}-\mu_{k})}{2\sigma^{2}}\)
Work out the detailed forms of ak, bkj , and bkjl in (4.33). Your answer should involve πk, πK, µk, µK, Σk, and ΣK.
The equation in 4.33 is
\(log(\frac{Pr(Y = k|X = x)}{Pr(Y = K|X = x)})=a_{k}+\sum_{j=1}^{p}b_{kj}x_{j}+\sum_{j=1}^{p}\sum_{l=1}^{p}c_{kjl}x_{j}x_{l}\)
so,
\(a_{k}=log(\frac{\pi_{k}}{\pi_{K}})\)
\(b_{kj}=log(\frac{b_{kj}x_{j}}{b_{K}x_{j}})\)
\(c_{kjl}=log(\frac{c_{kjl}x_{j}x_{l}}{c_{Kjl}x_{j}x_{l}})\)
This question should be answered using the Weekly data set, which is part of the ISLR2 package. This data is similar in nature to the Smarket data from this chapter’s lab, except that it contains 1, 089 weekly returns for 21 years, from the beginning of 1990 to the end of 2010.
(a) Produce some numerical and graphical summaries of the Weekly data. Do there appear to be any patterns?
summary(Weekly)## Year Lag1 Lag2 Lag3
## Min. :1990 Min. :-18.1950 Min. :-18.1950 Min. :-18.1950
## 1st Qu.:1995 1st Qu.: -1.1540 1st Qu.: -1.1540 1st Qu.: -1.1580
## Median :2000 Median : 0.2410 Median : 0.2410 Median : 0.2410
## Mean :2000 Mean : 0.1506 Mean : 0.1511 Mean : 0.1472
## 3rd Qu.:2005 3rd Qu.: 1.4050 3rd Qu.: 1.4090 3rd Qu.: 1.4090
## Max. :2010 Max. : 12.0260 Max. : 12.0260 Max. : 12.0260
## Lag4 Lag5 Volume Today
## Min. :-18.1950 Min. :-18.1950 Min. :0.08747 Min. :-18.1950
## 1st Qu.: -1.1580 1st Qu.: -1.1660 1st Qu.:0.33202 1st Qu.: -1.1540
## Median : 0.2380 Median : 0.2340 Median :1.00268 Median : 0.2410
## Mean : 0.1458 Mean : 0.1399 Mean :1.57462 Mean : 0.1499
## 3rd Qu.: 1.4090 3rd Qu.: 1.4050 3rd Qu.:2.05373 3rd Qu.: 1.4050
## Max. : 12.0260 Max. : 12.0260 Max. :9.32821 Max. : 12.0260
## Direction
## Down:484
## Up :605
##
##
##
##
pairs(Weekly)cor(Weekly[ ,-9])## Year Lag1 Lag2 Lag3 Lag4
## Year 1.00000000 -0.032289274 -0.03339001 -0.03000649 -0.031127923
## Lag1 -0.03228927 1.000000000 -0.07485305 0.05863568 -0.071273876
## Lag2 -0.03339001 -0.074853051 1.00000000 -0.07572091 0.058381535
## Lag3 -0.03000649 0.058635682 -0.07572091 1.00000000 -0.075395865
## Lag4 -0.03112792 -0.071273876 0.05838153 -0.07539587 1.000000000
## Lag5 -0.03051910 -0.008183096 -0.07249948 0.06065717 -0.075675027
## Volume 0.84194162 -0.064951313 -0.08551314 -0.06928771 -0.061074617
## Today -0.03245989 -0.075031842 0.05916672 -0.07124364 -0.007825873
## Lag5 Volume Today
## Year -0.030519101 0.84194162 -0.032459894
## Lag1 -0.008183096 -0.06495131 -0.075031842
## Lag2 -0.072499482 -0.08551314 0.059166717
## Lag3 0.060657175 -0.06928771 -0.071243639
## Lag4 -0.075675027 -0.06107462 -0.007825873
## Lag5 1.000000000 -0.05851741 0.011012698
## Volume -0.058517414 1.00000000 -0.033077783
## Today 0.011012698 -0.03307778 1.000000000
Based on the above scatterplot and matrix, there appears to be a positive correlation between the Year and Volume variables.
(b) Use the full data set to perform a logistic regression with Direction as the response and the five lag variables plus Volume as predictors. Use the summary function to print the results. Do any of the predictors appear to be statistically significant? If so, which ones?
glmf=glm(Direction~Lag1+Lag2+Lag3+Lag4+Lag5+Volume,
data = Weekly, family = binomial)
summary (glmf)##
## Call:
## glm(formula = Direction ~ Lag1 + Lag2 + Lag3 + Lag4 + Lag5 +
## Volume, family = binomial, data = Weekly)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -1.6949 -1.2565 0.9913 1.0849 1.4579
##
## Coefficients:
## Estimate Std. Error z value Pr(>|z|)
## (Intercept) 0.26686 0.08593 3.106 0.0019 **
## Lag1 -0.04127 0.02641 -1.563 0.1181
## Lag2 0.05844 0.02686 2.175 0.0296 *
## Lag3 -0.01606 0.02666 -0.602 0.5469
## Lag4 -0.02779 0.02646 -1.050 0.2937
## Lag5 -0.01447 0.02638 -0.549 0.5833
## Volume -0.02274 0.03690 -0.616 0.5377
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for binomial family taken to be 1)
##
## Null deviance: 1496.2 on 1088 degrees of freedom
## Residual deviance: 1486.4 on 1082 degrees of freedom
## AIC: 1500.4
##
## Number of Fisher Scoring iterations: 4
The only Lag that shows significance is Lag 2, but it isn’t that significant.
(c) Compute the confusion matrix and overall fraction of correct predictions. Explain what the confusion matrix is telling you about the types of mistakes made by logistic regression.
glmprob.wk = predict(glmf, type = "response")
glmpred.wk = rep("Down", length(glmprob.wk))
glmpred.wk[glmprob.wk > 0.5] <- "Up"
table(glmpred.wk, Weekly$Direction)##
## glmpred.wk Down Up
## Down 54 48
## Up 430 557
mean(glmpred.wk == Weekly$Direction)## [1] 0.5610652
Our model correctly predicted that the market would go up on 557 days and that it would go down on 54 days. So, our model is telling us that about 56.11% of the responses in the market are correctly predicted.
(d) Now fit the logistic regression model using a training data period from 1990 to 2008, with Lag2 as the only predictor. Compute the confusion matrix and the overall fraction of correct predictions for the held out data (that is, the data from 2009 and 2010).
train=(Weekly$Year<2009)
weekly09=Weekly[!train ,]
direction09=Weekly$Direction[!train]
dim(weekly09)## [1] 104 9
glm_fit=glm(Direction~Lag2, data = Weekly,family=binomial ,subset=train)
glm_probability=predict (glm_fit,weekly09, type="response")
glm_prediction=rep("Down",104)
glm_prediction[glm_probability >.5]=" Up"
table(glm_prediction ,direction09)## direction09
## glm_prediction Down Up
## Up 34 56
## Down 9 5
Our model indicates that we correctly predicted that the market would go up on 56 days and down on 9 days. This means that 62.5% of the time our model is predicting the response of the market correctly.
(e) Repeat (d) using LDA.
ldafit=lda(Direction~Lag2 ,data = Weekly ,subset=train)
ldafit## Call:
## lda(Direction ~ Lag2, data = Weekly, subset = train)
##
## Prior probabilities of groups:
## Down Up
## 0.4477157 0.5522843
##
## Group means:
## Lag2
## Down -0.03568254
## Up 0.26036581
##
## Coefficients of linear discriminants:
## LD1
## Lag2 0.4414162
lda.prediction=predict(ldafit , weekly09)
names(lda.prediction)## [1] "class" "posterior" "x"
ldaclass=lda.prediction$class
table(ldaclass , direction09)## direction09
## ldaclass Down Up
## Down 9 5
## Up 34 56
The probability for down is 0.4477157 and the probability for down is 0.5522843. The model tells up that we correctly predicted the market would go up on 56 days and down on 9 days; which are the same results as in part (d). So, using the LDA method created the same results.
(f) Repeat (d) using QDA.
weeklyqda=qda(Direction~Lag2 ,data=Weekly ,subset=train)
weeklyqda## Call:
## qda(Direction ~ Lag2, data = Weekly, subset = train)
##
## Prior probabilities of groups:
## Down Up
## 0.4477157 0.5522843
##
## Group means:
## Lag2
## Down -0.03568254
## Up 0.26036581
classqda=predict(weeklyqda ,weekly09)$class
table(classqda ,direction09)## direction09
## classqda Down Up
## Down 0 0
## Up 43 61
We can see that the probability of groups for down is 0.4477157 and the probability for up is 0.5522843. Which is the same as the lda process. However, in this model we correctly predicted the market would go up on 61 days and down on 0 days. This means that using the qda model, our predictions were correct 58.65% of the time.
(g) Repeat (d) using KNN with K = 1.
trainX=cbind(Weekly$Lag2)[train ,]
testX=cbind(Weekly$Lag2)[!train ,]
direction.train =Weekly$Direction [train]
dim(trainX)= c(985,1)
dim(testX)=c(104,1)
set.seed(1)
knnprediction=knn(trainX,testX,direction.train ,k=1)
table(knnprediction ,direction09)## direction09
## knnprediction Down Up
## Down 21 30
## Up 22 31
We can see that in this model we correctly predicted the market would go up on 31 days and down on 21 days. This means that using our knn model, our predictions were correct 50% of the time.
(h) Repeat (d) using naive Bayes.
nbayes=naiveBayes(Direction~Lag2 ,data=Weekly ,subset=train)
nbayes##
## Naive Bayes Classifier for Discrete Predictors
##
## Call:
## naiveBayes.default(x = X, y = Y, laplace = laplace)
##
## A-priori probabilities:
## Y
## Down Up
## 0.4477157 0.5522843
##
## Conditional probabilities:
## Lag2
## Y [,1] [,2]
## Down -0.03568254 2.199504
## Up 0.26036581 2.317485
nbayes.class=predict(nbayes ,weekly09)
table(nbayes.class ,direction09)## direction09
## nbayes.class Down Up
## Down 0 0
## Up 43 61
Once again, the probability for down is 0.4477157 and the probability for up is 0.5522843. We can see that in this model we correctly predicted the market would go up on 61 days and down on 0 days. This means that using the naive bayes model, our predictions were correct 58.65% of the time; which is the same as the qda model.
(i) Which of these methods appears to provide the best results on this data?
the regression model predicted the market correctly 62.5% of the time which is the highest out of all the models, so that method appears to provide the best results.
(j) Experiment with different combinations of predictors, including possible transformations and interactions, for each of the methods. Report the variables, method, and associated confusion matrix that appears to provide the best results on the held out data. Note that you should also experiment with values for K in the KNN classifier.
glm2=glm(Direction~Lag2:Lag3, data = Weekly,family=binomial ,subset=train)
glmprobability2=predict (glm_fit,weekly09, type="response")
glmprediction2=rep("Down",104)
glmprediction2[glmprobability2 >.5]=" Up"
table(glmprediction2 ,direction09)## direction09
## glmprediction2 Down Up
## Up 34 56
## Down 9 5
When using the glm model to compare the relationship between Lag 2 and Lag 3, we see that the glm model correctly predicted the market would go up on 56 days and down on 9 days. This means that our predictions were correct 62.5% of the time.
lda2=lda(Direction~Lag2^2 ,data = Weekly ,subset=train)
lda2## Call:
## lda(Direction ~ Lag2^2, data = Weekly, subset = train)
##
## Prior probabilities of groups:
## Down Up
## 0.4477157 0.5522843
##
## Group means:
## Lag2
## Down -0.03568254
## Up 0.26036581
##
## Coefficients of linear discriminants:
## LD1
## Lag2 0.4414162
ldapred2=predict(lda2 , weekly09)
names(ldapred2)## [1] "class" "posterior" "x"
lda2class=ldapred2$class
table(lda2class , direction09)## direction09
## lda2class Down Up
## Down 9 5
## Up 34 56
When applying the transformation of squaring Lag2, we see that the lda model correctly predicted the market would go up on 56 days and down on 9 days. This means that our predictions were correct 62.5% of the time.
qda2=qda(Direction~Lag2:Lag3 ,data=Weekly ,subset=train)
qda2## Call:
## qda(Direction ~ Lag2:Lag3, data = Weekly, subset = train)
##
## Prior probabilities of groups:
## Down Up
## 0.4477157 0.5522843
##
## Group means:
## Lag2:Lag3
## Down -0.1937158
## Up -0.6405132
classqda2=predict(qda2 ,weekly09)$class
table(classqda2 ,direction09)## direction09
## classqda2 Down Up
## Down 6 8
## Up 37 53
When using the qda model to compare the relationship between Lag 2 and Lag 3, we see that the qda model correctly predicted the market would go up on 53 days and down on 6 days. This means that our predictions were correct 56.73% of the time.
Xtrain=cbind(Weekly$Lag2)[train ,]
Xtest=cbind(Weekly$Lag2)[!train ,]
Directiontrain =Weekly$Direction [train]
dim(Xtrain)= c(985,1)
dim(Xtest)=c(104,1)
set.seed(1)
knn2=knn(Xtrain,Xtest,Directiontrain ,k=15)
table(knn2 ,direction09)## direction09
## knn2 Down Up
## Down 20 20
## Up 23 41
When setting K=15, we see that the knn model correctly predicted the market would go up on 41 days and down on 20 days. This means that our predictions were correct 58.65% of the time.
Xtrain2=cbind(Weekly$Lag2)[train ,]
Xtest2=cbind(Weekly$Lag2)[!train ,]
Directiontrain2 =Weekly$Direction [train]
dim(Xtrain2)= c(985,1)
dim(Xtest2)=c(104,1)
set.seed(1)
knn3=knn(Xtrain2,Xtest2,Directiontrain2 ,k=25)
table(knn3 ,direction09)## direction09
## knn3 Down Up
## Down 19 25
## Up 24 36
When setting K=25, we see that the qda model correctly predicted the market would go up on 36 days and down on 19 days. This means that our predictions were correct 52.88% of the time.
…