Regression Model Quiz 2
- Consider the following data with x as the predictor and y as as the outcome. Give a P-value for the two sided hypothesis test of whether beta1 from a linear regression model is 0 or not
x <- c(0.61, 0.93, 0.83, 0.35, 0.54, 0.16, 0.91, 0.62, 0.62)
y <- c(0.67, 0.84, 0.6, 0.18, 0.85, 0.47, 1.1, 0.65, 0.36)
fit <- lm(y ~ x)
summary(fit)$coeffcients
## NULL
- Consider the previous problem, give the estimate of the residual standard deviation
summary(fit)$sigma
## [1] 0.2229981
- In the mtcars data set, fit a linear regression model of weight (predictor) on mpg (outcome). Get a 95% confidence interval for the expected mpg at the average weight. What is the lower endpoint?
data(mtcars)
x <- mtcars$wt
y <- mtcars$mpg
fit <- lm(y ~ x)
predict(fit,data.frame(x=mean(x)), interval="confidence")
## fit lwr upr
## 1 20.09062 18.99098 21.19027
- Refer to the previous question. Read the help file for mtcars. What is the weight coefficient interpreted as?
The estimated expected change in mpg per 1000 lbs increase in weight.
- Consider again the mtcars data set and a linear regression model with mpg as predicted by weight (1000 lbs). A new car is coming weighing 3000 pounds. Construct a 95% prediction interval for its mpg. What is the upper endpoint?
predict(fit,data.frame(x=3), interval="prediction")
## fit lwr upr
## 1 21.25171 14.92987 27.57355
- Consider again the mtcars data set and a linear regression model with mpg as predicted by weight (in 1000 lbs). A “short” ton is defined as 2000 lbs. Construct a 95% confidence interval for the expected change in mpg per 1 short ton increase in weight. Give the lower endpoint.
fit2<-lm(y~I(x/2))
sumcoef <- summary(fit2)$coefficients
sumcoef[2,1]+c(-1,1)*qt(0.975, df=fit2$df)*sumcoef[2,2]
## [1] -12.97262 -8.40527
- If my X from a linear regression is measured in centimeters and I convert it to meters what would happen to the slope coefficient?
fit$coefficients
## (Intercept) x
## 37.285126 -5.344472
fit2 <- lm(y ~ I(x/100))
fit2$coefficients
## (Intercept) I(x/100)
## 37.28513 -534.44716
- I have an outcome, Y, and a predictor, X and fit a linear regression model with Y=beta0+beta1X+e to obtain beta0_h and beta1_h. What would be the consequence to the subsequent slope and intercept if I were to refit the model with a new regressor, X+c for some constant, c?
c <- 10
b0 <- fit$coefficients[1]
b1 <- fit$coefficients[2]
b0-b1*10
## (Intercept)
## 90.72984
fit3 <- lm(y ~ I(x+c))
fit3$coefficients
## (Intercept) I(x + c)
## 90.729842 -5.344472
- Refer back to the mtcars data set with mpg as an outcome and weight (wt) as the predictor. About what is the ratio of the the sum of the squared errors when comparing a model with just an intercept (denominator) to the model with the intercept and slope (numerator)?
# Yi_h = Y_bar when the fitted model has an intercept only.
data(mtcars)
y <- mtcars$mpg
x <- mtcars$wt
fit_car <- lm(y ~ x)
sum(resid(fit_car)^2) / sum((y - mean(y)) ^ 2)
## [1] 0.2471672
- Do the residuals always have to sum to 0 in linear regression?
If an intercept is included, then they will sum to 0.