x <- c(0.61, 0.93, 0.83, 0.35, 0.54, 0.16, 0.91, 0.62, 0.62)
y <- c(0.67, 0.84, 0.6, 0.18, 0.85, 0.47, 1.1, 0.65, 0.36)
Give a P-value for the two sided hypothesis test of whether \(\beta\)1 from a linear regression model is 0 or not.
Answer: from the linear model using x as predictor and y as regressor we can obtain the p-value displayed at the bottom line.
asw<-lm(y~x)
summary(asw)
##
## Call:
## lm(formula = y ~ x)
##
## Residuals:
## Min 1Q Median 3Q Max
## -0.27636 -0.18807 0.01364 0.16595 0.27143
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 0.1885 0.2061 0.914 0.391
## x 0.7224 0.3107 2.325 0.053 .
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 0.223 on 7 degrees of freedom
## Multiple R-squared: 0.4358, Adjusted R-squared: 0.3552
## F-statistic: 5.408 on 1 and 7 DF, p-value: 0.05296
2.325
0.391
Answer: as we can read at the third line from bottom, the residual error is 0.223
0.3552
0.05296
Answer: Using the Predict method for Linear Model (see help predict.lm in R Studio) we can set interval to “confidence”. Obs: mean vaue must be a data.frame
data(mtcars)
head(mtcars)
## mpg cyl disp hp drat wt qsec vs am gear carb
## Mazda RX4 21.0 6 160 110 3.90 2.620 16.46 0 1 4 4
## Mazda RX4 Wag 21.0 6 160 110 3.90 2.875 17.02 0 1 4 4
## Datsun 710 22.8 4 108 93 3.85 2.320 18.61 1 1 4 1
## Hornet 4 Drive 21.4 6 258 110 3.08 3.215 19.44 1 0 3 1
## Hornet Sportabout 18.7 8 360 175 3.15 3.440 17.02 0 0 3 2
## Valiant 18.1 6 225 105 2.76 3.460 20.22 1 0 3 1
carslm<-lm(mpg~wt, data=mtcars)
meanvalue = data.frame(wt=mean(mtcars$wt))
predict(carslm, meanvalue, interval="confidence")
## fit lwr upr
## 1 20.09062 18.99098 21.19027
-4.00
21.190
-6.486
Answer: From the help, in R Studio, we know that wt represents the weight in thousand pounds. So, as the model uses wt as predictor for the miles per gallon, we can say that is
It can’t be interpreted without further information
The estimated 1,000 lb change in weight per 1 mpg increase.
Answer: a simple application of the prediction, giving wt as 3 once wt is the weight in thousand pounds.
newcarweight = data.frame(wt=3)
predict(carslm, newcarweight, interval="predict")
## fit lwr upr
## 1 21.25171 14.92987 27.57355
-5.77
21.25
14.93
Answer: in this case we will scalate the previous predictor by 2.
newpredictor <-lm(mpg~I(wt/2), data=mtcars)
newsummary <-summary(newpredictor)$coefficients
newsummary
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 37.28513 1.877627 19.857575 8.241799e-19
## I(wt/2) -10.68894 1.118202 -9.559044 1.293959e-10
confint(newpredictor)
## 2.5 % 97.5 %
## (Intercept) 33.45050 41.11975
## I(wt/2) -12.97262 -8.40527
-6.486
-9.000
Answer: when you multiply a regression variable by a factor, the slope will be divided by this factor. So, multiplying x by 1/100 wil divide \(\beta\)1 by 1/100 that means multiply by 100.
It would get multiplied by 10
It would get divided by 10
Answer: If you only shift the predictor, the slope wil remain the same and the intercept will move in the x predictor line multiplying the \(\widehat{\beta}\)1 factor by the value of the constant with opposite signal
The new slope would be \(\widehat{\beta}\)1 +c
The new slope would be c\(\widehat{\beta}\)1
Answer:
InterSlope<-lm(mpg~wt, data=mtcars)
JustInter<-lm(mpg~1, data=mtcars)
num<-sum((predict(InterSlope)-mtcars$mpg)^2)
den<-sum((predict(JustInter)-mtcars$mpg)^2)
num/den
## [1] 0.2471672
0.75
4.00
0.50
Answer:
If an intercept is included, the residuals most likely won’t sum to zero.
The residuals must always sum to zero.