Consider the data set given below
x <- c(0.18, -1.54, 0.42, 0.95) And weights given by
w <- c(2, 1, 3, 1) Give the value of μ that minimizes the least squares equation $∑ni=1wi(xi−u)2 0.14711.077 0.0025 0.300
x <- c(0.18, -1.54, 0.42, 0.95)
w <- c(2, 1, 3, 1)
minu <- sum(x*w) / sum(w)
final <- sum(w*(x-minu)^2)
c(minu, final)
## [1] 0.1471429 3.7165429
mu <- c(0.1471, 1.077, 0.0025, 0.300)
for (v in mu)
{print( c(v, sum(w*(x-v)^2)) )}
## [1] 0.147100 3.716543
## [1] 1.077000 9.768983
## [1] 0.002500 3.862994
## [1] 0.3000 3.8801
Consider the following data set
x <- c(0.8, 0.47, 0.51, 0.73, 0.36, 0.58, 0.57, 0.85, 0.44, 0.42) y <- c(1.39, 0.72, 1.55, 0.48, 1.19, -1.59, 1.23, -0.65, 1.49, 0.05) Fit the regression through the origin and get the slope treating y as the outcome and x as the regressor. (Hint, do not center the data since we want regression through the origin, not through the means of the data.)
0.8263 -0.04462 -1.713 0.59915
x <- c(0.8, 0.47, 0.51, 0.73, 0.36, 0.58, 0.57, 0.85, 0.44, 0.42)
y <- c(1.39, 0.72, 1.55, 0.48, 1.19, -1.59, 1.23, -0.65, 1.49, 0.05)
x <- c(x,x*-1)
y <- c(y,y*-1)
mean(y)
## [1] 0
plot(x,y)
cor(x,y) *sd(y)/sd(x)
## [1] 0.8262517
Do data(mtcars) from the datasets package and fit the regression model with mpg as the outcome and weight as the predictor. Give the slope coefficient.
-5.344 0.5591 -9.559 30.2851
data(mtcars)
head(mtcars)
## mpg cyl disp hp drat wt qsec vs am gear carb
## Mazda RX4 21.0 6 160 110 3.90 2.620 16.46 0 1 4 4
## Mazda RX4 Wag 21.0 6 160 110 3.90 2.875 17.02 0 1 4 4
## Datsun 710 22.8 4 108 93 3.85 2.320 18.61 1 1 4 1
## Hornet 4 Drive 21.4 6 258 110 3.08 3.215 19.44 1 0 3 1
## Hornet Sportabout 18.7 8 360 175 3.15 3.440 17.02 0 0 3 2
## Valiant 18.1 6 225 105 2.76 3.460 20.22 1 0 3 1
x <- mtcars$wt
y <- mtcars$mpg
cor(x,y) *sd(y)/sd(x)
## [1] -5.344472
Consider data with an outcome (Y) and a predictor (X). The standard deviation of the predictor is one half that of the outcome. The correlation between the two variables is .5. What value would the slope coefficient for the regression model with Y as the outcome and X as the predictor?
4 1 0.25 3
sx <- 1/2
sy <- 1
cor <- .5
cor * sy / sx
## [1] 1
Students were given two hard tests and scores were normalized to have empirical mean 0 and variance 1. The correlation between the scores on the two tests was 0.4. What would be the expected score on Quiz 2 for a student who had a normalized score of 1.5 on Quiz 1?
0.16 0.4 0.6 1.0
1.5 * .4
## [1] 0.6
Consider the data given by the following
x <- c(8.58, 10.46, 9.01, 9.64, 8.86) What is the value of the first measurement if x were normalized (to have mean 0 and variance 1)?
8.86 -0.9719 9.31 8.58
x <- c(8.58, 10.46, 9.01, 9.64, 8.86)
zx <- (x-mean(x)) / sd(x)
zx[1]
## [1] -0.9718658
Consider the following data set (used above as well). What is the intercept for fitting the model with x as the predictor and y as the outcome?
x <- c(0.8, 0.47, 0.51, 0.73, 0.36, 0.58, 0.57, 0.85, 0.44, 0.42) y <- c(1.39, 0.72, 1.55, 0.48, 1.19, -1.59, 1.23, -0.65, 1.49, 0.05) 2.105 1.252 1.567 -1.713
x <- c(0.8, 0.47, 0.51, 0.73, 0.36, 0.58, 0.57, 0.85, 0.44, 0.42)
y <- c(1.39, 0.72, 1.55, 0.48, 1.19, -1.59, 1.23, -0.65, 1.49, 0.05)
plot(y,x)
b1 <- cor(x,y)*sd(y)/sd(x)
b0 <- mean(y) - b1 * mean(x)
You know that both the predictor and response have mean 0. What can be said about the intercept when you fit a linear regression?
It is undefined as you have to divide by zero. It must be identically 0. It must be exactly one. Nothing about the intercept can be said from the information given.
b1=1000000000000
b0 <- 0 - b1 *0
Consider the data given by
x <- c(0.8, 0.47, 0.51, 0.73, 0.36, 0.58, 0.57, 0.85, 0.44, 0.42) What value minimizes the sum of the squared distances between these points and itself?
0.573 0.8 0.36 0.44
x <- c(0.8, 0.47, 0.51, 0.73, 0.36, 0.58, 0.57, 0.85, 0.44, 0.42)
mean(x)
## [1] 0.573
for(u in c(0.573, 0.8, 0.36, 0.44))
{
SSE <- sum((x-u)^2)
print(c(u, SSE))
}
## [1] 0.57300 0.25401
## [1] 0.8000 0.7693
## [1] 0.3600 0.7077
## [1] 0.4400 0.4309
Let the slope having fit Y as the outcome and X as the predictor be denoted as β1. Let the slope from fitting X as the outcome and Y as the predictor be denoted as γ1. Suppose that you divide β1 by γ1; in other words consider β1/γ1. What is this ratio always equal to?
Var(Y)/Var(X) 2SD(Y)/SD(X) 1 Cor(Y,X)
x <- c(0.8, 0.47, 0.51, 0.73, 0.36, 0.58, 0.57, 0.85, 0.44, 0.42)
y <- c(1.39, 0.72, 1.55, 0.48, 1.19, -1.59, 1.23, -0.65, 1.49, 0.05)
beta <- cor(x,y)*sd(y)/sd(x)
alpha <- cor(x,y)*sd(x)/sd(y)
beta/alpha
## [1] 38.39077
var(y)/var(x)
## [1] 38.39077