\[\hat{weight} = 120.07 - 1.93 \times parity\]
The slope indicates that for every one increase in parity, the weight of a baby goes down by 1.93 ounces.
For first born: \(weight = 120.07 ounces\)
For not first born: \(weight = 120.07 - 1.93 * 1 = 118.14 ounces\)
Since the p-value is 0.1052 which is larger than 0.05, we failed to reject the null hypothesis, hence there is no statistically significant relationship between birth weight and parity.
\[\hat{absence} = 18.93 - 9.11*eth + 3.1*sex + 2.15*lrn\]
Given all other factors being the same:
Non-aboriginal students miss 9.11 fewer days on average.
Male students miss 3.1 more days on average.
Slow learners miss 2.15 more days on average.
\[\hat{absence} = 18.93 - 9.11*0 + 3.1*1 + 2.15*1 = 24.18\]
\[residual = 2 - 24.18 = -22.18\]
\[R^2 = 1 - \frac{Var(residuals)}{Var(y)}\] \[R^2 = 1 - \frac{240.57}{264.17} = 0.0893\]
Adjusted \(R^2 = 1 - \frac{Var(residuals)}{Var(y)} \frac{n-1}{n-k-1} = 1 - \frac{240.57}{264.17}\frac{146-1}{146-3-1} = 0.07\)
Since adjusted \(R^2\) increases when learner status is removed, hence learner status should be removed from the model first.
We observed 8 damaged o-rings below the temperature of 63 degrees Fahrenheit and 3 damaged o-rings above 70 degrees Fahrenheit. Low temperature seems to be associated with o-ring damages.
The summary table is consisted of the intercept, which is the estimate when degree equals zero. For each increase in degree, there will be decrease in the probability of damaging an O-ring. P-value indicates the possibility of an association between temperature and damage to an O-ring.
\[ln(\frac{p_i}{1 - p_i}) = 11.6630 - 0.2162 \times Temperature\]
Yes, a p-value of 0.0000 indicates that there is very high probability that a damaged O-ring is due to low temperature. Since O-rings are critical parts, I believe the concerns regarding O-rings are justified.
\[ln(\frac{\hat{p}}{1 - \hat{p}}) = 11.6630 - 0.2162 \times temperature\]
\[ln(\frac{\hat{p}}{1 - \hat{p}}) = 11.6630 - 0.2162 \times temperature\]
Solving the equation in terms of \(\hat{p}\), we obtain
\[\hat{p} = \frac{e^{11.6630 - 0.2162*temperature}}{1 + e^{11.6630 - 0.2162*temperature}} \]
p <- function(t){
exp(11.6630-0.2162*t) / (1+ exp(11.6630-0.2162*t))
}
p(51) #temperature = 51
## [1] 0.6540297
p(53) #temperature = 53
## [1] 0.5509228
p(55) #temperature = 55
## [1] 0.4432456
t <- c(53,57,58,63,66,67,67,67,68,69,70,70,70,70,72,73,75,75,76,76,78,79,81)
prob <- p(t)
df <- data.frame(t, prob)
library(ggplot2)
## Registered S3 methods overwritten by 'ggplot2':
## method from
## [.quosures rlang
## c.quosures rlang
## print.quosures rlang
ggplot(df, aes(x=df$t, y=df$prob)) +
geom_point() +
geom_smooth()
## `geom_smooth()` using method = 'loess' and formula 'y ~ x'
My concern is that we do not know if these observations are indepedent to be applied in logistic regression.