Data input

x <- c(21,24,32,47,50,59,68,74,62,50,41,30)
y <- c(185.79,214.47,288.03,424.84,454.68,539.03,621.55,675.06,562.03,452.93,369.95,273.98)

Fitting our simple linear regression model

model <- lm(y~x)
model
## 
## Call:
## lm(formula = y ~ x)
## 
## Coefficients:
## (Intercept)            x  
##      -6.332        9.208

Qb)Test for significance of regression

summary(model)
## 
## Call:
## lm(formula = y ~ x)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -2.5629 -1.2581 -0.2550  0.8681  4.0581 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept) -6.33209    1.67005  -3.792  0.00353 ** 
## x            9.20847    0.03382 272.255  < 2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.946 on 10 degrees of freedom
## Multiple R-squared:  0.9999, Adjusted R-squared:  0.9999 
## F-statistic: 7.412e+04 on 1 and 10 DF,  p-value: < 2.2e-16

we can see from above analysis that p value is < 2e-16 , which is less than 0.05 . Hence we claim that we reject Null Hypothesis and state that our regression model is significant

Qd) Construct a 99% prediction interval on steam usage in a month

with average ambient temperature of 58

xnew <- c(58)
pred <- predict(model,data.frame(x=xnew),interval = "prediction", level = 0.99)

pred
##       fit      lwr      upr
## 1 527.759 521.2237 534.2944

Hence upper limit is 534.2944 and lower limit is 521.2237