library(MASS)
Fit a first order multiple linear regression model with interactions relating calthrate formation to sufactant and time.
--> The given data is:data<- read.csv("C:\\Users\\18067\\Documents\\Fareeha Imam\\TTU R11767331\\Spring 2023\\SDA\\Assignment 9\\data-table-B8(4).csv")
colnames(data)<-c("x1","x2","y")
data
## x1 x2 y
## 1 0.00 10 7.5
## 2 0.00 50 15.0
## 3 0.00 85 22.0
## 4 0.00 110 28.6
## 5 0.00 140 31.6
## 6 0.00 170 34.0
## 7 0.00 200 35.0
## 8 0.00 230 35.5
## 9 0.00 260 36.5
## 10 0.00 290 38.5
## 11 0.00 10 12.3
## 12 0.00 30 18.0
## 13 0.00 62 20.8
## 14 0.00 90 25.7
## 15 0.00 150 32.5
## 16 0.00 210 34.0
## 17 0.00 270 35.0
## 18 0.02 10 14.4
## 19 0.02 30 19.0
## 20 0.02 60 26.4
## 21 0.02 90 28.5
## 22 0.02 120 29.0
## 23 0.02 210 35.0
## 24 0.02 30 15.1
## 25 0.02 60 26.4
## 26 0.02 120 27.0
## 27 0.02 150 29.0
## 28 0.05 20 21.0
## 29 0.05 40 27.3
## 30 0.05 130 48.5
## 31 0.05 190 50.4
## 32 0.05 250 52.5
## 33 0.05 60 34.4
## 34 0.05 90 46.5
## 35 0.05 120 50.0
## 36 0.05 150 51.9
--> Initilization of Model.
fullmodel<-lm(y~x1+x2,data)
fullmodel1<-lm(y~x1+x2+x1:x2,data)
summary(fullmodel)
##
## Call:
## lm(formula = y ~ x1 + x2, data = data)
##
## Residuals:
## Min 1Q Median 3Q Max
## -9.7716 -4.1656 0.0802 3.8323 8.3349
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1.109e+01 1.669e+00 6.642 1.48e-07 ***
## x1 3.501e+02 3.968e+01 8.823 3.38e-10 ***
## x2 1.089e-01 9.983e-03 10.912 1.74e-12 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 4.782 on 33 degrees of freedom
## Multiple R-squared: 0.8415, Adjusted R-squared: 0.8319
## F-statistic: 87.6 on 2 and 33 DF, p-value: 6.316e-14
--> Here, you can see the p-value is 6.316e-14.
summary(fullmodel1)
##
## Call:
## lm(formula = y ~ x1 + x2 + x1:x2, data = data)
##
## Residuals:
## Min 1Q Median 3Q Max
## -7.0753 -3.6781 0.4395 3.1321 8.8448
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 12.50128 1.89347 6.602 1.92e-07 ***
## x1 256.73740 73.72914 3.482 0.00146 **
## x2 0.09879 0.01193 8.281 1.84e-09 ***
## x1:x2 0.76127 0.51026 1.492 0.14551
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 4.696 on 32 degrees of freedom
## Multiple R-squared: 0.8518, Adjusted R-squared: 0.8379
## F-statistic: 61.31 on 3 and 32 DF, p-value: 2.318e-13
Check for model adequacy, peform any transformation and repeat part a) if deemed necessary.
--> Checking the model Adequacy.plot(fullmodel)
Test for the significance of the regression (using ANOVA). What do you conclude?
anova(fullmodel,fullmodel1)
## Analysis of Variance Table
##
## Model 1: y ~ x1 + x2
## Model 2: y ~ x1 + x2 + x1:x2
## Res.Df RSS Df Sum of Sq F Pr(>F)
## 1 33 754.74
## 2 32 705.66 1 49.084 2.2259 0.1455
--> Here we can see that probability is greater than 0.05. Hence it is not significant. we can reject x1:x2
consider fullmodel<-lm(y~x1+x2)
Test for the significance of the regression parameters, eliminating those that are deemed not significant. What is your final model?
bestmodel<-lm(y~x1+x2,data)
summary(bestmodel)
##
## Call:
## lm(formula = y ~ x1 + x2, data = data)
##
## Residuals:
## Min 1Q Median 3Q Max
## -9.7716 -4.1656 0.0802 3.8323 8.3349
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1.109e+01 1.669e+00 6.642 1.48e-07 ***
## x1 3.501e+02 3.968e+01 8.823 3.38e-10 ***
## x2 1.089e-01 9.983e-03 10.912 1.74e-12 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 4.782 on 33 degrees of freedom
## Multiple R-squared: 0.8415, Adjusted R-squared: 0.8319
## F-statistic: 87.6 on 2 and 33 DF, p-value: 6.316e-14