Statistical Learning Lab-4

CHAPTER - 3

Question-8

library(ISLR)

(a)

# View the first few rows of the Auto dataset
head(Auto)
##   mpg cylinders displacement horsepower weight acceleration year origin
## 1  18         8          307        130   3504         12.0   70      1
## 2  15         8          350        165   3693         11.5   70      1
## 3  18         8          318        150   3436         11.0   70      1
## 4  16         8          304        150   3433         12.0   70      1
## 5  17         8          302        140   3449         10.5   70      1
## 6  15         8          429        198   4341         10.0   70      1
##                        name
## 1 chevrolet chevelle malibu
## 2         buick skylark 320
## 3        plymouth satellite
## 4             amc rebel sst
## 5               ford torino
## 6          ford galaxie 500
# Fit the linear regression model
lm_fit <- lm(mpg ~ horsepower, data = Auto)

# Print summary of the regression model
summary(lm_fit)
## 
## Call:
## lm(formula = mpg ~ horsepower, data = Auto)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -13.5710  -3.2592  -0.3435   2.7630  16.9240 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 39.935861   0.717499   55.66   <2e-16 ***
## horsepower  -0.157845   0.006446  -24.49   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 4.906 on 390 degrees of freedom
## Multiple R-squared:  0.6059, Adjusted R-squared:  0.6049 
## F-statistic: 599.7 on 1 and 390 DF,  p-value: < 2.2e-16
# Predict mpg for horsepower = 98 with confidence and prediction intervals
predict(lm_fit, newdata = data.frame(horsepower = 98), interval = "confidence")
##        fit      lwr      upr
## 1 24.46708 23.97308 24.96108
predict(lm_fit, newdata = data.frame(horsepower = 98), interval = "prediction")
##        fit     lwr      upr
## 1 24.46708 14.8094 34.12476

Interpretation of the Output:

i. Is there a relationship between the predictor (horsepower) and the response (mpg)?

To determine if there is a significant relationship between horsepower and mpg, we check the p-value of horsepower from the regression summary (summary(lm_fit)).

  • If the p-value is very small (< 0.05), we can conclude that horsepower is a significant predictor of mpg.

  • Since we performed simple linear regression, if the p-value is low, we have strong evidence that horsepower affects mpg.

ii. How strong is the relationship between horsepower and mpg?

  • The R-squared value in summary(lm_fit) measures how much of the variation in mpg is explained by horsepower.

  • Interpretation:

    • If R-squared is close to 1, it means horsepower explains most of the variation in mpg (strong relationship).

    • If R-squared is low, it means horsepower is not a strong predictor, and other factors influence mpg.

iii. Is the relationship positive or negative?

  • The sign of the coefficient of horsepower in the summary output tells us:

    • If negative, it means that as horsepower increases, mpg decreases (inverse relationship).

    • If positive, it means that as horsepower increases, mpg also increases (direct relationship).

  • Based on typical car data, we expect a negative relationship, meaning higher horsepower leads to lower mpg.

iv. Predicted mpg for horsepower = 98 with 95% confidence and prediction intervals

From the predict() function output:

  • Predicted mpg: 24.47 for a car with horsepower = 98.

  • 95% Confidence Interval: (23.97, 24.96)

    • We are 95% confident that the mean mpg for all cars with horsepower = 98 falls in this range.
  • 95% Prediction Interval: (14.81, 34.12)

    • If we pick a random car with horsepower = 98, its actual mpg is likely to be anywhere between 14.81 and 34.12.

    • This wider range reflects more uncertainty in predicting individual car values compared to estimating the mean.

(b) Plot the Response and the Predictor

# Scatter plot of mpg vs. horsepower
plot(Auto$horsepower, Auto$mpg, main = "MPG vs. Horsepower",
     xlab = "Horsepower", ylab = "MPG", col = "blue", pch = 19)

# Add regression line
abline(lm_fit, col = "red", lwd = 2)

Interpretation:

  • The scatter plot shows how mpg varies with horsepower.

  • The red regression line represents the least squares fit.

(c) Diagnostic Plots for Model Evaluation

# Diagnostic plots
par(mfrow = c(2, 2))  # Arrange plots in 2x2 grid
plot(lm_fit)

  • Residuals vs. Fitted Plot:

    • Shows a curved pattern, suggesting non-linearity.

    • Variance increases for larger fitted values.

    • We can consider adding polynomial terms or transforming variables.

  • Q-Q Plot:

    • Residuals mostly follow a normal distribution but show deviations in the tails (outliers).

    • We can use log transformation or robust regression if normality is important.

  • Scale-Location Plot:

    • Residual spread increases, confirming heteroscedasticity (non-constant variance).

    • We can try log transformation or weighted least squares regression.

  • Residuals vs. Leverage Plot:

    • Some points have high leverage, meaning they influence the regression strongly.

    • We can investigate high-leverage points and consider removing extreme outliers.

Question - 10

# View first few rows of the dataset
head(Carseats)
##   Sales CompPrice Income Advertising Population Price ShelveLoc Age Education
## 1  9.50       138     73          11        276   120       Bad  42        17
## 2 11.22       111     48          16        260    83      Good  65        10
## 3 10.06       113     35          10        269    80    Medium  59        12
## 4  7.40       117    100           4        466    97    Medium  55        14
## 5  4.15       141     64           3        340   128       Bad  38        13
## 6 10.81       124    113          13        501    72       Bad  78        16
##   Urban  US
## 1   Yes Yes
## 2   Yes Yes
## 3   Yes Yes
## 4   Yes Yes
## 5   Yes  No
## 6    No Yes

(a).

# Fit the multiple regression model
lm_fit <- lm(Sales ~ Price + Urban + US, data = Carseats)
# Display model summary
summary(lm_fit)
## 
## Call:
## lm(formula = Sales ~ Price + Urban + US, data = Carseats)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -6.9206 -1.6220 -0.0564  1.5786  7.0581 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 13.043469   0.651012  20.036  < 2e-16 ***
## Price       -0.054459   0.005242 -10.389  < 2e-16 ***
## UrbanYes    -0.021916   0.271650  -0.081    0.936    
## USYes        1.200573   0.259042   4.635 4.86e-06 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 2.472 on 396 degrees of freedom
## Multiple R-squared:  0.2393, Adjusted R-squared:  0.2335 
## F-statistic: 41.52 on 3 and 396 DF,  p-value: < 2.2e-16

(b).

  • Intercept: The estimated baseline Sales when Price = 0, Urban = No, and US = No.

  • Price: The coefficient tells us how much Sales decrease for each unit increase in Price.

  • Urban (Yes/No): Since Urban is qualitative (categorical), the coefficient tells us whether being in an Urban location has a significant impact on Sales.

  • US (Yes/No): The coefficient shows the difference in Sales between stores in the US vs. non-US locations.

Key Observation: If the p-value for a predictor is small (<0.05), it means the predictor significantly impacts Sales.

(c).

Since Urban and US are categorical variables, R automatically converts them into dummy variables (e.g., UrbanYes and USYes):

\[ \hat{Sales} = \beta_0 + \beta_1 Price + \beta_2 UrbanYes + \beta_3 USYes + \varepsilon \]

Where:

  • UrbanYes = 1 if store is in an urban area, 0 otherwise.

  • USYes = 1 if store is in the US, 0 otherwise.

(d).

summary(lm_fit)$coefficients
##                Estimate  Std. Error      t value     Pr(>|t|)
## (Intercept) 13.04346894 0.651012245  20.03567373 3.626602e-62
## Price       -0.05445885 0.005241855 -10.38923205 1.609917e-22
## UrbanYes    -0.02191615 0.271650277  -0.08067781 9.357389e-01
## USYes        1.20057270 0.259041508   4.63467306 4.860245e-06

p-value < 0.05, we reject H0​, meaning the predictor significantly affects Sales.

(e).

# Fit the smaller model with only significant predictors
lm_fit_small <- lm(Sales ~ Price, data = Carseats)

# Display summary of the new model
summary(lm_fit_small)
## 
## Call:
## lm(formula = Sales ~ Price, data = Carseats)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -6.5224 -1.8442 -0.1459  1.6503  7.5108 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 13.641915   0.632812  21.558   <2e-16 ***
## Price       -0.053073   0.005354  -9.912   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 2.532 on 398 degrees of freedom
## Multiple R-squared:  0.198,  Adjusted R-squared:  0.196 
## F-statistic: 98.25 on 1 and 398 DF,  p-value: < 2.2e-16

(f).

# Compare models using ANOVA
anova(lm_fit, lm_fit_small)
## Analysis of Variance Table
## 
## Model 1: Sales ~ Price + Urban + US
## Model 2: Sales ~ Price
##   Res.Df    RSS Df Sum of Sq      F    Pr(>F)    
## 1    396 2420.8                                  
## 2    398 2552.2 -2   -131.41 10.748 2.848e-05 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
  • p-value (Pr(>F)) = 2.848e-05 (very small)

    • Since p < 0.05, we reject the null hypothesis, meaning that adding Urban and US significantly improves the model.
  • F-statistic = 10.748

    • A higher F-value confirms that the extra predictors (Urban, US) contribute to explaining variation in Sales.

(g).

confint(lm_fit_small)
##                  2.5 %      97.5 %
## (Intercept) 12.3978438 14.88598655
## Price       -0.0635995 -0.04254653

(h).

par(mfrow = c(2, 2))  # Arrange plots in a 2x2 grid
plot(lm_fit_small)

# Identify high leverage points
influence.measures(lm_fit_small)
## Influence measures of
##   lm(formula = Sales ~ Price, data = Carseats) :
## 
##        dfb.1_  dfb.Pric     dffit cov.r   cook.d     hat inf
## 1    0.001140  7.84e-03  0.044761 1.004 1.00e-03 0.00258    
## 2    0.061461 -5.47e-02  0.067410 1.009 2.27e-03 0.00731    
## 3    0.022222 -2.00e-02  0.023952 1.013 2.88e-04 0.00823    
## 4   -0.021205  1.72e-02 -0.027673 1.008 3.84e-04 0.00408    
## 5    0.016338 -2.76e-02 -0.060162 1.002 1.81e-03 0.00317    
## 6    0.039753 -3.65e-02  0.041531 1.016 8.64e-04 0.01108   *
## 7   -0.013244  8.35e-03 -0.026661 1.007 3.56e-04 0.00277    
## 8    0.002349  1.62e-02  0.092288 0.991 4.23e-03 0.00258    
## 9    0.001441 -3.57e-03 -0.010903 1.008 5.96e-05 0.00280    
## 10   0.006565 -1.63e-02 -0.049681 1.003 1.23e-03 0.00280    
## 11   0.011423 -8.93e-03  0.016076 1.008 1.30e-04 0.00362    
## 12   0.072428 -6.05e-02  0.089291 1.001 3.98e-03 0.00462    
## 13   0.030874 -4.14e-02 -0.063742 1.005 2.03e-03 0.00433    
## 14   0.053636 -4.71e-02  0.060142 1.009 1.81e-03 0.00647    
## 15   0.008171  7.01e-03  0.075480 0.996 2.84e-03 0.00252    
## 16  -0.052162  6.42e-02  0.083827 1.005 3.51e-03 0.00606    
## 17  -0.001949  1.08e-03 -0.004558 1.008 1.04e-05 0.00265    
## 18  -0.047946  7.17e-02  0.132591 0.984 8.70e-03 0.00353   *
## 19   0.169355 -1.57e-01  0.175152 1.006 1.53e-02 0.01271    
## 20  -0.000465  6.57e-03  0.030582 1.006 4.68e-04 0.00262    
## 21   0.002377 -3.55e-03 -0.006573 1.009 2.17e-05 0.00353    
## 22   0.040841 -2.44e-02  0.088229 0.993 3.87e-03 0.00271    
## 23   0.017662 -2.30e-02 -0.033653 1.009 5.67e-04 0.00470    
## 24  -0.018937  1.13e-02 -0.040910 1.005 8.38e-04 0.00271    
## 25   0.015603 -5.84e-03  0.049737 1.003 1.24e-03 0.00253    
## 26   0.179520 -1.60e-01  0.195666 0.988 1.90e-02 0.00761    
## 27  -0.013966  2.09e-02  0.038621 1.006 7.47e-04 0.00353    
## 28  -0.030107  1.98e-02 -0.056905 1.002 1.62e-03 0.00285    
## 29  -0.107313  8.71e-02 -0.140046 0.985 9.71e-03 0.00408    
## 30  -0.006389  4.83e-03 -0.009586 1.008 4.61e-05 0.00335    
## 31   0.120874 -1.05e-01  0.139415 0.994 9.66e-03 0.00571    
## 32  -0.013284  1.99e-02  0.036736 1.007 6.76e-04 0.00353    
## 33   0.002297 -3.04e-03 -0.004548 1.010 1.04e-05 0.00451    
## 34  -0.011625  1.96e-02  0.042806 1.005 9.17e-04 0.00317    
## 35   0.025350 -4.28e-02 -0.093344 0.994 4.34e-03 0.00317    
## 36   0.051044 -4.19e-02  0.065246 1.004 2.13e-03 0.00425    
## 37   0.009393 -7.34e-03  0.013219 1.008 8.76e-05 0.00362    
## 38  -0.024879  1.39e-02 -0.058191 1.001 1.69e-03 0.00265    
## 39  -0.025030  1.89e-02 -0.037552 1.006 7.06e-04 0.00335    
## 40   0.043985 -5.74e-02 -0.083809 1.002 3.51e-03 0.00470    
## 41   0.021621 -4.19e-02 -0.105722 0.989 5.55e-03 0.00297    
## 42  -0.002487  6.17e-03  0.018824 1.007 1.78e-04 0.00280    
## 43  -0.159540  1.55e-01 -0.159746 1.044 1.28e-02 0.04017   *
## 44   0.026476 -3.68e-02 -0.060289 1.004 1.82e-03 0.00398    
## 45  -0.093736  7.76e-02 -0.117571 0.994 6.88e-03 0.00443    
## 46   0.022628 -3.09e-02 -0.048940 1.006 1.20e-03 0.00415    
## 47   0.105335 -9.73e-02  0.109454 1.012 5.99e-03 0.01188    
## 48  -0.036602  2.31e-02 -0.073681 0.998 2.71e-03 0.00277    
## 49  -0.084423  6.78e-02 -0.112714 0.993 6.32e-03 0.00392    
## 50  -0.114433  1.37e-01  0.167787 0.994 1.40e-02 0.00743    
## 51  -0.067691  4.27e-02 -0.136263 0.975 9.15e-03 0.00277   *
## 52  -0.036185  2.28e-02 -0.072842 0.998 2.65e-03 0.00277    
## 53  -0.007654  1.23e-02  0.025261 1.007 3.20e-04 0.00328    
## 54  -0.000540 -1.09e-03 -0.008105 1.007 3.29e-05 0.00255    
## 55   0.021131 -2.60e-02 -0.033958 1.010 5.78e-04 0.00606    
## 56  -0.038027  4.44e-02  0.052248 1.013 1.37e-03 0.00903    
## 57   0.082277 -7.29e-02  0.090855 1.006 4.13e-03 0.00702    
## 58  -0.019526 -6.63e-03 -0.130203 0.974 8.36e-03 0.00251   *
## 59  -0.039866  2.95e-02 -0.062073 1.002 1.93e-03 0.00323    
## 60  -0.012937  3.58e-03 -0.047271 1.003 1.12e-03 0.00251    
## 61  -0.002348  7.27e-03  0.024938 1.007 3.12e-04 0.00273    
## 62  -0.007180  4.73e-03 -0.013570 1.008 9.23e-05 0.00285    
## 63   0.048567 -6.89e-02 -0.117130 0.991 6.82e-03 0.00382    
## 64   0.003032 -2.33e-03  0.004399 1.009 9.70e-06 0.00348    
## 65  -0.004392  3.18e-03 -0.007125 1.008 2.54e-05 0.00312    
## 66   0.011789 -1.99e-02 -0.043412 1.005 9.43e-04 0.00317    
## 67   0.000918 -7.84e-04  0.001084 1.010 5.89e-07 0.00525    
## 68   0.006782 -9.78e-04  0.029119 1.006 4.25e-04 0.00250    
## 69  -0.075977  1.06e-01  0.173012 0.972 1.47e-02 0.00398   *
## 70  -0.007053  5.59e-03 -0.009656 1.009 4.67e-05 0.00376    
## 71   0.019022 -1.51e-02  0.026040 1.008 3.40e-04 0.00376    
## 72  -0.019814  2.35e-02  0.028627 1.012 4.11e-04 0.00773    
## 73  -0.007450 -3.37e-04 -0.038886 1.005 7.57e-04 0.00250    
## 74   0.061390 -4.45e-02  0.099599 0.992 4.93e-03 0.00312    
## 75   0.002826 -3.79e-03 -0.005835 1.009 1.71e-05 0.00433    
## 76  -0.004917  4.17e-03 -0.005882 1.010 1.73e-05 0.00503    
## 77   0.029858 -2.76e-02  0.031025 1.017 4.82e-04 0.01188   *
## 78  -0.031673  2.74e-02 -0.036531 1.010 6.69e-04 0.00571    
## 79   0.030406 -3.72e-02 -0.047841 1.010 1.15e-03 0.00631    
## 80   0.006910 -5.94e-03  0.008059 1.011 3.26e-05 0.00547    
## 81  -0.049387  4.46e-02 -0.052967 1.012 1.41e-03 0.00855    
## 82  -0.004060  6.86e-03  0.014949 1.008 1.12e-04 0.00317    
## 83  -0.081262  1.05e-01  0.149534 0.987 1.11e-02 0.00491    
## 84  -0.092840  7.76e-02 -0.114455 0.996 6.52e-03 0.00462    
## 85   0.001544 -2.18e-02 -0.101522 0.988 5.12e-03 0.00262    
## 86   0.005456 -2.45e-03  0.015465 1.007 1.20e-04 0.00256    
## 87  -0.023831  3.31e-02  0.054267 1.005 1.47e-03 0.00398    
## 88  -0.020999  4.07e-02  0.102677 0.990 5.24e-03 0.00297    
## 89  -0.009391  4.78e-03 -0.024030 1.007 2.89e-04 0.00260    
## 90   0.000830  1.67e-03  0.012446 1.007 7.76e-05 0.00255    
## 91  -0.041172  3.05e-02 -0.064107 1.002 2.05e-03 0.00323    
## 92  -0.035269  2.32e-02 -0.066660 1.000 2.22e-03 0.00285    
## 93   0.008895 -1.91e-02 -0.052648 1.003 1.39e-03 0.00288    
## 94   0.010052 -7.28e-03  0.016309 1.008 1.33e-04 0.00312    
## 95  -0.023924  2.12e-02 -0.026418 1.012 3.50e-04 0.00702    
## 96   0.004666 -5.60e-03 -0.006950 1.012 2.42e-05 0.00714    
## 97  -0.026572  3.86e-02  0.068341 1.002 2.33e-03 0.00367    
## 98  -0.004494  7.23e-03  0.014833 1.008 1.10e-04 0.00328    
## 99  -0.029383  5.27e-02  0.123080 0.984 7.50e-03 0.00306   *
## 100 -0.034483  2.27e-02 -0.065174 1.000 2.12e-03 0.00285    
## 101 -0.046952  3.21e-02 -0.083873 0.996 3.51e-03 0.00293    
## 102 -0.002536 -2.17e-03 -0.023422 1.006 2.75e-04 0.00252    
## 103 -0.062026  5.04e-02 -0.080945 1.001 3.27e-03 0.00408    
## 104 -0.070420  5.78e-02 -0.090012 1.000 4.04e-03 0.00425    
## 105  0.024232 -3.16e-02 -0.046172 1.008 1.07e-03 0.00470    
## 106 -0.057153  4.64e-02 -0.074586 1.002 2.78e-03 0.00408    
## 107  0.092794 -1.20e-01 -0.170756 0.981 1.44e-02 0.00491   *
## 108  0.006620 -4.17e-03  0.013326 1.008 8.90e-05 0.00277    
## 109 -0.068275  5.06e-02 -0.106308 0.991 5.62e-03 0.00323    
## 110  0.002885 -2.48e-03  0.003364 1.011 5.67e-06 0.00547    
## 111  0.005739  2.60e-04  0.029956 1.006 4.49e-04 0.00250    
## 112 -0.024828  2.94e-02  0.035379 1.012 6.27e-04 0.00804    
## 113  0.001211 -2.60e-03 -0.007169 1.008 2.58e-05 0.00288    
## 114  0.004660 -8.36e-03 -0.019520 1.008 1.91e-04 0.00306    
## 115  0.015510 -1.06e-02  0.027706 1.007 3.85e-04 0.00293    
## 116 -0.011985  1.93e-02  0.039555 1.006 7.83e-04 0.00328    
## 117  0.010699 -1.81e-02 -0.039396 1.006 7.77e-04 0.00317    
## 118  0.001961  3.95e-03  0.029416 1.006 4.33e-04 0.00255    
## 119 -0.014504  1.15e-02 -0.019855 1.008 1.98e-04 0.00376    
## 120 -0.003153  5.32e-03  0.011609 1.008 6.75e-05 0.00317    
## 121 -0.001537  2.30e-03  0.004250 1.009 9.06e-06 0.00353    
## 122  0.073233 -6.40e-02  0.082828 1.006 3.43e-03 0.00621    
## 123 -0.010656  6.72e-03 -0.021451 1.007 2.31e-04 0.00277    
## 124 -0.078783  9.17e-02  0.107094 1.008 5.73e-03 0.00937    
## 125  0.000817  5.62e-03  0.032082 1.006 5.15e-04 0.00258    
## 126 -0.101910  9.70e-02 -0.102900 1.026 5.30e-03 0.02245   *
## 127 -0.047781  6.78e-02  0.115235 0.992 6.60e-03 0.00382    
## 128 -0.003657 -1.65e-04 -0.019090 1.007 1.83e-04 0.00250    
## 129  0.008795 -1.70e-02 -0.043003 1.005 9.26e-04 0.00297    
## 130  0.029743 -3.59e-02 -0.045062 1.010 1.02e-03 0.00685    
## 131 -0.041213  3.74e-02 -0.043809 1.013 9.62e-04 0.00923    
## 132 -0.047097  3.94e-02 -0.058062 1.006 1.69e-03 0.00462    
## 133 -0.039392  5.28e-02  0.081330 1.002 3.30e-03 0.00433    
## 134 -0.016939  1.38e-02 -0.022105 1.009 2.45e-04 0.00408    
## 135  0.025734 -3.85e-02 -0.071166 1.001 2.53e-03 0.00353    
## 136 -0.000426 -2.93e-03 -0.016733 1.007 1.40e-04 0.00258    
## 137 -0.001076 -7.40e-03 -0.042270 1.004 8.94e-04 0.00258    
## 138 -0.001847 -1.58e-03 -0.017064 1.007 1.46e-04 0.00252    
## 139  0.023007 -1.37e-02  0.049702 1.003 1.24e-03 0.00271    
## 140  0.079912 -6.68e-02  0.098517 0.999 4.84e-03 0.00462    
## 141  0.005257 -8.46e-03 -0.017349 1.008 1.51e-04 0.00328    
## 142  0.001356 -2.03e-03 -0.003749 1.009 7.05e-06 0.00353    
## 143 -0.009298  6.73e-03 -0.015085 1.008 1.14e-04 0.00312    
## 144  0.148786 -1.71e-01 -0.194898 0.999 1.89e-02 0.01085    
## 145 -0.003850  1.19e-02  0.040879 1.005 8.36e-04 0.00273    
## 146  0.003972  1.35e-03  0.026488 1.006 3.51e-04 0.00251    
## 147  0.023768 -3.55e-02 -0.065728 1.002 2.16e-03 0.00353    
## 148  0.004244  8.55e-03  0.063646 1.000 2.02e-03 0.00255    
## 149 -0.018102  1.47e-02 -0.023624 1.008 2.80e-04 0.00408    
## 150  0.067960 -5.94e-02  0.076864 1.007 2.95e-03 0.00621    
## 151  0.015753 -4.36e-03  0.057560 1.001 1.66e-03 0.00251    
## 152  0.037533 -2.78e-02  0.058441 1.003 1.71e-03 0.00323    
## 153 -0.004785  8.08e-03  0.017621 1.008 1.56e-04 0.00317    
## 154 -0.006024  7.16e-03  0.008703 1.013 3.80e-05 0.00773    
## 155 -0.007955  4.43e-03 -0.018607 1.007 1.73e-04 0.00265    
## 156 -0.097034  8.98e-02 -0.100584 1.013 5.06e-03 0.01229    
## 157 -0.065531  7.58e-02  0.087353 1.011 3.82e-03 0.01009    
## 158  0.033843 -2.91e-02  0.039470 1.009 7.80e-04 0.00547    
## 159  0.034295 -1.54e-02  0.097206 0.989 4.69e-03 0.00256    
## 160 -0.025403  2.35e-02 -0.026396 1.017 3.49e-04 0.01188   *
## 161 -0.024334  1.24e-02 -0.062268 1.000 1.94e-03 0.00260    
## 162  0.072313 -8.29e-02 -0.093967 1.012 4.42e-03 0.01124    
## 163  0.049190 -5.87e-02 -0.072124 1.009 2.60e-03 0.00743    
## 164 -0.028027  1.92e-02 -0.050065 1.004 1.25e-03 0.00293    
## 165 -0.034532  4.36e-02  0.059781 1.007 1.79e-03 0.00534    
## 166  0.185786 -2.03e-01 -0.212412 1.026 2.25e-02 0.02779   *
## 167 -0.004558  6.02e-03  0.009022 1.010 4.08e-05 0.00451    
## 168 -0.045308  3.82e-02 -0.054983 1.007 1.51e-03 0.00482    
## 169 -0.000393 -1.33e-04 -0.002620 1.008 3.44e-06 0.00251    
## 170  0.069293 -6.29e-02  0.073657 1.011 2.72e-03 0.00923    
## 171  0.001356  1.16e-03  0.012524 1.007 7.86e-05 0.00252    
## 172  0.096648 -9.14e-02  0.098053 1.022 4.81e-03 0.01902   *
## 173  0.010675 -5.94e-03  0.024968 1.007 3.12e-04 0.00265    
## 174  0.002833 -4.78e-03 -0.010432 1.008 5.45e-05 0.00317    
## 175  0.206642 -2.27e-01 -0.239598 1.018 2.86e-02 0.02391   *
## 176 -0.000420  1.94e-03  0.007625 1.008 2.91e-05 0.00267    
## 177 -0.003889  4.54e-03  0.005344 1.014 1.43e-05 0.00903    
## 178  0.039953 -3.34e-02  0.049256 1.007 1.21e-03 0.00462    
## 179  0.042993 -3.85e-02  0.046589 1.012 1.09e-03 0.00791    
## 180  0.001116  5.05e-05  0.005823 1.007 1.70e-05 0.00250    
## 181  0.018550 -2.22e-02 -0.027198 1.012 3.71e-04 0.00743    
## 182 -0.033647  2.87e-02 -0.039719 1.009 7.90e-04 0.00525    
## 183  0.023429 -2.99e-02 -0.041761 1.008 8.73e-04 0.00512    
## 184 -0.044481  3.36e-02 -0.066735 1.002 2.22e-03 0.00335    
## 185  0.028234 -2.29e-02  0.036846 1.007 6.80e-04 0.00408    
## 186  0.023541 -1.55e-02  0.044494 1.004 9.91e-04 0.00285    
## 187 -0.011323  9.94e-03 -0.012696 1.011 8.08e-05 0.00647    
## 188 -0.050918  4.18e-02 -0.065085 1.004 2.12e-03 0.00425    
## 189 -0.020013  1.72e-02 -0.023341 1.010 2.73e-04 0.00547    
## 190  0.054504 -3.95e-02  0.088428 0.996 3.90e-03 0.00312    
## 191  0.008182 -6.30e-03  0.011869 1.008 7.06e-05 0.00348    
## 192 -0.096305  1.07e-01  0.116166 1.019 6.75e-03 0.01713   *
## 193 -0.026001  2.19e-02 -0.031553 1.009 4.99e-04 0.00482    
## 194  0.096058 -7.88e-02  0.122784 0.992 7.49e-03 0.00425    
## 195 -0.002306  3.89e-03  0.008492 1.008 3.61e-05 0.00317    
## 196 -0.024841  1.12e-02 -0.070408 0.998 2.47e-03 0.00256    
## 197  0.025237 -3.58e-02 -0.060864 1.004 1.85e-03 0.00382    
## 198  0.054328 -7.09e-02 -0.103516 0.998 5.34e-03 0.00470    
## 199  0.019559 -3.30e-02 -0.072022 1.000 2.59e-03 0.00317    
## 200  0.002356 -4.56e-03 -0.011519 1.008 6.65e-05 0.00297    
## 201  0.006956 -8.45e-03 -0.010732 1.012 5.77e-05 0.00658    
## 202  0.006476 -9.00e-03 -0.014746 1.009 1.09e-04 0.00398    
## 203  0.020340 -3.15e-02 -0.061079 1.003 1.86e-03 0.00340    
## 204  0.098069 -1.13e-01 -0.130726 1.007 8.53e-03 0.01009    
## 205 -0.004647  1.15e-02  0.035167 1.006 6.19e-04 0.00280    
## 206  0.008923 -1.30e-02 -0.022949 1.008 2.64e-04 0.00367    
## 207  0.005864 -6.72e-03 -0.007621 1.016 2.91e-05 0.01124   *
## 208 -0.005889  4.78e-03 -0.007685 1.009 2.96e-05 0.00408    
## 209 -0.115873  1.08e-01 -0.118919 1.015 7.07e-03 0.01449    
## 210 -0.148066  1.27e-01 -0.172685 0.984 1.47e-02 0.00547   *
## 211  0.005369 -1.66e-02 -0.057013 1.002 1.62e-03 0.00273    
## 212  0.001083  7.45e-03  0.042546 1.004 9.06e-04 0.00258    
## 213  0.051002 -3.60e-02  0.086605 0.996 3.74e-03 0.00302    
## 214 -0.029675  3.82e-02  0.054606 1.007 1.49e-03 0.00491    
## 215 -0.035044  2.31e-02 -0.066235 1.000 2.19e-03 0.00285    
## 216  0.070505 -8.68e-02 -0.113304 1.001 6.40e-03 0.00606    
## 217  0.005177 -6.38e-03 -0.008319 1.011 3.47e-05 0.00606    
## 218 -0.026952  1.37e-02 -0.068966 0.998 2.37e-03 0.00260    
## 219  0.001242  8.54e-03  0.048790 1.003 1.19e-03 0.00258    
## 220  0.011896  5.38e-04  0.062090 1.000 1.92e-03 0.00250    
## 221 -0.009786  2.43e-02  0.074053 0.998 2.74e-03 0.00280    
## 222 -0.017123  1.13e-02 -0.032363 1.006 5.25e-04 0.00285    
## 223 -0.030955  3.79e-02  0.048705 1.010 1.19e-03 0.00631    
## 224  0.012789 -2.75e-02 -0.075693 0.998 2.86e-03 0.00288    
## 225  0.034486 -4.35e-02 -0.059701 1.007 1.78e-03 0.00534    
## 226 -0.083108  7.42e-02 -0.090583 1.007 4.10e-03 0.00761    
## 227 -0.000714  3.28e-03  0.012941 1.007 8.39e-05 0.00267    
## 228  0.006572 -5.06e-03  0.009535 1.008 4.56e-05 0.00348    
## 229 -0.014340  1.63e-02  0.018235 1.018 1.67e-04 0.01246   *
## 230  0.055032 -5.06e-02  0.057493 1.015 1.66e-03 0.01108    
## 231 -0.013209  3.65e-03 -0.048266 1.003 1.17e-03 0.00251    
## 232 -0.001041  4.79e-03  0.018871 1.007 1.78e-04 0.00267    
## 233  0.065259 -4.60e-02  0.110816 0.988 6.09e-03 0.00302    
## 234  0.000704  4.84e-03  0.027659 1.006 3.83e-04 0.00258    
## 235 -0.018114  2.91e-02  0.059781 1.003 1.79e-03 0.00328    
## 236  0.010323 -1.50e-02 -0.026551 1.008 3.53e-04 0.00367    
## 237  0.014589 -9.19e-03  0.029369 1.006 4.32e-04 0.00277    
## 238 -0.037143  5.06e-02  0.080333 1.001 3.22e-03 0.00415    
## 239 -0.007886  1.12e-02  0.019019 1.008 1.81e-04 0.00382    
## 240 -0.007518 -6.45e-03 -0.069448 0.998 2.41e-03 0.00252    
## 241 -0.000953  1.35e-02  0.062673 1.000 1.96e-03 0.00262    
## 242  0.073528 -6.14e-02  0.090647 1.001 4.10e-03 0.00462    
## 243  0.021210 -2.89e-02 -0.045872 1.007 1.05e-03 0.00415    
## 244  0.000140 -7.81e-05  0.000328 1.008 5.40e-08 0.00265    
## 245  0.007533 -5.89e-03  0.010601 1.009 5.63e-05 0.00362    
## 246  0.027586 -2.40e-02  0.031494 1.010 4.97e-04 0.00595    
## 247 -0.049485  4.25e-02 -0.057713 1.008 1.67e-03 0.00547    
## 248  0.014710 -1.74e-02 -0.020962 1.013 2.20e-04 0.00804    
## 249 -0.047087  3.62e-02 -0.068311 1.002 2.33e-03 0.00348    
## 250 -0.007080 -2.40e-03 -0.047212 1.003 1.11e-03 0.00251    
## 251 -0.111159  1.29e-01  0.149590 1.003 1.12e-02 0.00973    
## 252  0.027252 -3.96e-02 -0.070091 1.002 2.45e-03 0.00367    
## 253  0.002606  8.84e-04  0.017375 1.007 1.51e-04 0.00251    
## 254  0.001722 -7.93e-03 -0.031230 1.006 4.88e-04 0.00267    
## 255 -0.019148  3.08e-02  0.063195 1.002 2.00e-03 0.00328    
## 256 -0.053319  4.78e-02 -0.057778 1.011 1.67e-03 0.00791    
## 257  0.034599 -4.26e-02 -0.055602 1.009 1.55e-03 0.00606    
## 258  0.006870 -3.09e-03  0.019472 1.007 1.90e-04 0.00256    
## 259 -0.192976  1.73e-01 -0.209116 0.986 2.16e-02 0.00791    
## 260 -0.054477  4.26e-02 -0.076665 1.001 2.93e-03 0.00362    
## 261 -0.009840  7.57e-03 -0.014276 1.008 1.02e-04 0.00348    
## 262 -0.003590 -3.08e-03 -0.033163 1.005 5.51e-04 0.00252    
## 263  0.002484 -3.61e-03 -0.006389 1.009 2.05e-05 0.00367    
## 264  0.001066 -1.54e-04  0.004579 1.008 1.05e-05 0.00250    
## 265 -0.055403  6.37e-02  0.072573 1.014 2.64e-03 0.01085    
## 266  0.010204 -1.64e-02 -0.033677 1.007 5.68e-04 0.00328    
## 267  0.009910 -4.45e-03  0.028089 1.006 3.95e-04 0.00256    
## 268 -0.013204  5.93e-03 -0.037424 1.005 7.01e-04 0.00256    
## 269 -0.019719  1.39e-02 -0.033484 1.006 5.61e-04 0.00302    
## 270 -0.006699  7.57e-03  0.008362 1.019 3.51e-05 0.01377   *
## 271  0.079971 -6.92e-02  0.092237 1.003 4.25e-03 0.00571    
## 272 -0.028380  1.58e-02 -0.066379 0.999 2.20e-03 0.00265    
## 273  0.128362 -1.20e-01  0.131522 1.014 8.65e-03 0.01496    
## 274  0.027408 -2.41e-02  0.030732 1.011 4.73e-04 0.00647    
## 275 -0.000141 -2.85e-04 -0.002119 1.008 2.25e-06 0.00255    
## 276 -0.000315  4.57e-04  0.000809 1.009 3.28e-07 0.00367    
## 277 -0.001442  2.23e-03  0.004330 1.008 9.40e-06 0.00340    
## 278 -0.002841  6.10e-03  0.016815 1.007 1.42e-04 0.00288    
## 279 -0.039855  4.71e-02  0.056792 1.011 1.62e-03 0.00804    
## 280  0.056711 -6.54e-02 -0.074920 1.013 2.81e-03 0.01046    
## 281  0.061976 -7.58e-02 -0.097515 1.004 4.75e-03 0.00631    
## 282  0.040036 -2.82e-02  0.067985 1.000 2.31e-03 0.00302    
## 283 -0.062568  7.31e-02  0.085968 1.010 3.70e-03 0.00903    
## 284 -0.006157 -2.09e-03 -0.041057 1.004 8.44e-04 0.00251    
## 285 -0.031877  2.62e-02 -0.040747 1.007 8.31e-04 0.00425    
## 286 -0.007749  1.16e-02  0.021429 1.008 2.30e-04 0.00353    
## 287 -0.000716  2.68e-04 -0.002283 1.008 2.61e-06 0.00253    
## 288 -0.118340  1.09e-01 -0.123633 1.009 7.64e-03 0.01108    
## 289 -0.029353  2.38e-02 -0.038307 1.007 7.35e-04 0.00408    
## 290 -0.099100  1.15e-01  0.133361 1.006 8.87e-03 0.00973    
## 291  0.018998 -1.41e-02  0.029581 1.007 4.38e-04 0.00323    
## 292 -0.059270  5.13e-02 -0.068362 1.007 2.34e-03 0.00571    
## 293  0.081115 -7.42e-02  0.085269 1.012 3.64e-03 0.01031    
## 294  0.061439 -5.31e-02  0.070863 1.006 2.51e-03 0.00571    
## 295  0.076043 -6.03e-02  0.104101 0.994 5.39e-03 0.00376    
## 296  0.029071 -3.84e-02 -0.057549 1.006 1.66e-03 0.00451    
## 297 -0.002134  6.61e-03  0.022663 1.007 2.57e-04 0.00273    
## 298 -0.069188  5.01e-02 -0.112250 0.988 6.25e-03 0.00312    
## 299 -0.032689  5.05e-02  0.098163 0.994 4.80e-03 0.00340    
## 300  0.017240 -1.41e-02  0.022036 1.009 2.43e-04 0.00425    
## 301  0.003233 -2.56e-03  0.004426 1.009 9.82e-06 0.00376    
## 302 -0.044657  3.90e-02 -0.050508 1.009 1.28e-03 0.00621    
## 303 -0.021994  1.22e-02 -0.051444 1.003 1.32e-03 0.00265    
## 304  0.028786 -2.28e-02  0.039408 1.007 7.78e-04 0.00376    
## 305 -0.059594  8.28e-02  0.135703 0.986 9.13e-03 0.00398    
## 306 -0.013007  1.89e-02  0.033454 1.007 5.61e-04 0.00367    
## 307  0.018316 -2.60e-02 -0.044172 1.006 9.77e-04 0.00382    
## 308 -0.000702 -4.83e-03 -0.027585 1.006 3.81e-04 0.00258    
## 309 -0.010078  1.95e-02  0.049280 1.004 1.21e-03 0.00297    
## 310  0.059742 -5.37e-02  0.064392 1.011 2.08e-03 0.00823    
## 311 -0.177397  2.00e-01  0.221436 1.001 2.44e-02 0.01377   *
## 312  0.004537 -6.60e-03 -0.011668 1.009 6.82e-05 0.00367    
## 313 -0.003809  5.19e-03  0.008238 1.009 3.40e-05 0.00415    
## 314 -0.080318  7.60e-02 -0.081411 1.023 3.32e-03 0.01957   *
## 315 -0.006349  1.02e-02  0.020953 1.008 2.20e-04 0.00328    
## 316 -0.076339  8.54e-02  0.092888 1.019 4.32e-03 0.01613   *
## 317  0.234960 -2.16e-01  0.245467 0.989 2.98e-02 0.01108   *
## 318  0.000176 -2.37e-04 -0.000364 1.009 6.65e-08 0.00433    
## 319 -0.025712  3.98e-02  0.077210 1.000 2.98e-03 0.00340    
## 320 -0.001198  1.93e-03  0.003954 1.008 7.84e-06 0.00328    
## 321 -0.007373  8.68e-03  0.010371 1.013 5.39e-05 0.00836    
## 322 -0.017090  1.37e-02 -0.022817 1.008 2.61e-04 0.00392    
## 323 -0.043756  5.64e-02  0.080519 1.003 3.24e-03 0.00491    
## 324  0.031590 -2.34e-02  0.049187 1.005 1.21e-03 0.00323    
## 325  0.073206 -8.70e-02 -0.105766 1.006 5.59e-03 0.00773    
## 326  0.048871 -3.54e-02  0.079287 0.998 3.14e-03 0.00312    
## 327  0.002796 -1.29e-02 -0.050696 1.003 1.29e-03 0.00267    
## 328 -0.025802  1.87e-02 -0.041861 1.005 8.77e-04 0.00312    
## 329 -0.036424  1.85e-02 -0.093204 0.991 4.32e-03 0.00260    
## 330  0.061178 -5.29e-02  0.070562 1.006 2.49e-03 0.00571    
## 331 -0.019156  8.60e-03 -0.054297 1.002 1.47e-03 0.00256    
## 332 -0.039269  5.45e-02  0.089422 0.999 3.99e-03 0.00398    
## 333 -0.032496  2.35e-02 -0.052722 1.004 1.39e-03 0.00312    
## 334 -0.000647  7.81e-04  0.000980 1.012 4.82e-07 0.00685    
## 335 -0.049786  4.43e-02 -0.054605 1.010 1.49e-03 0.00731    
## 336 -0.014141  7.87e-03 -0.033075 1.006 5.48e-04 0.00265    
## 337  0.016231 -2.01e-02 -0.026686 1.010 3.57e-04 0.00581    
## 338  0.005825 -4.40e-03  0.008740 1.008 3.83e-05 0.00335    
## 339 -0.037232  2.86e-02 -0.054014 1.004 1.46e-03 0.00348    
## 340 -0.020285  3.93e-02  0.099185 0.991 4.89e-03 0.00297    
## 341 -0.031942  2.73e-02 -0.037706 1.009 7.12e-04 0.00525    
## 342 -0.030087  2.53e-02 -0.036512 1.009 6.68e-04 0.00482    
## 343  0.000926  7.94e-04  0.008552 1.007 3.67e-05 0.00252    
## 344  0.000379 -5.35e-03 -0.024910 1.006 3.11e-04 0.00262    
## 345 -0.006502  1.26e-02  0.031794 1.006 5.06e-04 0.00297    
## 346  0.021588 -2.58e-02 -0.031653 1.012 5.02e-04 0.00743    
## 347 -0.007041  1.51e-02  0.041674 1.005 8.69e-04 0.00288    
## 348 -0.005778  2.59e-03 -0.016376 1.007 1.34e-04 0.00256    
## 349  0.051645 -3.40e-02  0.097612 0.991 4.74e-03 0.00285    
## 350  0.015622 -1.28e-02  0.019969 1.009 2.00e-04 0.00425    
## 351 -0.004192  3.58e-03 -0.004948 1.010 1.23e-05 0.00525    
## 352  0.030390 -2.14e-02  0.051605 1.004 1.33e-03 0.00302    
## 353 -0.007126  3.28e-02  0.129236 0.977 8.24e-03 0.00267   *
## 354  0.016239 -1.38e-02  0.019425 1.010 1.89e-04 0.00503    
## 355  0.012956 -1.58e-02 -0.020385 1.011 2.08e-04 0.00631    
## 356 -0.023522  2.86e-02  0.036292 1.011 6.60e-04 0.00658    
## 357  0.048773 -5.53e-02 -0.061619 1.017 1.90e-03 0.01289   *
## 358  0.142543 -1.31e-01  0.148918 1.006 1.11e-02 0.01108    
## 359 -0.006912 -5.93e-03 -0.063851 0.999 2.04e-03 0.00252    
## 360  0.027840 -4.30e-02 -0.083600 0.998 3.49e-03 0.00340    
## 361  0.006396 -1.77e-03  0.023370 1.006 2.74e-04 0.00251    
## 362  0.007599 -5.50e-03  0.012329 1.008 7.62e-05 0.00312    
## 363 -0.022257  1.24e-02 -0.052057 1.003 1.35e-03 0.00265    
## 364  0.024333 -1.53e-02  0.048983 1.003 1.20e-03 0.00277    
## 365 -0.032513  4.86e-02  0.089913 0.997 4.03e-03 0.00353    
## 366 -0.050869  5.80e-02  0.065130 1.016 2.12e-03 0.01204   *
## 367  0.006037 -8.39e-03 -0.013747 1.009 9.47e-05 0.00398    
## 368  0.200170 -1.90e-01  0.202718 1.015 2.05e-02 0.02013   *
## 369  0.043264 -3.90e-02  0.046400 1.012 1.08e-03 0.00855    
## 370 -0.003493  1.61e-02  0.063346 1.000 2.00e-03 0.00267    
## 371  0.000471  9.48e-04  0.007058 1.008 2.50e-05 0.00255    
## 372 -0.009371  1.82e-02  0.045823 1.004 1.05e-03 0.00297    
## 373 -0.011892  9.55e-03 -0.015877 1.009 1.26e-04 0.00392    
## 374 -0.007222 -3.27e-04 -0.037698 1.005 7.11e-04 0.00250    
## 375  0.004433  3.80e-03  0.040950 1.004 8.39e-04 0.00252    
## 376 -0.002321  5.76e-03  0.017567 1.007 1.55e-04 0.00280    
## 377  0.178543 -1.51e-01  0.213566 0.966 2.24e-02 0.00503   *
## 378  0.000709 -1.52e-03 -0.004198 1.008 8.83e-06 0.00288    
## 379 -0.001618 -3.26e-03 -0.024272 1.006 2.95e-04 0.00255    
## 380 -0.024058  1.59e-02 -0.045471 1.004 1.03e-03 0.00285    
## 381  0.018754 -1.62e-02  0.021631 1.010 2.34e-04 0.00571    
## 382  0.043258 -5.12e-02 -0.061641 1.011 1.90e-03 0.00804    
## 383  0.000700 -9.89e-03 -0.046004 1.004 1.06e-03 0.00262    
## 384 -0.029746  2.76e-02 -0.030764 1.018 4.74e-04 0.01271   *
## 385  0.036590 -1.64e-02  0.103709 0.987 5.34e-03 0.00256    
## 386  0.007150 -1.04e-02 -0.018388 1.008 1.69e-04 0.00367    
## 387 -0.005524  6.33e-03  0.007178 1.016 2.58e-05 0.01124   *
## 388  0.005214 -7.52e-04  0.022387 1.007 2.51e-04 0.00250    
## 389 -0.047884  4.33e-02 -0.051119 1.013 1.31e-03 0.00889    
## 390  0.005324 -3.51e-03  0.010063 1.008 5.08e-05 0.00285    
## 391 -0.018000  9.15e-03 -0.046059 1.004 1.06e-03 0.00260    
## 392  0.002658 -6.59e-03 -0.020116 1.007 2.03e-04 0.00280    
## 393  0.017023 -2.63e-02 -0.051118 1.005 1.31e-03 0.00340    
## 394 -0.000871 -5.99e-03 -0.034221 1.005 5.86e-04 0.00258    
## 395  0.013805 -1.78e-02 -0.025403 1.009 3.23e-04 0.00491    
## 396 -0.034815  5.88e-02  0.128198 0.982 8.13e-03 0.00317   *
## 397 -0.000579 -3.98e-03 -0.022761 1.007 2.60e-04 0.00258    
## 398 -0.070018  8.05e-02  0.091719 1.012 4.21e-03 0.01085    
## 399 -0.056017  4.64e-02 -0.070261 1.004 2.47e-03 0.00443    
## 400  0.001247  8.58e-03  0.048992 1.003 1.20e-03 0.00258

Question - 14

(a).

# Set seed for reproducibility
set.seed(1)

# Generate predictors x1 and x2
x1 <- runif(100)
x2 <- 0.5 * x1 + rnorm(100) / 10

# Generate response variable y
y <- 2 + 2 * x1 + 0.3 * x2 + rnorm(100)

# Fit the linear model
lm_fit <- lm(y ~ x1 + x2)

# Display the summary
summary(lm_fit)
## 
## Call:
## lm(formula = y ~ x1 + x2)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -2.8311 -0.7273 -0.0537  0.6338  2.3359 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   2.1305     0.2319   9.188 7.61e-15 ***
## x1            1.4396     0.7212   1.996   0.0487 *  
## x2            1.0097     1.1337   0.891   0.3754    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.056 on 97 degrees of freedom
## Multiple R-squared:  0.2088, Adjusted R-squared:  0.1925 
## F-statistic:  12.8 on 2 and 97 DF,  p-value: 1.164e-05
  • Regression Equation:

    \[ \hat{y} = \beta_0 + \beta_1 x_1 + \beta_2 x_2 + \varepsilon \]

  • Interpretation:

    • The coefficients β₀, β₁, and β₂ will be compared to their true values (2, 2, 0.3) from the data generation process.

(b).

# Compute correlation
cor_x1_x2 <- cor(x1, x2)
cor_x1_x2
## [1] 0.8351212
# Scatterplot of x1 vs. x2
plot(x1, x2, main = "Scatterplot of x1 vs. x2", xlab = "x1", ylab = "x2", col = "red", pch = 19)

(c).

# Fit multiple regression model
lm_fit <- lm(y ~ x1 + x2)

# Display summary
summary(lm_fit)
## 
## Call:
## lm(formula = y ~ x1 + x2)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -2.8311 -0.7273 -0.0537  0.6338  2.3359 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   2.1305     0.2319   9.188 7.61e-15 ***
## x1            1.4396     0.7212   1.996   0.0487 *  
## x2            1.0097     1.1337   0.891   0.3754    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.056 on 97 degrees of freedom
## Multiple R-squared:  0.2088, Adjusted R-squared:  0.1925 
## F-statistic:  12.8 on 2 and 97 DF,  p-value: 1.164e-05

We can reject the null hypotheses of β0 and β1.

(d).

# Fit regression with only x1
lm_x1 <- lm(y ~ x1)

# Display summary
summary(lm_x1)
## 
## Call:
## lm(formula = y ~ x1)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -2.89495 -0.66874 -0.07785  0.59221  2.45560 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   2.1124     0.2307   9.155 8.27e-15 ***
## x1            1.9759     0.3963   4.986 2.66e-06 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.055 on 98 degrees of freedom
## Multiple R-squared:  0.2024, Adjusted R-squared:  0.1942 
## F-statistic: 24.86 on 1 and 98 DF,  p-value: 2.661e-06

Yes, We can reject the null hypothesis.

(e).

# Fit regression with only x2
lm_x2 <- lm(y ~ x2)

# Display summary
summary(lm_x2)
## 
## Call:
## lm(formula = y ~ x2)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -2.62687 -0.75156 -0.03598  0.72383  2.44890 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   2.3899     0.1949   12.26  < 2e-16 ***
## x2            2.8996     0.6330    4.58 1.37e-05 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.072 on 98 degrees of freedom
## Multiple R-squared:  0.1763, Adjusted R-squared:  0.1679 
## F-statistic: 20.98 on 1 and 98 DF,  p-value: 1.366e-05

Yes, We can reject the null hypothesis.

(f).

  • Without the presence of other predictors (y ~ x1 and y ~ x2), both β₁ and β₂ are statistically significant with very small p-values.

  • In the presence of both predictors (y ~ x1 + x2), β₂ is no longer statistically significant (p = 0.3754), while β₁ remains marginally significant (p = 0.0487).

This suggests that collinearity between x1 and x2 is affecting the regression results. When both variables are included in the model, their shared variance makes it difficult to determine their individual effects, leading to inflated standard errors and a loss of statistical significance for x2.

(g).

# Add new data point
x1 <- c(x1, 0.1)
x2 <- c(x2, 0.8)
y <- c(y, 6)

# Refit models
lm_fit_new <- lm(y ~ x1 + x2)
lm_x1_new <- lm(y ~ x1)
lm_x2_new <- lm(y ~ x2)

# Display summaries
summary(lm_fit_new)
## 
## Call:
## lm(formula = y ~ x1 + x2)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -2.73348 -0.69318 -0.05263  0.66385  2.30619 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   2.2267     0.2314   9.624 7.91e-16 ***
## x1            0.5394     0.5922   0.911  0.36458    
## x2            2.5146     0.8977   2.801  0.00614 ** 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.075 on 98 degrees of freedom
## Multiple R-squared:  0.2188, Adjusted R-squared:  0.2029 
## F-statistic: 13.72 on 2 and 98 DF,  p-value: 5.564e-06
summary(lm_x1_new)
## 
## Call:
## lm(formula = y ~ x1)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -2.8897 -0.6556 -0.0909  0.5682  3.5665 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   2.2569     0.2390   9.445 1.78e-15 ***
## x1            1.7657     0.4124   4.282 4.29e-05 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.111 on 99 degrees of freedom
## Multiple R-squared:  0.1562, Adjusted R-squared:  0.1477 
## F-statistic: 18.33 on 1 and 99 DF,  p-value: 4.295e-05
summary(lm_x2_new)
## 
## Call:
## lm(formula = y ~ x2)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -2.64729 -0.71021 -0.06899  0.72699  2.38074 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   2.3451     0.1912  12.264  < 2e-16 ***
## x2            3.1190     0.6040   5.164 1.25e-06 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.074 on 99 degrees of freedom
## Multiple R-squared:  0.2122, Adjusted R-squared:  0.2042 
## F-statistic: 26.66 on 1 and 99 DF,  p-value: 1.253e-06
plot(lm_fit_new, which = 4)  # Cook’s Distance plot

Key Observations from the Cook’s Distance Plot:

  • Observation 101 (new data point) has a high Cook’s distance, indicating it is an influential point.

  • This means it has a strong impact on the regression model and might be pulling the regression coefficients.

Effects on Models:

  1. Multiple Regression (y ~ x1 + x2)

    • The coefficients remain similar, but the standard errors may increase slightly.

    • The p-value for x1 is still large (0.36458), meaning x1 is not significant.

  2. Simple Regression (y ~ x1)

    • The coefficient for x1 is still significant (p = 4.29e-05), but the effect size may be slightly different.
  3. Simple Regression (y ~ x2)

    • x2 is still significant (p = 1.25e-06), meaning the mismeasured observation did not completely distort the model.

Is the New Observation an Outlier or High-Leverage Point?

  • Outlier - Not necessarily, as it does not have an extreme residual.

  • High-Leverage Point- Yes, because it lies far from the other data points in the predictor space and has a high Cook’s distance, meaning it strongly influences the model.