1 Least Squares In Two Dimensions

## 
## Call:
## lm(formula = y ~ x, data = x.df)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -2.0306 -0.5751 -0.2109  0.5522  2.7050 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   2.6800     0.3273   8.188 6.51e-09 ***
## x             5.0094     0.1124  44.562  < 2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.9189 on 28 degrees of freedom
## Multiple R-squared:  0.9861, Adjusted R-squared:  0.9856 
## F-statistic:  1986 on 1 and 28 DF,  p-value: < 2.2e-16

So, we know that the p-value < alpha where 2.2e-16 < 0.05, then H0 will be rejected.

##       a0       a1 
## 2.680009 5.009426

So, the OLS estimators a0 = 2.68 and a1 = 5.01

##           y         x
## y 1.0000000 0.9930235
## x 0.9930235 1.0000000

The correlation from x and y is equal to 0.99 or 99%.

## [1] 0.9188509
## [1] 0.9860958
## [1] "Df"      "Sum Sq"  "Mean Sq" "F value" "Pr(>F)"

in ss there are Degree of freedoms, Sum of squares, mean of squares, F-value and p-value.

## [1] 0.9860958

The \(R^2\)-value of the model is 0.98 or 98%. So, the model is good.

## [1] 1985.773

The F-value is equal to 1985.773

## [1] 0

The Probability from F-test is 0.