The simple linear regression determines the relationship between two variables. One variable (predictor) tells us what we can expect from the other variable (response). The general idea of the simple linear regression is to use the predictor to come up with some average value of the response. The relationship is defined as: y = a + bx + E, where a is the intercept; and b is the slope; E is the error term; x is the predictor variable; and y is the outcome/response variable. The equation describes is essentially a straight line that goes through the data where a is the y-intercept and b is the slope.
##
## Call:
## lm(formula = y ~ x, data = dataset)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1.4444 -0.8013 -0.2426 0.5978 2.2363
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 4.20041 0.56730 7.404 5.16e-06 ***
## x 1.84036 0.07857 23.423 5.13e-12 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.091 on 13 degrees of freedom
## Multiple R-squared: 0.9769, Adjusted R-squared: 0.9751
## F-statistic: 548.7 on 1 and 13 DF, p-value: 5.13e-12