Mingshi Di
2018-03-04 (re-submission)
library(datasets)
data("anscombe")
data <- anscombe
attach(data)
library(fBasics)
## Warning: package 'fBasics' was built under R version 3.4.2
## Loading required package: timeDate
## Warning: package 'timeDate' was built under R version 3.4.2
## Warning in as.POSIXlt.POSIXct(Sys.time()): unknown timezone 'default/
## America/Los_Angeles'
## Loading required package: timeSeries
## Warning: package 'timeSeries' was built under R version 3.4.2
colMeans(data)
## x1 x2 x3 x4 y1 y2 y3 y4
## 9.000000 9.000000 9.000000 9.000000 7.500909 7.500909 7.500000 7.500909
colVars(data)
## x1 x2 x3 x4 y1 y2 y3
## 11.000000 11.000000 11.000000 11.000000 4.127269 4.127629 4.122620
## y4
## 4.123249
correlationTest(x1, y1)
##
## Title:
## Pearson's Correlation Test
##
## Test Results:
## PARAMETER:
## Degrees of Freedom: 9
## SAMPLE ESTIMATES:
## Correlation: 0.8164
## STATISTIC:
## t: 4.2415
## P VALUE:
## Alternative Two-Sided: 0.00217
## Alternative Less: 0.9989
## Alternative Greater: 0.001085
## CONFIDENCE INTERVAL:
## Two-Sided: 0.4244, 0.9507
## Less: -1, 0.9388
## Greater: 0.5113, 1
##
## Description:
## Sun Mar 4 17:28:05 2018
correlationTest(x2, y2)
##
## Title:
## Pearson's Correlation Test
##
## Test Results:
## PARAMETER:
## Degrees of Freedom: 9
## SAMPLE ESTIMATES:
## Correlation: 0.8162
## STATISTIC:
## t: 4.2386
## P VALUE:
## Alternative Two-Sided: 0.002179
## Alternative Less: 0.9989
## Alternative Greater: 0.001089
## CONFIDENCE INTERVAL:
## Two-Sided: 0.4239, 0.9506
## Less: -1, 0.9387
## Greater: 0.5109, 1
##
## Description:
## Sun Mar 4 17:28:05 2018
correlationTest(x3, y3)
##
## Title:
## Pearson's Correlation Test
##
## Test Results:
## PARAMETER:
## Degrees of Freedom: 9
## SAMPLE ESTIMATES:
## Correlation: 0.8163
## STATISTIC:
## t: 4.2394
## P VALUE:
## Alternative Two-Sided: 0.002176
## Alternative Less: 0.9989
## Alternative Greater: 0.001088
## CONFIDENCE INTERVAL:
## Two-Sided: 0.4241, 0.9507
## Less: -1, 0.9387
## Greater: 0.511, 1
##
## Description:
## Sun Mar 4 17:28:05 2018
correlationTest(x4, y4)
##
## Title:
## Pearson's Correlation Test
##
## Test Results:
## PARAMETER:
## Degrees of Freedom: 9
## SAMPLE ESTIMATES:
## Correlation: 0.8165
## STATISTIC:
## t: 4.243
## P VALUE:
## Alternative Two-Sided: 0.002165
## Alternative Less: 0.9989
## Alternative Greater: 0.001082
## CONFIDENCE INTERVAL:
## Two-Sided: 0.4246, 0.9507
## Less: -1, 0.9388
## Greater: 0.5115, 1
##
## Description:
## Sun Mar 4 17:28:05 2018
plot(x1, y1)
plot(x2, y2)
plot(x3, y3)
plot(x4, y4)
par(mfrow=c(2,2))
plot(x1, y1, pch = 20, xlim=c(0,16), ylim=c(0,12))
plot(x2, y2, pch = 20, xlim=c(0,16), ylim=c(0,12))
plot(x3, y3, pch = 20, xlim=c(0,16), ylim=c(0,14))
plot(x4, y4, pch = 20, xlim=c(0,22), ylim=c(0,14))
Now fit a linear model to each data set using the lm() function.
Now combine the last two tasks. Create a four panel scatter plot matrix that has both the data points and the regression lines. (hint: the model objects will carry over chunks!)
par(mfrow=c(2,2))
plot(x1, y1, pch = 20, xlim=c(0,16), ylim=c(0,12))
abline(lm(x1~y1))
plot(x2, y2, pch = 20, xlim=c(0,16), ylim=c(0,12))
abline(lm(x2~y2))
plot(x3, y3, pch = 20, xlim=c(0,16), ylim=c(0,14))
abline(lm(x3~y3))
plot(x4, y4, pch = 20, xlim=c(0,22), ylim=c(0,14))
abline(lm(x4~y4))
lm(x1~y1)
##
## Call:
## lm(formula = x1 ~ y1)
##
## Coefficients:
## (Intercept) y1
## -0.9975 1.3328
lm(x2~y2)
##
## Call:
## lm(formula = x2 ~ y2)
##
## Coefficients:
## (Intercept) y2
## -0.9948 1.3325
lm(x3~y3)
##
## Call:
## lm(formula = x3 ~ y3)
##
## Coefficients:
## (Intercept) y3
## -1.000 1.333
lm(x4~y4)
##
## Call:
## lm(formula = x4 ~ y4)
##
## Coefficients:
## (Intercept) y4
## -1.004 1.334
These four sets of data have very similar statistical properties, however, taking a look at the visualization, we know that they are drastically different. This highlights the importance of visualization to point out what statistical models don’t tell us.