For example 19 on page 79 in the book, carry out the regression using R.
x=c(-0.98, 1.00, 2.02, 3.03, 4.00)
y=c(2.44,-1.51, -0.47, 2.54, 7.52)
model1 <- lm(y~x)
model1
##
## Call:
## lm(formula = y ~ x)
##
## Coefficients:
## (Intercept) x
## 0.4038 0.9373
summary(model1)
##
## Call:
## lm(formula = y ~ x)
##
## Residuals:
## 1 2 3 4 5
## 2.9547 -2.8511 -2.7671 -0.7037 3.3671
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 0.4038 2.2634 0.178 0.870
## x 0.9373 0.9058 1.035 0.377
##
## Residual standard error: 3.481 on 3 degrees of freedom
## Multiple R-squared: 0.263, Adjusted R-squared: 0.01739
## F-statistic: 1.071 on 1 and 3 DF, p-value: 0.3769
Implement the nonlinear curve-fitting of Example 20 on Page 83 for the following data.
x <- c(0.10, 0.50, 1.00, 1.50, 2.00, 2.50)
y <- c(0.10, 0.28, 0.40, 0.40, 0.37, 0.32)
df1 <- data.frame(x, y)
eq1 <- function(x, a, b){
x / (a + b * x^2)
}
model2 <- nls(y ~ eq1(x, a, b), data = df1, start = list(a = 1, b = 1))
model2
## Nonlinear regression model
## model: y ~ eq1(x, a, b)
## data: df1
## a b
## 1.485 1.002
## residual sum-of-squares: 0.00121
##
## Number of iterations to convergence: 5
## Achieved convergence tolerance: 3.899e-07
s <- seq(from = 0, to = 1, length = 50)
plot(x, y)
lines(s, predict(model2, list(x = s)))
## Ex.3
For the data with binary y values, try to fit the following data.
x <- c(0.1, 0.50, 1.0, 1.5, 2.0, 2.5)
y <- c(0, 1, 0, 1, 0, 1)
to the non linear function \[y = \frac{1}{1+ e^{(a+bx)}},\]
starting with a=1 and b=1.
df2 <- data.frame(x, y)
eq2 <- function(x, a, b){
1 / (1 + exp((a + b * x)))
}
model3 <- nls(y ~ eq2(x, a, b), data = df2, start = list(a = 1, b = 1))
model3
## Nonlinear regression model
## model: y ~ eq2(x, a, b)
## data: df2
## a b
## 0.8294 -0.6520
## residual sum-of-squares: 1.387
##
## Number of iterations to convergence: 6
## Achieved convergence tolerance: 7.342e-06
s <- seq(from = 0, to = 1, length = 50)
plot(x, y)
lines(s, predict(model3, list(x = s)))