5. We have seen that we can fit an SVM with a non-linear kernel in order to perform classification using a non-linear decision boundary. We will now see that we can also obtain a non-linear decision boundary by performing logistic regression using non-linear transformations of the features.

  1. Generate a data set with n = 500 and p = 2, such that the observations belong to two classes with a quadratic decision boundary between them. For instance, you can do this as follows:
x1=runif (500) - 0.5
x2=runif (500) - 0.5
y=1*(x1^2-x2^2 > 0)
  1. Plot the observations, colored according to their class labels. Your plot should display X1 on the x-axis, and X2 on the yaxis.
plot(x1[y == 0], x2[y == 0], col = "purple", xlab = "X1", ylab = "X2", pch = "*")
points(x1[y == 1], x2[y == 1], col = "green", pch = 4)

  1. Fit a logistic regression model to the data, using X1 and X2 as predictors.
log.fit = glm(y ~ x1 + x2, family = binomial)
summary(log.fit)
## 
## Call:
## glm(formula = y ~ x1 + x2, family = binomial)
## 
## Deviance Residuals: 
##    Min      1Q  Median      3Q     Max  
## -1.239  -1.181   1.123   1.162   1.205  
## 
## Coefficients:
##             Estimate Std. Error z value Pr(>|z|)
## (Intercept) 0.033447   0.089682   0.373    0.709
## x1          0.008086   0.306000   0.026    0.979
## x2          0.217831   0.310963   0.701    0.484
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 693.02  on 499  degrees of freedom
## Residual deviance: 692.53  on 497  degrees of freedom
## AIC: 698.53
## 
## Number of Fisher Scoring iterations: 3
  1. Apply this model to the training data in order to obtain a predicted class label for each training observation. Plot the observations, colored according to the predicted class labels. The decision boundary should be linear.
data = data.frame(x1 = x1, x2 = x2, y = as.factor(y))
lm.prob = predict(log.fit, data, type = "response")
lm.pred = ifelse(lm.prob > 0.50, 1, 0)
data.pos = data[lm.pred == 1, ]
data.neg = data[lm.pred == 0, ]
plot(data.pos$x1, data.pos$x2, col = "blue", xlab = "X1", ylab = "X2", pch = "+")
points(data.neg$x1, data.neg$x2, col = "red", pch = 4)

  1. Now fit a logistic regression model to the data using non-linear functions of X1 and X2 as predictors (e.g. X2 1 , X1×X2, log(X2), and so forth).
lm.fit = glm(y ~ poly(x1, 2) + poly(x2, 2) + I(x1 * x2), data = data, family = binomial)
## Warning: glm.fit: algorithm did not converge
## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred
  1. Apply this model to the training data in order to obtain a predicted class label for each training observation. Plot the observations, colored according to the predicted class labels. The decision boundary should be obviously non-linear. If it is not, then repeat (a)-(e) until you come up with an example in which the predicted class labels are obviously non-linear.
lm.prob = predict(lm.fit, data, type = "response")
lm.pred = ifelse(lm.prob > 0.5, 1, 0)
data.pos = data[lm.pred == 1, ]
data.neg = data[lm.pred == 0, ]
plot(data.pos$x1, data.pos$x2, col = "blue", xlab = "X1", ylab = "X2", pch = "+")
points(data.neg$x1, data.neg$x2, col = "gray", pch = 4)

  1. Fit a support vector classifier to the data with X1 and X2 as predictors. Obtain a class prediction for each training observation. Plot the observations, colored according to the predicted class labels.
library(e1071)

svm.fit = svm(y ~ x1 + x2, data, kernel = "linear", cost = 0.01)
svm.pred2 = predict(svm.fit, data)
data.pos1 = data[svm.pred2 == 1, ]
data.neg1 = data[svm.pred2 == 0, ]
plot(data.pos1$x1, data.pos1$x2, col = "blue", xlab = "X1", ylab = "X2", pch = "+")
points(data.neg1$x1, data.neg1$x2, col = "gray", pch = 4)

  1. Fit a SVM using a non-linear kernel to the data. Obtain a class prediction for each training observation. Plot the observations, colored according to the predicted class labels.
svm.fit = svm(as.factor(y) ~ x1 + x2, data, gamma = 1)
svm.pred = predict(svm.fit, data)
data.pos = data[svm.pred == 1, ]
data.neg = data[svm.pred == 0, ]
plot(data.pos$x1, data.pos$x2, col = "blue", xlab = "X1", ylab = "X2", pch = "+")
points(data.neg$x1, data.neg$x2, col = "gray", pch = 4)

  1. Comment on your results.

Support Vector machine and logistic regression are very efficient in predicting non-linear predictors with the gamma.

Including Plots