Dual Percptron and Kernels: Learning non-linear decision boundaries

In this article, the dual perceptron implementation and some non-linear kernels are used to separate out some linearly inseparable datasets.

  1. The following figure shows the dual perceptron algorithm used.

    Data File Format

  2. Few 2-D training datasets generated uniformly randomly and the binary class labels also generated randomly. Then the dual implementation of the perceptron with a few non-linear kernels, such as Gaussian (with different bandwidth), Polynomial (with different degrees) were used to separate out the positive from the negative class data points. The following figures show the decision boundaries learnt by the perceptron upon convergence.

Data File Format Data File Format

  1. The next animation shows the iteration steps how the dual implementation converges on a given training dataset.

Data File Format

  1. The next figure shows how the dual perceptron with polynomial kernel overfits with the increase in the degree of the polynomial. Also, with the Gaussian kernel it learns a different decision boundary altogether.

Data File Format