Multi class Perceptron problem

What do you do when you have to process more than two classes using the perceptron or a similar basic classifier unit? There's several approaches available to solve it, one of the most common is called the One vs All strategy.

The one vs all strategy consists on using several perceptron layers to discriminate the desired traits, so that selecting one automatically eliminates the rest. In this exercise we make use of the irisSmall.arff dataset which is widely used by researchers worldwide due to being an example of three linearly separable objects that has been provided and fully tested.

Iris dataset

This dataset consists of three different classes of flowers;

Iris-Setosa

Iris-versicolor

Iris-virginica

These classes each possess 4 attributes that distinguish one flower from the other. These attributes are:

sepallength

sepalwidth

petallength

petalwidth

The attributes are contained in a .arff file which is an extension widely used by the Weka classifier system/library in machine learning. The classes and attributes must be parsed from the file, which looks like follows:

Neural network topology

In order to solve this problem we decided to make a huge assumption. The data provided will only concern those three flowers, that means, no weeds, no other flowers other than iris-setosa, iris-versicolor and iris-virginica. The topology of the network makes use of the hardlim function to evaluate and with this we only require to feed the four attributes to the first input networks.

The second network that discriminates the third class only receives the outputs of the first two networks as input. Figure 1 shows the network topology.

In this case we feed the real values to the input networks and readjust the weights and bias values until we find the numbers required for the network to classify between the iris-setosa vs the rest & iris-virginica vs the rest.

Then we feed those output values which are now binary numbers to the third neuron to identify if its iris-versicolor or the others.