Multilayer perceptrons are a powerful form of neural network that often use sigmoidal units and backpropagation for learning. The output units are counted as one layer. Each layer of hidden units then counts as an additional layer. The connectivity of such a network is all of the input units to all of the hidden units in the first hidden layer, all of the hidden units in the first hidden laer to all of the hidden units in the second hidden layer, ..., finally all of the hidden units in the last hidden layer to the units in the output layer. Each unit in the hidden layer also has a weight to the always 1 unit (used as a threshold). A two layer MLP has one hidden layer and one output layer of sigmoidal units in addition to the input layer.
Implement an MLP using your dataset class from assignment 1. Your code should:
Test your code on the labor, promoters-936, coolcars and your dataset. In each case use the first 75% of each class in the dataset for training and the last 25% for each class for testing. Have your code report a confusion matrix for the training set at the end of training and for the test set. Pick a set of parameters to try for each dataset (you should likely use several different combinations) and write up your results.
In addition, test how your code works on the XOR problem with 2 hidden units using all of the data for training and for testing. Determine how many epochs it appears to take to learn the problem (in most cases).
You should hand in a documented copy of your code (including your dataset class files). Also create an archive of the code and email it to firstname.lastname@example.org. Make sure to provide a good general description of your code.
In addition hand in output and a writeup for all of your testing. Try to present the testing in a way to show how effective your system is.