Computer Science 8751
Machine Learning

Programming Assignment 3
Cascade Correlation (45 points)
Due Tuesday, November 22, 2005


Determining the correct number of units for a neural network is often difficult. Various approaches have been proposed, often starting with too many units and then cutting out units as needed. An alternate approach is to start with a minimal network and add units until some criterion is reached. In this work we will be using the MultilayerPerceptron method to implement the Cascade Correlation method.

To Do

You should familiarize yourself with the MLP code in Weka and learn how the code can be used to add units to a neural network. Once you understand this process you will also need to explore how to make it so that only some of the weights in a network can be used for learning. Once you have done this you should implement the following method:

Create an initial perceptron network (no hidden units) and train it on
  the data for some fixed number of epochs
Estimate the error using a tuning set
While (the error from the tuning set has not decreased over the last three networks)
  Freeze all of the weights in the existing network
  Add a single hidden unit to the network that has as inputs all of the 
     input units and all of the hidden units so far and is connected to
     the output unit
Use the network with the lowest tuning set error
You should base your method on the MultiLayerPerceptron code. You should extract the java code for MLPs from the weka-src.jar file (which should have been included when you downloaded the code).

Once you get your code working you should test it on the datasets you used in program 1. Show how many hidden units are selected for the best network, plus compare the performance of the best network with a perceptron learner.

To Hand In

  1. Comment your code and hand in a copy of the code.
  2. Show the results of a perceptron learner and the best CascadeCorrelation learner (make sure to indicate how many hidden units were used and what the tuning set error for each of the networks tested was).