Part of Advances in Neural Information Processing Systems 2 (NIPS 1989)
Scott Fahlman, Christian Lebiere
Cascade-Correlation is a new architecture and supervised learning algo(cid:173) rithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a min(cid:173) imal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detec(cid:173) tors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network . determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
1 DESCRIPTION OF CASCADEĀ·CORRELATION The most important problem preventing the widespread application of artificial neural networks to real-world problems is the slowness of existing learning algorithms such as back-propagation (or "backprop"). One factor contributing to that slowness is what we call the moving target problem: because all of the weights in the network are changing at once, each hidden units sees a constantly changing environment. Instead of moving quickly to assume useful roles in the overall problem solution, the hidden units engage in a complex dance with much wasted motion. The Cascade-Correlation learning algorithm was developed in an attempt to solve that problem. In the problems we have examined, it learns much faster than back-propagation and solves some other problems as well.
The Cascade-Correlation Learning Architecture
525