z-logo
open-access-imgOpen Access
Reducing Network Depth in the Cascade-Correlation Learning Architecture,
Author(s) -
Shumeet Baluja,
Scott E. Fahlman
Publication year - 1994
Publication title -
citeseer x (the pennsylvania state university)
Language(s) - English
Resource type - Reports
DOI - 10.21236/ada289352
Subject(s) - cascade , correlation , architecture , computer science , artificial intelligence , mathematics , geography , engineering , geometry , archaeology , chemical engineering
: The Cascade-Correlation learning algorithm constructs a multi-layer artificial neural network as it learns to perform a given task. The resulting network's size and topology are chosen specifically for this task. In the resulting 'cascade' networks, each new hidden unit receives incoming connections from all input and pre-existing hidden units. In effect, each new unit adds a new layer to the network. This allows Cascade-Correlation to create complex feature detectors, but it typically results in a network that is deeper, in terms of the longest path from input to output, than is necessary to solve the problem efficiently. In this paper we investigate a simple variation of Cascade-Correlation that will build deep nets if necessary, but that is biased toward minimizing network depth. We demonstrate empirically, across a range of problems, that this simple technique can reduce network depth, often dramatically. However, we show that this technique does not, in general, reduce the total number of weights or improve the generalization ability of the resulting networks.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom