z-logo
open-access-imgOpen Access
Searching Most Efficient Neural Network Architecture Using Akaike's Information Criterion (AIC)
Author(s) -
Gaurang Panchal,
Amit Ganatra,
Y.P. Kosta,
Devyani Panchal
Publication year - 2010
Publication title -
international journal of computer applications
Language(s) - English
Resource type - Journals
ISSN - 0975-8887
DOI - 10.5120/126-242
Subject(s) - akaike information criterion , computer science , artificial neural network , architecture , artificial intelligence , bayesian information criterion , data mining , information retrieval , machine learning , art , visual arts
The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. Neural networks are commonly used networks in many engineering applications due to its better generalization property. An ensemble neural network algorithm is proposed based on the Akaike information criterion (AIC). Ecologists have long relied on hypothesis testing to include or exclude variables in models, although the conclusions often depend on the approach used. The advent of methods based on information theory, also known as information-theoretic approaches, has changed the way we look at model selection The Akaike information criterion (AIC) has been successfully used in model selection. It is not easy to decide the optimal size of the neural network because of its strong nonlinearity. We discuss problems with well used information and propose a model selection method.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom