Premium
Overtraining in back‐propagation neural networks: A CRT color calibration example
Author(s) -
Alman David H.,
Ningfang Liao
Publication year - 2002
Publication title -
color research and application
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.393
H-Index - 62
eISSN - 1520-6378
pISSN - 0361-2317
DOI - 10.1002/col.10027
Subject(s) - overtraining , calibration , artificial neural network , artificial intelligence , training set , computer science , backpropagation , training (meteorology) , experimental data , test data , machine learning , pattern recognition (psychology) , mathematics , statistics , medicine , physical therapy , physics , programming language , athletes , meteorology
A condition of overtraining a back‐propagation neural network exists when excessive model degrees of freedom are used in network training. A CRT color calibration experiment was done to illustrate methods to avoid an overtrained condition in model development. Cross‐validation, in which the experimental data are split into parameter‐training and independent‐test data sets, is advocated. © 2002 Wiley Periodicals, Inc. Col Res Appl, 27, 122–125, 2002; DOI 10.1002/col.10027