Premium
LEARNING MONOTONIC‐CONCAVE INTERVAL CONCEPTS USING THE BACK‐PROPAGATION NEURAL NETWORKS
Author(s) -
Wang Shouhong
Publication year - 1996
Publication title -
computational intelligence
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.353
H-Index - 52
eISSN - 1467-8640
pISSN - 0824-7935
DOI - 10.1111/j.1467-8640.1996.tb00262.x
Subject(s) - monotonic function , artificial neural network , interval (graph theory) , computer science , domain (mathematical analysis) , artificial intelligence , backpropagation , machine learning , mathematics , combinatorics , mathematical analysis
Monotonicity and concavity play important roles in human cognition, reasoning, and decision making. This paper shows that neural networks can learn monotonic‐concave interval concepts based on real‐world data, Traditionally, the training of neural networks has been based only on raw data. In cases where the training samples carry statistical fluctuations, the products of the training have often suffered. This paper suggests that global knowledge about monotonicity and concavity of a problem domain can be incorporated in neural network training. This paper proposes a learning scheme for the back‐propagation layered neural networks in learning monotonic‐concave interval concepts and provides an example to show its application.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom