z-logo
open-access-imgOpen Access
Artificial Neural Network Performance Boost using Probabilistic Recovery with Fast Cascade Training
Author(s) -
Andreas Maniatopoulos,
Alexandros Gazis,
Venetis P. Pallikaras,
Nikolaos Mitianoudis
Publication year - 2020
Publication title -
international journal of circuits, systems and signal processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.156
H-Index - 13
ISSN - 1998-4464
DOI - 10.46300/9106.2020.14.110
Subject(s) - mnist database , artificial neural network , computer science , artificial intelligence , machine learning , field (mathematics) , cascade , probabilistic logic , pattern recognition (psychology) , data mining , engineering , mathematics , chemical engineering , pure mathematics
Pattern Recognition and Classification is considered one of the most promising applications in the scientific field of Artificial Neural Networks (ANN). However, regardless of the vast scientific advances in almost every aspect of the technology and mathematics, neural networks still need to be fairly large and complex (i.e., deep), in order to provide robust results. In this article, we propose a novel ANN architecture approach that aims to combine two fairly small Neural Networks based on an introduced probability term of correct classification. Additionally, we present a second ANN, used to reclassify the potentially incorrect results by using the most probable error-free results as additional training data with the predicted labels. The proposed method achieves a rapid decrease in the mean square error compared to other large and complex ANN architectures with a similar execution time. Our approach demonstrates increased effectiveness when applied to various databases, related to wine, iris, the Modified National Institute of Standards and Technology (MNIST) database, the Canadian Institute for Advanced Research (Cifar32), and Fashion MNIST classification problems.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here