z-logo
open-access-imgOpen Access
Efficiency of Multilayer Perceptron Neural Networks Powered by Multi-Verse Optimizer
Author(s) -
Moahaimen Talib,
Faruq Mohammad
Publication year - 2019
Publication title -
international journal of computer applications
Language(s) - English
Resource type - Journals
ISSN - 0975-8887
DOI - 10.5120/ijca2019919340
Subject(s) - computer science , artificial neural network , multilayer perceptron , artificial intelligence , machine learning
The models of artificial neural networks are applied to find solutions to many problems because of their computational power. The paradigm of multi-layer perceptron (MLP) is widely used. MLP must be trained before using. The training phase represents an obstacle in the formation of the solution model. Back-propagation algorithm, among of other approaches, has been used for training. The disadvantage of Back-propagation is the possibility of falling in local minimum of the training error instead of reaching the global minimum. Recently, many metaheuristic methods were developed to overcome this problem. In this work, an approach to train MLP by Multi-Verse Optimizer (MVO) was proposed. Implementing this approach on seven datasets and comparing the obtained results with six other metaheuristic techniques shows that MVO exceeds the other competitors to train MLP.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom