z-logo
open-access-imgOpen Access
Multilayer Perceptron optimization through Simulated Annealing and Fast Simulated Annealing
Author(s) -
Pedro Henrique Cardoso Camelo,
Rafael Lima de Carvalho
Publication year - 2020
Publication title -
academic journal on computing, engineering and applied mathematics
Language(s) - English
Resource type - Journals
ISSN - 2675-3588
DOI - 10.20873/ajceam.v1i2.9474
Subject(s) - simulated annealing , mnist database , computer science , artificial neural network , artificial intelligence , hyperparameter optimization , metaheuristic , classifier (uml) , machine learning , perceptron , multilayer perceptron , adaptive simulated annealing , pattern recognition (psychology) , support vector machine
The Multilayer Perceptron (MLP) is a classic and widely used neural network model in machine learning applications. As the majority of classifiers, MLPs need well-defined parameters to produce optimized results. Generally, machine learning engineers use grid search to optimize the hyper-parameters of the models, which requires to re-train the models. In this work, we show a computational experiment using metaheuristics Simulated Annealing and Fast Simulated Annealing for optimization of MLPs in order to optimize the hyper-parameters. In the reported experiment, the model is used to optimize two parameters: the configuration of the neural network layers and its neuron weights. The experiment compares the best MLPs produced by the SA and FastSA using the accuracy and classifier complexity as comparison measures. The MLPs are optimized in order to produce a classifier for the MNIST database. The experiment showed that FastSA has produced a better MLP, using less computational time and less fitness evaluations.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here