z-logo
open-access-imgOpen Access
ABC LSTM Optimizing Parameters of Deep LSTM using ABC Algorithm for Big Datasets
Author(s) -
Shweta Mittal,
Om Prakash Sangwan
Publication year - 2020
Publication title -
international journal of engineering and advanced technology
Language(s) - English
Resource type - Journals
ISSN - 2249-8958
DOI - 10.35940/ijeat.d7649.069520
Subject(s) - computer science , simulated annealing , ant colony optimization algorithms , artificial intelligence , recurrent neural network , algorithm , artificial neural network , task (project management) , genetic algorithm , long short term memory , machine learning , management , economics
Long Short Term Memory Network is the variant of RNN (Recurrent Neural Network) popularly used in various domains, particularly for sequence prediction tasks. For deep networks, number of hidden layers in the network is high and thus, the time complexity of the network increases. Moreover, with the increase in the size of datasets, it becomes very difficult to tune these complex networks manually (as the network may take several days/weeks to run). Thus, to minimize the time required to run an algorithm and for better accuracy, there is a need to automate the task of tuning the parameters of the network. To automatically tune the parameters of the networks, various researchers have used numerous Metaheuristic approaches like Ant Colony Optimization, Genetic Algorithm, Simulated Annealing etc. in the past which provides us with the near optimal solution. In the proposed ABC_LSTM algorithm, traditional Artificial Bee Colony algorithm has been implemented to optimize the number of hidden neurons of LSTM networks with 2 hidden layers. Based on the experimental results, it can be concluded that up to a certain point increasing the number of bees and iterations gives us the solution with the least MAE value, thereby improving the accuracy of the model.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here