z-logo
Premium
Dynamical recollection of interconnected neural networks using meta‐heuristics
Author(s) -
Kuremoto Takashi,
Watanabe Shun,
Kobayashi Kunikazu,
Feng LiangBing,
Obayashi Masanao
Publication year - 2012
Publication title -
electronics and communications in japan
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.131
H-Index - 13
eISSN - 1942-9541
pISSN - 1942-9533
DOI - 10.1002/ecj.11372
Subject(s) - bidirectional associative memory , content addressable memory , computer science , recall , artificial neural network , particle swarm optimization , chaotic , genetic algorithm , hopfield network , heuristic , artificial intelligence , association (psychology) , sequence (biology) , heuristics , associative property , memory model , algorithm , machine learning , mathematics , psychology , shared memory , biology , pure mathematics , genetics , operating system , psychotherapist , cognitive psychology
Interconnected recurrent neural networks are well‐known, with their abilities of associative memory of characteristic patterns. For example, the traditional Hopfield network (HN) can recall stored patterns stably, and Aihara's chaotic neural network (CNN) is able to realize dynamical recollection of a sequence of patterns. In this paper, we propose to use meta‐heuristic (MH) methods such as particle swarm optimization (PSO) and the genetic algorithm (GA) to improve traditional associative memory systems. Using PSO or GA, for CNN, the optimal parameters are found to accelerate the recollection process and raise the rate of successful recollection, and for HN, the optimized bias current is calculated to improve the network with dynamical association of a series of patterns. Simulations of binary pattern association showed the effectiveness of the proposed methods. © 2012 Wiley Periodicals, Inc. Electron Comm Jpn, 95(6): 12–23, 2012; Published online in Wiley Online Library ( wileyonlinelibrary.com ). DOI 10.1002/ecj.11372

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here