z-logo
Premium
Complex‐valued multidirectional associative memory
Author(s) -
Kobayashi Masaki,
Yamazaki Haruaki
Publication year - 2007
Publication title -
electrical engineering in japan
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.136
H-Index - 28
eISSN - 1520-6416
pISSN - 0424-7760
DOI - 10.1002/eej.20387
Subject(s) - bidirectional associative memory , content addressable memory , robustness (evolution) , associative property , hopfield network , computer science , algorithm , artificial neural network , convergence (economics) , function (biology) , content addressable storage , artificial intelligence , pattern recognition (psychology) , mathematics , pure mathematics , biochemistry , chemistry , evolutionary biology , biology , economics , gene , economic growth
Abstract Hopfield model is a representative associative memory. It was improved to Bidirectional Associative Memory (BAM) by Kosko and to Multidirectional Associative Memory (MAM) by Hagiwara. They have two layers or multilayers. Since they have symmetric connections between layers, they ensure convergence. MAM can deal with multiples of many patterns, such as ( x 1 , x 2 ,…), where x m is the pattern on layer m . Copyright © 2004 Wiley Periodicals, Inc. Noest, Hirose, and Nemoto proposed complex‐value Hopfield model. Lee proposed complex‐valued Bidirectional Associative Memory. Zemel proved the rotation invariance of complex‐valued Hopfield model. It means that the rotated pattern in also stored. In this paper, the complex‐valued Multidirectional Associative Memory is proposed. The rotation invariance is also proved. Moreover it is shown by computer simulation that the differences of angles of given patterns are automatically reduced. At first we define complex‐valued Multidirectional Associative Memory. Then we define the energy function of network. With the energy function, we prove that the network ensures convergence. Next, we define the learning law and show the characteristic of recall process. The characteristic means that the differences of angles of given patterns are automatically reduced. Especially we prove the following theorem. In the case that only a multiple of patterns is stored, if patterns with different angles are given to each layer, the differences are automatically reduced. Finally, we investigate whether the differences of angles influence the noise robustness. It is found to reduce the noise robustness, because the input to each layer becomes small. We show this by computer simulations. © 2007 Wiley Periodicals, Inc. Electr Eng Jpn, 159(1): 39–45, 2007; Published online in Wiley InterScience ( www.interscience.wiley.com ). DOI 10.1002/eej.20387

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here