z-logo
open-access-imgOpen Access
Memory Capacity of Networks with Stochastic Binary Synapses
Author(s) -
Alexis Dubreuil,
Yali Amit,
Nicolas Brunel
Publication year - 2014
Publication title -
plos computational biology/plos computational biology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.628
H-Index - 182
eISSN - 1553-7358
pISSN - 1553-734X
DOI - 10.1371/journal.pcbi.1003727
Subject(s) - attractor , binary number , computer science , coding (social sciences) , artificial neural network , neural coding , theoretical computer science , statistical physics , algorithm , artificial intelligence , mathematics , physics , statistics , mathematical analysis , arithmetic
In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored, and the stored information measured in bits per synapse. In this paper, we compute both quantities in fully connected networks of N binary neurons with binary synapses, storing patterns with coding level, in the largeand sparse coding limits ( ). We also derive finite-size corrections that accurately reproduce the results of simulations in networks of tens of thousands of neurons. These methods are applied to three different scenarios: (1) the classic Willshaw model, (2) networks with stochastic learning in which patterns are shown only once (one shot learning), (3) networks with stochastic learning in which patterns are shown multiple times. The storage capacities are optimized over network parameters, which allows us to compare the performance of the different models. We show that finite-size effects strongly reduce the capacity, even for networks of realistic sizes. We discuss the implications of these results for memory storage in the hippocampus and cerebral cortex.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here