Associative memory model with long-tail-distributed Hebbian synaptic connections

Naoki Hiratani, Jun Nosuke Teramae, Tomoki Fukai

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

The postsynaptic potentials of pyramidal neurons have a non-Gaussian amplitude distribution with a heavy tail in both hippocampus and neocortex. Such distributions of synaptic weights were recently shown to generate spontaneous internal noise optimal for spike propagation in recurrent cortical circuits. However, whether this internal noise generation by heavy-tailed weight distributions is possible for and beneficial to other computational functions remains unknown. To clarify this point, we construct an associative memory (AM) network model of spiking neurons that stores multiple memory patterns in a connection matrix with a lognormal weight distribution. In AM networks, non-retrieved memory patterns generate a cross-talk noise that severely disturbs memory recall. We demonstrate that neurons encoding a retrieved memory pattern and those encoding non-retrieved memory patterns have different subthreshold membrane-potential distributions in our model. Consequently, the probability of responding to inputs at strong synapses increases for the encoding neurons, whereas it decreases for the non-encoding neurons. Our results imply that heavy-tailed distributions of connection weights can generate noise useful for AM recall.

Original languageEnglish
Article number102
JournalFrontiers in Computational Neuroscience
Issue numberFEB
StatePublished - Feb 7 2013

Keywords

  • Attractor
  • Hippocampus
  • Integrate-and-fire
  • Mean-field
  • Stochastic resonance
  • Storage capacity

Fingerprint

Dive into the research topics of 'Associative memory model with long-tail-distributed Hebbian synaptic connections'. Together they form a unique fingerprint.

Cite this