Associative memory model with long-tail-distributed Hebbian 1 synaptic connections

Naoki Hiratani, Jun nosuke Teramae, Tomoki Fukai

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

The postsynaptic potentials of pyramidal neurons have a non-Gaussian amplitude distribution with a heavy tail in both hippocampus and neocortex. Such distributions of synaptic weights were recently shown to generate spontaneous internal noise optimal for spike propagation in recurrent cortical circuits. However, whether this internal noise generation by heavy-tailed weight distributions is possible for and beneficial to other computational functions remains unknown. To clarify this point, we construct an associative memory network model of spiking neurons that stores multiple memory patterns in a connection matrix with a lognormal weight distribution. In associative memory networks, non-retrieved memory patterns generate a cross-talk noise that severely disturbs memory recall. We demonstrate that neurons encoding a retrieved memory pattern and those encoding non-retrieved memory patterns have different subthreshold membrane-potential distributions in our model. Consequently, the probability of responding to inputs at strong synapses increases for the encoding neurons, whereas it decreases for the non-encoding neurons. Our results imply that heavy-tailed distributions of connection weights can generate noise useful for associative memory recall.

Original languageEnglish
JournalFrontiers in Computational Neuroscience
Issue numberDEC
DOIs
StatePublished - Dec 30 2012

Keywords

  • Excitatory postsynaptic potentential
  • Hippocampus
  • Hopfield networks
  • Integrate-and fire neurons
  • Mean-field
  • Memory retrieval
  • Recurrent neural networks
  • Spontaneous activity
  • Stochastic resonance

Fingerprint

Dive into the research topics of 'Associative memory model with long-tail-distributed Hebbian 1 synaptic connections'. Together they form a unique fingerprint.

Cite this