Synthesis of recurrent neural dynamics for monotone inclusion with application to Bayesian inference

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

We propose a top-down approach to construct recurrent neural circuit dynamics for the mathematical problem of monotone inclusion (MoI). MoI in a general optimization framework that encompasses a wide range of contemporary problems, including Bayesian inference and Markov decision making. We show that in a recurrent neural circuit/network with Poisson neurons, each neuron's firing curve can be understood as a proximal operator of a local objective function, while the overall circuit dynamics constitutes an operator-splitting system of ordinary differential equations whose equilibrium point corresponds to the solution of the MoI problem. Our analysis thus establishes that neural circuits are a substrate for solving a broad class of computational tasks. In this regard, we provide an explicit synthesis procedure for building neural circuits for specific MoI problems and demonstrate it for the specific case of Bayesian inference and sparse neural coding.

Original languageEnglish
Pages (from-to)231-241
Number of pages11
JournalNeural Networks
Volume131
DOIs
StatePublished - Nov 2020

Keywords

  • Bayesian casual inference
  • Monotone inclusion
  • Network synthesis
  • Normative approach
  • Poisson spiking neuron
  • Recurrent neural networks

Fingerprint

Dive into the research topics of 'Synthesis of recurrent neural dynamics for monotone inclusion with application to Bayesian inference'. Together they form a unique fingerprint.

Cite this