Creating functionally favorable neural dynamics by maximizing information capacity

Elham Ghazizadeh, Shi Nung Ching

Research output: Contribution to journalArticlepeer-review


A ubiquitous problem in optimization and machine learning pertains to the design of systems that enact a desired behavior in dynamical environments. For example, the classical example of a control system that stabilizes an inverted pendulum. In this paper, we consider a complementary and less well-studied problem: the design of the environment itself. That is, can we create a dynamical system that in a general but mathematically rigorous way, is readily ‘usable’ by an unknown agent. We are especially interested in the synthesis of neuronal dynamics that are maximally labile with respect to afferent inputs. That is, can we create neural dynamics that propagate information well. To do so, we blend ideas from control and information theories, by turning specifically to the notion of empowerment, or the information capacity of a dynamical system in an input-to-state sense. We devise a strategy to optimize the dynamics of a system using empowerment over its state space as an objective function. This results in dynamics that are generically conducive to information propagation. For example, the optimized environment would be expected to perform well as an encoder (of afferent input distributions). We outline the key technical innovations needed in order to perform the optimization and, by means of example, discuss emergent dynamical characteristics of systems optimized according to this principle.

Original languageEnglish
Pages (from-to)285-293
Number of pages9
StatePublished - Aug 4 2020


  • Empowerment
  • Information capacity
  • Neural dynamics


Dive into the research topics of 'Creating functionally favorable neural dynamics by maximizing information capacity'. Together they form a unique fingerprint.

Cite this