Dopaminergic and Prefrontal Basis of Learning from Sensory Confidence and Reward Value

Armin Lak, Michael Okun, Morgane M. Moss, Harsha Gurnani, Karolina Farrell, Miles J. Wells, Charu Bai Reddy, Adam Kepecs, Kenneth D. Harris, Matteo Carandini

Research output: Contribution to journalArticlepeer-review

67 Scopus citations


Deciding between stimuli requires combining their learned value with one's sensory confidence. We trained mice in a visual task that probes this combination. Mouse choices reflected not only present confidence and past rewards but also past confidence. Their behavior conformed to a model that combines signal detection with reinforcement learning. In the model, the predicted value of the chosen option is the product of sensory confidence and learned value. We found precise correlates of this variable in the pre-outcome activity of midbrain dopamine neurons and of medial prefrontal cortical neurons. However, only the latter played a causal role: inactivating medial prefrontal cortex before outcome strengthened learning from the outcome. Dopamine neurons played a causal role only after outcome, when they encoded reward prediction errors graded by confidence, influencing subsequent choices. These results reveal neural signals that combine reward value with sensory confidence and guide subsequent learning.

Original languageEnglish
Pages (from-to)700-711.e6
Issue number4
StatePublished - Feb 19 2020


  • Calcium imaging
  • Decision confidence
  • Electrophysiology
  • Mice
  • Optogenetics
  • Psychophysics
  • Reinforcement learning


Dive into the research topics of 'Dopaminergic and Prefrontal Basis of Learning from Sensory Confidence and Reward Value'. Together they form a unique fingerprint.

Cite this