Abstract

Decision confidence is a forecast about the probability that a decision will be correct. From a statistical perspective, decision confidence can be defined as the Bayesian posterior probability that the chosen option is correct based on the evidence contributing to it. Here,we used this formal definition as a starting point to develop a normative statistical framework for decision confidence. Our goal was to make general predictions that do not depend on the structure of the noise or a specific algorithm for estimating confidence. We analytically proved several interrelations between statistical decision confidence and observable decision measures, such as evidence discriminability, choice, and accuracy. These interrelationships specify necessary signatures of decision confidence in terms of externally quantifiable variables that can be empirically tested. Our results lay the foundations for a mathematically rigorous treatment of decision confidence that can lead to a common framework for understanding confidence across different research domains, from human and animal behavior to neural representations.

Original languageEnglish
Pages (from-to)1840-1858
Number of pages19
JournalNeural Computation
Volume28
Issue number9
DOIs
StatePublished - Sep 1 2016

Fingerprint

Dive into the research topics of 'A Mathematical framework for statistical decision confidence'. Together they form a unique fingerprint.

Cite this