A uniform human multimodal dataset for emotion perception and judgment

Sai Sun, Runnan Cao, Ueli Rutishauser, Rongjun Yu, Shuo Wang

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Face perception is a fundamental aspect of human social interaction, yet most research on this topic has focused on single modalities and specific aspects of face perception. Here, we present a comprehensive multimodal dataset for examining facial emotion perception and judgment. This dataset includes EEG data from 97 unique neurotypical participants across 8 experiments, fMRI data from 19 neurotypical participants, single-neuron data from 16 neurosurgical patients (22 sessions), eye tracking data from 24 neurotypical participants, behavioral and eye tracking data from 18 participants with ASD and 15 matched controls, and behavioral data from 3 rare patients with focal bilateral amygdala lesions. Notably, participants from all modalities performed the same task. Overall, this multimodal dataset provides a comprehensive exploration of facial emotion perception, emphasizing the importance of integrating multiple modalities to gain a holistic understanding of this complex cognitive process. This dataset serves as a key missing link between human neuroimaging and neurophysiology literature, and facilitates the study of neuropsychiatric populations.

Original languageEnglish
Article number773
JournalScientific data
Issue number1
StatePublished - Dec 2023


Dive into the research topics of 'A uniform human multimodal dataset for emotion perception and judgment'. Together they form a unique fingerprint.

Cite this