Cortical responses to degraded speech are modulated by linguistic predictions

Jonathan E. Peelle

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

Our perceptual experience is formed by combining incoming sensory information with prior knowledge and expectation. When speech is not fully intelligible, non-acoustic information may be particularly important. Predictions about a degraded acoustic signal can be provided extrinsically (for example, by presenting a written cue) or intrinsically (if the speech is still partially intelligible). Here I review two studies in which the neural response to speech was measured using magnetoencephalography (MEG), with speech clarity parametrically manipulated using noise vocoding. In a study of isolated word processing, accurate predictions provided by written text enhanced subjective clarity and changed the response in early auditory processing regions of temporal cortex. In a separate study looking at connected speech, the phase of ongoing cortical oscillations was matched to that of the acoustic speech envelope in the range of the syllable rate (4-8 Hz). Critically, this phase-locking was enhanced in left temporal cortex when speech is intelligible. Both experiments thus highlight neural responses in brain regions associated with relatively low-level speech perception. Together these findings support the ability of linguistic information to provide predictions that shape auditory processing of spoken language, particularly when acoustic clarity is compromised.

Original languageEnglish
Article number060108
JournalProceedings of Meetings on Acoustics
Volume19
DOIs
StatePublished - 2013
Event21st International Congress on Acoustics, ICA 2013 - 165th Meeting of the Acoustical Society of America - Montreal, QC, Canada
Duration: Jun 2 2013Jun 7 2013

Fingerprint

Dive into the research topics of 'Cortical responses to degraded speech are modulated by linguistic predictions'. Together they form a unique fingerprint.

Cite this