Our perceptual experience is formed by combining incoming sensory information with prior knowledge and expectation. When speech is not fully intelligible, non-acoustic information may be particularly important. Predictions about a degraded acoustic signal can be provided extrinsically (for example, by presenting a written cue) or intrinsically (if the speech is still partially intelligible). Here I review two studies in which the neural response to speech was measured using magnetoencephalography (MEG), with speech clarity parametrically manipulated using noise vocoding. In a study of isolated word processing, accurate predictions provided by written text enhanced subjective clarity and changed the response in early auditory processing regions of temporal cortex. In a separate study looking at connected speech, the phase of ongoing cortical oscillations was matched to that of the acoustic speech envelope in the range of the syllable rate (4-8 Hz). Critically, this phase-locking was enhanced in left temporal cortex when speech is intelligible. Both experiments thus highlight neural responses in brain regions associated with relatively low-level speech perception. Together these findings support the ability of linguistic information to provide predictions that shape auditory processing of spoken language, particularly when acoustic clarity is compromised.
|Journal||Proceedings of Meetings on Acoustics|
|State||Published - Jun 19 2013|
|Event||21st International Congress on Acoustics, ICA 2013 - 165th Meeting of the Acoustical Society of America - Montreal, QC, Canada|
Duration: Jun 2 2013 → Jun 7 2013