Understanding the neural systems supporting speech perception can shed light on the representations, processes, and variability in human communication. In the case of speech and language disorders, uncovering the neurological underpinnings can sometimes lead to surgical or medical treatments. Even in the case of healthy listeners, better understanding the interactions among hierarchical brain systems during speech processing can deepen our understanding of perceptual and language processes, and how these might be affected during development, hearing loss, or in background noise. Current neurobiological frameworks largely agree on the importance of bilateral temporal cortex for processing auditory speech, with the addition of left frontal cortex for more complex linguistic structures (such as sentences). Although visual cortex is clearly important for audiovisual speech processing, there is continued debate about where and how auditory and visual signals are integrated. Studies offer evidence supporting multisensory roles for posterior superior temporal sulcus, auditory cortex, and motor cortex. Rather than a single integration mechanism, it may be that visual and auditory inputs are combined in different ways depending on the type of information being processed. Importantly, core speech regions are not always sufficient for successfully understanding spoken language. Increased linguistic complexity or acoustic challenge forces listeners to recruit additional neural systems. In many cases compensatory activity is seen in executive and attention systems, such as the cingulo-opercular or frontoparietal networks. These patterns of increased activity appear to depend on the auditory and cognitive abilities of individual listeners, indicating a systems-level balance between neural systems that dynamically adjusts to the acoustic properties of the speech and current task demand. Speech perception is thus a shining example of flexible neural processing and behavioral stability.