Current Projects
Research in the N.E.R.D. lab focuses on the relationship between gesture and speech and its contribution to language processing and learning in both typical and atypical development. Our research is highly interdisciplinary, spanning the boundaries of cognitive neuroscience, educational psychology, and developmental science. We use a variety of cutting-edge behavioral and neuroimaging techniques, including eyetracking, EEG, and fNIRS, to investigate how the brain integrates gesture and speech in real time. We are the first fNIRS lab in the state of Alabama, as well as the first multimodal neuroimaging and the first developmental cognitive neuroscience lab at UA. Come work with us as we conduct exciting and innovative research on the neurobiology of gesture and its impact on language processing and learning across development!
Neural Response Variability and Gesture-Speech Integration
Human language is inherently multimodal, typically consisting of speech and co-speech gestures that occur simultaneously. Individuals with autism spectrum disorder (ASD) show insensitivity to discrepancies in the timing of multimodal stimuli such as speech and gesture. Given that neural responses to visual and auditory stimuli are unstable in ASD, we are using EEG and fNIRS to investigate whether the stability of neural responses to gesture and co-occurring speech features predicts language impairments in ASD, as well as individual differences in language processing and learning in typical development.
- What are the neural signatures of abnormal temporal gesture-speech integration in ASD, and how do they differ from the neural signatures of temporal gesture-speech integration in typical development?
- Do unstable neural responses during gesture-speech integration predict language and multimodal processing impairments in ASD?
- Do unstable neural responses during gesture-speech integration also predict temporal discrepancies between gesture and speech production in ASD?
Brain Mechanisms of Multimodal Prosody in Gesture and Speech
Prosody is the music of human language, conveying meaning beyond that conveyed via the content of speech alone. In speech, prosody is conveyed via pitch accenting; in gesture, prosody is conveyed via beats. Although pitch accenting and beat gesture both convey emphasis and typically occur simultaneously, little is known about how these two cues are integrated and how they affect memory for discourse and language acquisition. To shed light on these questions, we are using eyetracking and EEG to investigate real-time processing of beat gesture and pitch accenting and its neural signatures in both typical development and ASD. In particular, we are focusing on the N400 event-related potential (ERP), which is thought to reflect prediction in language processing.
- Is the N400 ERP a reliable neural marker of beat gesture-pitch accent integration in typical development?
- How does beat gesture influence pitch accent interpretation--and vice versa--in real-time language processing, and how do they affect memory for discourse and word learning?
- In ASD, how are pitch accent and beat gesture integrated abnormally, how do they affect real-time sentence interpretation, and does the N400 reflect impairments in beat gesture-pitch accent interpretation?
Individual Differences in Brain Development, Gesture, and Language Learning
The ability to learn a novel language varies widely between individuals, as does the use of gesture. Research on first language development has shown that children's gestural repertoire is closely related to their vocabulary size, and research with adults has shown that gesture can compensate for reduced working memory. Moreover, gesture can help both age groups learn new words in a novel language. At present, little is known about how individual differences in neural responses across development reflect individual differences in the efficacy of gesture interpretation and its effect on additional language learning.
- What are the neural signatures of gesture's impact on word learning, and how do they develop?
- How does the brain adapt to variation in gesture use in real time during language acquisition, and how do differences in real-time gesture use during language acquisition affect neural responses?
- Can gesture compensate for individual differences in language learning aptitude, and is this compensation reflected in differential changes in neural responses during language learning?