Research in the N.E.R.D. lab focuses on the relationship between gesture and speech and its contribution to language processing and learning in both typical and atypical development. Our research is highly interdisciplinary, spanning the boundaries of cognitive neuroscience, educational psychology, and developmental science. We use a variety of cutting-edge behavioral and neuroimaging techniques, including eyetracking, EEG, and fNIRS, to investigate how the brain integrates gesture and speech in real time. We are the first fNIRS lab in the state of Alabama, as well as the first multimodal neuroimaging and the first developmental cognitive neuroscience lab at UA. Come work with us as we conduct exciting and innovative research on the neurobiology of gesture and its impact on language processing and learning across development!
Cognitive Mechanisms of Multimodal Prosody
Prosody is the music of human language, conveying meaning beyond the content of speech alone. In speech, prosody is conveyed via pitch accenting (the GREEN bowl, not the red one); in gesture, prosody is conveyed via beats (simple rhythmic gestures). Although pitch accenting and beat gesture each convey emphasis and typically occur simultaneously, little is known about how these two cues are integrated and how they each affect comprehension and memory of discourse and language acquisition. To shed light on these questions, we are using behavioral methods and eyetracking to investigate real-time integration of beat gesture and pitch accenting in typical development, L2 comprehension, and ASD. In particular, we are interested in the following issues:
- Are young children sensitive to the relationship between beat gesture and pitch accenting, and can it help them learn pairs of contrasting words?
- How are beat gesture and pitch accenting integrated in ASD and how does it affect memory and comprehension of discourse?
- What impact do beat gesture and pitch accenting have on memory and comprehension of discourse by English-as-a-second language speakers?
Functional Laterality and L2 Lexical Tone Acquisition
During acquisition of tonal second languages (L2s), the locus of lexical tone processing in the brain shifts leftward. Gesture conveying the conceptual metaphor underlying pitch (up = high, down = low) facilitates L2 acquisition of lexical tone. At present, however, it is unclear whether these gestures expedite the leftward shift in functional laterality as well as whether it is the gesture itself or its motion that expedites acquisition of L2 lexical tone. To address these questions, we are using fNIRS and EEG to examine the impact of pitch gesture and equivalent dot motion on the neural substrates of lexical tone processing. In particular, this project addresses the following questions:
- Does pitch gesture enhance L2 lexical tone acquisition more effectively than equivalent dot motion?
- How do pitch gesture and equivalent dot motion affect the leftward shift in functional laterality accompanying successful L2 lexical tone acquisition?
- How are the neural substrates of multimodal integration of lexical tones and their static visual representations affected by pitch gesture and dot motion, as evidenced by the N400 event-related potential and activity in posterior superior temporal sulcus?
Individual Differences in Brain Development, Gesture, and Language Learning
The ability to learn a novel language varies widely between individuals, as does the use of gesture. Research on first language development has shown that children's gestural repertoire is closely related to their vocabulary size, and research with adults has shown that gesture can compensate for reduced working memory. Moreover, gesture can help both age groups learn new words in an unfamiliar second language. At present, little is known about how individual differences in neural responses across development reflect individual differences in the efficacy of gesture interpretation and its effect on additional language learning.
- What are the neural signatures of gesture's impact on word learning, and how do they develop?
- How does the brain adapt to variation in gesture use in real time during language acquisition, and how do differences in real-time gesture use during language acquisition affect neural responses?
- Can gesture compensate for individual differences in language learning aptitude, and is this compensation reflected in differential changes in neural responses during language learning?