Research in the N.E.R.D. lab focuses on the relationship between gesture and speech and its contribution to language processing and learning in both typical and atypical development. Our research is highly interdisciplinary, spanning the boundaries of cognitive neuroscience, educational psychology, and developmental science. We use a variety of cutting-edge behavioral and neuroimaging techniques, including eyetracking, EEG, and fNIRS, to investigate how the brain integrates gesture and speech in real time. We are the first fNIRS lab in the state of Alabama, as well as the first multimodal neuroimaging and the first developmental cognitive neuroscience lab at UA. Come work with us as we conduct exciting and innovative research on the neurobiology of gesture and its impact on language processing and learning across development!
Cognitive Mechanisms of Multimodal Prosody
Collaborators: Scott Fraundorf (Pitt), Jason Scofield (UA HDFS)
Funding: Hilibrand Fellowship
Prosody is the music of human language, conveying meaning beyond the content of speech alone. In speech, prosodic prominence is conveyed via pitch accenting (the GREEN bowl, not the red one); in gesture, it is conveyed via beats (simple rhythmic gestures). Although pitch accenting and beat gesture typically occur simultaneously, little is known about how they are integrated and how they affect comprehension and memory of discourse and language acquisition. To shed light on these questions, we are using behavioral methods, event-related potentials (ERPs), and eyetracking to investigate real-time integration of beat gesture and pitch accenting in typical adults and children, non-native English speakers, and autism spectrum disorder. In particular, this work examines the following issues:
- Are young children sensitive to the relationship between beat gesture and pitch accenting, and can it help them learn pairs of contrasting words?
- How are beat gesture and pitch accenting integrated in ASD and how does it affect memory and comprehension of discourse?
- What impact do beat gesture and pitch accenting have on memory and comprehension of discourse by English-as-a-second language speakers?
Functional Laterality and L2 Lexical Tone Acquisition
Collaborators: Jake Feiler (Ph.D. student), Laura Getz (University of San Diego)
Funding: Language Learning Early Career Award, UA Research Grants Council
During acquisition of tonal second languages (L2s), the locus of lexical tone processing in the brain shifts leftward. Gesture conveying lexical tone contours that is based on the conceptual metaphor underlying pitch (up = high, down = low) facilitates L2 acquisition of lexical tone. At present, however, it is unclear whether these gestures expedite the leftward shift in functional laterality, and it is also unclear whether it is the gesture itself or its motion that expedites acquisition of L2 lexical tone. It is also unclear whether these gestures facilitate processing of musical lexical tone analogs. To address these questions, we are using fNIRS and ERPs to examine the impact of gestures conveying lexical tone contours and equivalent dot motion on the neural substrates of L2 lexical tone and musical analog processing. In particular, this project addresses the following questions:
- Do gestures conveying lexical tone contours enhance L2 lexical tone acquisition more effectively than equivalent dot motion?
- Do gestures conveying lexical tone contours enhance processing of musical lexical tone analogs more effectively than equivalent dot motion?
- How do gestures conveying lexical tone contours and equivalent dot motion affect the leftward shift in functional laterality accompanying successful L2 lexical tone acquisition?
Neural Substrates of Contextual Vocabulary Comprehension
Collaborators: Sarah Hughes-Berheim (Ph.D. student), Jack Shelley-Tremblay (South Alabama)
Funding: Graduate Council Fellowship, Taube Scholarship
Iconic gesture facilitates word learning and spoken discourse processing in both the native language and L2 in children and adults. Research examining the neural substrates of these processes has shown that iconic gesture and co-occurring speech are integrated rapidly and automatically, suggesting that they arise from the same semantic representation. The extent to which iconic gesture is integrated with--and affected by--contextual discourse is less clear, however, as are any differences between gesture-speech and gesture-text integration. To investigate these issues, we are using ERPs and behavioral methods to examine the impact of iconic gesture on acquisition and integration of novel vocabulary into novel sentential contexts. Specifically, this project investigates the following questions:
- Can differences in learning based on the semantic congruence between novel words and iconic gestures be replicated when words are presented as text (as opposed to speech)?
- Is the N400 ERP affected by the semantic relationships between novel words, iconic gestures, and sentential contexts?
- Do individual differences in print exposure and/or vocabulary affect the magnitude of the N400 ERP in these tasks?
Social Processing and the N170 ERP in Autism Spectrum Disorder
Collaborators: Cailee Nelson (Ph.D. student); Caitlin Hudac (UA Psychology)
Funding: NIH R21, Graduate Council Fellowship
Impaired social functioning is a key diagnostic criterion of autism spectrum disorder (ASD). In ASD, the N170 event-related potential (ERP), which subserves face processing, is functionally abnormal, suggesting that it may play a key role in social processing impairments in this disorder. To determine whether the N170 is a viable biomarker of ASD, we are using EEG to examine it in children with ASD and typically-developing children. This project addresses the following questions:
- Does the N170 differ between children with ASD and typically-developing children?
- Is the N170 associated with impaired social processing in children with ASD?
- What types of social stimuli elicit an abnormal N170 in children with ASD?