Current Projects
Research in the NERD Lab focuses on the relationship between gesture and speech and its contribution to language processing and learning in both typical and atypical development. Our research is highly interdisciplinary, spanning the boundaries of cognitive neuroscience, educational psychology, and developmental science. We use a variety of cutting-edge behavioral and neuroimaging techniques, including eyetracking, EEG, and fNIRS, to investigate how the brain integrates gesture and speech in real time. We are the first fNIRS lab in the state of Alabama, as well as the first multimodal neuroimaging and the first developmental cognitive neuroscience lab at UA. Come work with us as we conduct exciting and innovative research on the neurobiology of gesture and its impact on language processing and learning across development!
Cognitive and Neural Signatures of Lexical Tone Acquisition via Gesture Observation
Collaborator: Sarah Hughes-Berheim (Ph.D. student)
Funding: National Science Foundation CAREER Award
Although it is known that observing gestures conveying pitch contours facilitates L2 lexical tone learning, it is currently unclear how the brain integrates these gestures into representations of L2 words differing in lexical tone. Moreover, it is unclear whether the brain integrates representational gestures conveying the meanings with L2 words differing in lexical tone in the same way. To investigate these questions, we are using event-related potentials (ERPs) and transcranial direct current stimulation (tDCS) to determine how L2 words differing in lexical tone learned by observing pitch and representational gestures are subsequently processed. In particular, this project addresses the following questions:
- How do observed pitch gestures congruent and incongruent with the lexical tones of L2 words affect the phonological and semantic N400 ERPs?
- How do observed representational gestures congruent and incongruent with the lexical tones of L2 words affect the phonological and semantic N400 ERPs?
- Can anodal stimulation of left inferior frontal gyrus and posterior temporal sulcus during observation of congruent pitch and representational gestures amplify differences in the phonological and semantic N400 ERPs?
Functional Laterality and L2 Lexical Tone Acquisition
Collaborators: Jake Feiler (Ph.D. student), Laura Getz (University of San Diego)
Funding: Language Learning Early Career Award, UA Research Grants Council
During acquisition of tonal second languages (L2s), the locus of lexical tone processing in the brain shifts leftward. Gesture conveying lexical tone contours that is based on the conceptual metaphor underlying pitch (up = high, down = low) facilitates L2 acquisition of lexical tone. At present, however, it is unclear whether these gestures expedite the leftward shift in functional laterality, and it is also unclear whether it is the gesture itself or its motion that expedites acquisition of L2 lexical tone. It is also unclear whether these gestures facilitate processing of musical lexical tone analogs. To address these questions, we are using fNIRS and ERPs to examine the impact of gestures conveying lexical tone contours and equivalent dot motion on the neural substrates of L2 lexical tone and musical analog processing. In particular, this project addresses the following questions:
- Do gestures conveying lexical tone contours enhance L2 lexical tone acquisition more effectively than equivalent dot motion?
- Do gestures conveying lexical tone contours enhance processing of musical lexical tone analogs more effectively than equivalent dot motion?
- How do gestures conveying lexical tone contours and equivalent dot motion affect the leftward shift in functional laterality accompanying successful L2 lexical tone acquisition?
Neural Substrates of Contextual Vocabulary Comprehension
Collaborators: Sarah Hughes-Berheim (Ph.D. student), Jack Shelley-Tremblay (South Alabama)
Funding: Graduate Council Fellowship, Taube Scholarship
Iconic gesture facilitates word learning and spoken discourse processing in both the native language and L2 in children and adults. Research examining the neural substrates of these processes has shown that iconic gesture and co-occurring speech are integrated rapidly and automatically, suggesting that they arise from the same semantic representation. The extent to which iconic gesture is integrated with--and affected by--contextual discourse is less clear, however, as are any differences between gesture-speech and gesture-text integration. To investigate these issues, we are using ERPs and behavioral methods to examine the impact of iconic gesture on acquisition and integration of novel vocabulary into novel sentential contexts. Specifically, this project investigates the following questions:
- Can differences in learning based on the semantic congruence between novel words and iconic gestures be replicated when words are presented as text (as opposed to speech)?
- Is the N400 ERP affected by the semantic relationships between novel words, iconic gestures, and sentential contexts?
- Do individual differences in print exposure and/or vocabulary affect the magnitude of the N400 ERP in these tasks?