top of page

My Research

Understanding social cognition

What we view

eyes (2).png

Visual social cues, such as body language, guide social decisions, yet little is known about how the brain facilitates interpretation and exchange of sensory information during social behavior. Through simultaneous recording of neural and eye tracking data, I discovered how cortical coding of social information improves during learning social interactions and I  continue to examine how the primate brain processes real-world visual inputs to direct behavior.

What we say

speak (1).png

Language is the bedrock of human communication during social interaction. Therefore, my current postdoctoral research investigates how a network of brain regions interact to process the semantics, or meaning, of spoken words. In these studies, I am recording single neuron activity from microelectrodes in human patients as they watch and listen to people telling stories that encompass naturalistic language dynamics.

 

​

What we do 

highFiveHand.png

Everything we communicate through our words and visual cues influences how we act and learn. Previously, I identified salient visual cues that promote cooperation and found that visuo-frontal cortical regions prioritize this information during social learning. Now, by using eye tracking, pose estimation, and neural recording during organic conversations between people, I aim to reveal the neural computations within and across multiple brain areas that underly integration of multisensory stimuli for social decisions.

bottom of page