Perception, Language and Memory (2013-2014)
Language relies on perception for accurate identification of both oral and written speech and memory for binding of these arbitrary sensory stimuli to semantic content. This relationship has social, cultural, and biomedical implications at a variety of scales from the individual to the global. For example, early exposure to the sounds of a second language is often correlated with developing fluency in that language, a problem with significant implications for interactions between individuals in different cultures. Deficits in both visual and auditory processing contribute to language impairments (e.g. dyslexia, hearing loss), with attendant implications for communication and education worldwide. Not only does perception play a causal role in language comprehension, but there may also be an arrow of causation running the other way. Recent theories of language function in the brain have suggested that words may evoke a kind of mental simulation of a remembered perceptual experience connected to the meaning of that word. For example, words like “kick” have been shown to activate the foot region of the motor cortex. In short, language may work by coopting perceptual and action networks in the brain.
This project team examined these concepts using fMRI imaging and psychophysical techniques. Team members worked together on several different group projects involving perception and language using behavior and fMRI. They spent their time on a balance of analysis of previously collected data and on design of new experiments.
- Edna Andrews, Arts & Sciences-Slavic and Eurasian Studies
- Michele Diaz, N/A
- Jennifer Groh, Arts & Sciences-Psychology and Neuroscience
- Elizabeth Johnson, School of Medicine-Neurology
- J. Tobias Overath, Duke Institute for Brain Sciences
/undergraduate Team Members
Akshita Iyer, Neuroscience (BS)
Ana Restrepo, Neuroscience (BS)
Muhammed Yalcinbas, Biomedical Engineering (BSE)