Sensory and motor circuits for listening and learning

David Schneider
Assistant Professor of Neural Science
Education
- Ph.D. 2012 Columbia University
Neurophysiology and Behavior We are a discovery science laboratory focused on understanding how sensory, motor, and learning systems within the brain converge to store memories about the past and make predictions about the future. Our approach toward neuroscience begins with a rigorous understanding of behavior, using the mouse as a model organism. We engineer controlled behavioral tasks in the lab that distill down and retain the core computational principles of natural behaviors (i.e. "behavior clamp") and we then use electrical, optical, and pharmacological techniques to monitor and manipulate the activity of networks, neurons and synapses in behaving mice. Based on the results of these simplified behavioral paradigms, we then circle back to the natural behaviors upon which our engineered behaviors were modeled to understand brain function in natural environments.
One of the specific questions we pursue in the lab is how the brain recognizes and predicts the sounds of one's own movements. This ability to predict the acoustic consequences of our actions is vital for learning and maintaining complex behaviors such as speech. More fundamentally, without this ability, we would be surprised by every sound we made. Predictions about self-generated sounds are thought to be mediated by copies of motor-related signals that are relayed to the auditory system during sound-generating behaviors, where they suppress neural responses to the sounds that a movement is expected to produce. Although predictions are likely made at multiple sites in the brain, evidence from many vertebrate species points toward a cortical locus for predictions made during sound-generating movements. Despite the important roles that predictions play in everyday life, we understand remarkably little about how the brain learns, stores and recalls the statistical associations about the world necessary for making predictions.
By studying brain activity and connectivity in mice engaged in natural behaviors and virtual reality, we have found that the auditory cortex is strongly modulated during sound-generating movements. Motor-related inputs to the auditory cortex can dominate auditory cortical synaptic and spiking activity during movements including vocalizing and locomotion. Moreover, the motor-related signals that impinge on the auditory cortex are predictive of the sound features that a movement is expected to produce. Together, these findings indicate that the synaptic interface between motor and auditory cortices is a likely site of synaptic plasticity for storing and recalling memories about self-generated sounds. Ongoing work aims at discovering how this sensory-motor interface stores memories of past experiences at cellular and synaptic levels. In parallel, we hope to better understand how memories are recalled during behavior and integrated with ongoing sensory experience to make predictions about the future.
Awards
2016 Burroughs Wellcome Fund Career Award at the Scientific Interface
2016 NIH Pathway to Independence Award - K99/R00 (declined)
2015 McKnight Foundation Allison Doupe Fellowship
2014 Helen Hay Whitney Foundation Postdoctoral Fellowship
2013 Titus M. Coan Prize for Excellence in Basic Research