Fluctuating responses evoked by multiple objects: a general feature of visual representations
Meredith Schmehl, Duke University, United States; Valeria Caruso, University of Michigan, United States; Shawn Willett, University of Pittsburgh, United States; Yunran Chen, Na Young Jun, Jeff Mohl, Duke University, United States; Douglas Ruff, Marlene Cohen, University of Chicago, United States; Akinori Ebihara, Winrich Freiwald, The Rockefeller University, United States; Surya Tokdar, Jennifer Groh, Duke University, United States
Session:
Contributed Talks 1 Lecture
Location:
South Schools / East Schools
Presentation Time:
Fri, 25 Aug, 15:00 - 15:15 United Kingdom Time
Abstract:
How neural representations preserve information about multiple stimuli is mysterious because tuning of individual neurons is coarse and the populations of neurons tasked with encoding each individual stimulus must in principle overlap. Here we show that when two perceptually distinguishable stimuli are presented, a subpopulation of neurons in MT and the IT face patch system exhibit fluctuating firing patterns, as if they responded to only one individual stimulus at a time. Furthermore, consistent with our previous results in the early visual area V1, the fluctuations in MT were only observed when the two stimuli formed distinguishable objects but not when they fused into one perceived object. In the face patches, fluctuations were observed for both face-face and face-object stimulus combinations. Taken together, these findings point toward an updated model of how the brain preserves sensory information about multiple stimuli for subsequent processing and behavioral action.