The neural dynamics of auditory word recognition and integration
Jon Gauthier, Roger Levy, Massachusetts Institute of Technology, United States
Session:
Contributed Talks 4 Lecture
Location:
South Schools / East Schools
Presentation Time:
Sun, 27 Aug, 13:30 - 13:45 United Kingdom Time
Abstract:
Listeners recognize and integrate words in everyday speech by combining expectations about upcoming content with incremental sensory evidence. We present a computational model of word recognition and its downstream neural correlates, and fit this model to explain EEG signals recorded as subjects listened to a fictional story. The model reveals distinct neural processing of words depending on whether or not they can be quickly recognized. While all words trigger a neural response characteristic of probabilistic integration — voltage modulations predicted by a word's surprisal in context — these modulations are amplified for words which require more than roughly 100 ms of input to be recognized. We observe no difference in the latency of these neural responses according to words' recognition times. Our results support a two-part model of speech comprehension, combining an eager and rapid process of word recognition with a temporally independent process of word integration.