Keep moving: sensorimotor integration of fixational eye-movements yields human-like superresolution in recurrent neural networks
Adrien Doerig, Osnabrück University, Germany; Kirubeswaran O.R., Indian Institute of Science Education and Research Pune, India; Tim Kietzmann, Osnabrück University, Germany
Session:
Posters 1B Poster
Presentation Time:
Thu, 24 Aug, 17:00 - 19:00 United Kingdom Time
Abstract:
Artificial Neural Networks (ANNs) have helped gain important insights into the mechanisms of biological vision, as they provide a powerful language for expressing computational hypotheses of brain computation. Yet, the field is still in its infancy with many central aspects of biology still lacking. One of these aspects is the role of eye movements, as most current ANNs work on static images, contrasting biology’s active sampling of the visual world. What’s more, even during fixations, the human eye is not at rest, but exhibits minute yet systematic eye-movements. Here we focus on the computational role of such involuntary “Fixational Eye Movements” (FEMs) and study their interaction with recurrent processing - a ubiquitous feature of neural computation. We developed a model to test the theory that targeted FEMs endow recurrent systems with computational advantages, e.g. to resolve incoming information at super-resolution that would remain invisible with static input. Our normative model learns human-like FEMs and replicates psychophysical data. Moving towards more natural visual paradigms, we furthermore demonstrate large gains in classification performance on a computer vision task. In summary, we demonstrate that systems allowed to use time as a computational resource benefit from targeted FEMs, in line with theories of human vision.