Behind the Eyes: Insights into cognition from naturalistic gaze behavior in VR

Keynote Abstract

A major impulse of human cognitive neuroscience today is to push towards increasing levels of naturalism in our scientific approach. Advances in virtual reality (VR) technology present a promising avenue to investigate cognition in naturalistic contexts while simultaneously balancing experimental control. This keynote discusses a series of studies using a combination of immersive VR and in-headset eye-tracking to understand perception during active, real-world vision. We will discuss recent findings using these techniques, including some surprising bandwidth limitations on perceptual awareness and intriguing individual differences in the act of constructive perception that are introduced by the demands of behaving in immersive, full-field active visual settings. In the tutorial, we will present vrGazeCore, an open source toolbox for analyzing eye-tracking data collected in VR. As we walk through the toolbox, we will provide a live demo of volunteers exploring a VR environment, analyze eye-tracking data from different tasks using the toolbox, and compare the results to computational models of the immersive, real-world scenes. All in all, this keynote and tutorial will equip participants with core knowledge and skills required to use VR and in-headset eye-tracking to study human cognition during naturalistic, active vision.

Tutorial Plan

Humans actively explore our real-world environments. To do this, we are constantly exchanging the contents of our current field of view with our memory of the surrounding environment by shifting our eyes, turning our heads, and moving our bodies. Yet, little is known about visual cognition in immersive, active viewing conditions.

Virtual reality (VR) combined with in-headset eye-tracking is a powerful method for studying human behavior in naturalistic conditions. This tutorial aims to enable participants to learn the basics of studying active vision in immersive VR. To do this, our tutorial will introduce participants to a recently released open-source toolbox that processes eye-tracking data collected in VR, vrGazeCore, through a series of hands-on exercises. To minimize startup cost for participants, our tutorial will not require participants to come with any software pre-installed on their machines. Instead, participants will be able to directly follow along a step-by-step tutorial from their web browser using Google Colab.

Our tutorial will have three sections. First, we will host a live demo of a head-mounted VR system by “screencasting” while two volunteers explore an immersive environment in real time. This step will provide a quick and engaging way to familiarize participants with VR. Second, we will introduce participants to vrGazeCore through a step-by-step tutorial. From beginning to end, this tutorial will walk participants through the process of processing eye-tracking data from 360° scenes and visualizing their results overlaid on the scene. Third, we will teach participants how to compare these eye-tracking results to the outputs of different models of the immersive environment, enabling them to classify the cognitive task that a participant was performing. All in all, this tutorial will lower the barrier to entry for eye-tracking in VR and provide a hands-on introduction to this powerful scientific tool. Because current software available to process in-headset eye-tracking data is proprietary, this tutorial promises to make the study of active vision in immersive scenes more accessible to and flexible for the cognitive neuroscience community.

Caroline Robertson

Caroline Robertson

Dartmouth University

AJ Haskins

AJ Haskins

Dartmouth University

Deepasri Prasad

Deepasri Prasad

Dartmouth University

Tommy Botch

Tommy Botch

Dartmouth University

Brenda Garcia

Brenda Garcia

Dartmouth University