Skip to main content

Synchronized multi-person eye-tracking in dynamic scenes

Synchronized multi-person eye-tracking in dynamic scenes

Name:Shreshth Saxena

School/Affiliation:Department of Psychology, Neuroscience and Behaviour, McMaster University

Co-Authors:Dr. Lauren Fink

Virtual or In-person:In-person

Abstract:

Portable eye-tracking glasses have unlocked the potential of studying eye movements in naturalistic everyday settings. However, their application remains primarily confined to single-person setups due to challenges faced with multi-device synchronization and collation of egocentric views in multi-viewer arrangements. These challenges have traditionally been addressed through application-specific solutions or manual alignment of data, post-collection. Here, we introduce a generic framework to collect synchronized data from multiple eye-tracking glasses and map varying perspectives to a common frame of reference where each viewer's gaze location can be compared directly. We utilize a high-throughput, fault-tolerant event streaming platform to synchronize and transmit data from multiple glasses in a physical space. Feature-based homography estimation is then applied to the incoming data to robustly project egocentric views onto a common central view for analysis. In addition, we introduce functionality modules for live monitoring of streams and real-time analysis/visualization of gaze data. The modular structure ensures flexible model switching and forward compatibility for hardware upgrades. The framework is still under active development. To demonstrate its capabilities, we present preliminary results from exploratory pilot experiments involving extreme viewing positions during an hour-long concert. These experiments showcase the framework's ability to accurately map gaze coordinates using a lightweight deep-learning homography model, even in challenging real-world scenarios.

Poster PDFPoster PDF