/Eye-Tracking Through In-Ear Sensing: Cognitive-Motor Integration for Error Anticipation in High-Stakes Tasks

Eye-Tracking Through In-Ear Sensing: Cognitive-Motor Integration for Error Anticipation in High-Stakes Tasks

PhD - Gent | Just now

Seeing Through the Ear: Anticipating Human Error by Fusing Gaze and Action
High-stakes work environments such as semiconductor cleanrooms, pharmaceutical production lines, or robotic surgery theaters demand flawless manual precision. Even minor operator errors can lead to severe financial, safety, or clinical consequences. Anticipating these errors before they occur is therefore a crucial frontier in operator support. This PhD project will tackle this challenge by integrating fine-grained motor tracking (through sensing gloves and potentially imec’s pMUT-based ultrasonic forearm sensors) with novel cognitive state measures captured by a revolutionary in-ear sensing platform.

The imec multimodal earbud incorporates dry-electrode electrophysiology arrays, PPG, and IMU sensors and will be developed in the imec.prospect project PADRE, that will start in November 2025. Critically, its in-ear EOG electrodes will allow for the detection of eye movements—capturing gaze direction, blink frequency, fixation instability, and saccadic dynamics. These signals are currently only accessible with expensive, fragile, and cumbersome eye-tracking headsets or camera-based systems that suffer from occlusion and calibration drift. The earbud offers a compact, comfortable, and unobtrusive alternative that unlocks continuous cognitive monitoring in both controlled training setups and real-world operational contexts.

The PhD candidate will design spatio-temporal models (e.g., graph neural networks, CNN-BiLSTMs, and contrastive multimodal fusion techniques) that combine motor trajectories with electrophysiology-driven eye metrics. The aim is to predict execution errors at least 1.5 seconds in advance with ≥75% precision, distinguishing early signs of hesitation, cognitive overload, or loss of situational awareness. Experiments will be conducted across lab-based maintenance tasks, VR-based industrial simulations, and surgical training scenarios (with both novices and experts), ensuring ecological validity and cross-domain generalization.

By replacing conventional eye trackers with an unobtrusive sensing earbud, this research goes beyond state-of-the-art methods in two ways: (1) it democratizes access to eye-movement-derived cognitive markers by eliminating costly, impractical equipment, and (2) it enables true multimodal cognitive-motor fusion, where gaze dynamics are directly aligned with hand actions and physiological states. The expected outcome is a set of real-time algorithms for error anticipation, directly deployable in adaptive operator-support systems. More broadly, the project will demonstrate how next-generation wearable technologies can reshape cognitive assessment and performance monitoring in high-stakes domains.



Required background: Engineering, Data science, Cognitive neuroscience

Type of work: 50% data science, 25% literature, 25% data collection

Supervisor: Klaas Bombeke

Co-supervisor: Lieven De Marez

Daily advisor: Lien De Bie

The reference code for this position is 2026-071. Mention this reference code on your application form.

Who we are
Accept analytics-cookies to view this content.
imec's cleanroom
Accept analytics-cookies to view this content.

Send this job to your email