News & Posts

Visual Attention Analysis–The Power of Shared Gaze

Tobii Glasses III used with Prophea.X for simultaneous multi-subject recording

© Tobii

Multi-subject eyetracking—simultaneously recording the gaze of two or more individuals—unlocks new dimensions in studying social interaction, joint attention, and collaborative behavior. Traditional single-subject setups limit insights to individual gaze patterns, but dual or multi-subject systems reveal how people coordinate attention in real time, a critical factor in fields like piloting, human-robot collaboration (HRC), education, sports, psychology, and healthcare.

For example, research highlights the importance of mutual gaze (when two people look at each other) and shared gaze (when both focus on the same object) in early social development and task coordination. These metrics, only measurable with multi-subject setups, enable researchers to quantify how gaze synchrony shapes communication, learning, and teamwork.
Achieving visual attention analysis for multiple subjects on a technical level requires a flexible & lossless recording framework handling scalable data flows without interference, delay, or other network- or computing-power-related restraints. Once an extremely complex method requiring manual synchronization and enormous hardware setups, Ergoneers’ Prophea.X has optimized this process, unlocking new dimensions in the study of social interaction, joint attention, and collaborative behavior.

Multi Subject Eyetracking

Key Applications in Behavior Research

1. Social Interaction & Development

  • Joint Attention: Multi-subject eyetracking operationalizes variables like mutual and shared gaze, which are essential for studying early childhood development, autism spectrum disorder (ASD), and social cognition. For instance, infants and caregivers exhibit synchronized gaze patterns during play, revealing how attention coordination shapes learning.
  • Dyadic Communication: In face-to-face interactions, dual eyetracking captures how speakers and listeners align their gaze, providing objective data on turn-taking, engagement, and nonverbal cues—critical for understanding disorders like ASD or improving interpersonal training.

2. Human Factors & Ergonomics

  • Human-Robot Collaboration (HRC): In manufacturing, multi-subject eyetracking helps optimize human-machine interactions by analyzing how workers visually coordinate with robots or each other. This data informs the design of safer, more intuitive workspaces and reduces cognitive load.
  • Team Performance: In high-stakes environments (e.g., aviation, surgery, sports), tracking team gaze patterns identifies bottlenecks in visual communication, improving training and interface design. For example, shared gaze on a control panel can reveal who is attending to critical information—and who isn’t.

3. Naturalistic & Ecological Validity

  • Real-World Tasks: Wearable, multi-subject eyetrackers enable studies in unrestricted environments (e.g., driving, cooking, or navigating crowds), where head movement and natural behavior are preserved. This addresses a key limitation of lab-based setups, which often restrict movement and ecological validity.
  • Group Dynamics: Beyond dyads, multi-subject setups (e.g., triple or quadruple) can analyze group attention patterns, such as in classrooms or collaborative problem-solving, though complexity increases with each added tracker.

Overcoming Challenges

While multi-subject eye tracking offers rich insights, it introduces technical and analytical complexities:

  • Scalability: Expanding from single-subject to multi-subject setups—especially in large groups or dynamic environments—requires robust network infrastructure and decentralized computing power to handle increased data volume, participant synchronization, and real-time processing without compromising accuracy and reliability. Traditional systems struggle with scalability, limiting the scope of studies to small, controlled groups or post-recording cloud synchronization lacking life-monitoring capabilities. Prophea.X addresses this by supporting unlimited simultaneous subjects and modular integration of devices, enabling large-scale studies in naturalistic settings like classrooms, cockpits, and urban intersections but also in simulators with direct data connectivity.
  • Connectivity/Flexibility: Multi-subject and multi-modal setups demand seamless network connectivity between diverse hardware taking into account varying input frequencies (e.g., different eye-trackers, EEGs, biometric sensors, simulator data, cameras) and software platforms. Rigid or proprietary systems create silos, complicating data fusion and cross-device communication. Ergoneers breaks these barriers with a unified Propheadata engine that integrates any sensor or device—from mobile eyetracking using dedicated eye-tracking wearables or glasses to environmental inputs—into a single, flexible workflow, whether in a lab, field, or hybrid environment.
  • Synchronization: Aligning gaze data across subjects required precise timing to avoid artifacts and misleading interpretation. Prophea.X is currently the only solution facilitating real-time monitoring of any connected data stream and GPU-powered offline data handling for analysis from in-lab to off-the-grid applications.
  • Data Analysis: Manual coding of areas of interest (AOIs) in multi-subject setups used to be the most time-consuming key bottleneck. Integrating machine vision for automated Project-AOI recognition across experiment stacks into Prophea.X streamlines this process and reduces turnover times to a fraction.
  • Ethical Considerations and Comfort: In populations like children or patients, minimizing movement restrictions is crucial, favoring high-precision, lightweight wearables like Tobii Glasses III .
  • 3D spatial mapping: To rely on Gaze-path plots, heatmaps or automatically detected AOIs independent from head movements, Prophea.X uses AI-powered machine vision to map the gaze behavior of multiple subjects precisely to the environment.
driver and passenger POV plus gaze point visualizing the center of attention of each driver. Areas of interest can be marked and automatically detected.
Driver & Passenger Gaze Analysis for Interaction, Performance, or UX Optimization.

The Future: Scalability & Integration with Prophea.X

Advances in wearable, wireless eyetrackers and AI-driven analysis are making multi-subject setups more accessible. As technology improves, researchers can expect:

  • Larger group studies (e.g., classrooms, teams) with reduced setup complexity.
  • Integration with other biometrics (e.g., EEG, heart rate) for a holistic view of behavior and cognition.
  • Real-time feedback in applied settings, such as adaptive training or human-robot interfaces.

Prophea.X stands at the forefront of this evolution. As a scalable, multimodal research platform, it empowers labs to seamlessly connect multiple subjects and devices—from eye-tracking wearables like Tobii Glasses III to EEG and video signals—within a single, unified workflow. By automating gaze analysis, synchronizing data streams, and enabling real-time interaction studies, Prophea.X eliminates the bottlenecks of traditional setups. Researchers can now decode group dynamics, gaze behavior, and UX interactions with unprecedented precision, speed, and scalability—whether in automotive safety, classroom engagement, or safety-critical environments. With Prophea.X, the future of multi-subject eyetracking isn’t just about collecting data; it’s about unlocking the full complexity of human behavior in real-world, multi-human contexts.

Conclusion

Multi-subject eyetracking bridges the gap between individual cognition and social behavior, offering unprecedented insights into how humans interact with each other and their environment. By capturing the dynamic, reciprocal nature of gaze, it empowers researchers to design better interventions, tools, and spaces—ultimately advancing both theoretical understanding and practical applications in behavior research and human factors. With  Prophea.X, the field is poised to scale new heights, turning raw gaze data into actionable, transformative insights.

Learn more – Demand a free personal consultation !












    data protection regulations