News & Posts
Unlocking Multi Subject Behavior and Interaction Insights
© Tobii
Eye Tracking of Multiple Subjects in One Take – Propheaeye and Tobii redefine Behavior Observation
Eyetracking in Behavior research is evolving from isolated, single-subject studies toward more naturalistic, complex setups. With Prophea.X fields like automotive safety, human-machine interaction, and UX design, unlock deeper understanding how users behave in dynamic and multi-layered environments. Mobile eye trackers offer the flexibility to capture real-world gaze behavior, but scaling the use of mobile eyetrackers to multiple subjects to work or drive in parallel introduces a powerful shift in methodology. Meet Prophea.X.
Introduction
The rising complexity of modern user environments—especially in automotive, UX, and simulator-driven research—calls for scalable, realistic, and precise behavior analysis methods. Parallel multi-subject studies using mobile eye trackers provide an innovative solution by enabling synchronized, high-throughput gaze data collection. When extended with multimodal sensor integration, this approach reveals deep, holistic insights into human attention, cognitive states, and interaction dynamics. This white paper explores the methodology, benefits, challenges, and future outlook of such parallel and multimodal studies, emphasizing applications in road safety, human-machine interaction (HMI), and user interface (UI) research.
Prophea.X, developed by Ergoneers, is purpose-built to support such advanced study designs. With native support for multi-subject setups, flexible sensor integration, and a modular architecture, Prophea.X empowers researchers to implement complex synchronized studies at scale—backed by Ergoneers’ deep domain expertise in behavioral research. Compatible with any Ergoneers or Tobii eye tracker, research labs can use the industry standard if not integrating other hardware using the LSL protocol.
Parallel multi-subject setups allow researchers to collect data from several participants simultaneously, thereby increasing ecological validity, statistical power, and research efficiency. When enriched with synchronized sensor streams—such as EEG, GSR, driving data, and simulator events—these studies become truly multimodal, capturing not only where participants look but also how they feel and why they respond.
With its integrated approach to multi-user observation, Prophea.X extremely simplifies the execution of parallel eye-tracking studies. Ergoneers’ extensive track record in automotive research, particularly in complex simulator and real-traffic setups, ensures that both hardware and software components are aligned for reliability, usability, and extensibility. Prophea.X features multi-subject eye tracking on a plug-and-play level of integration for Tobii and Dikablis Eyetracking hardware.
Eyetracker System for Parallel Multi-Subject Recoding and Analysis
Method and Advantages
In a parallel multi-subject study, multiple participants wear mobile eye trackers while engaging in the same or complementary tasks. Whether it’s navigating a driving simulator, interacting with a new automotive UI, or testing collaborative interfaces, data is collected in a synchronized fashion.
Examples of Approved Use-Cases
-
Automotive Simulators: Drivers engage in coordinated scenarios (e.g., highway merges, sudden braking) to assess gaze behavior and interaction with HMI elements.
-
Collaborative UX Evaluation: Multiple users interact with shared interfaces (e.g., infotainment systems, AR HUDs) to evaluate collective usability.
-
Ship Bridge Simulators: Crews navigate complex maritime environments to assess shared situational awareness, gaze allocation during maneuvering, and team coordination under varying sea and visibility conditions.
-
Flight Deck Simulators: Pilots and copilots are monitored during high-stress scenarios (e.g., emergency landings, turbulence) to analyze attention shifts, cognitive load, and communication timing.
-
Control Room Stress Monitoring: Operators in electricity or traffic control rooms are observed in real time during simulated outages or load peaks to analyze physiological stress, decision timing, and task switching behaviors.
Benefits of Multi-Subject Eye Tracking and Multimodal Observation with Propheaeye
Higher Throughput
Running multiple participants simultaneously significantly reduces per-subject session time and resource consumption. Instead of sequentially repeating identical tasks, researchers can gather data from an entire group in one coordinated take—accelerating study cycles and reducing operational costs.
Natural Group Dynamics
Multi-subject setups allow researchers to observe spontaneous interactions, peer influence, and collaborative decision-making in realistic environments. This is essential for studies on teamwork, social behavior, or shared interface use—where isolated single-user data would miss key dynamics.
Improved Comparability
Synchronized data collection ensures that all participants are exposed to identical stimuli, timing, and environmental conditions. This allows for highly controlled cross-subject comparisons, supports split-group analyses, and increases the reliability of conclusions about user behavior.
Statistical Validity
Parallel data capture supports larger sample sizes without linear increases in lab time. This enhances the statistical power of studies and enables more sophisticated experimental designs—such as factorial experiments, between-group comparisons, or longitudinal cohort studies under consistent conditions.
Propheaeye supports synchronous recording across multiple mobile eye trackers, managing their calibration, live streaming, and data alignment seamlessly. Ergoneers’ technology ensures stability in multi-user environments, providing robust tools for synchronized visualizations, live feedback, and efficient post-processing.
© Ergoneers
The tech behind
Synchronization and Data Fusion: The Key Enablers
Precision in synchronization is critical when conducting multimodal, multi-subject research. To ensure that all data streams align seamlessly in time, Prophea.X leverages robust technologies such as Lab Streaming Layer (LSL), Network Time Protocol (NTP), Robot Operating System (ROS), and UDP/IP communication. These protocols allow for nanosecond- to millisecond-level temporal alignment across heterogeneous devices and platforms. Exploring meaningful correlations between visual attention, physiological states, user inputs, and system events, Prophea.X is designed to deliver not just the data but also the necessary analysis modules for a 360º insight generation.
Synchronized data sources may include:
-
Biophysical Sensors – Devices such as EEG (to monitor cognitive load and engagement), ECG (for heart rate and cardiac stress), and GSR (to track emotional arousal) provide critical insights into the internal state of the user, especially under stress or during rapid decision-making.
-
Simulator Data – Inputs like steering wheel angles, pedal pressure, gear changes, and precise timestamps for scenario events (e.g., a hazard appearing) allow researchers to link user behavior directly to system stimuli and performance metrics.
-
Environmental Sensors – External context data from GPS (location tracking), IMU (motion/orientation), and CAN bus (vehicle telemetry) enhance situational understanding and enrich the behavioral dataset, especially in mobility or transport scenarios.
-
System State Logs – Detailed records of user interface interactions, system alerts, and backend processes provide a ground truth for what the user was exposed to, and how the system responded in real time.
Together, these synchronized streams enable deep temporal correlation between what a user sees, feels, thinks, and does—forming the foundation for advanced behavior modeling, system validation, and human-centered design.
Data fusion—the alignment and integration of these sources—unlocks multidimensional analysis: what participants saw, when they saw it, how they reacted, and what physiological or cognitive responses occurred simultaneously.
Prophea.X is designed around a modular synchronization framework, supporting low-latency data streaming and integration with external devices. Ergoneers’ experience in pushing technical boundaries, in the development of tools like the Dikablis eye-tracking system, the initial D-Lab software, and finally Prophea.X and integrated APIs, allows seamless fusion of behavioral, physiological, and environmental data—ensuring every signal is both time-aligned and context-aware.
Expand the Spheres
Turning Parallel Eye Tracking into Infinitely Scalable Multimodal Insight
The true potential of parallel eye tracking unfolds when integrated into a multimodal research framework. In a representative study, multiple drivers in a driving simulator are simultaneously equipped with mobile eye trackers, EEG caps, GSR sensors, HR monitors, and motion tracking systems. As they encounter a simulated hazard—such as a pedestrian crossing unexpectedly or a vehicle suddenly braking—researchers gain synchronized, high-resolution insight into the full cognitive and physiological experience of each participant.
This biophysical sensor setup enables real-time access to:
-
Visual Attention Patterns – Eye trackers reveal where each driver looks before, during, and after the hazard, helping identify visual anticipation, distractions, or blind spots.
-
Stress Indicators – GSR and HRV measurements reflect emotional arousal and physiological stress response, offering insights into how drivers cope under pressure.
-
Brain Activity – EEG captures cognitive workload, decision latency, and potential overload, enabling assessment of how mentally demanding the scenario is.
-
Steering Behavior – Telemetry from the simulator provides precise data on reaction time, lane stability, braking force, and corrective maneuvers.
Applications of this integrated, parallel multimodal setup include:
-
Road Safety Research – Understand how different drivers detect and respond to hazards, predict risk-prone behavior, and assess the real-world effectiveness of ADAS (Advanced Driver Assistance Systems) interventions.
-
UX/UI Evaluation – Test new dashboard layouts, AR HUDs, or voice controls under multitasking and high-stress conditions to determine their clarity, accessibility, and impact on driver performance.
-
Human-Machine Interaction – Evaluate the trust and engagement levels drivers have with partially or fully automated systems, particularly how they transition control and react when autonomy fails.
Prophea.X natively supports multimodal sensor streams and provides unified access to eye tracking, vehicle dynamics, simulator events, and biophysical data. Ergoneers’ extensive library of synchronization protocols and experience in multi-sensor studies makes it the platform of choice for extracting deep behavioral insights across user groups. Our team of research engineers has helped leading institutions and labs around the globe to pursue their research objectives.
Challenges and Considerations
While promising, parallel and multimodal setups come with challenges:
Calibration consistency across users
Propheaeye supports harmonized calibration workflows to ensure that data from multiple participants—such as gaze or biosignal inputs—align accurately across sessions. This minimizes inter-user variability and enhances the validity of comparative or group-level analyses. Still, calibration must be carefully standardized to maintain precision at scale.
Signal interference and environmental noise
Multimodal setups increase susceptibility to artifacts and interference, especially when multiple devices operate concurrently. Prophea.X includes advanced filtering and synchronization tools to detect and mitigate such disturbances. However, controlled environments and careful hardware integration remain essential for clean data.
Managing and storing large, complex datasets
With simultaneous data streams from several subjects and sensors, data volume can grow rapidly. Prophea.X offers scalable data handling to prevent latency issues and structured storage to manage this complexity. Researchers must nonetheless plan for high-throughput recoding, storage and efficient post-processing pipelines.
Coordinating group sessions
Parallel testing requires precise logistical planning to ensure all participants are equipped, briefed, and synchronized. Tools like automated start triggers and centralized control do help streamline the process. Collaborative workflows using one file standard allow project handling on a global scale. Prophea.X simplifies technical execution but doesn’t replace the need for tight session orchestration.
Ensuring synchronized task onset and consistent instructions
Consistency in stimuli and task timing is critical in multi-subject designs. Prophea.X supports unified task triggering and stimulus presentation across devices. Even so, researchers must standardize instruction delivery and timing protocols to ensure reliable group-level comparisons.
Informed consent across multiple data modalities
Multimodal data collection raises the bar for ethical transparency. With Prophea.X, researchers can clearly document which modalities are being recorded and how data will be used. Nonetheless, consent procedures must be comprehensive and modality-specific to respect participant autonomy.
Ensuring participant privacy in group recordings
Group-based video or physiological recordings introduce new privacy concerns, such as unintended capture of bystanders or cross-participant visibility. Prophea.X allows for modular data segmentation and anonymization, but researchers must proactively manage privacy boundaries and storage policies
Prophea.X addresses security challenges with fully transparent on-site computing behind your firewall.
Flexible calibration workflows and real-time monitoring tools produce transparent and reliably structured data pipelines. Ergoneers brings nearly two decades of experience in secure data management, participant privacy protocols, and ergonomic system design—helping researchers navigate both the technical and ethical dimensions of large-scale studies.
For further information and individual support for your research project, do not hesitate to contact us!
© Dikablis.X by Ergoneers
Multi Subject Eyetracking
Future Perspectives
As AI-powered analytics evolve, parallel and multimodal eye tracking studies are becoming more automated, intelligent, and real-time.
Emerging Directions:
Real-time adaptive simulator feedback based on group behavior
Prophea.X enables real-time analysis of synchronized multimodal data from multiple users, allowing simulators to adapt dynamically to collective behavior. This facilitates realistic training scenarios or system stress tests in shared virtual environments. Behavioral patterns, stress markers, and gaze dynamics can all inform system feedback instantly.
Multisubject validation of autonomous driving systems
By capturing data from several test subjects simultaneously, Prophea.X allows parallel evaluation of perception, decision-making, and reaction consistency across different drivers. This is key for validating autonomous driving systems under varied human responses. Synchronization across biosignals, eye tracking, and vehicle data ensures robust safety and UX assessment.
Integrated AR/VR HMI testing with physiological feedback
Prophea.X supports immersive AR/VR test environments by seamlessly combining gaze, biophysiological, and interaction data in real time. Researchers can evaluate HMI concepts based on cognitive load, attention, and emotional states. This enables iterative design improvements driven by real human responses in simulated conditions.
Dynamic workload balancing in multi-user interfaces
Using Prophea.X, developers can monitor physiological and behavioral indicators across teams to assess mental load and task distribution in collaborative systems. This makes it possible to detect overload or disengagement and adapt interfaces or task flows accordingly. The result is optimized performance and reduced error rates in critical multi-user setups.
Prophea.X is actively evolving these trends. Its extensible API and AI-ready data structure, combined with powerful UDP/IP and LSL protocol interfaces, allow integration of the most complex multi subject, multi-modal data before processing with our emerging machine learning and real-time decision-making systems. With Ergoneers’ long-term roadmap rooted in future-forward R&D, users can count on Prophea.X to further define the next generation of behavior research lab solutions.
Multi Subject Eyetracking
Conclusion
Parallel multi-subject eye tracking, when integrated with multimodal sensor data, represents a transformative leap in behavior research. It provides rich, synchronized, and context-aware insights that were previously difficult to achieve. For industries focused on road safety, UX design, and human-machine interaction, this methodology enables more realistic, scalable, and insightful evaluations—empowering smarter systems and safer interfaces.
Prophea.X by Ergoneers offers the technical foundation, flexibility, and robustness needed to realize this vision. Its support for parallel, multimodal studies—combined with Ergoneers’ deep expertise in applied behavior research—makes it the optimal choice for forward-thinking research teams across automotive, simulation, and UX domains.
For more Information and individual consultations, contact us directly or connect with your local Tobii reseller.
Boost your lab for the future
Individual requirements meet our flexible pricing.
Request a consultation today!
600+ and counting
Explore real studies using Ergoneers Equipment