Publication Hub Archive

Dikablis Glasses

You have reached the Ergoneers Publication Hub for:

Product Name > Dikablis Glasses

Find all Publications here:

Publication Hub

Total results: 510

Dynamic driving risk in highway tunnel groups based on pupillary oscillations

Year: 2024

Authors: H Zheng, Z Du,S Wang

This study aims to understand the dynamic changes in driving risks in highway tunnel groups. Real-world driving experiments were conducted, collecting pupil area data to measure pupil size oscillations using the Percentage of Pupil Area Variable (PPAV) metric. The analysis focused on investigating relative pupil size fluctuations to explore trends in driving risk fluctuations within tunnel groups. The objective was to identify accident-prone areas and key factors influencing driving risks, providing insights for safety improvements. The findings revealed an overall “whipping effect” phenomenon in driving risk changes within tunnel groups. Differences were observed between interior tunnel areas and open sections, including adjacent, approach, and departure zones. Higher driving risks were associated with locations closer to the tail end of the tunnel group and shorter exit departure sections. Targeted safety improvement designs should consider fluctuation patterns in different directions, with attention to tunnels at the tail end. In open sections, increased travel distance and lengths of upstream and downstream tunnels raised driving risks, while longer open zones improved driving risks. Driving direction and sequence had minimal impact on risks. By integrating driver vision, tunnel characteristics, and the environment, this study identified high-risk areas and critical factors, providing guidance for monitoring and improving driving risks in tunnel groups. The findings have practical implications for the operation and safety management of tunnel groups.

Eye Tracking Glasses
Simulator

6 versions available

Evaluation of driver’s situation awareness in freeway exit using backpropagation neural network

Year: 2024

Authors: Y Yang, Y Chen, SM Easa, J Lin, M Chen

Based on combining the relevant studies on situation awareness (SA), this paper integrated multiple indicators, including eye movement, electroencephalogram (EEG), and driving behavior, to evaluate SA. SA is typically divided into three stages: perception, understanding, and prediction. This paper used eye movement indicators to represent perception, EEG indicators to represent understanding, and driving behavior indicators to represent prediction. After identifying indicators for evaluating SA, a driving simulation experiment was designed to collect data on the indicators. 41 subjects were recruited to participate in the investigation, and the experimenter collected data from each subject in a total of 9 groups. After removing 4 groups of invalid data, 365 groups of valid data were finally obtained. The grey correlation analysis was used to optimize the SA indicators, and 10 SA evaluation indicators were finally determined. There were the average fixation duration, the nearest neighbor index, pupil area, the percentage power spectral density values of the 3 rhythmic waves (θ, α, β), rhythmic wave energy combination parameters (α/θ), mean speed, SD of speed and acceleration. Taking the optimized 10 indicators as input and the SA scores as output, a backpropagation neural network model with a topological structure of 10-8-1 was constructed. 75% of the data were randomly selected for model training, and the final network training’s mean square error was 0.0025. Using the remaining 25% of data for verification, the average absolute error and average relative error of the predicted results are 0.248 and 0.046, respectively. This showed that the model was effective, and it was feasible to evaluate the SA by using the data of eye movement, EEG and driving behavior parameters.

Eye Tracking Glasses
Simulator

5 versions available

Exploring the occupational fatigue risk of short-haul truck drivers: effects of sleep pattern, driving task, and time-on-task on driving behavior and eye-motion metrics

Year: 2024

Authors: C Zhang,Y Ma, S Chen, J Zhang, G Xing

Driver fatigue is the leading cause of truck-related accidents. The most significant occupational fatigue factors among short-haul truck drivers are sleep patterns, the round-trip driving task, and the time-on-task. However, the underlying mechanisms of these influential factors remain unclear. This study aims to explore the interactive effects of sleep patterns, driving task, and time-on-task on driving behavior and eye-motion metrics among short-haul truck drivers. We obtained test data from eleven professional short-haul truck drivers, with each driver participating in a three-day test under the conditions of two driving tasks and three different sleep patterns. We applied three-way repeated-measure ANOVA and non-parametric tests to analyze the data. The results reveal that: (1) violation of sleep-related legal requirements, insufficient sleep, and unreasonable time-on-task can have negative effects on short-haul truck drivers' vigilance and driving performance; (2) both driving task and sleep pattern contribute to driver fatigue, and the interaction of time-on-task and sleep pattern exacerbates driver fatigue more than the effects of any single factor alone; and (3) short-haul truck drivers who are sleep deprived exhibit short periods of controlled compensatory behavior during the outbound task, and sleepiness is more prevalent during the inbound task compared to the outbound task due to the monotony and low workload of the driving process. These findings provide theoretical and practical guidance for transportation industry managers to strengthen company-wide fatigue-related regulations, ensure adequate sleep for drivers via regulations, and optimize work schedules to improve safety outcomes of short-haul truck drivers.

Eye Tracking Glasses
Simulator

5 versions available

Gaze alternation predicts inclusive next-speaker selection: evidence from eyetracking

Year: 2024

Authors: C Rühlemann

Next-speaker selection refers to the practices conversationalists rely on to designate who should speak next. Speakers have various methods available to them to select a next speaker. Certain actions, however, systematically co-select more than one particular participant to respond. These actions include asking “open-floor” questions, which are addressed to more than one recipient and that more than one recipient are eligible to answer. Here, next-speaker selection is inclusive. How are these questions multimodally designed? How does their multimodal design differ from the design of “closed-floor” questions, in which just one participant is selected as next speaker and where next-speaker selection is exclusive? Based on eyetracking data collected in naturalistic conversation, this study demonstrates that unlike closed-floor questions, open-floor questions can be predicted based on the speaker’s gaze alternation during the question. The discussion highlights cases of gaze alternation in open-floor questions and exhaustively explores deviant cases in closed-floor questions. It also addresses the functional relation of gaze alternation and gaze selection, arguing that the two selection techniques may collide, creating disorderly turntaking due to a fundamental change in participation framework from focally dyadic to inclusive. Data are in British and American English.

Eye Tracking Glasses
Software

1 version available:

GazeAway: Designing for Gaze Aversion Experiences

Year: 2024

Authors: N Overdevest,R Patibanda,A Saini

Gaze aversion is embedded in our behaviour: we look at a blank area to support remembering and creative thinking, and as a social cue that we are thinking. We hypothesise that a person's gaze aversion experience can be mediated through technology, in turn supporting embodied cognition. In this design exploration we present six ideas for interactive technologies that mediate the gaze aversion experience. One of these ideas we developed into “GazeAway”: a prototype that swings a screen into the wearer's field of vision when they perform gaze aversion. Six participants experienced the prototype and based on their interviews, we found that GazeAway changed their gaze aversion experience threefold: increased awareness of gaze aversion behaviour, novel cross-modal perception of gaze aversion behaviour, and changing gaze aversion behaviour to suit social interaction. We hope that ultimately, our design exploration offers a starting point for the design of gaze aversion experiences.

Eye Tracking Glasses
Software

3 versions available

GazeTrak: Exploring Acoustic-based Eye Tracking on a Glass Frame

Year: 2024

Authors: K Li,R Zhang, B Chen,S Chen, S Yin

In this paper, we present GazeTrak, the first acoustic-based eye tracking system on glasses. Our system only needs one speaker and four microphones attached to each side of the glasses. These acoustic sensors capture the formations of the eyeballs and the surrounding areas by emitting encoded inaudible sound towards eyeballs and receiving the reflected signals. These reflected signals are further processed to calculate the echo profiles, which are fed to a customized deep learning pipeline to continuously infer the gaze position. In a user study with 20 participants, GazeTrak achieves an accuracy of 3.6° within the same remounting session and 4.9° across different sessions with a refreshing rate of 83.3 Hz and a power signature of 287.9 mW. Furthermore, we report the performance of our gaze tracking system fully implemented on an MCU with a low-power CNN accelerator (MAX78002). In this configuration, the system runs at up to 83.3 Hz and has a total power signature of 95.4 mW with a 30 Hz FPS.

Eye Tracking Glasses
Software

3 versions available

Guiding gaze gestures on smartwatches: Introducing fireworks

Year: 2024

Authors: W Delamare, D Harada, L Yang,X Ren

Smartwatches enable interaction anytime and anywhere, with both digital and augmented physical objects. However, situations with busy hands can prevent user inputs. To address this limitation, we propose Fireworks, an innovative hands-free alternative that empowers smartwatch users to trigger commands effortlessly through intuitive gaze gestures by providing post-activation guidance. Fireworks allows command activation by guiding users to follow targets moving from the screen center to the edge, mimicking real life fireworks. We present the experimental design and evaluation of two Fireworks instances. The first design employs temporal parallelization, displaying few dynamic targets during microinteractions (e.g., snoozing a notification while cooking). The second design sequentially displays targets to support more commands (e.g., 20 commands), ideal for various scenarios other than microinteractions (e.g., turn on lights in a smart home). Results show that Fireworks’ single straight gestures enable faster and more accurate command selection compared to state-of-the-art baselines, namely Orbits and Stroke. Additionally, participants expressed a clear preference for Fireworks’ original visual guidance.

Eye Tracking Glasses
Software

4 versions available

Head-mounted eye tracker videos and raw data collected during breathing recognition attempts in in simulated cardiac arrest

Year: 2024

Authors: M Pedrotti, M Stanek, L Gelin, P Terrier

This paper presents data collected by Pedrotti et al. (2022, 2024) [1][2], which includes videos captured using a Dikablis head-mounted eye tracker (Ergoneers GmbH, Germany), along with the corresponding raw data. The data collection aimed to assess participants' ability to recognize breathing in a simulated cardiac arrest scenario. Equipped with the eye tracker, participants entered a room where a manikin was positioned on the floor. Their task was to determine if the manikin was breathing and respond accordingly, such as initiating cardiopulmonary resuscitation if the victim was not breathing. Our analysis focused on examining looking time on the manikin's thorax by inspecting the videos. Potential applications of the dataset [3] include identifying fixation and saccades using custom algorithms, analyzing pupil diameter data, and conducting secondary analyses involving participant characteristics like age and gender as independent variables.

Eye Tracking Glasses
Simulator

2 versions available

Image-Analysis-Based Method for Exploring Factors Influencing the Visual Saliency of Signage in Metro Stations

Year: 2024

Authors: M Yin,X ZHOU, S Yang, H Peng, C LI

Many studies have been conducted on the effects of colour, light, and signage location on the visual saliency of underground signage. However, few studies have investigated the influence of indoor visual environments on the saliency of pedestrian signage. To explore the factors that influence the visual saliency of signage in metro stations, we developed a novel analysis method using a combination of saliency and focus maps. Then, questionnaires were utilised to unify the various formats of results from the saliency and focus maps. The factors that influence the visual saliency of signage were explored using the proposed method at selected sites and validated through virtual reality experiments. Additionally, this study proposes an image-analysis-based method that reveals the multilevel factors affecting pedestrian attention to signage in underground metro stations, including spatial interfaces, crowd flow, and ambient light. The results indicate that crowd flow has the greatest impact on pedestrian attention to signage. The findings of this study are expected to improve the wayfinding efficiency of pedestrians and assist designers in producing high-quality metro experiences.

Eye Tracking Glasses
Software

1 version available:

Inducing visual attention through audiovisual stimuli: Can synchronous sound be a salient event?

Year: 2024

Authors: I Salselas,F Pereira,E Sousa

We present an experimental research aiming to explore how spatial attention may be biased through auditory stimuli. In particular, we investigate how synchronous sound and image may affect attention and increase the saliency of the audiovisual event. We have designed and implemented an experimental study where subjects, wearing an eye-tracking system, were examined regarding their gaze toward the audiovisual stimuli being displayed. The audiovisual stimuli were specifically tailored for this experiment, consisting of videos contrasting in terms of Synch Points (i.e., moments where a visual event is associated with a visible trigger movement, synchronous with its correspondent sound). While consistency across audiovisual sensory modalities revealed to be an attention-drawing feature, when combined with synchrony, it clearly emphasized the biasing, triggering orienting, that is, focal attention towards the particular scene that contains the Synch Point. Consequently, results revealed synchrony to be a saliency factor, contributing to the strengthening of the focal attention. In today's increasingly complex multimedia landscape, the interaction between auditory and visual stimuli plays a pivotal role in shaping our perception and directing our attention. Within the context of the research on multisensory attention, this study endeavors to explore the intricate dynamics of attentional allocation concerning audiovisual stimuli, specifically focusing on the impact of synchronized auditory and visual cues on capturing and directing attention.

Eye Tracking Glasses
Software

7 versions available