Publication Hub Archive

Dikablis Glasses

You have reached the Ergoneers Publication Hub for:

Product Name > Dikablis Glasses

Find all Publications here:

Publication Hub

Total results: 509

Driver Demand: Eye Glance Measures

Year: 2016

Authors: S Seaman, L Hsieh, R Young

This study investigated driver glances while engaging in infotainment tasks in a stationary vehicle while surrogate driving: watching a driving video recorded from a driver’s viewpoint and projected on a large screen, performing a lane-tracking task, and performing the Tactile Detection Response Task (TDRT) to measure attentional effects of secondary tasks on event detection and response. Twenty-four participants were seated in a 2014 Toyota Corolla production vehicle with the navigation system option. They performed the lane-tracking task using the vehicle’s steering wheel, fitted with a laser pointer to indicate wheel movement on the driving video. Participants simultaneously performed the TDRT and a variety of infotainment tasks, including Manual and Mixed-Mode versions of Destination Entry and Cancel, Contact Dialing, Radio Tuning, Radio Preset selection, and other Manual tasks. Participants also completed the 0-and 1-Back pure auditory-vocal tasks. Glances were recorded using an eye-tracker, and validated by manual inspection. Glances were classified as on-road (i.e., looking through the windshield) or off-road (i.e., to locations other than through the windshield). Three off-road glance metrics were tabulated and scored using the NHTSA Guidelines methods: Mean Single Glance Duration (MSGD), Total Eyes-Off-Road Time (TEORT), and Long Glance Proportion (LGP). Comparisons were made for these metric values between the task conditions and a 30-s Baseline condition with no task. Mixed-Mode tasks did not have a statistically significant longer MSGD or TEORT, or higher LGP, than Baseline (except for Mixed-Mode Destination Entry), whereas all the Manual tasks did. Mixed-Mode tasks improved compliance with the NHTSA Guidelines.

Eye Tracking Glasses
Simulator

2 versions available

Driver Demand: Eye Glance Measures 2016-01-1421

Year: 2016

Authors: S Seaman, L Hsieh, R Young

This study investigated driver glances while engaging in infotainment tasks in a stationary vehicle while surrogate driving: watching a driving video recorded from a driver’s viewpoint and projected on a large screen, performing a lane-tracking task, and performing the Tactile Detection Response Task (TDRT) to measure attentional effects of secondary tasks on event detection and response. Twenty-four participants were seated in a 2014 Toyota Corolla production vehicle with the navigation system option. They performed the lane-tracking task using the vehicle’s steering wheel, fitted with a laser pointer to indicate wheel movement on the driving video. Participants simultaneously performed the TDRT and a variety of infotainment tasks, including Manual and Mixed-Mode versions of Destination Entry and Cancel, Contact Dialing, Radio Tuning, Radio Preset selection, and other Manual tasks. Participants also completed the 0-and 1-Back pure auditory-vocal tasks. Glances were recorded using an eye-tracker, and validated by manual inspection. Glances were classified as on-road (i.e., looking through the windshield) or off-road (i.e., to locations other than through the windshield). Three off-road glance metrics were tabulated and scored using the NHTSA Guidelines methods: Mean Single Glance Duration (MSGD), Total Eyes-Off-Road Time (TEORT), and Long Glance Proportion (LGP). Comparisons were made for these metric values between the task conditions and a 30-s Baseline condition with no task. Mixed-Mode tasks did not have a statistically significant longer MSGD or TEORT, or higher LGP, than Baseline (except for Mixed-Mode Destination Entry), whereas all the Manual tasks did. Mixed-Mode tasks improved compliance with the NHTSA Guidelines.

Eye Tracking Glasses
Software

1 version available:

EOG-based head-mounted eye tracking with 1 kHz sampling rate

Year: 2016

Authors: DJ Mack, P Schönle,T Burger, Q Huang,S Fateh

Prerequisites for an out-of-lab eye tracker - Head-mounted, including scene camera, data - acquisition & power supply - Wireless data transmission to mobile computer - for preview & storage - Lightweight With one exception, all out-of-lab eye trackers are based on VOG, mainly operating at a low sampling rate of 60 Hz (Fig.1B). For fast eye movements (saccades), peak velocity is an important property, indicating fatigue and relying on the proper functioning of the brainstem. Unfortunately, measured peak velocity significantly drops at sampling rates below 240 Hz (Fig.1C).

Eye Tracking Glasses
Simulator

2 versions available

Experimental evaluation of the controllability of interacting advanced driver assistance systems

Year: 2016

Authors: O Schädler,S Müller, M Gründl

A method for the experimental evaluation of the controllability of interacting advanced driver assistance systems (ADAS) is presented at the beginning of this paper. Here, driving situations where particular ADAS are acting within, at or beyond their system limits or during and after system failures have been implemented into a static driving simulator. According to the recommendation of the Code of Practice (CoP) each situation has been assessed to select three critical driving situations. The second part of the paper describes two driving simulator studies to evaluate the controllability of four interacting ADAS (Automatic Emergency Brake Assist (AEB), Adaptive Cruise Control (ACC), Lane Keeping Assist (LKA) and Lane Change Decision Aid System (LCDAS)) in critical driving situations. Each study is based on a within-subjects design. In these studies, each participant was driving each of the three scenarios ('Stationary Obstacle Avoidance', 'Braking Object Vehicle' and 'Three-Lane Motorway') without and with ADAS. The recorded physical and physiological data and the subjective perceptions of the participants were analysed. One of the findings was e.g., that some drivers became confused when ACC was braking while LKA overlaid a steering torque during a system failure (3 Nm steering torque ramp) in the scenario 'Three-Lane Motorway'. It could also be shown that accidents have happened during an evasive manoeuvre where ACC has accelerated and LKA had a overlaying steering torque. Based on the results of the study, functional improvements which might enhance the interaction of ADAS have been derived and are presented in this paper. These improvements have been tested and evaluated in a second replication driver simulator study. The results of this study show an improvement of the controllability when the vehicle in front is detected earlier. It also confirms that an unexpected braking and warning of an Adaptive Cruise Control and an overlay of a steering moment lead to an uncontrollable behaviour.

Eye Tracking Glasses
Simulator

3 versions available

Eyerec: An open-source data acquisition software for head-mounted eye-tracking

Year: 2016

Authors: T Santini,W Fuhl,T Kübler,E Kasneci

Head-mounted eye tracking offers remarkable opportunities for human computer interaction in dynamic scenarios (e.g., driving assistance). Although a plethora of proprietary software for the acquisition of such eye-tracking data exists, all of them are plagued by a critical underlying issue: their source code is not available to the end user. Thus, a researcher is left with few options when facing a scenario in which the proprietary software does not perform as expected. In such a case, the researcher is either forced to change the experimental setup (which is undesirable) or invest a considerable amount of time and money in a different eye-tracking system (which may also underperform). In this paper, we introduce EyeRec, an open-source data acquisition software for head-mounted eye-tracking. Out of the box, EyeRec offers real-time state-of-the-art pupil detection and gaze estimation, which can be easily replaced by user implemented algorithms if desired. Moreover, this software supports multiple head-mounted eye-tracking hardware, records eye and scene videos, and stores pupil and gaze information, which are also available as a real-time stream. Thus, EyeRec can be an efficient means towards facilitating gazed-based human computer interaction research and applications. Available at: www.perception.uni-tuebingen.de

Eye Tracking Glasses
Software

4 versions available

Gaze augmentation in egocentric video improves awareness of intention

Year: 2016

Authors: D Akkil,P Isokoski

Video communication using head-mounted cameras could be useful to mediate shared activities and support collaboration. Growing popularity of wearable gaze trackers presents an opportunity to add gaze information on the egocentric video. We hypothesized three potential benefits of gaze-augmented egocentric video to support collaborative scenarios: support deictic referencing, enable grounding in communication, and enable better awareness of the collaborator's intentions. Previous research on using egocentric videos for real-world collaborative tasks has failed to show clear benefits of gaze point visualization. We designed a study, deconstructing a collaborative car navigation scenario, to specifically target the value of gaze-augmented video for intention prediction. Our results show that viewers of gaze-augmented video could predict the direction taken by a driver at a four-way intersection more accurately and more confidently than a viewer of the same video without the superimposed gaze point. Our study demonstrates that gaze augmentation can be useful and encourages further study in real-world collaborative scenarios.

Eye Tracking Glasses
Software

3 versions available

Hand and Eye Gaze Analysis for the Objective Assessment of Open Surgical Dexterity

Year: 2016

Authors: SC Byrns

Objective assessment of technical skill remains a challenging task. Paper based evaluations completed by expert assessors have been criticized for not accurately or consistently describing a surgeons’ technical proficiency due to inter-observer variability and subjective bias. In the laparoscopic or minimally invasive surgical domain, technology assisted evaluation has been shown to provide a reliable and objective measure of performance based on motion analysis, focusing on instrument movement and gestures. Aided by the miniaturization of motion tracking technology, this thesis focuses on the development of novel techniques for acquiring synchronized hand motion and eye tracking data in open surgical procedures. An overview of motor learning theory is provided as a basis for segmenting or decomposing surgical movements into constituent gestures. An empirical study investigating the learning effects of a visuospatial intensive video game as a substitute for traditional practice was performed, and showed that video gaming, can in some conditions, enhance or reinforce traditional simulator based practice. Existing motion capture techniques are reviewed along with an analysis of computational models used in high level motion analysis. A second empirical study was completed to investigate the application of one of these computer models to hand motion captured via an optical marker-less tracking device. Hidden Markov Models applied to the motion data was able to discriminate between participants emulating different levels of dexterity. Finally, the development of a technology-assisted assessment system for evaluating a surgeons’ performance based on synchronized hand motion, eye gaze and force application in open surgical techniques is presented. Several empirical studies designed to validate this system are described. The novel aspects of this system include the ability to capture eye gaze in a 3-dimensional environment as well as highly detailed hand motion based on a surgical glove system where 6D electromagnetic sensors are embedded. The design and assembly of this apparatus is described including an overview of the software required for achieving spatial and temporal coherence. The thesis concludes with a summary of findings and a brief discussion of planned experiments necessary to validate the clinical utility of a surgical motion and eye tracking system for both objective assessment and training purposes.

Eye Tracking Glasses
Software

2 versions available

Interaction dialog design for the use of mobile devices while walking

Year: 2016

Authors: J Conradi, B Nord, T Alexander

A study was carried out to determine an optimal solution for presenting multiple interaction options on the limited space of a mobile device, e.g., a smartphone, taking into account the special situation of walking. We compared three different hierarchy models and a complex interaction editor which combines all the required interaction alternatives in one screen. Slow versus fast walking on a treadmill was introduced as an additional mobility condition. The results showed that menus with a hierarchy breadth of 4 or 8 to be suit best for walking. Flat hierarchies required longer time on task and led to fewer gaze changes per single interaction. The complex interaction editor triggered a high error count and a high task load level and therefore should be avoided while walking.

Eye Tracking Glasses
Simulator
Software

5 versions available

Keep your scanners peeled: Gaze behavior as a measure of automation trust during highly automated driving

Year: 2016

Authors: S Hergeth, L Lorenz, R Vilimek,JF Krems

Objective: The feasibility of measuring drivers’ automation trust via gaze behavior during highly automated driving was assessed with eye tracking and validated with self-reported automation trust in a driving simulator study. Background: Earlier research from other domains indicates that drivers’ automation trust might be inferred from gaze behavior, such as monitoring frequency. Method: The gaze behavior and self-reported automation trust of 35 participants attending to a visually demanding non-driving-related task (NDRT) during highly automated driving was evaluated. The relationship between dispositional, situational, and learned automation trust with gaze behavior was compared. Results: Overall, there was a consistent relationship between drivers’ automation trust and gaze behavior. Participants reporting higher automation trust tended to monitor the automation less frequently. Further analyses revealed that higher automation trust was associated with lower monitoring frequency of the automation during NDRTs, and an increase in trust over the experimental session was connected with a decrease in monitoring frequency. Conclusion: We suggest that (a) the current results indicate a negative relationship between drivers’ self-reported automation trust and monitoring frequency, (b) gaze behavior provides a more direct measure of automation trust than other behavioral measures, and (c) with further refinement, drivers’ automation trust during highly automated driving might be inferred from gaze behavior. Application: Potential applications of this research include the estimation of drivers’ automation trust and reliance during highly automated driving.

Eye Tracking Glasses
Simulator

8 versions available

Measuring safety for urban tunnel entrance and exit based on nanoscopic driving behaviors

Year: 2016

Authors: S Fei, X Qian, X Xiaoling, M Chao

The entrance and exit zones of urban tunnel have been considered as the most dangerous parts along the tunnel for its sharp changes of the driving environment. The objective of this paper is to extract a comprehensive measure from numerous original measures which reflect drivers' Nanoscopic behaviors to measure the safety of urban tunnel entrance and exit. Field test was conducted at Xi'an men tunnel located in Nanjing. The drivers' heart rate (HR) and eye movement, together with the pupillary diameter at the entrance and exit of urban tunnel were collected synchronously with theD-Lab system. Operating speed was recorded by the video camera, and then the individual vehicle acceleration was calculated. Following the factor analysis procedure, three factors, which explain 90.18% and 89.15% of the variance in the original data for entrance and exit separately, are retained from the initial four Nanoscopic driving behavior measures. According to weight score of each factor, a comprehensive measure (FE) which could reflect Nanoscopic driving behavior was extracted by linear combination of the three retained factors. To measure the safety level of urban tunnel gateway, FE is classified into three levels. The criterion for safety classification is given as: |FE|=0.05, safe, 0.05<|FE|<0.10, moderately dangerous, |FE|>0.10, dangerous. The validation by comparing with tunnel environmental shows that the measure proposed in this paper is acceptable and more accurate in evaluating the safety of tunnel gateway zones.

Eye Tracking Glasses
Software

3 versions available