Publication Hub Archive

Eye Tracker

You have reached the Ergoneers Publication Hub for:

Used Tool > Eye Tracker

Find all Publications here:

Publication Hub

Total results: 582

Full Reviewed Paper at ICSA 2019

Year: 2019

Authors: P Dodds,SVA Garı,WO Brimijoin, PW Robinson

Augmented reality has the potential to connect people anywhere, anytime, and provide them with interactive virtual objects that enhance their lives. To deliver contextually appropriate audio for these experiences, a much greater understanding of how users will interact with augmented content and each other is needed. This contribution presents a system for evaluating human behavior and augmented reality device performance in calibrated synthesized environments. The system consists of a spherical loudspeaker array capable of spatial audio reproduction in a noise isolated and acoustically dampened room. The space is equipped with motion capture systems that track listener position, orientation, and eye gaze direction in temporal synchrony with audio playback and capture to allow for interactive control over the acoustic environment. In addition to spatial audio content from the loudspeaker array, supplementary virtual objects can be presented to listeners using motion-tracked unoccluding headphones. The system facilitates a wide array of studies relating to augmented reality research including communication ecology, spatial hearing, room acoustics, and device performance. System applications and configuration, calibration, processing, and validation routines are presented.

Simulator
Software

4 versions available

Gaze behavior in basketball free throws developed in constant and variable practice

Year: 2019

Authors: SH Czyż, M Zvonař, Z Borysiuk, J Nykodým

There are a limited number of studies focusing on the mechanisms explaining why variable practice gives an advantage in a novel situation and constant practice in performance in trained conditions. We hypothesized that this may be due to the different gaze behavior that is developed under different conditions. Twenty participants, randomly assigned to two different groups, practiced basketball free throws for three consecutive days, performing 100 throws per day. The constant group (n = 10) practiced at a free throw distance (4.57 m) only. The variable practice group (n = 10) randomly performed 20 shots per five throw distances (3.35, 3.96, 4.57, 5.18, and 5.79 m) on each day, also accumulating 100 shots per day. We analyzed the total gaze fixation duration, a number of fixations, and the average fixation duration on a basketball rim in a pretest and posttest at the 4.57 m distance. We computed a linear mixed model with test (pretest–posttest), group (constant–variable), and test × group interaction in order to analyze the total fixation duration and number of fixations. The average fixation duration was analyzed with a repeated measure two-way ANOVA, with practice conditions as a between-participants factor and test type as a within-participants factor. We found that the total fixation duration increased significantly in the posttest, regardless of the practice conditions (p < 0.001, effect size = 0.504). The number of fixations also increased significantly in the posttest (p = 0.037, effect size = 0.246). The average fixation duration increased in both groups; however, insignificantly. We also did not find any significant differences between groups. Our results suggest that variable and constant practice conditions may lead to the development of similar gaze behavior.

Eye Tracking Glasses
Software

11 versions available

Hazard Detection among Young and Experienced Drivers via Driving Simulator

Year: 2019

Authors: N Borhan,MKA Ibrahim, AA Ab Rashid

Hazard perception test (HPT) is one of a common task in perceiving hazard among drivers. Many countries have been adopting this method to assess an individual’s driving competency in order to acquire driving licenses. Computer-based assessment was a common method widely used to carry out the HPT. Previous hazard perception studies using Malaysian samples reported mixed findings on the effectivity of reaction time-based HPT. Dissimilar with the common method, this study employed a full-size cabin driving simulator to study hazard perception, focussing on hazards detection between two groups of drivers: young and experienced. Results from 28 (15 young, 13 experienced) drivers indicated that young drivers detected hazards faster than their experienced counterparts, even though both groups have the same performance of hazard recognition. Correlational analysis revealed that driving frequency may be a factor contributing to the difference in response time between these two groups. Further analysis also indicates that different road environments contribute to different hazard perception performance.

Eye Tracking Glasses
Simulator

2 versions available

High Cognitive Load Assessment in Drivers Through Wireless Electroencephalography and the Validation of a Modified

Year: 2019

Authors: D He,B Donmez, CC Liu

This paper explores the influence of high cognitive load on vehicle driver's electroencephalography (EEG) signals collected from two channels (Fp1, Fp2) using a wireless consumer-grade system. Although EEG has been used in driving-related research to assess cognitive load, only a few studies focused on high load, and they used research-grade systems. Recent advancements allow for less intrusive and more affordable systems. As an exploration, we tested the feasibility of one such system to differentiate among three levels of cognitive taskload in a simulator study. Thirty-seven participants completed a baseline drive with no secondary task and two drives with a modified version of the n-back task (1-back and 2-back). The modification removed the verbal response required during task presentation to prevent EEG-signal degradation, with the 2-back task expected to impose higher load than that by the 1-back task. Another objective of this study is to validate that this modified task increased the cognitive load in the expected manner. The modified task led to significant trends from baseline to 1-back, and from 1-back to 2-back in participants' heart rate, galvanic skin response, respiration, horizontal gaze position variability, and pupil diameter, all in line with the previous driving-related studies on cognitive load. Furthermore, the EEG system was observed to be sensitive to the modified task, with the power of alpha band decreasing significantly with increasing n-back levels (baseline versus 1-back: 0.092 Bels on Fp1, 0.179 on Fp2; 1-back versus 2-back: 0.209 on Fp1, 0.147 on Fp2). Thus, a consumer-grade EEG system has the potential to capture high levels of cognitive load experienced by drivers.

Eye Tracking Glasses
Simulator

2 versions available

Influence of driving experience on distraction engagement in automated vehicles

Year: 2019

Authors: D He,B Donmez

State-of-the-art vehicle automation requires drivers to visually monitor the driving environment and the automation (through interfaces and vehicle’s actions) and intervene when necessary. However, as evidenced by recent automated vehicle crashes and laboratory studies, drivers are not always able to step in when the automation fails. Research points to the increase in distraction or secondary-task engagement in the presence of automation as a potential reason. However, previous research on secondary-task engagement in automated vehicles mainly focused on experienced drivers. This issue may be amplified for novice drivers with less driving skill. In this paper, we compared secondary-task engagement behaviors of novice and experienced drivers both in manual (non-automated) and automated driving settings in a driving simulator. A self-paced visual-manual secondary task presented on an in-vehicle display was utilized. Phase 1 of the study included 32 drivers (16 novice) who drove the simulator manually. In Phase 2, another set of 32 drivers (16 novice) drove with SAE-level-2 automation. In manual driving, there were no differences between novice and experienced drivers’ rate of manual interactions with the secondary task (i.e., taps on the display). However, with automation, novice drivers had a higher manual interaction rate with the task than experienced drivers. Further, experienced drivers had shorter average glance durations toward the task than novice drivers in general, but the difference was larger with automation compared with manual driving. It appears that with automation, experienced drivers are more conservative in their secondary-task engagement behaviors compared with novice drivers.

Simulator
Software

8 versions available

Interaction options for wearables and smart-devices while walking

Year: 2019

Authors: J Conradi,M Westhoven, T Alexander

Using smartphones and wearables in parallel activities, e.g. while walking, is a widespread phenomenon. In different individuals we furthermore find different interaction styles which involve one or both hands. To study the effects of different interaction styles, we carried out a study involving three different interaction styles for touch-sensitive devices compared to a HMD operated by an additional controller providing passive haptic feedback. The experimental task was carried out while walking on a treadmill. In addition to the primary task of using the interaction device, a secondary task was administered which competed for the participants’ visual attention. We found an impact of the different interaction styles on the input performance: Time on task proved to be faster with the HMD combined with a haptic input device, but error count increased significantly as well. Using the thumb for input resulted in a longer overall time during which the visual attention was focused on the interaction device.

Eye Tracking Glasses
Simulator

2 versions available

Investigating driver gaze behavior during lane changes using two visual cues: ambient light and focal icons

Year: 2019

Authors: A Löcken,F Yan,W Heuten,S Boll

Currently, lane change decision aid systems primarily address foveal vision and thus compete for drivers’ attention with interfaces of other assistant systems. Also, alternative modalities such as acoustic perception (Mahapatra et al., 2008), tactile perception (Löcken et al., 2015), or peripheral vision (Löcken et al., 2015), have been introduced for lane change support. We are especially interested in ambient light displays (ALD) addressing peripheral vision since they can adapt to the driver’s attention using changing saliency levels (Matthews et al., 2004). The primary objective of this research is to compare the effect of ambient light and focal icons on driving performance and gaze behavior. We conducted two driving simulator experiments. The first experiment evaluated an ambient light cue in a free driving scenario. The second one focused on the difference in gaze behavior between ALD and focal icons, called “abstract faces with emotional expressions” (FEE). The results show that drivers decide more often for safe gaps in rightward maneuvers with ambient light cues. Similarly, drivers decide to overtake more often when the gaps are big enough with both displays in the second experiment. Regarding gaze behavior, drivers looked longer towards the forward area, and less often and shorter into the side mirrors when using ALD. This effect supports the assumption that drivers perceive the ALD with peripheral vision. In contrast, FEE did not significantly affect the gaze behavior when compared to driving without assistance. These results help us to understand the effect of different modalities on performance and gaze behavior, and to explore appropriate modalities for lane change support.

Eye Tracking Glasses
Simulator

3 versions available

Investigating Temporal Changes of Behavioral Adaptation and User Experience During Highly Automated Driving

Year: 2019

Authors: D Stimm, A Engeln, J Schäfer, H Schmidt

Sleepiness and micro-sleep as a consequence of the monotony of moving in queues as well as a very stressful daily routine of truck drivers put a serious risk on traffic safety (National Transportation Safety Board 1995). The automation of heavy traffic provides an opportunity to enhance traffic safety and drivers’ convenience and allows the safe use of integrated infotainment and communication systems. The research project TANGO (German abbreviation for ‘Technologie für automatisiertes Fahren nutzergerecht optimiert’, English equivalent ‘Technology for autonomous driving, optimized to user needs’) is funded by the German Federal Ministry of Economic Affairs and Energy. It takes place in cooperation with Robert Bosch GmbH, Volkswagen Aktiengesellschaft, MAN Truck & Bus, University of Stuttgart and Stuttgart Media University. The project aims at improving user experience and acceptance of (highly) automated driving functions for trucks. The project focuses the user-centered development of an Attention and Activity Assistance system (AAA) which provides the truck driver with a variance of non-driving-related activities based on current traffic situation, automation level up to SAE level 3 (SAE international 2018), and the driver’s current attentional state. While behavioral adaptation of drivers to the first use of highly automated systems has already been considered in a number of studies, little is known about the development of these behavioral changes over time, when familiarity with the system increases. In order to address these issues, a long term static driving simulator study will be conducted in spring 2019. The central research subject is the adaptation of drivers’ behavior in take-over scenarios with low time budgets, which require an immediate reaction by the driver. The study will run from March to June, 2019. First research results will be presented at the HCI International Conference in July.

Eye Tracking Glasses
Simulator
Software

2 versions available

Microsaccades in applied environments: Real-world applications of fixational eye movement measurements

Year: 2019

Authors: RG Alexander,SL Macknik

Across a wide variety of research environments, the recording of microsaccades and other fixational eye movements has provided insight and solutions into practical problems. Here we review the literature on fixational eye movements—especially microsaccades—in applied and ecologically-valid scenarios. Recent technical advances allow noninvasive fixational eye movement recordings in real-world contexts, while observers perform a variety of tasks. Thus, fixational eye movement measures have been obtained in a host of real-world scenarios, such as in connection with driver fatigue, vestibular sensory deprivation in astronauts, and elite athletic training, among others. Here we present the state of the art in the practical applications of fixational eye movement research, examine its potential future uses, and discuss the benefits of including microsaccade measures in existing eye movement detection technologies. Current evidence supports the inclusion of fixational eye movement measures in real-world contexts, as part of the development of new or improved oculomotor assessment tools. The real-world applications of fixational eye movement measurements will only grow larger and wider as affordable high-speed and high-spatial resolution eye trackers become increasingly prevalent.

Eye Tracking Glasses
Software

11 versions available

mobEYEle: an embedded eye tracking platform for industrial assistance

Year: 2019

Authors: F Jungwirth, M Murauer, J Selymes

The eyes are a particularly interesting modality for cognitive industrial assistance systems, as gaze analysis can reveal cognition- and task-related aspects, while gaze interaction depicts a lightweight and fast method for hands-free machine control. In this paper, we present mobEYEle, a body-worn eye tracking platform that performs the entire computation directly on the user, as opposed to primarily streaming the data to a centralized unit for online processing and hence restricting its pervasiveness. The applicability of the platform is demonstrated throughout extensive performance and battery runtime tests. Moreover, a self-contained calibration method is outlined that enables the usage of mobEYEle without any supervisor nor digital screen.

Eye Tracking Glasses
Software

2 versions available