Publication Hub Archive

Eye Tracker

You have reached the Ergoneers Publication Hub for:

Used Tool > Eye Tracker

Find all Publications here:

Publication Hub

Total results: 582

TobiiGlassesPySuite: An open-source suite for using the Tobii Pro Glasses 2 in eye-tracking studies

Year: 2019

Authors: D De Tommaso,A Wykowska

In this paper we present the TobiiGlassesPySuite, an open-source suite we implemented for using the Tobii Pro Glasses 2 wearable eye-tracker in custom eye-tracking studies. We provide a platform-independent solution for controlling the device and for managing the recordings. The software consists of Python modules, integrated into a single package, accompanied by sample scripts and recordings. The proposed solution aims at providing additional methods with respect to the manufacturer's software, for allowing the users to exploit more the device's capabilities and the existing software. Our suite is available for download from the repository indicated in the paper and usable according to the terms of the GNU GPL v3.0 license.

Eye Tracking Glasses
Software

3 versions available

Trust in highly automated driving

Year: 2019

Authors: A Stephan

The automotive industry is on the verge of a new technology: self-driving vehicles. Such highly automated driving vehicles are more and more technically feasible, and corporations and research institutes all over the world are investing time and money to bring the once futuristic vision on the road. The technology is developed with the goal to release the driver from the manual task of controlling the vehicle. Through that, it shall increase driving comfort and, above all, contribute to the enhancement of overall road safety. Beyond further technical development, psychological aspects and the creation of an optimal user experience gain importance for highly automated driving functionality. In particular, trust in this kind of functionality has yet to be built up for future societal usage. Otherwise, if people are not willing to entrust control to such a vehicle, it will not be used and the potential of highly automated driving cannot be fully exploited. The aim of this work is to identify influential factors on trust in highly automated driving vehicles and to examine how this trust can be supported by a specific human-machine interface (HMI). To this end, three main studies were conducted with participants. Different HMI concepts were tested in these user studies in a prototype vehicle on public roads as well as in a simulated environment. The aim of the first real-driving study (N = 28) with the highly automated driving vehicle was to test influential factors on trust in such a vehicle. The personality characteristic desire for control as well as a general attitude towards technology were identified as relevant factors. However, most important for trust was the perceived performance of the system. In the second user study (N = 72), the influence of system boundaries on trust was examined with the help of a simulated environment. It was proven that the type of the experienced system limit plays a crucial role. In particular, the non-detection of a relevant event within the driving situation diminished trust, while a false detection led to little trust reduction. Over several trial days, it was examined in a third user study (N = 18) how trust develops beyond a first contact with a highly automated driving system. In this real-driving study, first indications were found that the relevance of the HMI increases with prolonged system use. A trust model set up based on previous insights and theories was transferred to the new context of highly automated driving with the help of these studies. Furthermore, guidelines for the design of an HMI concept for highly automated vehicles were collected and applied. Thereby, the insights of this work support developers in designing HMI concepts to promote trust in automated driving functionality. Even if the future driver no longer needs to take over driving tasks, it is recommended to provide an adequate HMI concept supporting trust development.

Simulator
Software

3 versions available

Using gaze-based interactions in automated vehicles for increased road safety

Year: 2019

Authors: H Schmidt,G Zimmermann,A Schmidt

The development of self-driving vehicles seems to go well with the growing demand for the daily use of mobile devices. However, autonomous vehicles will still need manual intervention in unforeseen or dangerous situations. Therefore, it is important for the driver to stay aware of the traffic situation around, and so to be quickly able to take over. We developed a prototype which represents media content on a simulated windshield display and uses gaze tracking as an additional form of input device for the driver. Although we intentionally pull away the driver's gaze from the driving situation, this seems to be less of a distraction than using hand-held mobile devices or dash-integrated display devices. We hypothesize that the time to regain control with our prototype is shorter compared to traditional media presentation. This work-in-progress paper provides insight to the concept of the prototype while first results will be presented at the conference.

Eye Tracking Glasses
Simulator

1 version available:

Visual attention failures towards vulnerable road users at intersections: Results from on-road studies

Year: 2019

Authors: NE Kaya

This dissertation investigates the visual attention failures of drivers towards vulnerable road users (VRUs) at intersections. VRUs include pedestrians, cyclists, and motorcyclists. This research uses on-road studies to observe driver behavior in real-world settings. The findings reveal that drivers often fail to notice VRUs at intersections, leading to potential collisions. By identifying the specific circumstances and conditions under which these failures occur, this work aims to improve intersection safety and inform the design of interventions to enhance driver awareness of VRUs.

Eye Tracking Glasses
Simulator

4 versions available

Visualizing natural language interaction for conversational in-vehicle information systems to minimize driver distraction

Year: 2019

Authors: M Braun,N Broy,B Pfleging,F Alt

In this paper we investigate how natural language interfaces can be integrated with cars in a way such that their influence on driving performance is being minimized. In particular, we focus on how speech-based interaction can be supported through a visualization of the conversation. Our work is motivated by the fact that speech interfaces (like Alexa, Siri, Cortana, etc.) are increasingly finding their way into our everyday life. We expect such interfaces to become commonplace in vehicles in the future. Cars are a challenging environment, since speech interaction here is a secondary task that should not negatively affect the primary task, that is driving. At the outset of our work, we identify the design space for such interfaces. We then compare different visualization concepts in a driving simulator study with 64 participants. Our results yield that (1) text summaries support drivers in recalling information and enhances user experience but can also increase distraction, (2) the use of keywords minimizes cognitive load and influence on driving performance, and (3) the use of icons increases the attractiveness of the interface.

Simulator
Software

14 versions available

Work, aging, mental fatigue, and eye movement dynamics

Year: 2019

Authors: RZ Marandi

Mental load and fatigue are important multidimensional phenomena concerning increasing involvement of elderly individuals in computer work. Fatigue may be associated with reduced cognitive resources and increased errors. Micro-breaks are strategic solutions to impede fatigue subject to design constraints, such as a timing plan. The present work aimed to use eye tracking as a promising technology to measure mental load and fatigue in young and elderly adults (Studies I and II), and to apply micro-breaks based on fatigue-related changes in eye movements to decelerate fatigue development (Study III). The three studies involved 58 young and elderly participants. A novel task resembling computer work was developed to induce mental load (Study I). Gaze positions and pupillary responses were recorded during the task execution to detect ocular events (saccades, fixations, and blinks) and to quantify their characteristics as oculometrics. In Study I, the task was performed with three load levels across two days. In addition to measuring the load effects on performance, perceived workload, and oculometrics, the test-retest reliability of 19 oculometrics was assessed. In Study II, the effect of 40-min time-on-task was explored on oculometrics, perceived fatigue, and performance. Then, in Study III, a predictive model of fatigue was developed based on the data collected in Study II. Oculometrics-based biofeedback was implemented in real time to detect fatigue using the developed model, which triggered micro-breaks upon fatigue detection to impede it. Perceived fatigue and workload were compared between a session with the biofeedback and a control session with self-triggering micro-breaks. A set of oculometrics were found to reflect mental load (Study I) and fatigue (Study II) in both age groups. Similar trends in oculometrics were observed with increased mental load and fatigue, implying shared neural systems for both conditions (Studies I and II). Age-related differences were exhibited in a few of the oculometrics (Study II), but age as a feature did not significantly contribute to fatigue detection (Study III). The biofeedback reduced workload and fatigue development, which suggests an improved strategy to design the timing plan of micro-breaks (Study III). All in all, the findings may support the viability of detecting the effects of fatigue and mental load on oculometrics to apply oculometrics-based biofeedback in computer work.

Eye Tracking Glasses
Software

8 versions available

A novel method for estimating free space 3D point-of-regard using pupillary reflex and line-of-sight convergence points

Year: 2018

Authors: Z Wan, X Wang, K Zhou, X Chen, X Wang

In this paper, a novel 3D gaze estimation method for a wearable gaze tracking device is proposed. This method is based on the pupillary accommodation reflex of human vision. Firstly, a 3D gaze measurement model is built. By uniting the line-of-sight convergence point and the size of the pupil, this model can be used to measure the 3D Point-of-Regard in free space. Secondly, a gaze tracking device is described. By using four cameras and semi-transparent mirrors, the gaze tracking device can accurately extract the spatial coordinates of the pupil and eye corner of the human eye from images. Thirdly, a simple calibration process of the measuring system is proposed. This method can be sketched as follows: (1) each eye is imaged by a pair of binocular stereo cameras, and the setting of semi-transparent mirrors can support a better field of view; (2) the spatial coordinates of the pupil center and the inner corner of the eye in the images of the stereo cameras are extracted, and the pupil size is calculated with the features of the gaze estimation method; (3) the pupil size and the line-of-sight convergence point when watching the calibration target at different distances are computed, and the parameters of the gaze estimation model are determined. Fourthly, an algorithm for searching the line-of-sight convergence point is proposed, and the 3D Point-of-Regard is estimated by using the obtained line-of-sight measurement model. Three groups of experiments were conducted to prove the effectiveness of the proposed method. This approach enables people to obtain the spatial coordinates of the Point-of-Regard in free space, which has great potential in the application of wearable devices.

Eye Tracking Glasses
Simulator

10 versions available

An approach to VR based head and eye movement correlation evaluation system

Year: 2018

Authors: J DutNo, R Vargic

This contribution describes a new VR based head and eye movement correlation evaluation systems. Systems allows capturing of the gaze position using eye tracker mounted in the head mounted display for the virtual reality as well as the actual head position and rotation. There values can be evaluated under number of prepared scenes and experiments. Using the environment some of the hypotheses related to the head and eye movements in the virtual reality were studied and tested. The preliminary results are shown and are promising. However, there are significant systematic errors that shall be eliminated.

Eye Tracking Glasses
Software

1 version available:

An inconspicuous and modular head-mounted eye tracker

Year: 2018

Authors: S Eivazi,TC Kübler,T Santini,E Kasneci

State of the art head mounted eye trackers employ glasses like frames, making their usage uncomfortable or even impossible for prescription eyewear users. Nonetheless, these users represent a notable portion of the population (e.g. the Prevent Blindness America organization reports that about half of the USA population use corrective eyewear for refractive errors alone). Thus, making eye tracking accessible for eyewear users is paramount to not only improve usability, but is also key for the ecological validity of eye tracking studies. In this work, we report on a novel approach for eye tracker design in the form of a modular and inconspicuous device that can be easily attached to glasses; for users without glasses, we also provide a 3D printable frame blueprint. Our prototypes include both low cost Commercial Out of The Shelf (COTS) and more expensive Original Equipment Manufacturer (OEM) cameras, with sampling rates ranging between 30 and 120 fps and multiple pixel resolutions.

Eye Tracking Glasses
Software

1 version available:

An investigation of driver behavior on urban general road and in tunnel areas

Year: 2018

Authors: HY Song, F Shao, Q Xu,TY Guo

The objective of this study is to examine experience-related differences in microscope driving behavior as drivers performed six separate maneuvers, namely 1) driving on general urban roads, 2) approaching a tunnel portal, 3) driving through a tunnel's threshold zone, 4) driving in the interior tunnel zone, 5) driving in the zone ahead the tunnel exit and 6) driving after the tunnel exit. An on-road experiment was conducted with 20 drivers in two groups. The first group was made up of new licensed drivers, and the second group contained the more experienced drivers. The study consisted of one between-subject (experience) and five within-subject variables (drive environment type). The drivers' behavior was measured through Mean Glance Duration, AOI Attention Ration, Horizontal Eye Activity, Vertical Eye Activity, Percentage of Eyelid Closure, and Heart Rate Variability. With respect to the relevant psychological measures, the results show that in general more attention is focused on the far left-hand side of the road and the near front road when driving through tunnel areas when compared with driving on general roads. In addition, the psychological measurements indicate that tunnel's dark narrow environment causes anxiety on driving for lower heart rate variation coefficient (RRCV). New licensed drivers were more severely affected by the tunnel environment than the experienced drivers.

Eye Tracking Glasses
Simulator

3 versions available