Publication Hub Archive

GPS

You have reached the Ergoneers Publication Hub for:

Used Tool > GPS

Find all Publications here:

Publication Hub

Total results: 129

SET: a pupil detection method using sinusoidal approximation

Year: 2015

Authors: AH Javadi, Z Hakimi, M Barati,V Walsh

Mobile eye-tracking in external environments remains challenging, despite recent advances in eye-tracking software and hardware engineering. Many current methods fail to deal with the vast range of outdoor lighting conditions and the speed at which these can change. This confines experiments to artificial environments where conditions must be tightly controlled. Additionally, the emergence of low-cost eye tracking devices calls for the development of analysis tools that enable non-technical researchers to process the output of their images. We have developed a fast and accurate method (known as “SET”) that is suitable even for natural environments with uncontrolled, dynamic and even extreme lighting conditions. We compared the performance of SET with that of two open-source alternatives by processing two collections of eye images: images of natural outdoor scenes with extreme lighting variations (“Natural”); and images of less challenging indoor scenes (“CASIA-Iris-Thousand”). We show that SET excelled in outdoor conditions and was faster, without significant loss of accuracy, indoors. SET offers a low cost eye-tracking solution, delivering high performance even in challenging outdoor environments. It is offered through an open-source MATLAB toolkit as well as a dynamic-link library (“DLL”), which can be imported into many programming languages including C# and Visual Basic in Windows OS.

Eye Tracking Glasses
Software

13 versions available

Using sound to reduce visual distraction from in-vehicle human–machine interfaces

Year: 2015

Authors: P Larsson, M Niemand

Objective: Driver distraction and inattention are the main causes of accidents. The fact that devices such as navigation displays and media players are part of the distraction problem has led to the formulation of guidelines advocating various means for minimizing the visual distraction from such interfaces. However, although design guidelines and recommendations are followed, certain interface interactions, such as menu browsing, still require off-road visual attention that increases crash risk. In this article, we investigate whether adding sound to an in-vehicle user interface can provide the support necessary to create a significant reduction in glances toward a visual display when browsing menus. Methods: Two sound concepts were developed and studied; spearcons (time-compressed speech sounds) and earcons (musical sounds). A simulator study was conducted in which 14 participants between the ages of 36 and 59 took part. Participants performed 6 different interface tasks while driving along a highway route. A 3 × 6 within-group factorial design was employed with sound (no sound /earcons/spearcons) and task (6 different task types) as factors. Eye glances and corresponding measures were recorded using a head-mounted eye tracker. Participants’ self-assessed driving performance was also collected after each task with a 10-point scale ranging from 1 = very bad to 10 = very good. Separate analyses of variance (ANOVAs) were conducted for different eye glance measures and self-rated driving performance. Results: It was found that the added spearcon sounds significantly reduced total glance time as well as number of glances while retaining task time as compared to the baseline (= no sound) condition (total glance time M = 4.15 for spearcons vs. M = 7.56 for baseline, p =.03). The earcon sounds did not result in such distraction-reducing effects. Furthermore, participants ratings of their driving performance were statistically significantly higher in the spearcon conditions compared to the baseline and earcon conditions (M = 7.08 vs. M = 6.05 and M = 5.99 respectively, p =.035 and p =.002). Conclusions: The spearcon sounds seem to efficiently reduce visual distraction, whereas the earcon sounds did not reduce distraction measures or increase subjective driving performance. An aspect that must be further investigated is how well spearcons and other types of auditory displays are accepted by drivers in general and how they work in real traffic.

Eye Tracking Glasses
Simulator

11 versions available

The impact of an anticipatory eco-driver assistant system in different complex driving situations on the driver behavior

Year: 2014

Authors: CP Rommerskirchen, M Helmbrecht

The anticipatory advanced driver assistance system (ADAS) developed at the Institute of Ergonomics at the TU München assists to reduce the individual fuel consumption of each driver by anticipating earlier. The goal is to achieve improvements in as many road situations as possible. The paper gives an overview on the different options to support the driver to reduce its fuel consumption. It also discusses the possibilities of an extension of anticipation to support the driver in eco-driving. Related work shows that anticipatory advanced driver assistance systems help to save fuel, but they focus on the general potentials of the system. The presented study in this paper, however, deals with the question of the impact of different road traffic situations on an anticipatory driver assistance system. Different traffic scenarios were chosen and varied in its complexity to evaluate the impact of the complexity of different driving situations on an anticipatory ADAS. A driving simulator study was conducted with 27 participants. The results showed that the fuel consumption is reduced with the assistant system due to earlier and better reaction but that there is no influence of the complexity of a situation on that. The influence of the situation on the driver in his use of the ADAS can be shown by his visual behavior. The percentage of the gaze time on the human machine interface (HMI) on the system is significantly reduced in the more complex situations.

Simulator
Software

7 versions available

Where am I? Investigating map matching during self‐localization with mobile eye tracking in an urban environment

Year: 2014

Authors: P Kiefer,I Giannopoulos,M Raubal

Self-localization is the process of identifying one's current position on a map, and it is a crucial part of any wayfinding process. During self-localization the wayfinder matches visually perceptible features of the environment, such as landmarks, with map symbols to constrain potential locations on the map. The success of this visual matching process constitutes an important factor for the success of self-localization. In this research we aim at observing the visual matching process between environment and map during self-localization with real-world mobile eye tracking. We report on one orientation and one self-localization experiment, both in an outdoor urban environment. The gaze data collected during the experiments show that successful participants put significantly more visual attention to those symbols on the map that were helpful in the given situation than unsuccessful participants. A sequence analysis revealed that they also had significantly more switches of visual attention between map symbols and their corresponding landmarks in the environment, which suggests they were following a more effective self-localization strategy.

Eye Tracking Glasses
Software

10 versions available

Free-hand pointing for identification and interaction with distant objects

Year: 2013

Authors: S Rümelin,C Marouane,A Butz

In this paper, we investigate pointing as a lightweight form of gestural interaction in cars. In a pre-study, we show the technical feasibility of reliable pointing detection with a depth camera by achieving a recognition rate of 96% in the lab. In a subsequent in-situ study, we let drivers point to objects inside and outside of the car while driving through a city. In three usage scenarios, we studied how this influenced their driving objectively, as well as subjectively. Distraction from the driving task was compensated by a regulation of driving speed and did not have a negative influence on driving behaviour. Our participants considered pointing a desirable interaction technique in comparison to current controller-based interaction and identified a number of additional promising use cases for pointing in the car.

Simulator
Software

7 versions available

HCI 2013

Year: 2013

Authors: M Hotel

Our vision of Tangible Bits is carried out through an artistic approach. Whereas today’s mainstream Human Computer Interaction (HCI) and Design research address functional concerns – the needs of users, practical applications, and usability evaluation – Tangible Bits is a vision driven by concepts. This is because today’s technologies will become obsolete in one year, and today’s applications will be replaced in 10 years, but true visions – we believe – can last longer than 100 years. Tangible Bits seeks to realize seamless interfaces between humans, digital information, and the physical environment by giving physical form to digital information, making bits directly manipulable and perceptible. Our goal is to invent new design media for artistic expression as well as for scientific analysis, taking advantage of the richness of human senses and skills – as developed through our lifetime of interaction with the physical world – as well as the computational reflection enabled by real-time sensing and digital feedback. I will present the trajectory of our vision-driven research and a variety of interaction design projects that were presented and exhibited in Media Arts, Design, and Science communities including: ICC, Ars Electronica, Centre Pompidou, Victoria and Albert Museum, Venice Biennale, ArtFutura, IDSA, ICSID, AIGA, ACM CHI, SIGGRAPH, UIST, CSCW.

Eye Tracking Glasses
Software

5 versions available

The influence of gaze history visualization on map interaction sequences and cognitive maps

Year: 2013

Authors: I Giannopoulos, P Kiefer, M Raubal

The restricted spatial context of mobile devices with small displays often makes orientation on mobile maps very difficult. In this paper we highlight and discuss the advantages of a proposed interaction concept called GeoGazemarks that can be used to facilitate orientation by utilizing the history of a user's visual attention as a visual clue. We present novel results of the analyses of interaction sequences and cognitive sketches collected during a user study with 40 participants. In addition, we correlate the increase in efficiency gained by GeoGazemarks with participants' self-reported spatial abilities, revealing significant negative correlations between spatial abilities, time efficiency gain and interaction sequences length.

Eye Tracking Glasses
Software

3 versions available

Towards the automated recognition of assistance need for drivers with impaired visual field

Year: 2013

Authors: E Kasneci

Mobility enabled through driving is a crucial aspect of today’s social lives. It concerns young and elderly people and is critical for those among us suffering from visual field defects. Since driving primarily involves visual input, such people are often considered as unsafe drivers and banned from driving, although several recent studies, including our own, provide evidence that even severe visual field defects can be compensated through effective visual search strategies. In this context, this work pursues the challenging vision of adaptive driving assistance systems that take the visual deficits of the driver into account to enable a safer driving experience. The main challenges towards this vision are: (1) individual analysis and detection of visual field defects, (2) online analysis of visual search behavior, and (3) integrated analysis of visual deficits, search behavior, and traffic objects to identify and draw the driver’s attention towards potential hazards. Each of the above challenges is approached by customized methods. For (1), a mobile method for the assessment of the visual field and an algorithm for the recognition of the type of the visual field defect are proposed. For (2), an online probabilistic method is combined with algebraic analysis of the driver’s gaze. For (3), a detailed analysis of the driving scene is combined with the above methods to reliably detect hazardous traffic objects that might be overlooked by the driver. The methods were evaluated on real-world data from driving experiments with patients suffering from visual field defects. In combination, they improve over state-of-the-art techniques by being flexible, adaptive, and reliable. The feasibility of detecting objects that might be overlooked by the driver, and thus an adaptive assistance need, is demonstrated in different user studies. The methods developed in this work have a broad applicability that reaches beyond the driving context. Their application to a variety of tasks involving visual perception might help better understand its underlying mechanisms. Some of these tasks are already being investigated and will also be presented in this thesis.

Eye Tracking Glasses
Software

4 versions available

Evaluation of Automotive HMI Using Eye-Tracking-Revision of the EN ISO 15007-1 & ISO TS 15007-2

Year: 2012

Authors: C Lange

This paper presents the revision of documents EN ISO 15007-1 and ISO/TS 15007-2 which was done by the ISO TC22SC13WG8 working group consisting of eye-tracking specialists worldwide. Both ISO documents were published in 1999. Since then many research studies were conducted, which lead to an increasing level of knowledge about eye-movement behavior. In parallel to that, eye-tracking technology developed enabling fully automated data analysis. Due to that both standards were revised in the ISO TC22SC13WG8 working group to include latest findings in eye-movement behavior and latest developments in eye-tracking technology.

Eye Tracking Glasses
Software

1 version available:

Gaze map matching: mapping eye tracking data to geographic vector features

Year: 2012

Authors: P Kiefer,I Giannopoulos

This paper introduces gaze map matching as the problem of algorithmically interpreting eye tracking data with respect to geographic vector features, such as a road network shown on a map. This differs from previous eye tracking studies which have not taken into account the underlying vector data of the cartographic map. The paper explores the challenges of gaze map matching and relates it to the (vehicle) map matching problem. We propose a gaze map matching algorithm based on a Hidden Markov Model, and compare its performance with two purely geometric algorithms. Two eye tracking data sets recorded during the visual inspection of 14 road network maps of varying realism and complexity are used for this evaluation.

Eye Tracking Glasses
Software

8 versions available