Publication Hub Archive

GPS

You have reached the Ergoneers Publication Hub for:

Used Tool > GPS

Find all Publications here:

Publication Hub

Total results: 129

GeoGazemarks: Providing gaze history for the orientation on small display maps

Year: 2012

Authors: I Giannopoulos,P Kiefer,M Raubal

Orientation on small display maps is often difficult because the visible spatial context is restricted. This paper proposes to provide the history of a user's visual attention on a map as visual clue to facilitate orientation. Visual attention on the map is recorded with eye tracking, clustered geo-spatially, and visualized when the user zooms out. This implicit gaze-interaction concept, called GeoGazemarks, has been evaluated in an experiment with 40 participants. The study demonstrates a significant increase in efficiency and an increase in effectiveness for a map search task, compared to standard panning and zooming.

Eye Tracking Glasses
Software

5 versions available

Location-aware mobile eye tracking for the explanation of wayfinding behavior

Year: 2012

Authors: P Kiefer, F Straub,M Raubal

This paper proposes location-aware mobile eye-tracking as a new method for the explanation of wayfinding behavior. We argue that the recording and analysis of an individual’s visual attention during wayfinding can provide insights into her spatial abilities and employed wayfinding strategies. Examples from an explorative pilot study with a pedestrian audio guide in an urban context illustrate that mobile eye-tracking data provide new possibilities to analyze map usage, landmark identification, and orientation strategies.

Eye Tracking Glasses
Software

8 versions available

Towards location-aware mobile eye tracking

Year: 2012

Authors: P Kiefer, F Straub,M Raubal

This paper considers the impact of location as context in mobile eye tracking studies that extend to large-scale spaces, such as pedestrian wayfinding studies. It shows how adding a subject's location to her gaze data enhances the possibilities for data visualization and analysis. Results from an explorative pilot study on mobile map usage with a pedestrian audio guide demonstrate that the combined recording and analysis of gaze and position can help to tackle research questions on human spatial problem solving in a novel way.

Eye Tracking Glasses
Software

10 versions available

Traffic Light Assistant–Driven in a Simulator

Year: 2012

Authors: M Krause,K Bengler

In a driving simulator experiment, different interfaces for a traffic light phase assistant on a smartphone were tested. Changes and metrics of driving behavior concerning fuel consumption, compliance to the assistant’s recommendations, speeding, reaction on display change, gaze statistics and gaze transition probabilities, are covered. The careful use of smartphones could be a cost effective solution for driver information. The current simulator based results showed no critical safety issues. Future trails will include the system being tested on the road.

Simulator
Software

2 versions available

Assessment and support of error recognition in automated driving

Year: 2011

Authors: W Spießl

Technical progress in the field of automated driving research is about to alter the way of driving from manual control toward supervision of automated control. The increasing dissemination of advanced driver assistance systems brings more and more people into contact with (semi-)automated systems that do not only warn against certain dangers and intervene if necessary, but are also able to take over parts of the driving task. Automated vehicles have the potential to increase traffic safety, efficiency and to reduce the driver’s workload. This requires systems working with absolute perfection that sense and interpret the environment correctly at any time and transform this information into adequate actions. However, such systems are not yet available today. Therefore it is necessary that the driver supervises automated vehicle control systems in order to be able to recognise automation errors and to intervene. Even if there is still a long way to go, it is worth taking a look at the ramifications an automated driving task implies. Currently, there is no methodical approach for a systematic assessment of human error recognition capabilities in the context of automated driving. The Lane Change Test is a standardized and well known method to measure driver performance under varying side conditions. In this thesis, this test has been further developed into the Automated Lane Change Test (ALCT). The ALCT allows the measurement of error recognition performance during an automated drive in a driving simulation environment using a set of objective metrics (mean response time, missed errors, false interventions). In several studies, this method has been assessed for objectivity, reliability and validity. It proved sensitive for different secondary task conditions. Tasks requiring active engagement showed the most prominent effect on error recognition and response. Haptic feedback through the steering wheel showed a positive effect on error recognition performance. There are more potential measures imaginable in order to improve the recognition of automation errors, in particular the difficult situation of slowly drifting out of the lane. After a discussion of these measures for effectiveness and acceptance, the most promising idea for improving this situation has been implemented, a prospective driving path display that visualizes the vehicle’s trajectory in the near future based on sensor data. By comparing the predicted path with the actual course of the road, a deviation caused by erroneous automation behaviour can be recognised earlier and potentially critical situations can be avoided. A user study showed that such a display should be realised in the form of a contact analogue head-up display following the paradigm of Augmented Reality, since the error recognition results were best in this condition.

Simulator
Software

3 versions available

Eyeing the Supreme Court’s Challenge: A Proposal to Use Eye Tracking to Determine the Effects of Television Courtroom Broadcasting

Year: 2011

Authors: P Lambert

This study investigates the impact of climate change on migratory bird patterns. We have identified several key changes in migration timings and routes correlated with rising global temperatures. Data were collected over a 10-year period, spanning multiple continents. Our findings suggest urgent need for new conservation strategies to protect vulnerable species.

Eye Tracking Glasses
Software

1 version available:

Analysis of glance movements in critical intersection scenarios

Year: 2010

Authors: M Plavsic,K Bengler, H Bubb

For designing effective and ergonomic assistance systems for road intersections it is highly beneficial to gain an understanding of the causes of driver's errors. At intersections errors depend mainly on the applied visual strategies and perceived information. This paper reports on a study conducted in the fixed-base driving simulator, with an objective to compare driver's visual strategies among three left-turn intersection scenarios, which can become critical with respect to safety.

Eye Tracking Glasses
Simulator

5 versions available

Reduction of fuel consumption by early anticipation and assistance of deceleration phases

Year: 2010

Authors: D Popiv,K Bengler, M Rakic

This work deals with the investigation of advanced driver assistance system (ADAS) which helps the driver to perform phases of deceleration in an efficient and safe manner. The concept of the assistance system is supported by early recognition of deceleration situations with the help of new sources of traffic information such as GPS based systems, car-to-car, and car-to-infrastructure communication. The system presents visual information to the drivers in order to enhance their anticipation. Together with the representation of emerging situation, the assistance suggests coasting a vehicle from the currently driven speed to the upcoming lower goal speed in order to reduce fuel consumption. If coasting does not suffice, the system will suggest moderate braking. It is left to the driver‟s consideration to accept the system‟s advice. The analysis of the estimated fuel consumption and the acceptance of the assistance system are done using situational, driving, visual, and subjective data which were collected during the experiment drives in the fixed-base simulator. Twenty six test subjects took part in the experiment; their average age was thirty four years. After a simulator training, they had to complete three experiment drives in the permuted order each lasting between seventeen and twenty minutes: one drive without any assistance (baseline condition), one with the innovative visual assistance using a bird‟s eye-view perspective on the emerging deceleration situation, and one with an iconic representation of it. Visual concepts are displayed in the digital instrument cluster. Each drive consists of thirteen deceleration situations which occur on rural, highway, and city roads. This work presents the results regarding the influence of the system on the driving behavior. The analysis data of two assisted drives are compared to the baseline condition. The results show the significant reduction of the estimated fuel consumption in particular situations (up to 50%) and in the entire drive (approximately on 4%). Maximum decelerations are significantly reduced in the investigated safety critical situation. The drivers are able to avoid collisions, which happened during baseline drives.

Simulator
Software

2 versions available

A high-level description and performance evaluation of pupil invisible

Year: 2009

Authors: M Tonsen, CK Baumann,K Dierkes

Head-mounted eye trackers promise convenient access to reliable gaze data in unconstrained environments. Due to several limitations, however, often they can only partially deliver on this promise. Among those are the following: (i) the necessity of performing a device setup and calibration prior to every use of the eye tracker, (ii) a lack of robustness of gaze-estimation results against perturbations, such as outdoor lighting conditions and unavoidable slippage of the eye tracker on the head of the subject, and (iii) behavioral distortion resulting from social awkwardness, due to the unnatural appearance of current head-mounted eye trackers. Recently, Pupil Labs released Pupil Invisible glasses, a head-mounted eye tracker engineered to tackle these limitations. Here, we present an extensive evaluation of its gaze-estimation capabilities. To this end, we designed a data-collection protocol and evaluation scheme geared towards providing a faithful portrayal of the real-world usage of Pupil Invisible glasses. In particular, we develop a geometric framework for gauging gaze-estimation accuracy that goes beyond reporting mean angular accuracy. We demonstrate that Pupil Invisible glasses, without the need of a calibration, provide gaze estimates which are robust to perturbations, including outdoor lighting conditions and slippage of the headset.

Eye Tracking Glasses
Software

4 versions available