Publication Hub Archive

Transportation & Mobility

You have reached the Ergoneers Publication Hub for:

Field of Application > Transportation & Mobility

Find all Publications here:

Publication Hub

Total results: 264

Driving with a partially autonomous forward collision warning system: How do drivers react?

Year: 2012

Authors: E Muhrer, K Reinprecht,M Vollrath

Objective: The effects of a forward collision warning (FCW) and braking system (FCW+) were examined in a driving simulator study analyzing driving and gaze behavior and the engagement in a secondary task. Background: In-depth accident analyses indicate that a lack of appropriate expectations for possible critical situations and visual distraction may be the major causes of rear-end crashes. Studies with FCW systems have shown that a warning alone was not enough for a driver to be able to avoid the accident. Thus, an additional braking intervention by such systems could be necessary. Method: In a driving simulator experiment, 30 drivers took part in a car-following scenario in an urban area. It was assumed that different lead car behaviors and environmental aspects would lead to different drivers’ expectations of the future traffic situation. Driving with and without FCW+ was introduced as a between-subjects factor. Results: Driving with FCW+ resulted in significantly fewer accidents in critical situations. This result was achieved by the system’s earlier reaction time as compared with that of drivers. The analysis of the gaze behavior showed that driving with the system did not lead to a stronger involvement in secondary tasks. Conclusion: The study supports the hypotheses about the importance of missing expectations for the occurrence of accidents. These accidents can be prevented by an FCW+ that brakes autonomously. Application: The results indicate that an autonomous braking intervention should be implemented in FCW systems to increase the effectiveness of these assistance systems.

Eye Tracking Glasses
Software

8 versions available

Evaluation of Automotive HMI Using Eye-Tracking-Revision of the EN ISO 15007-1 & ISO TS 15007-2

Year: 2012

Authors: C Lange

This paper presents the revision of documents EN ISO 15007-1 and ISO/TS 15007-2 which was done by the ISO TC22SC13WG8 working group consisting of eye-tracking specialists worldwide. Both ISO documents were published in 1999. Since then many research studies were conducted, which lead to an increasing level of knowledge about eye-movement behavior. In parallel to that, eye-tracking technology developed enabling fully automated data analysis. Due to that both standards were revised in the ISO TC22SC13WG8 working group to include latest findings in eye-movement behavior and latest developments in eye-tracking technology.

Eye Tracking Glasses
Software

1 version available:

Gaze map matching: mapping eye tracking data to geographic vector features

Year: 2012

Authors: P Kiefer,I Giannopoulos

This paper introduces gaze map matching as the problem of algorithmically interpreting eye tracking data with respect to geographic vector features, such as a road network shown on a map. This differs from previous eye tracking studies which have not taken into account the underlying vector data of the cartographic map. The paper explores the challenges of gaze map matching and relates it to the (vehicle) map matching problem. We propose a gaze map matching algorithm based on a Hidden Markov Model, and compare its performance with two purely geometric algorithms. Two eye tracking data sets recorded during the visual inspection of 14 road network maps of varying realism and complexity are used for this evaluation.

Eye Tracking Glasses
Software

8 versions available

Towards location-aware mobile eye tracking

Year: 2012

Authors: P Kiefer, F Straub,M Raubal

This paper considers the impact of location as context in mobile eye tracking studies that extend to large-scale spaces, such as pedestrian wayfinding studies. It shows how adding a subject's location to her gaze data enhances the possibilities for data visualization and analysis. Results from an explorative pilot study on mobile map usage with a pedestrian audio guide demonstrate that the combined recording and analysis of gaze and position can help to tackle research questions on human spatial problem solving in a novel way.

Eye Tracking Glasses
Software

10 versions available

Traffic Light Assistant–Driven in a Simulator

Year: 2012

Authors: M Krause,K Bengler

In a driving simulator experiment, different interfaces for a traffic light phase assistant on a smartphone were tested. Changes and metrics of driving behavior concerning fuel consumption, compliance to the assistant’s recommendations, speeding, reaction on display change, gaze statistics and gaze transition probabilities, are covered. The careful use of smartphones could be a cost effective solution for driver information. The current simulator based results showed no critical safety issues. Future trails will include the system being tested on the road.

Simulator
Software

2 versions available

Traffic light assistant–evaluation of information presentation

Year: 2012

Authors: M Krause,K Bengler

To reduce stops at traffic lights and related fuel consumption, a car driver would need information about the state of upcoming signals. Previous research has shown the potential for this (Thoma et al. 2008, Popiv et al. 2010). One aspect of the German pilot project (KOLIBRI cooperative optimization of traffic signal control) is to deliver this information to the car via already installed mobile phone networks. The information needs to be displayed to the driver while driving. So, on one hand, special care must be taken for suitability while driving; on the other hand, the interface needs a pleasant design for acceptance by the driver. In a static driving simulator, five human-machine interface designs for a smartphone were tested according to objective measurements (glance duration) and subjective ratings (SUS, AttrakDiff).

Simulator
Software

6 versions available

Assessment and support of error recognition in automated driving

Year: 2011

Authors: W Spießl

Technical progress in the field of automated driving research is about to alter the way of driving from manual control toward supervision of automated control. The increasing dissemination of advanced driver assistance systems brings more and more people into contact with (semi-)automated systems that do not only warn against certain dangers and intervene if necessary, but are also able to take over parts of the driving task. Automated vehicles have the potential to increase traffic safety, efficiency and to reduce the driver’s workload. This requires systems working with absolute perfection that sense and interpret the environment correctly at any time and transform this information into adequate actions. However, such systems are not yet available today. Therefore it is necessary that the driver supervises automated vehicle control systems in order to be able to recognise automation errors and to intervene. Even if there is still a long way to go, it is worth taking a look at the ramifications an automated driving task implies. Currently, there is no methodical approach for a systematic assessment of human error recognition capabilities in the context of automated driving. The Lane Change Test is a standardized and well known method to measure driver performance under varying side conditions. In this thesis, this test has been further developed into the Automated Lane Change Test (ALCT). The ALCT allows the measurement of error recognition performance during an automated drive in a driving simulation environment using a set of objective metrics (mean response time, missed errors, false interventions). In several studies, this method has been assessed for objectivity, reliability and validity. It proved sensitive for different secondary task conditions. Tasks requiring active engagement showed the most prominent effect on error recognition and response. Haptic feedback through the steering wheel showed a positive effect on error recognition performance. There are more potential measures imaginable in order to improve the recognition of automation errors, in particular the difficult situation of slowly drifting out of the lane. After a discussion of these measures for effectiveness and acceptance, the most promising idea for improving this situation has been implemented, a prospective driving path display that visualizes the vehicle’s trajectory in the near future based on sensor data. By comparing the predicted path with the actual course of the road, a deviation caused by erroneous automation behaviour can be recognised earlier and potentially critical situations can be avoided. A user study showed that such a display should be realised in the form of a contact analogue head-up display following the paradigm of Augmented Reality, since the error recognition results were best in this condition.

Simulator
Software

3 versions available

Development of Distraction-free control systems in a Driving simulator

Year: 2011

Authors: C Lange,K Bengler, R Spies, M Wohlfarter

Ensuring driver safety by minimizing distraction is crucial in the development of in-vehicle information and communication systems. The article discusses the design and evaluation of control systems intended to reduce driver distraction, leveraging a driving simulator for the research. The work examines various standards and guidelines for in-vehicle display systems, as well as metrics for assessing driver distraction and cognitive workload. The goal is to enhance road safety through improved driver assistance systems, such as adaptive cruise control, by ensuring that they do not overly tax the driver’s attention. The findings are based on extensive behavioral and eye-tracking studies conducted in a simulated driving environment.

Eye Tracking Glasses
Simulator

2 versions available

Measurement of Driver’s Distraction for an Early Prove of Concepts in Automotive Industry at the Example of the Development of a Haptic Touchpad

Year: 2011

Authors: R Spies, A Blattner, C Lange, M Wohlfarter

This contribution shows how it is possible to integrate the user’s behavior in the development process in a very early stage of concept. Therefore innovative applied methodologies for objectifying human behavior such as eye tracking or video observation like the Dikablis/ DLab environment in the Audi driving simulator are necessary. A demonstrative example therefore is the predevelopment of a touchpad with an adjustable haptic surface as a concept idea for infotainment interaction with the Audi MMI. First an overview of the idea of capturing human behavior for evaluating concept ideas in a very early stage of the development process is given and how it is realized with the Dikablis and DLab environment. Furthermore the paper describes the concept idea of the innovative control element of the haptic touchpad as well as the accompanied upcoming demands for research and how these questions were clarified. At the end some example results are given.

Eye Tracking Glasses
Simulator

5 versions available

The influence of predictability and frequency of events on the gaze behaviour while driving

Year: 2011

Authors: R Kaul,M Baumann,B Wortelen

One possible reason for rear-end crashes might be that the driver is distracted as the driver does not pay enough attention to the driving task. Therefore allocation of attention must be appropriate to the demands of the current traffic situation. According to the SEEV-Model allocation of attention is determined by the expectancy that there will be new information in a visual channel. According to the model expectancy is determined by the event rate of the information. To investigate to what extent allocation of attention is determined by the absolute frequency of events or by the expected event rate an experiment was conducted in a dynamic driving simulator. The current results show that the predictability of the behaviour of the lead car has a bigger influence on the allocation of visual attention than the frequency of speed changes of a lead car and the frequency of a visual secondary task.

Eye Tracking Glasses
Software

7 versions available