Publication Hub Archive

Transportation & Mobility

You have reached the Ergoneers Publication Hub for:

Field of Application > Transportation & Mobility

Find all Publications here:

Publication Hub

Total results: 264

Adjunct Proceedings

Year: 2014

Authors: LN Boyle,AL Kun,S Osswald, B Pearce,D Szostak

The MIT AgeLab n-back: a multi-modal android application implementation 56 Cognitive Workload and Driver Glance Behavior 62 Using an OpenDS Driving Simulator for Car Following: A First Attempt 64 Cognitive load in autonomous vehicles 70 WS3: Pointing towards future automotive HMIs: The potential for gesture 74 Linda Angell, Yu Zhang Page 8 Adjunct Proceedings of the 6 th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14), Sept. 17–19, 2014, Seattle, WA, USA vii Pointing Towards Future Automotive HMIs: The Potential for Gesture Interaction 75 Applying Popular Usability Heuristics to Gesture Interaction in the Vehicle 81 The steering wheel as a touch interface: Using thumb-based gestural interfaces as control inputs while driving 88 WS4: EVIS 2014 3rd Workshop on Electric Vehicle Information Systems 92 Sebastian Osswald, Technische Universität München, Germany Sebastian Loehmann, University of Munich (LMU), Germany Anders Lundström, Royal Institute of Technology, Sweden Ronald Schroeter, Queensland University of Technology, Australia Andreas Butz, University of Munich (LMU), Germany Markus Lienkamp, Technische Universität München, Germany Page 15 Adjunct Proceedings of the 6 th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14), Sept. 17–19, 2014, Seattle, WA, USA 121 Workshop 5: Human Factors Design Principles for the Driver-Vehicle Interface (DVI) Organizers: John L. Campbell, Battelle, USA Christian M. Richard, Battelle, USA L. Paige Bacon, Battelle, USA Zachary R. Doerzaph, Virginia Tech Transportation Institute, USA Page 16 Adjunct Proceedings of the 6 th International Conference on Automotive Interfaces and Interactive Vehicular Applications (AutomotiveUI '14), Sept. 17–19, 2014, Seattle, WA, USA 128 Workshop 6: Designing for People: Keeping the User in mind Organizers: JohnRobert Wilson, User Experience (UX) Group, Fujitsu Ten Corp. of America Jenny Le, User Experience (UX) Group, Fujitsu Ten Corp. of America Page 17 Adjunct Proceedings of the 6 th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14), Sept. 17–19, 2014, Seattle, WA, USA 133 Workshop 7: 2nd Workshop on User Experience of Autonomous Driving at AutomotiveUI 2014 Organizers: Alexander Meschtscherjakov, University of Salzburg, Austria Manfred Tscheligi, University of Salzburg, Austria Dalila Szostak, Google, USA Rabindra Ratan, Michigan State University, USA Ioannis Politis, University of Glasgow, UK Roderick McCall, University of Luxembourg, Luxembourg Sven Krome, RMIT University, Australia Page 18 Adjunct Proceedings of the 6 th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14), Sept. 17–19, 2014, Seattle, WA, USA 152 Workshop 8: Wearable Technologies for Automotive User Interfaces: Danger or Opportunity? Organizers: Maurizio Caon, University of Applied Sciences and Arts Western Switzerland, Switzerland Leonardo Angelini, University of Applied Sciences and Arts Western Switzerland, Switzerland Elena Mugellini, University of Applied Sciences and Arts Western Switzerland, Switzerland Michele Tagliabue, Paris Descartes University, France Paolo Perego, Politecnico di Milano, Italy Giuseppe Andreoni, Politecnico di Milano, Italy Page 19 Adjunct Proceedings of the 6 th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14), Sept. 17–19, 2014, Seattle, WA, USA 158 Work in Progress Page 20 Adjunct Proceedings of the 6 th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14), Sept. 17–19, 2014, Seattle, WA, USA 255 Interactive Demo

Simulator
Software

1 version available:

D5. 2-Plan for Integration of Empirical Analysis Techniques and Tools into the HF-RTP and Methodology

Year: 2014

Authors: MUTO Botta, STWT Borchers, CTWT Curio

This deliverable consists of two parts. The “Integration Plan Common Part” is shared by the deliverables D2.2 to D5.2. It explains how to integrate methods, tools and techniques (MTTs) into the Human Factors Reference Technology Platform (HF-RTP). The present document details the MTTs which will be contributed by WP5 as components to the HF-RTP. Details concerning the HoliDes RTP, its methodology and the integration of components can also be found in D1.1 and the forthcoming D1.3. Here, we will describe MTTs which the partners are developing or advancing in WP5 of HoliDes. These MTTs will eventually form the HF-RTP. They serve WP5’s vision to extend and develop empirical methods, which aid the design and development of adaptive, cooperative Human-machine systems. These methods support developers to conform to existing norms and standards. The MTTs of WP5 consist largely of empirical methods. Empirical methods are an integral part of any Human-centered systems engineering process. Their precise position and use in a workflow depends on the AdCoS under development, the organization that uses them, as well as individual considerations. These questions will determine the tailoring of the RTP for a specific use case. Empirical MTTs are an essential part of both, early and late stages of any design process of a Human-machine system, for example during requirements analysis or verification of Human Factors related non-functional requirements. However, empirical MTTs can also be an integral part of the development phase, especially when using principles of agile requirements engineering approaches. While in the CESAR RTP it is only software tools that manipulate data, in HoliDes various kinds of MTTs are being used. Each MTT that is part of the development and evaluation of an AdCoS manipulates data and is an integral part of the engineering environment.

Eye Tracking Glasses
Software

1 version available:

Designing driver assistance systems with crossmodal signals: Multisensory integration rules for saccadic reaction times apply

Year: 2014

Authors: R Steenken, L Weber,H Colonius,A Diederich

Modern driver assistance systems make increasing use of auditory and tactile signals in order to reduce the driver's visual information load. This entails potential crossmodal interaction effects that need to be taken into account in designing an optimal system. Here we show that saccadic reaction times to visual targets (cockpit or outside mirror), presented in a driving simulator environment and accompanied by auditory or tactile accessories, follow some well-known spatiotemporal rules of multisensory integration, usually found under confined laboratory conditions. Auditory nontargets speed up reaction time by about 80 ms. The effect tends to be maximal when the nontarget is presented 50 ms before the target and when target and nontarget are spatially coincident. The effect of a tactile nontarget (vibrating steering wheel) was less pronounced and not spatially specific. It is shown that the average reaction times are well-described by the stochastic “time window of integration” model for multisensory integration developed by the authors. This two-stage model postulates that crossmodal interaction occurs only if the peripheral processes from the different sensory modalities terminate within a fixed temporal interval, and that the amount of crossmodal interaction manifests itself in an increase or decrease of second stage processing time. A qualitative test is consistent with the model prediction that the probability of interaction, but not the amount of crossmodal interaction, depends on target–nontarget onset asynchrony. A quantitative model fit yields estimates of individual participants' parameters, including the size of the time window. Some consequences for the design of driver assistance systems are discussed.

Eye Tracking Glasses
Simulator

12 versions available

Driving with binocular visual field loss? A study on a supervised on-road parcours with simultaneous eye and head tracking

Year: 2014

Authors: E Kasneci,K Sippel, K Aehling, M Heister

Post-chiasmal visual pathway lesions and glaucomatous optic neuropathy cause binocular visual field defects (VFDs) that may critically interfere with quality of life and driving licensure. The aims of this study were (i) to assess the on-road driving performance of patients suffering from binocular visual field loss using a dual-brake vehicle, and (ii) to investigate the related compensatory mechanisms. A driving instructor, blinded to the participants' diagnosis, rated the driving performance (passed/failed) of ten patients with homonymous visual field defects (HP), including four patients with right (HR) and six patients with left homonymous visual field defects (HL), ten glaucoma patients (GP), and twenty age and gender-related ophthalmologically healthy control subjects (C) during a 40-minute driving task on a pre-specified public on-road parcours. In order to investigate the subjects' visual exploration ability, eye movements were recorded by means of a mobile eye tracker. Two additional cameras were used to monitor the driving scene and record head and shoulder movements. Thus this study is novel as a quantitative assessment of eye movements and an additional evaluation of head and shoulder was performed. Six out of ten HP and four out of ten GP were rated as fit to drive by the driving instructor, despite their binocular visual field loss. Three out of 20 control subjects failed the on-road assessment. The extent of the visual field defect was of minor importance with regard to the driving performance. The site of the homonymous visual field defect (HVFD) critically interfered with the driving ability: all failed HP subjects suffered from left homonymous visual field loss (HL) due to right hemispheric lesions. Patients who failed the driving assessment had mainly difficulties with lane keeping and gap judgment ability. Patients who passed the test displayed different exploration patterns than those who failed. Patients who passed focused longer on the central area of the visual field than patients who failed the test. In addition, patients who passed the test performed more glances towards the area of their visual field defect. In conclusion, our findings support the hypothesis that the extent of visual field per se cannot predict driving fitness, because some patients with HVFDs and advanced glaucoma can compensate for their deficit by effective visual scanning. Head movements appeared to be superior to eye and shoulder movements in predicting the outcome of the driving test under the present study scenario.

Eye Tracking Glasses
Simulator

19 versions available

Gaze guidance for the visually impaired

Year: 2014

Authors: TC Kübler,E Kasneci,W Rosenstiel

Visual perception is perhaps the most important sensory input. During driving, about 90% of the relevant information is related to the visual input [Taylor 1982]. However, the quality of visual perception decreases with age, mainly related to a reduce in the visual acuity or in consequence of diseases affecting the visual system. Amongst the most severe types of visual impairments are visual field defects (areas of reduced perception in the visual field), which occur as a consequence of diseases affecting the brain, e.g., stroke, brain injury, trauma, or diseases affecting the optic nerve, e.g., glaucoma. Due to demographic aging, the number of people with such visual impairments is expected to rise [Kasneci 2013]. Since persons suffering from visual impairments may overlook hazardous objects, they are prohibited from driving. This, however, leads to a decrease in quality of life, mobility, and participation in social life. Several studies have shown that some patients show a safe driving behavior despite their visual impairment by performing effective visual exploration, i.e., adequate eye and head movements (e.g., towards their visual field defect [Kasneci et al. 2014b]). Thus, a better understanding of visual perception mechanisms, i.e., of why and how we attend certain parts of our environment while 'ignoring' others, is a key question to helping visually impaired persons in complex, real-life tasks, such as driving a car.

Eye Tracking Glasses
Simulator

3 versions available

Masking Action Relevant Stimuli in dynamic environments–The MARS method

Year: 2014

Authors: L Rittger,A Kiesel,G Schmidt, C Maag

We present the novel MARS (Masking Action Relevant Stimuli) method for measuring drivers’ information demand for an action relevant stimulus in the driving scene. In a driving simulator setting, the traffic light as dynamic action relevant stimulus was masked. Drivers pressed a button to unmask the traffic light for a fixed period of time as often as they wanted. We compared the number of button presses with the number of fixations on the traffic light in a separate block using eye tracking. For the driving task, we varied the road environment by presenting different traffic light states, by adding a lead vehicle or no lead vehicle and by manipulating the visibility of the driving environment by fog or no fog. Results showed that these experimental variations affected the number of button presses as dependent measure of the MARS method. Although the number of fixations was affected qualitatively similar, changes were more pronounced in the number of fixations compared to the number of button presses. We argue that the number of button presses is an indicator for action relevance of the stimulus, complementing or even substituting the recording and analyses of gaze behaviour for specific research questions. In addition, using the MARS method did not change dynamic driving behaviour and driving with the MARS method was neither disturbing, nor difficult to learn. Future research is required to show the generalisability of the method to other stimuli in the driving scene.

Simulator
Software

10 versions available

SteerPad Development and Evaluation of a Touchpad in the Steering Wheel from a User Experience Perspective

Year: 2014

Authors: V Swantesson, D Gunnarsson

Driver safety has since the birth of automobiles been paramount. In a time when technologies are changing the way people interact with the outside world, the vehicle industries need to keep up with these changes in terms of both safety and user experience. When trying to assess this complication, some of these technologies have been integrated into the cars, thus leading to more distractions while driving. This thesis describes this dilemma as the gap between automobile safety and in-vehicle infotainment. By the use of a touchpad installed on the right hand side of the steering wheel, the thesis has developed and evaluated a prototype interface that is located in the vehicles dashboard display with goals to lower driver distraction. This touchpad is developed with three main sources of interaction; swipes, tactile interaction and character recognition. By merging and combining these sources the thesis has successfully developed a test prototype to be used for evaluation. The prototype was tested against an already existing in-vehicle information system where a number of use cases and scenarios were used to test the systems in terms of usability and user experience. Guidelines on safety regulations set by NHTSA have been studied and applied to the projects development and user studies. Test results indicate that this technology has the potential to lower the driver distraction while still maintaining a high level of usability and user experience. Finally the thesis presents a number of suggestions and ideas in reference to further development and studies.

Simulator
Software

3 versions available

The impact of an anticipatory eco-driver assistant system in different complex driving situations on the driver behavior

Year: 2014

Authors: CP Rommerskirchen, M Helmbrecht

The anticipatory advanced driver assistance system (ADAS) developed at the Institute of Ergonomics at the TU München assists to reduce the individual fuel consumption of each driver by anticipating earlier. The goal is to achieve improvements in as many road situations as possible. The paper gives an overview on the different options to support the driver to reduce its fuel consumption. It also discusses the possibilities of an extension of anticipation to support the driver in eco-driving. Related work shows that anticipatory advanced driver assistance systems help to save fuel, but they focus on the general potentials of the system. The presented study in this paper, however, deals with the question of the impact of different road traffic situations on an anticipatory driver assistance system. Different traffic scenarios were chosen and varied in its complexity to evaluate the impact of the complexity of different driving situations on an anticipatory ADAS. A driving simulator study was conducted with 27 participants. The results showed that the fuel consumption is reduced with the assistant system due to earlier and better reaction but that there is no influence of the complexity of a situation on that. The influence of the situation on the driver in his use of the ADAS can be shown by his visual behavior. The percentage of the gaze time on the human machine interface (HMI) on the system is significantly reduced in the more complex situations.

Simulator
Software

7 versions available

Traffic light assistant-what the users want

Year: 2014

Authors: M Krause, A Rissel,K Bengler

In a driving simulator experiment, a prototypical traffic light phase assistant is assessed. The main research issue: How would a user customize the system? As a sideline, data is gathered with a special Detection Response Task (DRT), the Tactile Detection Task (TDT), in conjunction with an auditory cognitive task as reference. Recorded gaze data, driving behavior, subjective ratings with a System Usability Scale (SUS) and an AttrakDiff2 -questionnaire are also reported. The subjects were able to customize ten parameters of the traffic light assistant system. The so personalized system configuration showed no great enhancement in the subjective ratings; thus, the later application implementation will include only little configuration features for the user. However, the test persons exhibited a willingness to be informed about speeding by a speed alerting function within the traffic light assistant system. The performance (reaction time) of the TDT is interpreted as a measure for the cognitive load while using the interface. The auditory cognitive task prolonged the reaction times for a tactile detection task more than the traffic light information system. The glance times are in line with current guidelines and the driving behavior shows a potential benefit for safety. Thus, the reported experiment evaluates an interface for use while driving with objective metrics regarding distraction and subjective results related to usability and joy-of-use.

Eye Tracking Glasses
Simulator

4 versions available

Wayfinding decision situations: A conceptual model and evaluation

Year: 2014

Authors: I Giannopoulos,P Kiefer,M Raubal,KF Richter

Humans engage in wayfinding many times a day. We try to find our way in urban environments when walking towards our work places or when visiting a city as tourists. In order to reach the targeted destination, we have to make a series of wayfinding decisions of varying complexity. Previous research has focused on classifying the complexity of these wayfinding decisions, primarily looking at the complexity of the decision point itself (e.g., the number of possible routes or branches). In this paper, we proceed one step further by incorporating the user, instructions, and environmental factors into a model that assesses the complexity of a wayfinding decision. We constructed and evaluated three models using data collected from an outdoor wayfinding study. Our results suggest that additional factors approximate the complexity of a wayfinding decision better than the simple model using only the number of branches as a criterion.

Eye Tracking Glasses
Simulator

11 versions available