Publication Hub Archive

Ergoneers VTK (Vehicle testing kit)

You have reached the Ergoneers Publication Hub for:

Product Name > Ergoneers VTK (Vehicle testing kit)

Find all Publications here:

Publication Hub

Total results: 145

Searching for street parking: effects on driver vehicle control, workload, physiology, and glances

Year: 2020

Authors: CT Ponnambalam,B Donmez

Urban areas that allow street parking exhibit a heightened crash risk that is often attributed to factors such as reduced road width, decreased visibility, and interruptions to traffic flow. No previous on-road studies have investigated how the demands of searching for parking affect driving performance, physiology, and visual attention allocation. We are interested in these effects on the driver and their possible influence on the safety of the environment. While simulator studies offer several benefits, the physical, mental and social pressures incurred by searching for parking in an urban streetscape cannot be emulated in a simulator. We conducted an on-road instrumented vehicle study with 28 participants driving in downtown Toronto, Canada to explore the effect of searching for street parking on drivers. During the experiment, participants drove two routes in a counterbalanced order: one route with a parking search task, and the other route as a baseline. Speed and lane position were measured via vehicle instrumentation, heart rate and galvanic skin response were measured through physiological sensors, and gaze position was collected through a head-mounted eye-tracker. Participants completed the NASA Task Load Index after each route. It was found that while searching for parking, participants drove slower and closer to the curb, and perceived higher workload. While there were no statistically significant effects in physiological measures, there was a rise in heart rate approaching statistical significance. A detailed analysis of eye-tracking data revealed a clear change in glance behavior while searching for parking, with an increase in long off-road glances (>2 s) and decrease in shorter off-road glances (<1.6 s). Some exhibited behaviors (e.g., slowing down) may be seen to compensate for the potentially negative effects of increased demands associated with parking search, while others (e.g., increase in long off-road glances) have the potential to increase crash risk. This study acts as an important first step in revealing changes in driving performance, physiology and glance behavior brought on by searching for parking in a real-world urban environment.

Eye Tracking Glasses
Software

9 versions available

Take-Over Time Modeling and Prediction for Conditional Driving Automation

Year: 2020

Authors: S Hwang

Conditional driving automation represents a pivotal milestone in the journey towards fully autonomous systems. At this intermediate level of automation, human drivers are periodically required to take over control from the automated system when specific conditions or scenarios are encountered. One of the key challenges in ensuring safety and effectiveness in such systems is understanding and predicting the human driver's take-over time (TOT) - the time it takes for a driver to respond to a takeover request. This dissertation focuses on modeling and predicting TOT by examining various factors that influence human performance during takeover events. By leveraging data from driving simulators and on-road experiments, the research delves into the effects of driver awareness, driving environment complexity, and the nature of the takeover request on TOT. The findings provide crucial insights for designing better human-machine interfaces and optimizing the transition process in conditional driving automation.

Eye Tracking Glasses
Simulator

2 versions available

The effect of visual HMIs of a system assisting manual drivers in manoeuvre coordination in system limit and system failure situations

Year: 2020

Authors: AK Kraft, C Maag, MI Cruz,M Baumann

Ambiguous situations in traffic often require communication and cooperation between road users. In order to resolve these situations and increase cooperative driving behavior in situations of merging or turning left, manual drivers could be assisted by an advanced driver assistance system (ADAS) for cooperative driving. This simulator study investigated the behavior of drivers confronted with system limits and failures of such a system. The ADAS used in this study informed the driver about an upcoming cooperation situation and gave advice on how to behave (e.g. reduce speed, change lane). Two test situations were implemented: a system freeze and an unexpected event, which could not be detected by the system. In order to find the most fitting HMI solution, the place of presentation (head-up display (HUD) vs. instrument cluster) as well as the form of presentation (dynamic vs. symbolic) were varied. The results indicated that the most fitting HMI solution to support the driver in a complex coordinated driving situation is a dynamic HUD, mainly due to the positive effect on glance behavior. However, advantages of both forms of presentation were revealed, as each form of presentation increased the probability of recognition for one of the test situations. The fewest collisions took place with the dynamic form of presentation.

Simulator
Software

6 versions available

The effects of a predictive HMI and different transition frequencies on acceptance, workload, usability, and gaze behavior during urban automated driving

Year: 2020

Authors: T Hecht, S Kratzert,K Bengler

Automated driving research as a key topic in the automotive industry is currently undergoing change. Research is shifting from unexpected and time-critical take-over situations to human machine interface (HMI) design for predictable transitions. Furthermore, new applications like automated city driving are getting more attention and the ability to engage in non-driving related activities (NDRA) starting from SAE Level 3 automation poses new questions to HMI design. Moreover, future introduction scenarios and automated capabilities are still unclear. Thus, we designed, executed, and assessed a driving simulator study focusing on the effect of different transition frequencies and a predictive HMI while freely engaging in naturalistic NDRA. In the study with 33 participants, we found transition frequency to have effects on workload and acceptance, as well as a small impact on the usability evaluation of the system. Trust, however, was not affected. The predictive HMI was used and accepted, as can be seen by eye-tracking data and the post-study questionnaire, but could not mitigate the above-mentioned negative effects induced by transition frequency. Most attractive activities were window gazing, chatting, phone use, and reading magazines. Descriptively, window gazing and chatting gained attractiveness when interrupted more often, while reading magazines and playing games were negatively affected by transition rate.

Eye Tracking Glasses
Simulator

7 versions available

The impact of auditory continual feedback on take-overs in Level 3 automated vehicles

Year: 2020

Authors: G Cohen

Objective: To implement auditory continual feedback into the interface design of a Level 3 automated vehicle and to test whether gaze behavior and reaction times of drivers improved in take-over situations. Background: When required to assume manual control in take-over situations, drivers of Level 3 automated vehicles are less likely than conventional drivers to spot potential hazards, and their reaction time is longer. Therefore, it is crucial that the interface of Level 3 automated vehicles will be designed to improve drivers’ performance in take-over situations. Method: In two experiments, participants drove a simulated route in a Level 3 automated vehicle for 35 min with one imminent take-over event. Participants’ gaze behavior and performance in an imminent take-over event were monitored under one of three auditory interface designs: (1) Continual feedback. A system that provides verbal driving-related feedback; (2) Persistent feedback. A system that provides verbal driving-related feedback and a persistent beep; and (3) Chatter feedback. A system that provides verbal non-driving-related feedback. Also, there was a control group without feedback. Results: Under all three auditory feedback designs, the number of drivers' on-road glances increased compared to no feedback, but none of the designs shortened reaction time to the imminent event. Conclusion: Increasing the number of on-road glances during automated driving does not necessarily improve drivers’ attention to the road and their reaction times during take-overs. Application: Possible implications for the effectiveness of auditory continual feedback should be considered when designing interfaces for Level 3 automated vehicles.

Eye Tracking Glasses
Simulator

7 versions available

Understanding and Supporting Anticipatory Driving in Automated Vehicles

Year: 2020

Authors: D He

Understanding and Supporting Anticipatory Driving in Automated Vehicles He, Dengbo University of Toronto (Canada), 2020. Abstract: As automated vehicles (AVs) are increasingly becoming a reality on our roads, understanding the interaction between human drivers and these vehicles is critical. Anticipatory driving refers to the human driver's ability to predict and react to road events before they occur, a skill that enhances safety and efficiency. This dissertation explores methods to support anticipatory driving behaviors in AVs through improved human-vehicle interaction. The research identifies key anticipatory behaviors, develops support systems for these behaviors, and evaluates their effectiveness. Findings suggest that enhancing AV interfaces and feedback mechanisms can significantly improve human-vehicle collaboration and overall driving performance.

Simulator
Software

4 versions available

User interface for in-vehicle systems with on-wheel finger spreading gestures and head-up displays

Year: 2020

Authors: SH Lee, SO Yoon

Interacting with an in-vehicle system through a central console is known to induce visual and biomechanical distractions, thereby delaying the danger recognition and response times of the driver and significantly increasing the risk of an accident. To address this problem, various hand gestures have been developed. Although such gestures can reduce visual demand, they are limited in number, lack passive feedback, and can be vague and imprecise, difficult to understand and remember, and culture-bound. To overcome these limitations, we developed a novel on-wheel finger spreading gestural interface combined with a head-up display (HUD) allowing the user to choose a menu displayed in the HUD with a gesture. This interface displays audio and air conditioning functions on the central console of a HUD and enables their control using a specific number of fingers while keeping both hands on the steering wheel. We compared the effectiveness of the newly proposed hybrid interface against a traditional tactile interface for a central console using objective measurements and subjective evaluations regarding both the vehicle and driver behaviour. A total of 32 subjects were recruited to conduct experiments on a driving simulator equipped with the proposed interface under various scenarios. The results showed that the proposed interface was approximately 20% faster in emergency response than the traditional interface, whereas its performance in maintaining vehicle speed and lane was not significantly different from that of the traditional one.

Eye Tracking Glasses
Simulator

7 versions available

A user experience‐based toolset for automotive human‐machine interface technology development

Year: 2019

Authors: MJ Pitts

The development of new automotive Human-Machine Interface (HMI) technologies must consider the competing and often conflicting demands of commercial value, User Experience (UX) and safety. Technology innovation offers manufacturers the opportunity to gain commercial advantage in a competitive and crowded marketplace, leading to an increase in the features and functionality available to the driver. User response to technology influences the perception of the brand as a whole, so it is important that in-vehicle systems provide a high-quality user experience. However, introducing new technologies into the car can also increase accident risk. The demands of usability and UX must therefore be balanced against the requirement for driver safety. Adopting a technology-focused business strategy carries a degree of risk, as most innovations fail before they reach the market. Obtaining clear and relevant information on the UX and safety of new technologies early in their development can help to inform and support robust product development (PD) decision making, improving product outcomes. In order to achieve this, manufacturers need processes and tools to evaluate new technologies, providing customer-focused data to drive development. This work details the development of an Evaluation Toolset for automotive HMI technologies encompassing safety-related functional metrics and UX measures. The Toolset consists of four elements: an evaluation protocol, based on methods identified from the Human Factors, UX and Sensory Science literature; a fixed-base driving simulator providing a context-rich, configurable evaluation environment, supporting both hardware and software-based technologies; a standardised simulation scenario providing a repeatable basis for technology evaluations, allowing comparisons across multiple technologies and studies; and a technology scorecard that collates and presents evaluation data to support PD decision making processes. The Evaluation Toolset was applied in three technology evaluation case studies, conducted in conjunction with the industrial partner, Jaguar Land Rover. All three were live technology development projects, representing hardware and software concepts with different technology readiness levels. Case study 1 evaluated a software-based voice messaging system with reference to industry guidelines, confirming its performance and identifying potential UX improvements. Case study 2 compared three touchscreen technologies, identifying user preference and highlighting specific usability issues that would not have been found though analytical means. Case study 3 evaluated autostereoscopic 3D displays, assessing the effectiveness of 3D information while highlighting design considerations for long-term use and defining a design space for 3D content. Findings from the case studies, along with learning from visits to research facilities in the UK and USA, was used to validate and improve the toolset, with recommendations made for implementation into the PD workflow. The driving simulator received significant upgrades as part of a new interdisciplinary research collaboration with the Department of Psychology to support future research activity. Findings from the case studies also directly supported their respective technology development projects; the main outcome from the research therefore is an Evaluation Toolset that has been demonstrated, through application to live technology development projects, to provide valid, relevant and detailed information on holistic evaluations of early-phase HMI technologies, thus adding value to the PD process.

Simulator
Software

2 versions available

Design and development of a state-of-the-art Universal Laboratory for Virtual Reality, Real-time Simulation and Human-Machine Interaction

Year: 2019

Authors: V Vasilev

LUT University LUT School of Energy Systems LUT Mechanical Engineering Vladislav Vasilev Design and Development of a State-of-the-art Universal Laboratory for Virtual Reality, Real-time Simulation and Human-Machine Interaction Master’s thesis 2019 100 pages, 43 figures, 12 tables and 3 appendices Examiners: Professor Heikki Handroos D. Sc. (Tech.) Hamid Roozbahani Keywords: Laboratory of Intelligent Machines, Simulation laboratory, Visualization, Projection systems, Display systems, Virtual Environment. This Master’s Thesis is focused on design and development of the LUT Laboratory of Intelligent Machines as the cutting-edge Simulation Laboratory, which includes HMI, VR and real-time research areas. Composing of the future laboratory concepts and studying of the possible options for the visualization system were priority tasks. Developed visualization system was presented in two options: display-based and front projection system. The information about equipment, its requirements and compatibility was gathered from the open-sources, project meetings and consulting. Design of the Display Visualization Platform and Projection System was made based on average human eye properties, industry guidelines and standards. As the result of the project, technical comparison of studied equipment was made, and procurement documents were designed. Two feasible concepts for the Visualization Platform were suggested in this paper. Obtained results can be applied in further business and academic activities of the Laboratory and University in general.

Eye Tracking Glasses
Simulator

1 version available:

Detection Response Task Evaluation for Driver Distraction Measurement for Auditory-Vocal Tasks: Experiment 2

Year: 2019

Authors: TA Ranney, GH Baldwin, IA Skuce, L Smith

This research evaluated the Detection Response Task (DRT) as a measure of the attentional demands of auditory-vocal in-vehicle tasks. DRT is an ISO standardized method that requires participants to respond to simple targets that occur every 3-5 s during in-vehicle task performance. DRT variants use different targets: Remote DRT (RDRT) uses visual targets; Tactile DRT (TDRT) uses vibrating targets. A single experiment evaluated the sensitivity of the two DRT variants in two test venues (driving simulator and non-driving) using auditory-vocal tasks. Participant selection criteria from the Visual-Manual NHTSA Driver Distraction Guidelines were used to recruit 192 participants; 48 were assigned to each combination of DRT variant and test venue. Identical production vehicles were used in each venue. In the simulator, participants wore a head-mounted eye tracker and performed in-vehicle tasks while driving in a car-following scenario. In the non-driving venue, occlusion testing required participants to perform the four discrete tasks while wearing occlusion goggles, which restricted viewing intermittently to simulate driving task demands. In-vehicle tasks for both venues included three discrete auditory-vocal tasks (destination entry, phone dialing, radio tuning), one discrete visual-manual task (radio tuning), and two continuous auditory-vocal digit-recall tasks representing acceptable (1-back) and unacceptable (2-back) levels of attentional load. Testing in each venue had a second part. All participants’ last procedural step involved brake response time (BRT) testing in the simulator which required participants to brake in response to both expected and unexpected lead-vehicle (LV) braking events while performing selected in-vehicle tasks. Differences observed between test venues suggest that some in-vehicle tasks are more demanding when performed intermittently in the driving simulator than when performed continuously in the non-driving venue, thus pointing to the driving simulator as the better test venue. BRT results provided some support for a connection between DRT RT and BRT; however, the experiment did not provide sufficient control of speed and headway to allow a stronger comparison. DRT results support the conclusion that the 2-back condition represents too much attentional demand and that acceptable tasks should have a lower level of attentional demand. Differences between TSOT and TEORT indicated that occlusion is not suitable for assessing auditory-vocal tasks; however, TEORT and other glance-based metrics appear suitable for use with auditory-vocal tasks. BRT testing revealed a small effect of attentional load for unexpected LV braking events but not for expected LV braking events. Mean heart rate was sensitive to differences in attentional load.

Eye Tracking Glasses
Simulator

2 versions available