Publication Hub Archive

CAN Bus

You have reached the Ergoneers Publication Hub for:

Used Tool > CAN Bus

Find all Publications here:

Publication Hub

Total results: 87

Utilization of viewing aids for safe operations with excavators

Year: 2016

Authors: M Koppenborg, M Huelke, P Nickel, A Lungfiel

Camera monitor systems (CMS) and mirrors are intended to support excavator operators’ understanding of the surrounding and help prevent accidents. However, little is known about visual information acquisition of operators of large construction machinery, especially during machine movements. In this field study, utilization of viewing aids and other information sources during rotating movements of excavators was investigated by means of eye-tracking and task observation. Results show that, while CMS monitors and left mirrors were used for many rotating movements, other information sources around the machine were also attended, such as the right frontolateral area and the area around the attachment. The article discusses implications for safety and machinery design, such as positioning of viewing aids.

Eye Tracking Glasses
Simulator

6 versions available

A surrogate test for cognitive demand: tactile detection response task (TDRT)

Year: 2015

Authors: L Hsieh,S Seaman, R Young

As advanced electronic technology continues to be integrated into in-vehicle and portable devices, it is important to understand how drivers handle multitasking in order to maintain safe driving while reducing driver distraction. NHTSA has made driver distraction mitigation a major initiative. Currently, several types of Detection Response Tasks (DRTs) for assessing selective attention by detecting and responding to visual or tactile events while driving have been under development by an ISO WG8 DRT group. Among these DRTs, the tactile version (TDRT) is considered as a sensitive surrogate measure for driver attention without visual-manual interference in driving, according to the ISO DRT Draft Standard. In our previous study of cognitive demand, our results showed that the TDRT is the only surrogate DRT task with an acute sensitivity to a cognitive demand increase in an auditory-vocal task (i.e., n-Back verbal working memory task). At the same time, a specificity for responding to only increased cognitive demand, not to increased physical demand for a visual-manual task (i.e., Surrogate Reference Task or SuRT). Similar findings in both simulated and on-road driving confirmed that the TDRT is a sensitive, specific and reliable surrogate test for measuring the effects of secondary tasks on driver attention. The current paper further investigated eye glance patterns and subjective ratings, and their relationship with DRT metrics, allowing a more comprehensive understanding of the attentional effect of secondary tasks on driver performance.

Eye Tracking Glasses
Simulator

6 versions available

Adding depth: establishing 3D display fundamentals for automotive applications

Year: 2015

Authors: MJ Pitts, E Hasedžić, L Skrypchuk, A Attridge

The advent of 3D displays offers Human-Machine Interface (HMI) designers and engineers new opportunities to shape the user's experience of information within the vehicle. However, the application of 3D displays to the in-vehicle environment introduces a number of new parameters that must be carefully considered in order to optimise the user experience. In addition, there is potential for 3D displays to increase driver inattention, either through diverting the driver's attention away from the road or by increasing the time taken to assimilate information. Manufacturers must therefore take great care in establishing the ‘do’s and ‘don’t's of 3D interface design for the automotive context, providing a sound basis upon which HMI designers can innovate. This paper describes the approach and findings of a three-part investigation into the use of 3D displays in the instrument cluster of a road car, the overall aim of which was to define the boundaries of the 3D HMI design space. A total of 73 participants were engaged over three studies. Findings indicate that users can identify depth more quickly and accurately when rendered in 3D, indicating potential for future applications using the depth dimension to relay information. Image quality was found to degrade with increasing parallax and indications of a fatigue effect with continued exposure were found. Finally, a relationship between minimum 3D offset, parallax position and object type was identified.

Eye Tracking Glasses
Software

3 versions available

App analytics: predicting the distraction potential of in-vehicle device applications

Year: 2015

Authors: M Krause,AS Conti, M Henning, C Seubert

Three experiments were conducted to check the feasibility of predicting experimental outcomes of driver distraction studies. The predictions are based on subtasks analysis and synthesis. In the first experiment, data (e.g., Total Glance Time, Single Glance Durations and Total Shutter Open Times) are gathered when subjects interacted with touch screen applications. In a second experiment, additional data were gathered about rotary knob interactions. These data were used to synthesis and predict the outcomes of a third (evaluation) experiment, which involved rotary knob and touch screen tasks. The results are promising and can help to have a better understanding of problematic subtasks and reduce testing of clearly unsuitable applications. The transfer of the procedure to other laboratories is challenging. The modeling and mapping process includes many subjective decisions.

Eye Tracking Glasses
Simulator

3 versions available

Driver-activity recognition in the context of conditionally autonomous driving

Year: 2015

Authors: C Braunagel,E Kasneci, W Stolzmann

This paper presents a novel approach to automated recognition of the driver's activity, which is a crucial factor for determining the take-over readiness in conditionally autonomous driving scenarios. Therefore, an architecture based on head-and eye-tracking data is introduced in this study and several features are analyzed. The proposed approach is evaluated on data recorded during a driving simulator study with 73 subjects performing different secondary tasks while driving in an autonomous setting. The proposed architecture shows promising results towards in-vehicle driver-activity recognition. Furthermore, a significant improvement in the classification performance is demonstrated due to the consideration of novel features derived especially for the autonomous driving context.

Eye Tracking Glasses
Simulator

6 versions available

Ergonomic design of the gauge cluster display for commercial trucks

Year: 2015

Authors: T Kim, J Park, J Choe, ES Jung

Objective: The purpose of this study is to determine the priority of information presentation and the effective menu type to be placed in the center of a gauge cluster display for commercial trucks and to present a set of ergonomic designs for the gauge cluster display. Background: An effective ergonomic design is specifically needed for the development of the gauge cluster display for the commercial trucks, because more diverse and heavier information is delivered to truck drivers, compared to the information to passenger car drivers. Method: First, all the information that must be shown on the commercial truck display was collected. Then, the severity, frequency of use, and display design parameters were evaluated for those information by commercial truck drivers. Next, an analysis on the information attributes and the heuristic evaluation utilizing the display design principles were carried out. According to the results, a design alternative of the main screen to be displayed was constructed by priority. A comparative analysis between the alternative and existing main screens was also conducted to see the efficacy of the designs. Lastly, we conducted an experiment for the selection of menu type. The experiment was conducted using the driving simulator with an eye-tracking device. The independent variables were four types of the menu reflecting the commercial truck characteristics such as grid type, icon type, list type, and flow type. We measured preference, total execution time, the total duration of fixation on the gauge cluster area, and the total number of fixation on the gauge cluster area as dependent variables. Results: Four types of driver convenience information and six types of driver assistance information were selected as the information to be placed primarily on the main screen of the gauge cluster. The Grid type was the most effective among the menu types. Conclusion: In this study, the information that appears on the main screen of the display, the division of the display and the design of the menu type for commercial truck drivers were suggested. Application: This study is expected to be utilized as guidelines on the ergonomic design of a gauge cluster display for commercial trucks.

Eye Tracking Glasses
Simulator

4 versions available

Exploiting the potential of eye movements analysis in the driving context

Year: 2015

Authors: C Braunagel, W Stolzmann,E Kasneci

Driving is a complex and highly visual task. With the development of high-end eyetracking devices, numerous studies over the last two decades have investigated eye movements of the driver to identify deficits in visual search patterns and to derive assistive, informative, and entertainment systems. However, little is known about the visual behavior during autonomous driving, where the driver can be involved in other tasks but still has to remain attentive in order to be able to resume control of the vehicle. This work aims at exploiting the potential of eye movement analysis in the autonomous driving context. In a pilot study, we investigated whether the type of the secondary task in which the driver is involved, can be recognized solely from the eye movement parameters of the driver. Furthermore, we will discuss several applications of eye movement analysis to future autonomous driving approaches, e.g., to automatically detect whether the driver is being attentive and – when required – to guide her visual attention towards the driving task.

Eye Tracking Glasses
Software

3 versions available

Eye glance analysis of the surrogate tests for driver distraction

Year: 2015

Authors: L Hsieh,S Seaman, RA Young

The purpose of this study was to examine the eye glance patterns of Detection Response Tasks (DRTs) for assessment of driver distraction during simulated driving. Several types of DRTs across visual, tactile and haptic modalities were used to investigate driver distraction by the ISO Driving Distraction working group. As part of the working group, we conducted a simulated driving study examining driver performance while engaging the primary driving task with visual-manual or auditory-verbal secondary tasks. Results of eye glance analysis showed that the visual DRTs increased visual load in driving more than the tactile DRT. Subsequently, the visual DRTs marginally increased the total glance time for forward view by 6.27 seconds and significantly increased the detection response time by 135.79 ms than the tactile DRT. As for the secondary tasks, the visual-manual secondary task yielded significantly longer total eye-off-the-road time (effect size = 50.75 ms), as well as DRT response times than the auditory-verbal ones time (effect size = 55.85 ms). This study allowed us to examine the relationships between rated situational awareness, DRT performance, and glance patterns, yielding insights into the relationship between objective task performance measures and subjective ratings.

Eye Tracking Glasses
Simulator

9 versions available

Eye-tracking technology in vehicles: application and design

Year: 2015

Authors: V Selimis

This work analyses the eye-tracking technology and, as an outcome, it presents an idea of implementing it, along with other kinds of technology, in vehicles. The main advantage of such an implementation would be to augment safety while driving. The setup and the methodology used for detecting human activity and interaction using the means of the eye-tracking technology are investigated. Research in that area is growing rapidly and its results are used in a variety of cases. The main reasons for that growth are the constant lowering of prices of the special equipment that is necessary, the portability that is available in some cases as well as the easiness of use that make the usage of that technology more user-friendly than it was a few years ago. The whole idea of eye-tracking is to track the movements of the eyes in an effort to determine the direction of the gaze, using sophisticated software and purpose built hardware. This manuscript, makes a brief introduction in the history of eye monitoring presenting the very early scientific approaches used in an effort to better understand the movements of the human while tracking an object or during an activity. Following, there is an overview of the theory and the methodology used to track a specific object. As a result there exists a short presentation of the image processing and the machine learning procedures that are used to accomplish such tasks. Thereafter, we further analyze the specific eye-tracking technologies and techniques that are used nowadays and the characteristics that affect the exact choice of eye-tracking equipment. For the appropriate choice we have to take into account the area of research-interest in which the equipment will be used. In addition, the main categories of eye-tracking applications are presented and we shortlist the latest state of the art eye-tracking commercial systems. Following, we present our first approach, trying to describe an eye-tracking device that could be used in vehicles offering much better safety standards, controlling various parameters, continuously checking the readiness of the driver and alerting him for potential imminent collision incidents. Finally, we describe the existing way of connecting a device, in our case an eye-tracker, can be connected to an automobile’s system.

Eye Tracking Glasses
Software

3 versions available

Prediction of take-over time in highly automated driving by two psychometric tests

Year: 2015

Authors: M Körber, T Weißgerber, L Kalb, C Blaschke, M Farid

In this study, we investigated if the driver's ability to take over vehicle control when being engaged in a secondary task (Surrogate Reference Task) can be predicted by a subject's multitasking ability and reaction time. 23 participants performed a multitasking test and a simple response task and then drove for about 38 min highly automated on a highway and encountered five take-over situations. Data analysis revealed significant correlations between the multitasking performance and take-over time as well as gaze distributions for Situations 1 and 2, even when reaction time was controlled. This correlation diminished beginning with Situation 3, but a stable difference between the worst multitaskers and the best multitaskers persisted. Reaction time was not a significant predictor in any situation. The results can be seen as evidence for stable individual differences in dual task situations regarding automated driving, but they also highlight effects associated with the experience of a take-over situation.

Eye Tracking Glasses
Simulator

16 versions available