Publication Hub Archive

UX Analysis

You have reached the Ergoneers Publication Hub for:

Field of Application > UX Analysis

Find all Publications here:

Publication Hub

Total results: 588

The benefits and user experience of hearing aids controlled by eye gaze of

Year: 2020

Authors: LJ Nellemann

A problem with hearing aids is that they do not always reduce the noise correctly, which can make it hard to follow a conversation in noisy surroundings. This master thesis seeks to investigate if eye gaze steering of a hearing aid can benefit the hearing aid user by testing two forms of eye gaze steering on a concept level; hard and soft eye gaze steering. An experiment is conducted with 13 hearing aid users answering questions by following a conversation during four different conditions; a familiarization round and a no, hard, and soft eye gaze steering condition. For every condition, participants answered a NASA Task Load Index (TLX) questionnaire for measuring workload, and after the experiment, an exit-interview was conducted. When calculating the percentage of correct answers during each condition the results show the familiarization round to have the most correct answers, followed by the hard eye gaze steering, soft eye gaze steering, and no steering with least correct answers. A significant difference is found between all conditions, except the hard and soft eye gaze steering. When measuring workload the only difference is found between the familiarization round and the three other conditions, where the familiarization is scoring lowest. The eye gaze steering of a hearing aid seems to work by reducing noise that participants do not want to listen to, and when asked about their preferences, participants preferred the hard eye gaze steering.

1 version available:

THE EFFECT OF VISUAL GAZE LOCATION ON BLOCK-START BIOMECHANICS

Year: 2020

Authors: M Lee, M Chan, O Mitsuo, D Boey

This study investigated the effects of varying visual gaze location (VGL), by means of externally-focused instruction, during the block-start “set” phase with the intention of optimizing block-start biomechanics for faster starts in an athlete-specific manner. Nine collegiate sprinters performed a series of block-starts while directing their VGL to their personal baselines, and at 0.5m, 1m, 2m and 3m from the start line. Twelve infrared opto-reflective cameras and one force plate were utilized to assess trunk, hip, knee and centre of mass kinematics, and blocks push-phase kinetics. An eyetracker was used to determine participants’ VGL. Some postural changes observed were a significant decrease in pelvic height in the “set” position, and more upright trunk postures at toe-off from the blocks, when participants gazed further at 2m and 3m. Gazing at 1m was effective in eliciting changes to pelvic horizontal velocity. These results suggest that manipulating VGL could help certain athletes to optimize their block-start biomechanics for faster starts. Coaches can consider redirecting VGL in addition to usual instructional methods to improve the block-start performances of athletes.

4 versions available

The effect of visual HMIs of a system assisting manual drivers in manoeuvre coordination in system limit and system failure situations

Year: 2020

Authors: AK Kraft, C Maag, MI Cruz,M Baumann

Ambiguous situations in traffic often require communication and cooperation between road users. In order to resolve these situations and increase cooperative driving behavior in situations of merging or turning left, manual drivers could be assisted by an advanced driver assistance system (ADAS) for cooperative driving. This simulator study investigated the behavior of drivers confronted with system limits and failures of such a system. The ADAS used in this study informed the driver about an upcoming cooperation situation and gave advice on how to behave (e.g. reduce speed, change lane). Two test situations were implemented: a system freeze and an unexpected event, which could not be detected by the system. In order to find the most fitting HMI solution, the place of presentation (head-up display (HUD) vs. instrument cluster) as well as the form of presentation (dynamic vs. symbolic) were varied. The results indicated that the most fitting HMI solution to support the driver in a complex coordinated driving situation is a dynamic HUD, mainly due to the positive effect on glance behavior. However, advantages of both forms of presentation were revealed, as each form of presentation increased the probability of recognition for one of the test situations. The fewest collisions took place with the dynamic form of presentation.

6 versions available

The effects of a predictive HMI and different transition frequencies on acceptance, workload, usability, and gaze behavior during urban automated driving

Year: 2020

Authors: T Hecht, S Kratzert,K Bengler

Automated driving research as a key topic in the automotive industry is currently undergoing change. Research is shifting from unexpected and time-critical take-over situations to human machine interface (HMI) design for predictable transitions. Furthermore, new applications like automated city driving are getting more attention and the ability to engage in non-driving related activities (NDRA) starting from SAE Level 3 automation poses new questions to HMI design. Moreover, future introduction scenarios and automated capabilities are still unclear. Thus, we designed, executed, and assessed a driving simulator study focusing on the effect of different transition frequencies and a predictive HMI while freely engaging in naturalistic NDRA. In the study with 33 participants, we found transition frequency to have effects on workload and acceptance, as well as a small impact on the usability evaluation of the system. Trust, however, was not affected. The predictive HMI was used and accepted, as can be seen by eye-tracking data and the post-study questionnaire, but could not mitigate the above-mentioned negative effects induced by transition frequency. Most attractive activities were window gazing, chatting, phone use, and reading magazines. Descriptively, window gazing and chatting gained attractiveness when interrupted more often, while reading magazines and playing games were negatively affected by transition rate.

7 versions available

The impact of auditory continual feedback on take-overs in Level 3 automated vehicles

Year: 2020

Authors: G Cohen

Objective: To implement auditory continual feedback into the interface design of a Level 3 automated vehicle and to test whether gaze behavior and reaction times of drivers improved in take-over situations. Background: When required to assume manual control in take-over situations, drivers of Level 3 automated vehicles are less likely than conventional drivers to spot potential hazards, and their reaction time is longer. Therefore, it is crucial that the interface of Level 3 automated vehicles will be designed to improve drivers’ performance in take-over situations. Method: In two experiments, participants drove a simulated route in a Level 3 automated vehicle for 35 min with one imminent take-over event. Participants’ gaze behavior and performance in an imminent take-over event were monitored under one of three auditory interface designs: (1) Continual feedback. A system that provides verbal driving-related feedback; (2) Persistent feedback. A system that provides verbal driving-related feedback and a persistent beep; and (3) Chatter feedback. A system that provides verbal non-driving-related feedback. Also, there was a control group without feedback. Results: Under all three auditory feedback designs, the number of drivers' on-road glances increased compared to no feedback, but none of the designs shortened reaction time to the imminent event. Conclusion: Increasing the number of on-road glances during automated driving does not necessarily improve drivers’ attention to the road and their reaction times during take-overs. Application: Possible implications for the effectiveness of auditory continual feedback should be considered when designing interfaces for Level 3 automated vehicles.

7 versions available

The instructor assistant system (iASSYST)-utilizing eye tracking for commercial aviation training purposes

Year: 2020

Authors: D Rudi,P Kiefer,M Raubal

This work investigates the potential of providing commercial aviation flight instructors with an eye tracking enhanced observation system to support the training process. During training, instructors must deal with many parallel tasks, such as operating the flight simulator, acting as air traffic controllers, observing the pilots and taking notes. This can cause instructors to miss relevant information that is crucial for debriefing the pilots. To support instructors, the instructor ASsistant SYSTem (iASSYST) was developed. It includes video, audio, simulator and eye tracking recordings. iASSYST was evaluated in a study involving 7 instructors. The results show that with iASSYST, instructors were able to support their observations of errors, find new errors, determine that some previously identified errors were not errors, and to reclassify the types of errors that they had originally identified. Instructors agreed that eye tracking can help identifying causes of pilot error. Practitioner summary: This paper introduces an instructor assistant system, which is evaluated in a user study involving 7 airline flight instructors. The system can be used by airline flight instructors to complement their observations, as a basis for discussions with pilots during debriefing, and by airline pilots to improve their flight performance.

10 versions available

Understanding and Supporting Anticipatory Driving in Automated Vehicles

Year: 2020

Authors: D He

Understanding and Supporting Anticipatory Driving in Automated Vehicles He, Dengbo University of Toronto (Canada), 2020. Abstract: As automated vehicles (AVs) are increasingly becoming a reality on our roads, understanding the interaction between human drivers and these vehicles is critical. Anticipatory driving refers to the human driver's ability to predict and react to road events before they occur, a skill that enhances safety and efficiency. This dissertation explores methods to support anticipatory driving behaviors in AVs through improved human-vehicle interaction. The research identifies key anticipatory behaviors, develops support systems for these behaviors, and evaluates their effectiveness. Findings suggest that enhancing AV interfaces and feedback mechanisms can significantly improve human-vehicle collaboration and overall driving performance.

4 versions available

Understanding the Cognitive and Psychological Impacts of Emerging Technologies on Driver Decision-Making Using Physiological Data

Year: 2020

Authors: S Agrawal

Emerging technologies, such as advanced driver-assistance systems (ADAS) and autonomous vehicles (AVs), are transforming the driving experience. These technologies can influence driver cognition and decision-making processes in various ways. This study aims to understand the cognitive and psychological impacts of these emerging technologies on driver decision-making by utilizing physiological data. Through the analysis of data such as heart rate variability, skin conductance, and eye-tracking metrics, the research investigates how drivers' mental and physical states are affected during interaction with ADAS and AVs. The findings aim to provide insights into improving the design and safety of these technologies, ultimately enhancing driver comfort and performance.

4 versions available

Understanding the role of visual attention on wines’ purchase intention: An eye-tracking study

Year: 2020

Authors: P Monteiro,J Guerreiro,SMC Loureiro

Purpose: Wine bottles compete for consumers’ attention on the shelf during the decisive moment of choice. This study aims to explore the role that visual attention to wine labels has on the purchase decision and the mediating role of quality perceptions and desire on such purchase behaviors. Wine awards and consumption situation are used as moderators. Design/methodology/approach: The study was conducted in Portugal and 36 individuals participated in a 2 × 2 within subjects design (awarded/not awarded × self-consumption/social-consumption). For each scenario, individuals’ attention, perceptions of quality, desire and purchase intentions were recorded. Findings: Data from eye-tracking shows that, during the purchase process, the amount of attention given to a bottle is determinant of individuals’ purchase intentions, a relationship that increases in significance for bottles with awards and for when consumers are buying wine for a consumption situation involving a social environment. In addition, both quality perceptions and desire are confirmed to positively influence wines’ purchase intentions. Originality/value: By using an eye monitoring method, this paper brings new insights into the wine industry by highlighting the impact that wines’ labels and different consumption situations have on individuals’ attention and purchase intention. Wine producers and retailers may benefit from the insights provided by the current study to refine their communication strategies by either highlighting product characteristics and pictorial elements, as it is the case of the awards, or communicating about their products for different consumption situations.

6 versions available

User interface for in-vehicle systems with on-wheel finger spreading gestures and head-up displays

Year: 2020

Authors: SH Lee, SO Yoon

Interacting with an in-vehicle system through a central console is known to induce visual and biomechanical distractions, thereby delaying the danger recognition and response times of the driver and significantly increasing the risk of an accident. To address this problem, various hand gestures have been developed. Although such gestures can reduce visual demand, they are limited in number, lack passive feedback, and can be vague and imprecise, difficult to understand and remember, and culture-bound. To overcome these limitations, we developed a novel on-wheel finger spreading gestural interface combined with a head-up display (HUD) allowing the user to choose a menu displayed in the HUD with a gesture. This interface displays audio and air conditioning functions on the central console of a HUD and enables their control using a specific number of fingers while keeping both hands on the steering wheel. We compared the effectiveness of the newly proposed hybrid interface against a traditional tactile interface for a central console using objective measurements and subjective evaluations regarding both the vehicle and driver behaviour. A total of 32 subjects were recruited to conduct experiments on a driving simulator equipped with the proposed interface under various scenarios. The results showed that the proposed interface was approximately 20% faster in emergency response than the traditional interface, whereas its performance in maintaining vehicle speed and lane was not significantly different from that of the traditional one.

7 versions available