Publication Hub Archive

UX Analysis

You have reached the Ergoneers Publication Hub for:

Field of Application > UX Analysis

Find all Publications here:

Publication Hub

Total results: 534

Driven to distraction? A review of speech technologies in the automobile

Year: 2017

Authors: R Young, J Zhang

Speech technologies hold the promise of improving driver performance for many visual-manual secondary tasks, by enabling eyes-free and hands-free interactions. Unfortunately, speech interfaces have enjoyed only incremental growth since the early 2000s in the automotive industry. Instead, mixed-mode interfaces (speech combined with visual) have become increasingly common, and visual-manual interfaces are still dominant. This paper provides a historical overview of speech driver interface studies, including formal testing on a 2014 Toyota Corolla production vehicle, and a new analytical evaluation of the Apple CarPlay interface in the 2016 Cadillac ATS. Results indicate that eyes-free and hands-free speech (i.e., “pure” speech) improves driver performance vs. mixed-mode interfaces. Also, mixed-mode improves driver performance vs. “pure” visual-manual, for the tasks tested. The visual component of the mixed-mode and visual-manual interfaces increases off-road glances, a safety decrement. We recommend that future in-vehicle speech interface products should sensibly limit visual displays from showing information redundant to that provided by the speech interface. In general, we recommend pure speech driver-vehicle interfaces for secondary tasks wherever possible.

Simulator
Software

2 versions available

ERTMS pilot in the Netherlands–impact on the train driver

Year: 2017

Authors: R Van der Weide, D De Bruijn, M Zeilstra

In 2014 the Ministry of Transport decided for implementation of ERTMS (European Railway Traffic Management System) on the main corridors in Holland. A pilot with ERTMS was performed between the cities of Amsterdam and Utrecht. This paper describes effects of ERTMS on workload and human error of the train driver by comparing driving in conventional (ATB), in ERTMS with Dual Signalling and in ERTMS L2-only train protection. This was done using driving performance data, a simulator experiment, workshops and surveys.

Simulator
Software

1 version available:

Evaluating distraction of in-vehicle information systems while driving by predicting total eyes-off-road times with keystroke level modeling

Year: 2017

Authors: C Purucker,F Naujoks, A Prill, A Neukum

Increasingly complex in-vehicle information systems (IVIS) have become available in the automotive vehicle interior. To ensure usability and safety of use while driving, the distraction potential of system-associated tasks is most often analyzed during the development process, either by employing empirical or analytical methods, with both families of methods offering certain advantages and disadvantages. The present paper introduces a method that combines the predictive precision of empirical methods with the economic advantages of analytical methods. Keystroke level modeling (KLM) was extended to a task-dependent modeling procedure for total eyes-off-road times (TEORT) resulting from system use while driving and demonstrated by conducting two subsequent simulator studies. The first study involved the operation of an IVIS by N = 18 participants. The results suggest a good model fit (R2 Adj. = 0.67) for predicting the TEORT, relying on regressors from KLM and participant age. Using the parameter estimates from study 1, the predictive validity of the model was successfully tested during a second study with N = 14 participants using a version of the IVIS prototype with a revised design and task structure (r Pred.-Obs. = 0.58). Possible applications and shortcomings of the approach are discussed.

Eye Tracking Glasses
Software

7 versions available

Evaluating the interactive effects of responsible drinking messages and attentional bias on actual drinking behaviours

Year: 2017

Authors: D Frings, G Eskian,I Albery,T Moss

Responsible drinking messages (RDMs) are often used as a key tool to reduce alcohol related harms. Posters are a common form of RDM, displayed in places such as bars, bus stops and toilet cubicles. However, some recent research suggests RDMs may not have the desired effect of reducing levels of consumption. It is not known how environmental (e.g. the number of alcohol-related cues in a given environment) or individual difference measures (such as prior drinking behaviour and beliefs, or attentional bias towards alcohol related stimuli) influence interactions with RDMs. Nor is it known how these factors affect their efficacy. This research explored these issues by having participants view RDMs either in a bar-laboratory (i.e. a 'fake bar' inside our research facility) or a traditional psychology laboratory cubicle. The key findings of the research are: 1: That posters in general, and RDMs in particular, are poorly attended to in bar environments 2: That attentional biases towards alcohol influence the allocation of visual attention that is consciously controlled and effortful, but not visual attention that is automatic. Variations at the level of individual drinkers (such as prior drinking history or alcohol expectancies) were also associated with the direction of visual attention towards actual alcoholic drinks. This research has implications for the optimal placement of RDMs. It also highlights the sensitivity of such messages to changes in content. Theoretical implications include new questions around the relationship between attentional bias and other forms of attention, and the importance of cue saturation in understanding when attentional bias affects other cognitive and behavioural processes.

Eye Tracking Glasses
Software

3 versions available

Exploring Normative Eye Movement Patterns in Functional Tasks

Year: 2017

Authors: EB Lavoie

When interacting with an object, humans are quite effective at navigating their hand to an object, grasping it, and acting on it. The level of ease with which we do this masks the complex interplay of sensory modalities that is occurring. This study utilizes a head-mounted eye-tracker and upper-limb motion capture markers to reveal how one of these sensory modalities, vision, enables efficient object interaction. Participants completed several trials of two tasks mimicking real-world demands. The first task involved turning and grasping a pasta box from an original position outside the participant’s field of view and placing it onto two shelves before returning it to its starting location. The second task had participants move cups filled with beads four times over a partition. Both tasks show participants spend nearly the full duration of the trial fixating on objects relevant to the task, well in advance of their hand arriving at an object. As well, participants spend little time fixating on their own hand when reaching towards an object, and slightly more time, although still very little, fixating on the object in their hand when transporting it. Instead, during a grasp, participants make a saccade from the object to its drop-off location, and hold this fixation until the object is being released by the hand. Other sensory systems, likely proprioception and haptic feedback, allow participants to behave this way. When interacting with an object outside the field of view, slight changes in this behavior occur. Specifically, participants are unable to fixate on the object as far in advance of their hand, move slightly slower, and increase their maximum grip aperture. A possible explanation for these behaviours is a predictable interaction between covert and overt attention, Dorsal and Ventral Streams of visual processing, and proprioceptive and haptic feedback that allow individuals to carry out object interactions in a smooth, cyclical manner with the eyes leading the hand.

Eye Tracking Glasses
Software

2 versions available

EyeRecToo: Open-source Software for Real-time Pervasive Head-mounted Eye Tracking.

Year: 2017

Authors: T Santini,W Fuhl, D Geisler,E Kasneci

Head-mounted eye tracking offers remarkable opportunities for research and applications regarding pervasive health monitoring, mental state inference, and human computer interaction in dynamic scenarios. Although a plethora of software for the acquisition of eye-tracking data exists, they often exhibit critical issues when pervasive eye tracking is considered, e.g., closed source, costly eye tracker hardware dependencies, and requiring a human supervisor for calibration. In this paper, we introduce EyeRecToo, an open-source software for real-time pervasive head-mounted eye-tracking. Out of the box, EyeRecToo offers multiple real-time state-of-the-art pupil detection and gaze estimation methods, which can be easily replaced by user implemented algorithms if desired. A novel calibration method that allows users to calibrate the system without the assistance of a human supervisor is also integrated. Moreover, this software supports multiple head-mounted eye-tracking hardware, records eye and scene videos, and stores pupil and gaze information, which are also available as a real-time stream. Thus, EyeRecToo serves as a framework to quickly enable pervasive eye-tracking research and applications. Available at: www.ti.uni-tuebingen.de/perception.

Eye Tracking Glasses
Software

5 versions available

Fast camera focus estimation for gaze-based focus control

Year: 2017

Authors: W Fuhl,T Santini,E Kasneci

Many cameras implement auto-focus functionality; however, they typically require the user to manually identify the location to be focused on. While such an approach works for temporally-sparse autofocusing functionality (e.g., photo shooting), it presents extreme usability problems when the focus must be quickly switched between multiple areas (and depths) of interest – e.g., in a gaze-based autofocus approach. This work introduces a novel, real-time auto-focus approach based on eye-tracking, which enables the user to shift the camera focus plane swiftly based solely on the gaze information. Moreover, the proposed approach builds a graph representation of the image to estimate depth plane surfaces and runs in real time (requiring ≈ 20ms on a single i5 core), thus allowing for the depth map estimation to be performed dynamically. We evaluated our algorithm for gaze-based depth estimation against state-of-the-art approaches based on eight new data sets with flat, skewed, and round surfaces, as well as publicly available datasets.

Eye Tracking Glasses
Software

4 versions available

Gaze direction when driving after dark on main and residential roads: Where is the dominant location?

Year: 2017

Authors: J Winter,S Fotios,S Völker

CIE Joint Technical Committee JTC-1 has requested data regarding the size and shape of the distribution of drivers’ eye movement in order to characterize visual adaptation. This paper reports the eye movement of drivers along two routes in Berlin after dark, a main road and a residential street, captured using eye tracking. It was found that viewing behaviour differed between the two types of road. On the main road eye movement was clustered within a circle of approximately 10° diameter, centred at the horizon of the lane. On the residential street eye movement is clustered slightly (3.8°) towards the near side, eye movements were best captured with either an ellipse of approximate axes 10° vertical and 20° horizontal, centred on the lane ahead, or a 10° circle centred 3.8° towards the near side. These distributions reflect a driver’s tendency to look towards locations of anticipated hazards.

Simulator
Software

6 versions available

Inter-rater reliability of mobile eye-tracking when walking in Parkinson’s disease: contextual analysis

Year: 2017

Authors: S Stuart, D Hunt, J Nell,A Godfrey

• Tracking eye-movements when walking allows inferences to be made about underlying cognitive and visual processes that may influence gait, particularly in ageing and Parkinson’s disease (PD) where such processes are commonly impaired [1]. • Very few studies have investigated the context of eye-movements, such as the task-relevance of fixation locations. • This is largely due to current analysis requiring time-consuming manual frame-by-frame inspection of eye-tracker videos [2], which can be subjective. • There is potential for a lack of consistency between raters. Aims: 1) Modify a previously developed eye-movement objective measurement algorithm [3] to provide still images of fixation locations 2) Develop a classification method for manual fixation location analysis of mobile eye-tracking data obtained when walking 3) Assess inter-rater reliability of the proposed classification method

Eye Tracking Glasses
Software

2 versions available

Menu styles of mobile devices and their influence on gaze behavior while walking

Year: 2017

Authors: J Conradi, B Nord, T Alexander

Mobile IT-devices (Smartphones, Tablet-PCs, etc.) are often used while performing other tasks in parallel, e.g. while walking. However, mobile device and environment often compete for the users’ attention. Binding too much attention on the mobile device will reduce attention on the environment. Especially in risky environments like road traffic, this might trigger substantial danger for users and third parties. Therefore, graphic user interfaces (GUIs) have to be adapted to it. Yet, lightweight mobile devices have small displays and only a limited number of objects can be displayed. Content with multiple subunits has to be arranged, e.g. by forming subcategories. Hierarchical structured menus facilitate this. In our survey, we compared the effect of different menu concepts on gaze behavior while walking. Menus containing 4–8 icons per level required the lowest number of gazes. In single interactions, the shortest visual distraction was found for the least number of objects on the screen.

Eye Tracking Glasses
Software

4 versions available