Publication Hub Archive

Eye Tracker

You have reached the Ergoneers Publication Hub for:

Used Tool > Eye Tracker

Find all Publications here:

Publication Hub

Total results: 582

The social construction of embodied experiences: two types of discoveries in the science centre

Year: 2021

Authors: W Kesselheim, C Brandenberger

Based on a large corpus of video and eye-tracking data and inspired by multimodal conversation analysis, this paper analyses how visitors discover natural phenomena during their use of hands-on exhibits in a science and technology centre (STC). In these discoveries, individual multisensorial experiences of natural phenomena are communicatively transferred from one visitor to another. This paper describes two contrasting sequential formats of joint discoveries in the STC. In the first and more frequent case, experiences are socially shared by focussing the co-visitors’ visual attention on one point in their interactional space, while in the second case perceptions are socially shared via reproduction sequences, i.e. by repeating the actions that have led to the discovery with exchanged roles. We will argue that in these reproduction sequences, sharing experiences can be understood via the concept of “intercorporeality” (Merleau-Ponty, Maurice. 2014 [1945]. Phenomenology of perception. London, New York: Routledge). Our paper contributes to the current debate on intercorporeality, as it empirically shows that it is analytically fruitful to extend the concept to situations without simultaneous perception.

Eye Tracking Glasses
Software

3 versions available

Urgent Cues While Driving: S3D Take-over Requests

Year: 2021

Authors: F Weidner, F Weidner

In SAE level 3 automated vehicles, the human driver still needs to be ready to take over control in case the vehicle encounters a situation outside its operational design domain. This chapter outlines a case study where smart stereoscopic 3D icons act as a take-over notification.

Simulator
Software

1 version available:

Use of Pupil Area and Fixation Maps to Evaluate Visual Behavior of Drivers inside Tunnels at Different Luminance Levels—A Pilot Study

Year: 2021

Authors: L Qin, QL Cao,AS Leon, YN Weng, XH Shi

This study reports the results of a pilot study on spatiotemporal characteristics of drivers’ visual behavior while driving in three different luminance levels in a tunnel. The study was carried out in a relatively long tunnel during the daytime. Six experienced drivers were recruited to participate in the driving experiment. Experimental data of pupil area and fixation point position (at the tunnel’s interior zone: 1566 m long) were collected by non-intrusive eye-tracking equipment at three luminance levels (2 cd/m2, 2.5 cd/m2, and 3 cd/m2). Fixation maps (color-coded maps presenting distributed data) were created based on fixation point position data to quantify changes in visual behavior. The results demonstrated that luminance levels had a significant effect on pupil areas and fixation zones. Fixation area and average pupil area had a significant negative correlation with luminance levels during the daytime. In addition, drivers concentrated more on the front road pavement, the top wall surface, and the cars’ control wheels. The results revealed that the pupil area had a linear relationship with the luminance level. The limitations of this research are pointed out and the future research directions are also prospected.

Eye Tracking Glasses
Simulator

4 versions available

Utilization of Inertial Measurement Units for Determining the Sequential Chain of Baseball Strike Posture

Year: 2021

Authors: YJ Lee, PC Lin,LY Chen, YJ Chen,JN Liang

The purpose of this study was to employ inertial measurement units (IMU) with an eye-tracking device to investigate different swing strategies between two levels of batters. The participants were 20 healthy males aged 20 to 30 years old, with ten professional and ten amateur batters. Eye gaze position, head, shoulder, trunk, and pelvis angular velocity, and ground reaction forces were recorded. The results showed that professional batters rotated segments more rhythmically and efficiently than the amateur group. Firstly, the professional group spent less time in the preparation stages. Secondly, the maximum angular velocity timing of each segment of the professional group was centralized in the swing cycle. Thirdly, the amateur group had significantly earlier gaze timing of the maximum angular velocity than the professional group. Moreover, the maximum angular velocity timing of the gaze was the earliest parameter among the five segments, and significantly earlier (at least 16.32% of cycle time) than the maximum angular velocity of the head, shoulder, trunk, and pelvis within the amateur group. The visual-motor coordination strategies were different between the two groups, which could successfully be determined by wearable instruments of IMU.

Eye Tracking Glasses
Simulator

10 versions available

Validity of primary driving tasks in head-mounted display-based driving simulators

Year: 2021

Authors: B Hartfiel,R Stark

The development of new car interior concepts requires tools, particularly in development phases before concept milestones, which enable subjective experiences and evaluations in static and driving situations. On the one hand, variant comparisons are necessary; on the other hand, the level of immersion should be high enough that participants can behave as they would in real cars. Virtual reality technologies and especially head-mounted displays are generally very suitable for such evaluations with the exception being in state-of-the-art driving simulators. Therefore, a validation study was undertaken in which primary driving tasks in two HMD-based simulators were compared with test runs in a real car. The difference between the simulators was only the state of the motion base (enabled vs. disabled). In both simulators and the test runs in the real car, four identical scenarios (straight, curves, overtaking and junction) were carried out. Since the focus is primarily on subjective ratings and gaze behaviour when evaluating new car interior concepts, in this study gaze behaviour was also priority. In addition, driving dynamics parameters were measured. The results reveal that the participants show more valid behaviour in the dynamic system than in the static simulator condition.

Simulator
Software

4 versions available

Visual enhancements for the driver’s information search on automotive head-up display

Year: 2021

Authors: J Park,Y Im

In the past, in-vehicle head-up displays (HUDs) were used to display simple information including driving speed and the distance between cars. However, recent HUDs now display complex information such as advanced driver assistance information. This study aims to identify the effects of visual enhancements for HUDs on the driver’s performance and workload. Twenty participants conducted the tracking tasks for information search while driving in an automotive simulator environment. The participants experienced three levels of visual enhancements (none, shaded reference bar, translucent reference bar) for each task difficulty (low, medium, high). The results showed that visual enhancements and task difficulties had a significant effect on the tracking errors and subjective workloads. These findings verify that the translucent reference bar significantly improved the tracking performance. Furthermore, the visual enhancement cues on the HUDs play an important role in visual search. This research provides practical guidelines to ensure road safety through minimizing cognitive workload on drivers. Therefore, the results will encourage interface designers to consider the visual enhancement for HUDs from a user-centered perspective.

Simulator
Software

1 version available:

Visual-attribute-based emotion regulation of angry driving behaviors

Year: 2021

Authors: W Li, B Zhang, P Wang,C Sun, G Zeng

Driver anger has become a severe transportation problem resulting in significant injuries and fatalities. The rapid development of intelligent transportation systems has provided new opportunities for dynamic angry driving regulations. However, the regulation qualities of different visual attributes under different parameters are not yet clear. In this article, we investigate the anger regulation quality of different parameters for different visual attributes during driving (color: cold/warm; symbol: flat/simulated; expression: positive/negative). Twenty-one drivers drove nine times (N = 189) on a simulated highway scene with the data recording. The regulation quality of the drivers’ anger was analyzed from their subjective experience, behavior, and physiology. Results indicate that regulation driving with different visual attributes presented better driver anger regulation quality compared with baseline driving. Additionally, for the color attribute, the regulation quality of the cold hue was better than the warm hue, and for the expression attribute, the positive expression was better than the negative expression. Our results provide preliminary insights for three specific visual attributes of a driver’s anger regulation and compare their corresponding regulation qualities under different parameter changes. Our research provides empirical evidence for choosing different parameters among the three specific visual attributes in the early stages of designing a visual human–machine interface for driver anger.

Eye Tracking Glasses
Software

2 versions available

VR reality of the relationship between augmented reality and virtual reality in the context of virtual reality

Year: 2021

Authors: Y Pan

With the development of virtual reality technology and its application in various fields, how to realize the natural and efficient interaction between human and virtual environment has always been a hot research issue. This paper mainly studies the realistic analysis of the relationship between augmented reality and virtual reality under the background of VR virtual reality. Starting from the research on virtual reality, this paper combines the features of virtual reality with the elements of visual presentation to conduct research on visual presentation in the sensory experience of virtual reality, analyze and find out the types and quantitative methods of visual presentation. This paper is supported by advance research, which is more in line with people's perceptual needs for visual presentation. This study expands the research content of virtual reality visual presentation, provides guidance for virtual reality design practitioners, and has certain value and practical application prospects.

Eye Tracking Glasses
Software

2 versions available

A comparison of workload demands imposed by different types of distracted walking tasks and its effect on gait

Year: 2020

Authors: H Zheng,Y Luo,B Hu

A growing body of research has found that distracted walking is a safety concern due to reduced situation awareness and the possibility of compromised gait performance. Distraction tasks, such as texting, browsing social media, and playing games, differ in terms of their physical and cognitive demands. However, few studies have examined whether there are differences in how physical and cognitive demands impact gait. The goal of this paper is to evaluate workload differences between four distraction tasks that represent common smartphone functions and may differ in terms of physical and cognitive demands: 1) No distraction, 2) Reading, 3) Tapping, 4) N-Back. We characterized the workload differences using three methods, a subjective workload assessment (NASA-TLX) and two physiological workload measures, pupil width and blink rate. Our results suggest that the chosen distraction tasks differ in their workload demands. While, a preliminary analysis of descriptive gait parameters failed to find significant differences between the distraction conditions, further analysis of more complex gait measures may be required to understand differences between physical and cognitive demands.

Eye Tracking Glasses
Software

1 version available:

Analysis of Visual Search Characteristics Based on Drivers’ Hazard Perception

Year: 2020

Authors: T Wu, JS Yang, J Sun, CH Dai, XH Li

In order to study the driver’s visual search characteristics, eye movement analysis is used to measure drivers’ hazard perception in different scenarios. The mechanical division method is used to divide the field of driver’s vision into five areas, considered potential danger miss rate as the indicator of the driver’s hazard perception evaluation to analysis drivers’ visual search characteristics, and the saccade or fixation in the driver’s hazard perception process. Results show that drivers mainly obtain traffic information through near area in front of road, distant area in front of road, and potentially dangerous source areas. Drivers with high hazard perception have a wider visual search range, which can identify potential dangers more quickly and accurately. Moreover, drivers with high hazard perception tend to pay more visual attention to near road and the danger area, the visual search scope is more comprehensive, and the visual search strategy is more effective.

Eye Tracking Glasses
Software

2 versions available