Publication Hub Archive

Eye Tracker

You have reached the Ergoneers Publication Hub for:

Used Tool > Eye Tracker

Find all Publications here:

Publication Hub

Total results: 582

Understanding and Supporting Anticipatory Driving in Automated Vehicles

Year: 2020

Authors: D He

Understanding and Supporting Anticipatory Driving in Automated Vehicles He, Dengbo University of Toronto (Canada), 2020. Abstract: As automated vehicles (AVs) are increasingly becoming a reality on our roads, understanding the interaction between human drivers and these vehicles is critical. Anticipatory driving refers to the human driver's ability to predict and react to road events before they occur, a skill that enhances safety and efficiency. This dissertation explores methods to support anticipatory driving behaviors in AVs through improved human-vehicle interaction. The research identifies key anticipatory behaviors, develops support systems for these behaviors, and evaluates their effectiveness. Findings suggest that enhancing AV interfaces and feedback mechanisms can significantly improve human-vehicle collaboration and overall driving performance.

Simulator
Software

4 versions available

Understanding the Cognitive and Psychological Impacts of Emerging Technologies on Driver Decision-Making Using Physiological Data

Year: 2020

Authors: S Agrawal

Emerging technologies, such as advanced driver-assistance systems (ADAS) and autonomous vehicles (AVs), are transforming the driving experience. These technologies can influence driver cognition and decision-making processes in various ways. This study aims to understand the cognitive and psychological impacts of these emerging technologies on driver decision-making by utilizing physiological data. Through the analysis of data such as heart rate variability, skin conductance, and eye-tracking metrics, the research investigates how drivers' mental and physical states are affected during interaction with ADAS and AVs. The findings aim to provide insights into improving the design and safety of these technologies, ultimately enhancing driver comfort and performance.

Eye Tracking Glasses
Software

4 versions available

Understanding the role of visual attention on wines’ purchase intention: An eye-tracking study

Year: 2020

Authors: P Monteiro,J Guerreiro,SMC Loureiro

Purpose: Wine bottles compete for consumers’ attention on the shelf during the decisive moment of choice. This study aims to explore the role that visual attention to wine labels has on the purchase decision and the mediating role of quality perceptions and desire on such purchase behaviors. Wine awards and consumption situation are used as moderators. Design/methodology/approach: The study was conducted in Portugal and 36 individuals participated in a 2 × 2 within subjects design (awarded/not awarded × self-consumption/social-consumption). For each scenario, individuals’ attention, perceptions of quality, desire and purchase intentions were recorded. Findings: Data from eye-tracking shows that, during the purchase process, the amount of attention given to a bottle is determinant of individuals’ purchase intentions, a relationship that increases in significance for bottles with awards and for when consumers are buying wine for a consumption situation involving a social environment. In addition, both quality perceptions and desire are confirmed to positively influence wines’ purchase intentions. Originality/value: By using an eye monitoring method, this paper brings new insights into the wine industry by highlighting the impact that wines’ labels and different consumption situations have on individuals’ attention and purchase intention. Wine producers and retailers may benefit from the insights provided by the current study to refine their communication strategies by either highlighting product characteristics and pictorial elements, as it is the case of the awards, or communicating about their products for different consumption situations.

Eye Tracking Glasses
Software

6 versions available

User interface for in-vehicle systems with on-wheel finger spreading gestures and head-up displays

Year: 2020

Authors: SH Lee, SO Yoon

Interacting with an in-vehicle system through a central console is known to induce visual and biomechanical distractions, thereby delaying the danger recognition and response times of the driver and significantly increasing the risk of an accident. To address this problem, various hand gestures have been developed. Although such gestures can reduce visual demand, they are limited in number, lack passive feedback, and can be vague and imprecise, difficult to understand and remember, and culture-bound. To overcome these limitations, we developed a novel on-wheel finger spreading gestural interface combined with a head-up display (HUD) allowing the user to choose a menu displayed in the HUD with a gesture. This interface displays audio and air conditioning functions on the central console of a HUD and enables their control using a specific number of fingers while keeping both hands on the steering wheel. We compared the effectiveness of the newly proposed hybrid interface against a traditional tactile interface for a central console using objective measurements and subjective evaluations regarding both the vehicle and driver behaviour. A total of 32 subjects were recruited to conduct experiments on a driving simulator equipped with the proposed interface under various scenarios. The results showed that the proposed interface was approximately 20% faster in emergency response than the traditional interface, whereas its performance in maintaining vehicle speed and lane was not significantly different from that of the traditional one.

Eye Tracking Glasses
Simulator

7 versions available

User-centred design and evaluation of a tele-operated echocardiography robot

Year: 2020

Authors: M Giuliani, D Szczęśniak

We present the collected findings of a user-centred approach for developing a tele-operated robot for remote echocardiography examinations. During the three-year development of the robot, we involved users in all development stages of the robot, to increase the usability of the system for the doctors. For requirement compilation, we conducted a literature review, observed two traditional examinations, arranged focus groups with doctors and patients, and conducted two online surveys. During the development of the robot, we regularly involved doctors in usability tests to receive feedback from them on the user interface for the robot and on the robot’s hardware. For evaluation of the robot, we conducted two eye tracking studies. In the first study, doctors executed a traditional echocardiography examination. In the second study, the doctors conducted a remote examination with our robot. The results of the studies show that all doctors were able to successfully complete a correct ultrasonography examination with the tele-operated robot. In comparison to a traditional examination, the doctors on average only need a short amount of additional time to successfully examine a patient when using our remote echocardiography robot. The results also show that the doctors fixate considerably more often, but with shorter fixation times, on the USG screen in the traditional examination compared to the remote examination. We found further that some of the user-centred design methods we applied had to be adjusted to the clinical context and the hectic schedule of the doctors. Overall, our experience and results suggest that the usage of user-centred design methodology is well suited for developing medical robots and leads to a usable product that meets the end users’ needs.

Eye Tracking Glasses
Software

6 versions available

What comes first: combining motion capture and eye tracking data to study the order of articulators in constructed action in sign language narratives

Year: 2020

Authors: T Jantunen,A Puupponen,B Burger

We use synchronized 120 fps motion capture and 50 fps eye tracking data from two native signers to investigate the temporal order in which the dominant hand, the head, the chest and the eyes start producing overt constructed action from regular narration in seven short Finnish Sign Language stories. From the material, we derive a sample of ten instances of regular narration to overt constructed action transfers in ELAN which we then further process and analyze in Matlab. The results indicate that the temporal order of articulators shows both contextual and individual variation but that there are also repeated patterns which are similar across all the analyzed sequences and signers. Most notably, when the discourse strategy changes from regular narration to overt constructed action, the head and the eyes tend to take the leading role, and the chest and the dominant hand tend to start acting last. Consequences of the findings are discussed.

Eye Tracking Glasses
Software

5 versions available

Where do pedestrians look when crossing? A state of the art of the eye-tracking studies

Year: 2020

Authors: L Lévêque,M Ranchet,J Deniel, JC Bornard

It has been widely shown in the literature that analysing eye movements and positions can provide useful information for a better understanding of human perception and cognition. The eye-tracking technology, as a process of measuring where people look, has established itself as a widespread means of studying visual information processing in several domains, including in the study of human walking. Street-crossing can be defined as a particular form of walking. Indeed, several elements have to be considered in the decision-making process, such as the distance headway, traffic density, vehicle speed, etc. It is also a very risky aspect of walking as pedestrians are considered one of the most vulnerable road users. In this article, we present an up-to-date comprehensive review of existing eye-tracking experiments in the literature, from the pedestrian's point of view, with a view to study the effects of both internal (e.g., age) and external (e.g., road environment) factors on pedestrians' road crossing gaze behaviour. Furthermore, the current gaps in the literature are then discussed in order to open up some future perspectives in the field, such as the forthcoming introduction of automated vehicles on the roads.

Eye Tracking Glasses
Software

7 versions available

500,000 images closer to eyelid and pupil segmentation

Year: 2019

Authors: W Fuhl,W Rosenstiel,E Kasneci

Human gaze behavior is not the only important aspect about eye tracking. The eyelids reveal additional important information; such as fatigue as well as the pupil size holds indications of the workload. The current state-of-the-art datasets focus on challenges in pupil center detection, whereas other aspects, such as the lid closure and pupil size, are neglected. Therefore, we propose a fully convolutional neural network for pupil and eyelid segmentation as well as eyelid landmark and pupil ellipsis regression. The network is jointly trained using the Log loss for segmentation and L1 loss for landmark and ellipsis regression. The application of the proposed network is the offline processing and creation of datasets. Which can be used to train resource-saving and real-time machine learning algorithms such as random forests. In addition, we will provide the worlds largest eye images dataset with more than 500,000 images.

Eye Tracking Glasses
Software

4 versions available

A comparison of pilot upset recovery performance in simulation and flight

Year: 2019

Authors: CM Reuter

This research examined the differences in pilot upset recovery performance in simulation and actual flight. The study compared pilots' ability to recover from unanticipated aircraft upsets in both environments to determine the effectiveness of simulation training versus real-world training. Data were gathered from multiple pilot groups with varying levels of experience and proficiency. The findings provide insights into the advantages and limitations of using simulation for upset recovery training and highlight the importance of incorporating both simulation and flight training for comprehensive pilot preparation.

Eye Tracking Glasses
Simulator

3 versions available

A Compression of Quiet Eye in Children with High and Low Motor Proficiency

Year: 2019

Authors: H Fahimi, E Arabameri

The aim of this study was to compare quiet eye in children with high and low motor proficiency. In this causal-comparative study, 40 children (7 to 14 years old) in Isfahan city were selected by multiple cluster sampling method. Participants performed throwing and catching tasks in 10 trials. The data of quiet eyes were recorded by an eye tracking device (Ergoneers) and analyzed by Dikablis 3.1 software when performing the desired task. After the evaluation of data normality, the data were analyzed by independent t test and Pearson correlation coefficient at the significance level of 0.05. Independent t test results showed that children with high motor proficiency were better in the onset of QE (P =0.0001), offset of QE (P =0.023), QE duration (P =0.0001) and catching performance (P =0.0001) than children with lower motor proficiency. Also, the results showed a significant negative relationship between the catching performance and the onset of QE and a significant positive relationship between the catching performance and the offset of QE and QE duration in both groups of children with high and low motor proficiency. The results of the study generally revealed that the QE is considered as an effective variable associated with motor performance and appropriate to expertise level.

Eye Tracking Glasses
Software

2 versions available