Publication Hub Archive

UX Analysis

You have reached the Ergoneers Publication Hub for:

Field of Application > UX Analysis

Find all Publications here:

Publication Hub

Total results: 534

Effects of searching for street parking on driver behaviour, physiology, and visual attention allocation: An on-road study

Year: 2018

Authors: CT Ponnambalam

On-street parking is a major aspect of the urban street system, and entails important costs to drivers, in terms of time, inconvenience and energy to find a parking space. The efforts needed to park on-street can create frustrations and stress among drivers, which can contribute to unsafe driving behavior and ultimately affect road safety. Additionally, the search for available spaces creates disturbances to traffic, delays to other vehicles, and increased pollution due to extra fuel consumption. In this study, the effects of searching for on-street parking on driver behavior, physiology, and visual attention allocation were investigated using an on-road experimental approach. A total of 32 drivers participated in the study, during which their driving behavior, physiological responses, and visual attention were monitored while they searched for parking on urban streets in Toronto, Canada. Results indicated that searching for on-street parking led to significant changes in drivers' behavior, including reduced speeds and abrupt stops, as well as increases in physiological stress markers and visual attention diversion. Understanding these effects is crucial for urban traffic management and designing policies to mitigate the negative impacts associated with on-street parking search.

Eye Tracking Glasses
Software

3 versions available

Effects of written peer-feedback content and sender’s competence on perceptions, performance, and mindful cognitive processing

Year: 2018

Authors: M Berndt,JW Strijbos,F Fischer

Peer-feedback efficiency might be influenced by the oftentimes voiced concern of students that they perceive their peers’ competence to provide feedback as inadequate. Feedback literature also identifies mindful processing of (peer)feedback and (peer)feedback content as important for its efficiency, but lacks systematic investigation. In a 2 × 2 factorial design, peer-feedback content (concise general feedback [CGF] vs. elaborated specific feedback [ESF]) and competence of the sender (high vs. low) were varied. Students received a scenario containing an essay by a fictional student and fictional peer feedback, a perception questionnaire, and a text revision, distraction, and peer-feedback recall task. Eye tracking was applied to measure how written peer feedback was (re-)read, e.g., glance duration on exact words and sentences. Mindful cognitive processing was inferred from the relation between glance duration and (a) text-revision performance and (b) peer-feedback recall performance. Feedback by a high competent peer was perceived as more adequate. Compared to CGF, participants who received ESF scored higher on positive affect towards the peer feedback. No effects were found for peer-feedback content and/or sender’s competence level on performance. Glance durations were negatively correlated to text-revision performance regardless of condition, although peer-feedback recall showed that a basic amount of mindful cognitive processing occurred in all conditions. Descriptive findings also hint that this processing might be dependent on an interaction between peer-feedback content and sender’s competence, signifying a clear direction for future research.

Eye Tracking Glasses
Software

10 versions available

Environmental context influences visual attention to responsible drinking messages

Year: 2018

Authors: D Frings,AC Moss,IP Albery, G Eskisan

Aims Responsible drinking messages (RDMs) are used as a key tool to reduce alcohol-related harms. A common form of RDM is in a poster format displayed in places such as bars, bus stops and toilet cubicles. However, evidence for the effectiveness of RDMs remains limited. Moreover, it is not known how environmental contexts (e.g. the number of alcohol-related cues in the environment) impact how such RDMs are interacted with, nor how this in turn affects their efficacy. Methods One hundred participants completed a pseudo taste preference task in either in a bar laboratory (alcohol cue rich environmental context) or a traditional laboratory. The walls of the laboratory displayed either RDM or control posters during this task and eye tracking was used to assess participant attention to the posters. Results Participants looked at the RDM posters less in the bar laboratory where the environmental context is rich in alcohol cues compared to a traditional laboratory where alcohol cues are sparse. Neither poster type or environmental context affected the amount of ‘alcohol’ consumed and the amount of visual attention given to RDMs was unrelated to the amount of ‘alcohol’ consumed. Conclusions These findings provide experimental evidence that RDMs do not influence drinking behaviour in the direction intended (reduced consumption in situ). In addition, locating RDMs in alcohol-cue rich environments may result in sub-optimal behavioural responses to the RDM materials (e.g. visual attention to content). To maximize the potential impact of RDMs, the optimal location for RDMs is in environments where pre-existing alcohol cues are sparse to non-existent.

Eye Tracking Glasses
Simulator

16 versions available

Evaluation of the fine motor skills of children with DCD using the digitalised visual‐motor tracking system

Year: 2018

Authors: R Li,B Li, S Zhang,H Fu,WL Lo,J Yu

The study on the coordination between vision and motion of children with developmental coordination disorder (DCD) can help understand the mechanism of DCD for timely and appropriate intervention. Whereas the existing visual-motor integrated systems rely on markers attached to the subject to track the eye gaze and body movements, which is too expensive and not suitable for DCD assessment. In this study, a markerless visual-motor tracking system which consists of an eye-tracker used to track the eye gaze and a Kinect used to capture the body movements is designed to monitor the behaviour of children in the fine motor tasks. Then the eye gaze position of the subject is matched into the motion image captured by Kinect. The current data of children placing pegs captured by the proposed system are analyzed quantitatively. We find that the visual movement speed of the children with DCD or at risk of DCD is slower than that of the typical developing children to focus on the target while their hand movement speed is almost the same. In addition to DCD analysis, the proposed system is meaningful to the monitoring of other diseases related to visual-motor coordination.

Eye Tracking Glasses
Software

4 versions available

Eye blink detection for different driver states in conditionally automated driving and manual driving using EOG and a driver camera

Year: 2018

Authors: J Schmidt, R Laarousi, W Stolzmann

In this article, we examine the performance of different eye blink detection algorithms under various constraints. The goal of the present study was to evaluate the performance of an electrooculogram- and camera-based blink detection process in both manually and conditionally automated driving phases. A further comparison between alert and drowsy drivers was performed in order to evaluate the impact of drowsiness on the performance of blink detection algorithms in both driving modes. Data snippets from 14 monotonous manually driven sessions (mean 2 h 46 min) and 16 monotonous conditionally automated driven sessions (mean 2 h 45 min) were used. In addition to comparing two data-sampling frequencies for the electrooculogram measures (50 vs. 25 Hz) and four different signal-processing algorithms for the camera videos, we compared the blink detection performance of 24 reference groups. The analysis of the videos was based on very detailed definitions of eyelid closure events. The correct detection rates for the alert and manual driving phases (maximum 94%) decreased significantly in the drowsy (minus 2% or more) and conditionally automated (minus 9% or more) phases. Blinking behavior is therefore significantly impacted by drowsiness as well as by automated driving, resulting in less accurate blink detection.

Eye Tracking Glasses
Simulator

6 versions available

Eye-tracking Technology and its Application in Chinese Teaching and Learning Research.

Year: 2018

Authors: L Shi

This study explores the effects of different types of social media on mental health. By analyzing data from 5,000 participants, the researchers found that excessive use of certain platforms can lead to increased levels of anxiety and depression, while moderated use can have positive effects on social well-being.

Eye Tracking Glasses
Software

1 version available:

Fast and robust ellipse detection algorithm for head-mounted eye tracking systems

Year: 2018

Authors: I Martinikorena,R Cabeza,A Villanueva

In head-mounted eye tracking systems, the correct detection of pupil position is a key factor in estimating gaze direction. However, this is a challenging issue when the videos are recorded in real-world conditions, due to the many sources of noise and artifacts that exist in these scenarios, such as rapid changes in illumination, reflections, occlusions and an elliptical appearance of the pupil. Thus, it is an indispensable prerequisite that a pupil detection algorithm is robust in these challenging conditions. In this work, we present one pupil center detection method based on searching the maximum contribution point to the radial symmetry of the image. Additionally, two different center refinement steps were incorporated with the aim of adapting the algorithm to images with highly elliptical pupil appearances. The performance of the proposed algorithm is evaluated using a dataset consisting of 225,569 head-mounted annotated eye images from publicly available sources. The results are compared with the better algorithm found in the bibliography, with our algorithm being shown as superior.

Eye Tracking Glasses
Software

4 versions available

From reading to driving: priming mobile users for take-over situations in highly automated driving

Year: 2018

Authors: SS Borojeni, L Weber,W Heuten,S Boll

Highly automated vehicles, occasionally require users to resume vehicle control from non-driving related tasks by issuing cues called take-over request (TOR). Due to being engaged in non-driving related tasks (NDRT), users have a decreased level of situational awareness of the driving context. Therefore, user interface designs for TORs should ensure smooth transitions from the NDRTs to vehicle control. In this paper, we investigated the role of decision priming cues as TORs across different levels of NDRT engagement. In a driving simulator, users performed a reading span task while driving in automated mode. They received audio-visual TORs which primed them with an appropriate maneuver (steering vs. braking), depending on the traffic situation. Our results showed that priming users with upcoming maneuvers results in faster responses and longer time to collision to obstacles. However, the level of engagement in NDRT does not affect user responses to TORs.

Eye Tracking Glasses
Simulator

2 versions available

Gaze and body capture system under VR experiences

Year: 2018

Authors: J Murakami, T Morimoto,I Mitsugami

This paper proposes a novel system to simultaneously capture gaze behavior and the whole body motion of a person experiencing 6-DOF VR contents. This system consists of a VR goggle, eye-trackers attached to the goggle, and multiple Kinects. Measurements of those devices are all described in the consistent global coordinate system. Since the Kinects are robustly calibrated, whichever position and pose a user is located, his/her whole body pose as well as gaze directions is correctly measured. Using this system, we can easily capture gaze behaviors as well as body motion of people under any VR scenes, which is helpful for physiological researches.

Eye Tracking Glasses
Simulator

3 versions available

Gaze tracking using common webcams

Year: 2018

Authors: S Höffner

Eye and gaze tracking have long been methods to study visual attention. Many devices for gaze tracking are expensive and require specific setup and calibration procedures. For many gaze tracking setups, it is even mandatory to use multiple computers, for showing stimuli and for tracking gaze, respectively. Today, modern laptops are equipped with enough processing power to process a video stream live. Additionally, many laptops come with a built-in webcam for teleconferencing and video chats. In this thesis, the possibility of performing gaze tracking using a calibration free, feature-based approach on laptops using built-in webcams is explored. To try the model, the free and open source software library Gaze is implemented and evaluated. It is shown that Gaze reaches very good eye tracking capabilities and manages to be easily usable and extendable. Its gaze tracking abilities are still to be improved, but because of its modular structure existing solutions like pre-trained neural networks can be integrated to leverage their strengths.

Eye Tracking Glasses
Software

1 version available: