Publication Hub Archive

Dikablis Glasses

You have reached the Ergoneers Publication Hub for:

Product Name > Dikablis Glasses

Find all Publications here:

Publication Hub

Total results: 509

Probabilistic approach to robust wearable gaze tracking

Year: 2017

Authors: M Toivanen,K Lukander

This paper presents a method for computing the gaze point using camera data captured with a wearable gaze tracking device. The method utilizes a physical model of the human eye, advanced Bayesian computer vision algorithms, and Kalman filtering, resulting in high accuracy and low noise. Our C++ implementation can process camera streams with 30 frames per second in realtime. The performance of the system is validated in an exhaustive experimental setup with 19 participants, using a self-made device. Due to the used eye model and binocular cameras, the system is accurate for all distances and invariant to device movement. We also test our system against a best-in-class commercial device which is outperformed for spatial accuracy and precision. The software and hardware instructions as well as the experimental data are published as open source.

Eye Tracking Glasses
Software

15 versions available

Proceedings of European Conference on Human Centred Design for Intelligent Transport Systems

Year: 2017

Authors: A Morris, L Mendoza

The instrument cluster in the trucks become screens and this brings new challenges for the speedometer design. Both traditional speedometers (i.e. analogue and digital) present design advantages. However, the existing human-factors literature does not allow concluding whether one or the other type is more usable and less distracting. Digital speedometers would be more appropriate for absolute and relative reading, while analogue speedometers would be more efficient and less distracting for detecting dynamic speed change. This study compared both speedometer types on a screen instrument cluster in simulated truck driving. The task-dependant results were replicated. This study updates previous literature and provides a basis to investigate other speedometer types which would be efficient on the three tasks.

Eye Tracking Glasses
Simulator

1 version available:

Proceedings of the Human Factors and Ergonomics Society Europe Chapter 2016 Annual Conference: Human Factors and User Needs in Transport, Control …

Year: 2017

Authors: D de Waard,A Toffetti, R Wiczorek, A Sonderegger

This experiment aims to study the impact of the sonification of a hand gesture controlled system on the driver behavior. The principle is to give an auditory feedback to the driver, in addition to a visual screen, in order to assist in-car devices interface manipulation. A touchless interface has been tested with a panel of 24 subjects on a driving simulator. Different tasks (pick up the phone, select an item in a list) involving the screen and the interface had to be performed by the user while driving. To study the contribution of the sound feedback on the drivers’ behavior, two audio conditions were tested: with and without auditory feedback. The tasks were performed in lowly and highly demanding traffic conditions. Driving and gaze behavior as well as eye-tracking information were analyzed. Moreover, a questionnaire was used to obtain subjective measurements, such as ease of use and feeling of safety. The results show that the sonification helped drivers to feel safer and more focused on the road. This result was confirmed by gaze analysis, which shows that drivers look significantly less to the visual interface when a sound is present, leading to a safer use of the interface.

Eye Tracking Glasses
Simulator

3 versions available

Processing of congruent and incongruent facial expressions during listening to music: an eye-tracking study

Year: 2017

Authors: K Kallinen

Studies have shown that (a) multimodal emotional experience might be increased in the combined music-picture condition and (b) that music influences ratings on visual stimuli. However, there is scarcity of studies that examine the potential moderating effects of music on looking at images. In the present paper we report the results of an eye-tracking study on congruent and incongruent emotional music (joy, sad, and anger) and facial expressions (happy and sad). We expected that facial expressions congruent to music would attach more attention than incongruent faces. In addition, we expected that angry music (which had no corresponding face images), would elicit highest eye-movement activity between the facial expression (as the subject search for corresponding facial expression). Five men and five women aged 33-64 years (M=46.9) took part in the experiment. Their task was to listen to three pieces of music (a priori sad, joyful and angry) and at the same time look at facial expressions (sad and happy) presented in the screen. Eye movements were tracked with Ergoneer Dikablis eye-tracker during listening to music and watching the facial expressions. As expected, in connection joyful and sad music the congruent faces (i.e., happy faces for joyful music and sad faces for sad music) elicited more attention in terms of AOI attention ratio and total glance time as compared for incongruent faces (for AOI attention ratio Ms = 53.2% and 36.7%; p=.002; for total glance time Ms = 12.9 and 8.88 seconds, p=.002). In connection with music that expressed anger the preliminary analysis showed no effects. The results give new information about the interactive effects of emotional music and facial expressions. The knowledge about the effects of music on image processing and interaction between music and images are important and useful, among other things, in the context of (multi)media design and presentation.

Eye Tracking Glasses
Software

1 version available:

Protocols for the investigation of information processing in human assessment of fundamental movement skills

Year: 2017

Authors: BJ Ward,A Thornton, B Lay

Fundamental movement skill (FMS) assessment remains an important tool in classifying individuals’ level of FMS proficiency. The collection of FMS performances for assessment and monitoring has remained unchanged over the last few decades, but new motion capture technologies offer opportunities to automate this process. To achieve this, a greater understanding of the human process of movement skill assessment is required. The authors present the rationale and protocols of a project in which they aim to investigate the visual search patterns and information extraction employed by human assessors during FMS assessment, as well as the implementation of the Kinect system for FMS capture.

Eye Tracking Glasses
Software

6 versions available

Quantitative Usability Testing in User-Centered Product Development with Mobile Eye Tracking

Year: 2017

Authors: M Mussgnug

The usability assessment of tangible products holds manifold challenges in identifying the sources of usability problems and quantifying the evaluation of user-product interactions. In this field, the use of modern mobile eye tracking systems show great potential, as they are capable of non-invasively capturing the user’s field of vision, including the gaze point in almost any real-word setting. With the eye movements captured, the eye tracking data provides information, which offers an insight into the user’s intentions and struggles. However, the prospects of mobile eye tracking for usability assessments of tangible products are not well studied yet, and there is a lack of methods supporting the analysis of the resulting eye tracking data. Therefore, the goal of this thesis is to evaluate mobile eye tracking in usability assessments of tangible products over the conventional third-person view, and to develop methodological supports for the data analysis. A comparison study shows that the mobile eye tracking perspective leads to a more detailed description of the scene and a better explanation for the causes of usability problems, when compared to the third-person perspective. To facilitate the analysis of eye tracking data, three analysis methods have been developed. The Target-Based Analysis is a coding scheme for manual analysis, whereas the Scrutinizing algorithm and the Hand-Gaze Distance approach are semi-automated supports, detecting interruptions of the usage flow, considering fixation durations, saccade amplitudes and hand movements. The evaluation of the three methods shows that the Target-Based Analysis is applicable to a broad variety of applications, however, it is time-consuming. Both, the Scrutinizing algorithm and the Hand-Gaze Distance approach are able to reduce the manual effort and to identify usability problems, however, they are less accurate. The evidence-based description of the detected usability problems, derived from the mobile eye tracking data, are quickly understood, accepted, and foster a solution-oriented discussion, when presented to others. Overall, the application of mobile eye tracking in usability testing is vital, as it allows for a fine-granular and a quantifiable evaluation of user-product interactions. The three developed methods are of benefit for the analyst and enable the interpretation of eye tracking data in a more structured and partly automated way. In the future, with analysis methods developed further, mobile eye tracking is suitable to become an important element of usability assessments of tangible products.

Eye Tracking Glasses
Simulator

1 version available:

Randompad: Usability of randomized mobile keypads for defeating inference attacks

Year: 2017

Authors: A Maiti, K Crager

The feasibility of malicious keystroke inference attacks on mobile device keypads has been demonstrated by multiple recent research efforts, but very little has been accomplished in the direction of protection against such attacks. One common assumption in these attacks is that the adversary has knowledge of the size and layout of the keypad employed by the target user, which is reasonable as keypad layouts and sizes are generally standard. Thus, an effective protection strategy against such keystroke inference attacks would be to randomly change the layout of the target keypad. However, before proposing unconventional changes to the widely used and highly familiar default keypads, a comprehensive usability evaluation is required. This paper accomplishes this goal by comprehensively studying the usability of randomized keypads that employ varying degrees of randomization in terms of key size, sequence and position. The privacy-usability trade-off of different randomized keypad strategies is then analyzed by empirically comparing their ease-of-usage and security assurance.

Eye Tracking Glasses
Software

7 versions available

Reducing Peak Workload in the Cockpit

Year: 2017

Authors: TJJ Bos, GDR Zon, W Rouwhorst

In efforts to increase safety and reduce peak workload situations in the cockpit, a tool with a different interaction style was developed for use in case of a runway change instructed by Air Traffic Control during approach. In an experiment a workload comparison was made between the new tool and the conventional cockpit. Workload was measured by means of a self-rating after each experiment run, as well as eye blink frequency during each run. Results show that the self-rated workload decreases with the new tool for one of the two crew members and the blink frequency suggests a workload decrease for the other crew member. Considering the fact that the crew used the tool after only a few training runs, and the positive feedback provided by the crews it is concluded that the tool has a positive effect on peak workload.

Simulator
Software

2 versions available

Sensorimotor integration of vision and proprioception for obstacle crossing in ambulatory individuals with spinal cord injury

Year: 2017

Authors: RN Malik, R Cote, T Lam

Skilled walking, such as obstacle crossing, is an essential component of functional mobility. Sensorimotor integration of visual and proprioceptive inputs is important for successful obstacle crossing. The objective of this study was to understand how proprioceptive deficits affect obstacle-crossing strategies when controlling for variations in motor deficits in ambulatory individuals with spinal cord injury (SCI). Fifteen ambulatory individuals with SCI and 15 able-bodied controls were asked to step over an obstacle scaled to their motor abilities under full and obstructed vision conditions. An eye tracker was used to determine gaze behaviour and motion capture analysis was used to determine toe kinematics relative to the obstacle. Combined, bilateral hip and knee proprioceptive sense (joint position sense and movement detection sense) was assessed using the Lokomat and customized software controls. Combined, bilateral hip and knee proprioceptive sense in subjects with SCI varied and was significantly different from able-bodied subjects. Subjects with greater proprioceptive deficits stepped higher over the obstacle with their lead and trail limbs in the obstructed vision condition compared with full vision. Subjects with SCI also glanced at the obstacle more frequently and with longer fixation times compared with controls, but this was not related to proprioceptive sense. This study indicates that ambulatory individuals with SCI rely more heavily on vision to cross obstacles and show impairments in key gait parameters required for successful obstacle crossing. Our data suggest that proprioceptive deficits need to be considered in rehabilitation programs aimed at improving functional mobility in ambulatory individuals with SCI.

Eye Tracking Glasses
Software

7 versions available

SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies

Year: 2017

Authors: TC Kübler, C Rothe, U Schiefer,W Rosenstiel

Our eye movements are driven by a continuous trade-off between the need for detailed examination of objects of interest and the necessity to keep an overview of our surrounding. In consequence, behavioral patterns that are characteristic for our actions and their planning are typically manifested in the way we move our eyes to interact with our environment. Identifying such patterns from individual eye movement measurements is however highly challenging. In this work, we tackle the challenge of quantifying the influence of experimental factors on eye movement sequences. We introduce an algorithm for extracting sequence-sensitive features from eye movements and for the classification of eye movements based on the frequencies of small subsequences. Our approach is evaluated against the state-of-the art on a novel and a very rich collection of eye movements data derived from four experimental settings, from static viewing tasks to highly dynamic outdoor settings. Our results show that the proposed method is able to classify eye movement sequences over a variety of experimental designs. The choice of parameters is discussed in detail with special focus on highlighting different aspects of general scanpath shape. Algorithms and evaluation data are available at: http://www.ti.uni-tuebingen.de/scanpathcomparison.html.

Eye Tracking Glasses
Software

7 versions available