Publication Hub Archive

Dikablis Glasses

You have reached the Ergoneers Publication Hub for:

Product Name > Dikablis Glasses

Find all Publications here:

Publication Hub

Total results: 509

Pupil segmentation approach on low resolution images

Year: 2015

Authors: LV Romaguera,FP Romero,CRV Seisdedos

The use of the characteristics of the iris and the pupil is useful in a wide range of applications. There are several studies about pupil detection, however, most of these works are evaluated using infrared, ideal or high quality images. In this paper we propose a method based on a combination of pre-processing (it includes histogram equalization, thresholding, morphological operations), edge detection and Hough transform. The evaluation was performed with 1214 low resolution images from the UBIRIS database. The experimental results show that the proposed method is feasible and has acceptable accuracy. The major advantage is the possibility to be used with images captured with a cheap webcam.

Eye Tracking Glasses
Software

3 versions available

Reviewers: Ian Giblet, CAS-UK Jan-Patrick Osterloh, OFF

Year: 2015

Authors: STWT Borchers, MUTO Botta, SSNV Collina

This deliverable reports the progress of the HoliDes consortium to develop methods, techniques, and tools (MTTs) for the Human Factors-Reference Technology Platform (HF-RTP), version 1.0. For work package 5 (WP5), it concludes project cycle I. During this cycle, as a first step we received the requirements from the application work packages WP6-9. After an analysis of these requirements (cf. D5.1), we selected those metrics and methods to be developed in WP5, which could best meet the AdCoS owners’ needs. Having documented those MTTs as HF-RTP 0.5 in D5.2, the first instantiations of these techniques and tools were made. The result of this work is described in this document. For each method, technique or tool, a detailed description is provided concerning data the MTTs receive, data they provide, their current functionality as well as specific and five definitive and further potential use cases. These use cases (see Table 1) originate from the four HoliDes domains Health, Aerospace, Control Room and Automotive. In our definition, a method is a general way to solve a problem. This could be the use of task analysis to answer a general design question. A technique is a concrete instantiation of such a method, as would be the application of a specific form of task analysis to the development and evaluation of an adaptive system. Finally, a tool is a technique, which has been realized as either hard- or software. Such a tool could be a program that aids the collection and organisation of observations during the task analysis. The MTTs created in this work package follow the objective to “Develop techniques and tools for empirical analysis of Adaptive Cooperative Human-Machine Systems (AdCoS) against human factors and safety regulations.” To achieve this objective and provide the application work packages 6–9 (WP6-9) with methods that best suit their needs the starting point of our work has been the AdCoS requirements from WP6-9. Some of these requirements describe genuine AdCoS functionality, while others relate to MTTs necessary to develop AdCoS functionality or aspects of the design process itself. Consequently, the purpose of WP5’s MTTs is to enable, aid, and assist the empirical analysis of adaptive, cooperative systems, or to act as part of these systems functionality. The actual outcome of the work presented here will be software tools, empirical results as the basis for system functionality and design decisions, but also procedures and algorithms and their implementation. All of these efforts help realize the human centred design process as e.g. described in ISO 9241-210, both during design and evaluation. For a quick overview over WP5’s MTT landscape, Table 1 shows both names and short descriptions of all methods, techniques and tools as well as the use cases they are being applied to.

Simulator
Software

1 version available:

SET: a pupil detection method using sinusoidal approximation

Year: 2015

Authors: AH Javadi, Z Hakimi, M Barati,V Walsh

Mobile eye-tracking in external environments remains challenging, despite recent advances in eye-tracking software and hardware engineering. Many current methods fail to deal with the vast range of outdoor lighting conditions and the speed at which these can change. This confines experiments to artificial environments where conditions must be tightly controlled. Additionally, the emergence of low-cost eye tracking devices calls for the development of analysis tools that enable non-technical researchers to process the output of their images. We have developed a fast and accurate method (known as “SET”) that is suitable even for natural environments with uncontrolled, dynamic and even extreme lighting conditions. We compared the performance of SET with that of two open-source alternatives by processing two collections of eye images: images of natural outdoor scenes with extreme lighting variations (“Natural”); and images of less challenging indoor scenes (“CASIA-Iris-Thousand”). We show that SET excelled in outdoor conditions and was faster, without significant loss of accuracy, indoors. SET offers a low cost eye-tracking solution, delivering high performance even in challenging outdoor environments. It is offered through an open-source MATLAB toolkit as well as a dynamic-link library (“DLL”), which can be imported into many programming languages including C# and Visual Basic in Windows OS.

Eye Tracking Glasses
Software

13 versions available

Speaking and listening with the eyes: Gaze signaling during dyadic interactions

Year: 2015

Authors: S Ho,T Foulsham, A Kingstone

Cognitive scientists have long been interested in the role that eye gaze plays in social interactions. Previous research suggests that gaze acts as a signaling mechanism and can be used to control turn-taking behaviour. However, early research on this topic employed methods of analysis that aggregated gaze information across an entire trial (or trials), which masks any temporal dynamics that may exist in social interactions. More recently, attempts have been made to understand the temporal characteristics of social gaze but little research has been conducted in a natural setting with two interacting participants. The present study combines a temporally sensitive analysis technique with modern eye tracking technology to 1) validate the overall results from earlier aggregated analyses and 2) provide insight into the specific moment-to-moment temporal characteristics of turn-taking behaviour in a natural setting. Dyads played two social guessing games (20 Questions and Heads Up) while their eyes were tracked. Our general results are in line with past aggregated data, and using cross-correlational analysis on the specific gaze and speech signals of both participants we found that 1) speakers end their turn with direct gaze at the listener and 2) the listener in turn begins to speak with averted gaze. Convergent with theoretical models of social interaction, our data suggest that eye gaze can be used to signal both the end and the beginning of a speaking turn during a social interaction. The present study offers insight into the temporal dynamics of live dyadic interactions and also provides a new method of analysis for eye gaze data when temporal relationships are of interest.

Eye Tracking Glasses
Software

13 versions available

Towards virtually transparent vehicles: first results of a simulator study and a field trial

Year: 2015

Authors: MCA Baltzer, A Krasni, P Boehmsdorff

Current versions of heavy trucks, tanks or excavators are subject to limited visibility due to small windshields. In order to overcome such limitations one option is to create a virtually transparent vehicle by using a camera-monitor / Head Mounted Display (HMD) system to provide a seamless vision to the driver. The aim of the study is to compare two vision systems for 'virtually transparent' vehicles, a HMD and a camera-monitor system, in a simulation environment with regard to ergonomic aspects and future prospects. The structure of the simulator includes a generic mock-up of the vehicle to emulate the visual masking effects of a real armoured vehicle. Thus, the driver can experience the obstruction of the visual space caused by the A-pillars. In addition, the degree of immersion of the driver is increased by windows on the left and right sides. The vision system with monitors is built in a semicircular shape in front of the driver with five 13 inch monitors. In this arrangement, the interior angle between adjacent displays is 40°, hence a total of 160° view can be displayed. The display panels have a maximum resolution of 1280 x 960 and an aspect ratio of 16:10. The alternative vision system with HMD uses an Oculus Rift DK2. In order to create a three-dimensional view around the driver, the images are projected on a curved surface and which provides a freedom for the driver to look around in all the directions. The Oculus Rift provides a nominal field of view (FoV) of approximately 100°. A simulated distance of about 16 km was repeatedly driven for 2 hours in different test conditions like federal highways, short pieces of off-road tracks and crossings with simulated intersection traffic, under consideration of the rules of the road. In order to minimise a sequence effect, the order in which these test conditions were presented was changed. After driving for each condition acceptance, performance, subjective stress (NASA TLX), workload, usability and driving performance were determined. As a secondary task, the driver had to identify and announce possible threats out loud.

Eye Tracking Glasses
Simulator

1 version available:

Using sound to reduce visual distraction from in-vehicle human–machine interfaces

Year: 2015

Authors: P Larsson, M Niemand

Objective: Driver distraction and inattention are the main causes of accidents. The fact that devices such as navigation displays and media players are part of the distraction problem has led to the formulation of guidelines advocating various means for minimizing the visual distraction from such interfaces. However, although design guidelines and recommendations are followed, certain interface interactions, such as menu browsing, still require off-road visual attention that increases crash risk. In this article, we investigate whether adding sound to an in-vehicle user interface can provide the support necessary to create a significant reduction in glances toward a visual display when browsing menus. Methods: Two sound concepts were developed and studied; spearcons (time-compressed speech sounds) and earcons (musical sounds). A simulator study was conducted in which 14 participants between the ages of 36 and 59 took part. Participants performed 6 different interface tasks while driving along a highway route. A 3 × 6 within-group factorial design was employed with sound (no sound /earcons/spearcons) and task (6 different task types) as factors. Eye glances and corresponding measures were recorded using a head-mounted eye tracker. Participants’ self-assessed driving performance was also collected after each task with a 10-point scale ranging from 1 = very bad to 10 = very good. Separate analyses of variance (ANOVAs) were conducted for different eye glance measures and self-rated driving performance. Results: It was found that the added spearcon sounds significantly reduced total glance time as well as number of glances while retaining task time as compared to the baseline (= no sound) condition (total glance time M = 4.15 for spearcons vs. M = 7.56 for baseline, p =.03). The earcon sounds did not result in such distraction-reducing effects. Furthermore, participants ratings of their driving performance were statistically significantly higher in the spearcon conditions compared to the baseline and earcon conditions (M = 7.08 vs. M = 6.05 and M = 5.99 respectively, p =.035 and p =.002). Conclusions: The spearcon sounds seem to efficiently reduce visual distraction, whereas the earcon sounds did not reduce distraction measures or increase subjective driving performance. An aspect that must be further investigated is how well spearcons and other types of auditory displays are accepted by drivers in general and how they work in real traffic.

Eye Tracking Glasses
Simulator

11 versions available

An experimental eye-tracking study for the design of a context-dependent social robot blinking model

Year: 2014

Authors: A Zaraki,MB Dehkordi,D Mazzei

Human gaze and blinking behaviours have been recently considered, to empower humanlike robots to convey a realistic behaviour in a social human-robot interaction. This paper reports the findings of our investigation on human eye-blinking behaviour in relation to human gaze behaviour, in a human-human interaction. These findings then can be used to design a humanlike eye-blinking model for a social humanlike robot. In an experimental eye-tracking study, we showed to 11 participants, a 7-minute video of social interactions of two people, and collected their eye-blinking and gaze behaviours with an eye-tracker. Analysing the collected data, we measured information such as participants’ blinking rate, maximum and minimum blinking duration, number of frequent (multiple) blinking, as well as the participants’ gaze directions on environment. The results revealed that participants’ blinking rate in a social interaction are qualitatively correlated to the gaze behaviour, as higher number of gaze shift increased the blinking rate. Based on the findings of this study, we can propose a context-dependent blinking model as an important component of the robot’s gaze control system that can empower our robot to mimic human blinking behaviour in a multiparty social interaction.

Eye Tracking Glasses
Software

7 versions available

Animating Eyes

Year: 2014

Authors: GB Guerrero, A Duchowski

Accurately and efficiently depicting the liveliness of eyes is not an easy task in animation. In this paper, we propose a method that utilizes eye tracking and motion capture technologies to animate 3D eyes. We measure it based on accuracy, efficiency, and user satisfaction.

Eye Tracking Glasses
Software

1 version available:

Binocular Glaucomatous Visual Field Loss and Its Impact on Visual Exploration-A Supermarket Study

Year: 2014

Authors: E Papageorgiou,K Sippel,E Kasneci, W Rosentiel

Advanced glaucomatous visual field loss may critically interfere with quality of life. The purpose of this study was to (i) assess the impact of binocular glaucomatous visual field loss on a supermarket search task as an example of everyday living activities, (ii) to identify factors influencing the performance, and (iii) to investigate the related compensatory mechanisms. Ten patients with binocular glaucoma (GP), and ten healthy-sighted control subjects (GC) were asked to collect twenty different products chosen randomly in two supermarket racks as quickly as possible. The task performance was rated as “passed” or “failed” with regard to the time per correctly collected item. Based on the performance of control subjects, the threshold value for failing the task was defined as m+3s (in seconds per correctly collected item). Eye movements were recorded by means of a mobile eye tracker. Eight out of ten patients with glaucoma and all control subjects passed the task. Patients who failed the task needed significantly longer time (111.47 s ± 12.12 s) to complete the task than patients who passed (64.45 s ± 13.36 s, t-test, p < 0.001). Furthermore, patients who passed the task showed a significantly higher number of glances towards the visual field defect (VFD) area than patients who failed (t-test, p < 0.05). According to these results, glaucoma patients with defects in the binocular visual field display on average longer search times in a naturalistic supermarket task. However, a considerable number of patients, who compensate by frequent glancing towards the VFD, showed successful task performance. Therefore, systematic exploration of the VFD area seems to be a “time-effective” compensatory mechanism during the present supermarket task.

Eye Tracking Glasses
Software

2 versions available

Boundary conditions for information visualization with respect to the user’s gaze

Year: 2014

Authors: M Tönnis,G Klinker

Gaze tracking in Augmented Reality is mainly used to trigger buttons and access information. Such selectable objects are usually placed in the world or in screen coordinates of a head- or hand-mounted display. Yet, no work has investigated options to place information with respect to the line of sight. This work presents our first steps towards gaze-mounted information visualization and interaction, determining boundary conditions for such an approach. We propose a general concept for information presentation at an angular offset to the line of sight. A user can look around freely, yet having information attached nearby the line of sight. Whenever the user wants to look at the information and does so, the information is placed directly at the axis of sight for a short time. Based on this concept we investigate how users understand frames of reference, specifically, if users relate directions and alignments in head or world coordinates. We further investigate if information may have a preferred motion behavior. Prototypical implementations of three variants are presented to users in guided interviews. The three variants resemble a rigid offset and two different floating motion behaviors of the information. Floating algorithms implement an inertia based model and either allow the user's gaze to surpass the information or to push information with the gaze. Testing our proto-types yielded findings that users strongly prefer information maintaining world-relation and that less extra motion is preferred.

Eye Tracking Glasses
Software

9 versions available