Publication Hub Archive

Dikablis Glasses

You have reached the Ergoneers Publication Hub for:

Product Name > Dikablis Glasses

Find all Publications here:

Publication Hub

Total results: 509

AnalysingPeople’sMovementintheBuiltEnvironmentvia Space Syntax, Objective Tracking and Gaze Data

Year: 2015

Authors: S Eloy,TP Lázaro Ourique,R Resende,MS Dias

In this paper we use analysis tools from Space Syntax and objective observation of the human behaviour, to understand the impact of landmarks in the walking patterns of users of spaces. Our case study was a large exterior public open space (University Campus), in which participants could walk freely and simultaneously be tracked by several sensors. We carried Space Syntax analysis for this space, and then collected Global Positioning System (GPS) tracking information and used a mobile eye-tracking device to acquire eye gaze information. The collected data allowed us to map and analyse each subject behaviour in the public space. A more specific analysis was done to four selected landmarks that, according to the Space Syntax analysis, were the ones with higher integration values. Results indicate that landmarks with such higher integration values show also a larger count of fixations and saccades of gaze interaction.

Eye Tracking Glasses
Software

2 versions available

App analytics: predicting the distraction potential of in-vehicle device applications

Year: 2015

Authors: M Krause,AS Conti, M Henning, C Seubert

Three experiments were conducted to check the feasibility of predicting experimental outcomes of driver distraction studies. The predictions are based on subtasks analysis and synthesis. In the first experiment, data (e.g., Total Glance Time, Single Glance Durations and Total Shutter Open Times) are gathered when subjects interacted with touch screen applications. In a second experiment, additional data were gathered about rotary knob interactions. These data were used to synthesis and predict the outcomes of a third (evaluation) experiment, which involved rotary knob and touch screen tasks. The results are promising and can help to have a better understanding of problematic subtasks and reduce testing of clearly unsuitable applications. The transfer of the procedure to other laboratories is challenging. The modeling and mapping process includes many subjective decisions.

Eye Tracking Glasses
Simulator

3 versions available

Black or white? Influence of robot arm contrast on distraction in human-robot interaction

Year: 2015

Authors: J Schmidtler, A Sezgin, T Illa,K Bengler

The conducted study is concerned with the visual appearance of a common industrial robot and the influence on the human worker while acting in the same workplace at the very same time. Sixteen volunteers, eight novices and eight experts participated in the study. Equipped with an eye-tracking-system glance chains while revealing the robot and number of glances influenced by different contrast conditions of the robot arm while working on a primary and an interactive secondary task where measured. The results of the first part are that human operators perceive a common six-axis industrial robot in a comparable way from bottom up to the tool-center-point and over the arm-kinematic back. The second part revealed that higher robot-arm contrasts lead to higher distraction caused by a higher number of glances to the moving robot.

Eye Tracking Glasses
Software

5 versions available

Blickbasierte Mensch-Computer-Interaktion mit Geoinformationssystemen

Year: 2015

Authors: P Kiefer

Die Schnittstelle zwischen Mensch und Computer steht zunehmend im Fokus der geoinformatischen Forschung. Aber auch in der Praxis wird der Aspekt der Nutzbarkeit immer wichtiger für die effiziente Einsetzbarkeit und breite Vermarktung eines Geoinformationssystems (GIS). Dieser Beitrag gibt einen Literaturüberblick zur blickbasierten Interaktion mit (mobilen oder Desktop-basierten) Geoinformationen. Bei dieser Interaktionsform wird mit Eye Tracking-Systemen das Blickverhalten des Nutzers gemessen, in Echtzeit verarbeitet, und zur Anpassung der Nutzerschnittstelle verwendet. Das Kapitel bietet eine generelle Einführung in die Eye Tracking-Technologie und verweist auf aktuelle Forschungsansätze im Bereich blickbasierter Interaktion mit Geoinformation.

Eye Tracking Glasses
Software

2 versions available

Combining direct and indirect touch input for interactive workspaces using gaze input

Year: 2015

Authors: S Voelker,A Matviienko,J Schöning

Interactive workspaces combine horizontal and vertical touch surfaces into a single digital workspace. During an exploration of these systems, it was shown that direct interaction on the vertical surface is cumbersome and more inaccurate than on the horizontal one. To overcome these problems, indirect touch systems turn the horizontal touch surface into the input which allows manipulation of objects on the vertical display. If the horizontal touch surface also acts as a display, however, it becomes necessary to notify the system which screen is currently in use by providing a switching mode. We investigate the use of gaze tracking to perform these mode switches. In three user studies, we compare absolute and relative gaze augmented selection techniques with the traditional direct-touch approach. Our results show that our relative gaze augmented selection technique outperforms the other techniques for simple tapping tasks alternating between horizontal and vertical surfaces, and for dragging on the vertical surface. However, when tasks involve dragging across surfaces, the findings are more complex. We provide a detailed description of the proposed interaction techniques, a statistical analysis of these interaction techniques, and how they can be applied to systems that involve a combination of multiple horizontal and vertical touch surfaces.

Eye Tracking Glasses
Software

6 versions available

Driver-activity recognition in the context of conditionally autonomous driving

Year: 2015

Authors: C Braunagel,E Kasneci, W Stolzmann

This paper presents a novel approach to automated recognition of the driver's activity, which is a crucial factor for determining the take-over readiness in conditionally autonomous driving scenarios. Therefore, an architecture based on head-and eye-tracking data is introduced in this study and several features are analyzed. The proposed approach is evaluated on data recorded during a driving simulator study with 73 subjects performing different secondary tasks while driving in an autonomous setting. The proposed architecture shows promising results towards in-vehicle driver-activity recognition. Furthermore, a significant improvement in the classification performance is demonstrated due to the consideration of novel features derived especially for the autonomous driving context.

Eye Tracking Glasses
Simulator

6 versions available

Driving behaviour and driver assistance at traffic light intersections

Year: 2015

Authors: L Rittger

The increasing importance of environmental friendly and efficient transportation guides the interest of researchers and car manufacturers towards the development of technologies that support an efficient driving style. This thesis presents the development of a traffic light assistance system with the focus on human factors. The system aims on supporting drivers in approaching traffic light intersections efficiently. In three driving simulator studies, the content related research covered the investigation of the unassisted driving task, the influence of the system on the driver’s perception of the interaction with other road users and the information strategy of the human machine interface. When the traffic light phase changes or when visibility is limited, drivers prepare driving behaviour that is not appropriate for the traffic light phase at arrival at the intersection. These situations offer the greatest potential for the assistance system. The traffic light assistant is able to change driving behaviour. However, the expectation of other road user’s emotional reactions influences driver compliance. In situations in which drivers expected to bother others with their driving behaviour, compliance to the traffic light assistant was low. Further, the deviations of driver behaviour from the target strategy of the traffic light assistant are lowest when the HMI includes the two information units target speed and action recommendations. Traffic light phase information in the HMI is a subjectively important information for drivers. The results point towards the presentation of all three information units. The method related research covered the development of a method for measuring drivers’ information demand for dynamic stimuli. While driving, specific stimuli are action relevant for drivers, i.e. they need to be processed in order to decide on the appropriate driving behaviour. Eye tracking has been the standard method for measuring information demand while driving. The novel MARS (Masking Action Relevant Stimuli) method measures information demand by masking the dynamic action relevant stimulus in the driving environment or in the vehicle. To unmask the stimulus for a fixed interval, drivers press a button at the steering wheel. In the present thesis, two driving simulator studies evaluated the MARS method. They included measuring information demand for the traffic light phasing and the in-vehicle display of the traffic light assistant. The analyses demonstrate that variations in the experimental conditions influence the information demand measured with the MARS method qualitatively similar to the influences on fixations measured by eye tracking. Due to its simple application, the MARS method represents a promising tool for transportation research.

Simulator
Software

1 version available:

Driving with glaucoma: task performance and gaze movements

Year: 2015

Authors: TC Kübler,E Kasneci,W Rosenstiel

Purpose: The aim of this pilot study was to assess the driving performance and the visual search behavior, that is, eye and head movements, of patients with glaucoma in comparison to healthy-sighted subjects during a simulated driving test. Methods: Driving performance and gaze behavior of six glaucoma patients and eight healthy-sighted age- and sex-matched control subjects were compared in an advanced driving simulator. All subjects underwent a 40-minute driving test including nine hazardous situations on city and rural roads. Fitness to drive was assessed by a masked driving instructor according to the requirements of the official German driving test. Several driving performance measures were investigated: lane position, time to line crossing, and speed. Additionally, eye and head movements were tracked and analyzed. Results: Three out of six glaucoma patients passed the driving test and their driving performance was indistinguishable from that of the control group. Patients who passed the test showed an increased visual exploration in comparison to patients who failed; that is, they showed increased number of head and gaze movements toward eccentric regions. Furthermore, patients who failed the test showed a rightward bias in average lane position, probably in an attempt to maximize the safety margin to oncoming traffic. Conclusions: Our study suggests that a considerable subgroup of subjects with binocular glaucomatous visual field loss shows a safe driving behavior in a virtual reality environment, because they adapt their viewing behavior by increasing their visual scanning. Hence, binocular visual field loss does not necessarily influence driving safety. We recommend that more individualized driving assessments, which will take into account the patient’s ability to compensate, are required.

Eye Tracking Glasses
Simulator

7 versions available

Driving with homonymous visual field defects: Driving performance and compensatory gaze movements

Year: 2015

Authors: TC Kübler,E Kasneci,W Rosenstiel

Aim of this pilot study was to assess the driving performance and its relationship to the visual search behavior, i.e., eye and head movements, of patients with homonymous visual field defects (HVFDs) in comparison to healthy-sighted subjects during a simulated driving test. Eight HVFD patients and six healthy-sighted age- and gender-matched control subjects underwent a 40-minute driving test with nine hazardous situations. Eye and head movements were recorded during the drive. Four out of eight patients passed the driving test and showed a driving performance similar to that of the control group. One control group subject failed the test. Patients who passed the test showed an increased number of head and eye movements. Patients who failed the test showed a rightwards-bias in average lane position, probably in an attempt to maximize the safety margin to oncoming traffic. Our study supports the hypothesis that a considerable subgroup of subjects with HVFDs show a safe driving behavior, because they adapt their viewing behavior by increased visual scanning.

Eye Tracking Glasses
Simulator

8 versions available

Ergonomic design of the gauge cluster display for commercial trucks

Year: 2015

Authors: T Kim, J Park, J Choe, ES Jung

Objective: The purpose of this study is to determine the priority of information presentation and the effective menu type to be placed in the center of a gauge cluster display for commercial trucks and to present a set of ergonomic designs for the gauge cluster display. Background: An effective ergonomic design is specifically needed for the development of the gauge cluster display for the commercial trucks, because more diverse and heavier information is delivered to truck drivers, compared to the information to passenger car drivers. Method: First, all the information that must be shown on the commercial truck display was collected. Then, the severity, frequency of use, and display design parameters were evaluated for those information by commercial truck drivers. Next, an analysis on the information attributes and the heuristic evaluation utilizing the display design principles were carried out. According to the results, a design alternative of the main screen to be displayed was constructed by priority. A comparative analysis between the alternative and existing main screens was also conducted to see the efficacy of the designs. Lastly, we conducted an experiment for the selection of menu type. The experiment was conducted using the driving simulator with an eye-tracking device. The independent variables were four types of the menu reflecting the commercial truck characteristics such as grid type, icon type, list type, and flow type. We measured preference, total execution time, the total duration of fixation on the gauge cluster area, and the total number of fixation on the gauge cluster area as dependent variables. Results: Four types of driver convenience information and six types of driver assistance information were selected as the information to be placed primarily on the main screen of the gauge cluster. The Grid type was the most effective among the menu types. Conclusion: In this study, the information that appears on the main screen of the display, the division of the display and the design of the menu type for commercial truck drivers were suggested. Application: This study is expected to be utilized as guidelines on the ergonomic design of a gauge cluster display for commercial trucks.

Eye Tracking Glasses
Simulator

4 versions available