Publication Hub Archive

GPS

You have reached the Ergoneers Publication Hub for:

Used Tool > GPS

Find all Publications here:

Publication Hub

Total results: 129

SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies

Year: 2017

Authors: TC Kübler, C Rothe, U Schiefer,W Rosenstiel

Our eye movements are driven by a continuous trade-off between the need for detailed examination of objects of interest and the necessity to keep an overview of our surrounding. In consequence, behavioral patterns that are characteristic for our actions and their planning are typically manifested in the way we move our eyes to interact with our environment. Identifying such patterns from individual eye movement measurements is however highly challenging. In this work, we tackle the challenge of quantifying the influence of experimental factors on eye movement sequences. We introduce an algorithm for extracting sequence-sensitive features from eye movements and for the classification of eye movements based on the frequencies of small subsequences. Our approach is evaluated against the state-of-the art on a novel and a very rich collection of eye movements data derived from four experimental settings, from static viewing tasks to highly dynamic outdoor settings. Our results show that the proposed method is able to classify eye movement sequences over a variety of experimental designs. The choice of parameters is discussed in detail with special focus on highlighting different aspects of general scanpath shape. Algorithms and evaluation data are available at: http://www.ti.uni-tuebingen.de/scanpathcomparison.html.

Eye Tracking Glasses
Software

7 versions available

Virtual eye height and display height influence visual distraction measures in simulated driving conditions

Year: 2017

Authors: P Larsson,J Engström, C Wege

Glance behaviour towards in-vehicle visual displays is likely not only a result of the design of the display itself, but also influenced by other factors such as the position of the display and characteristics of the surrounding road scene. In the current study, it was hypothesized that both display position and simulator view will affect a driver’s glance behaviour. A simulator study was conducted in which 25 participants drove in a highway scenario while performing three different tasks in a smartphone positioned at two different heights. Two different simulator views used: one corresponding to the view from the driver’s seat of a truck and the other one corresponded to the view from the driver’s seat of a car. A within-group design was used with simulator view, smartphone position, and task as factors. Results showed that type of view and display position to some extent influenced glance behaviour as well as subjective ratings of driving performance. These results may have implications for eye glance measurement procedures as well as for guidelines relating to driver distraction, e.g. that simulated road scenes must correspond to the vehicle class that the device under test is intended for.

Eye Tracking Glasses
Simulator

2 versions available

Visual dominance in pilots during recovery from upset

Year: 2017

Authors: T Schnell, C Reuter,M Cover

We conducted an unusual attitude recovery flight test in an instrumented L-29 fighter jet trainer owned by the Operator Performance Laboratory (OPL) using commercial airline first-officer participants who had not yet achieved the rank of captain on any aircraft, who had no military flight training background, and who have not had any acrobatic training in the flight background. Two test spirals were conducted with 15 participants serving in Spiral 1 and 12 participants serving in Spiral 2. Spiral 1 was a screening study and is not discussed in this paper. We investigated if Synthetic Vision Systems (SVS) could enhance the pilot's ability to recognize and recover from unusual attitude (uA) conditions compared to present-day Electronic Flight Information Systems (EFIS). Additionally, we investigated the effect of display field of view (FOV, 12 degrees and 30 degrees) and if recoveries with SVS over open water caused any problems in the recognition of the aircraft attitude. The evaluation pilot (EP) participants were seated in the rear crew station of the L-29 which had electronic displays that showed the test symbology. The canopy had a view limiting device which eliminated any and all view to the outside world. Carefully designed unusual attitude entry conditions were developed for this flight test and administered by the safety pilot (SP) while the EP had their eyes closed and their hands on their laps. On the command of the SP, the EPs opened their eyes and recovered from the unusual attitude (90 degrees angle of bank, 40 degrees nose low). The results indicate that the response time (time from opening the eyes to making first input) were statistically significantly (F 1,104=4.14, p=0.044) longer in the SVS display condition when the wide FOV was used. We determined that some of the lake features on SVS caused confusion with the sky, thus resulting in longer response times. However, while the response times were longer with the wide FOV SVS, the recovery times were statistically significantly shorter (F 1,105=4.06, p=0.046) and the SVS-Wide display condition overall produced less altitude loss (2,531 ft) when compared to all other conditions on average (2,722 ft). This flight test investigated many aspects of recovery with standard EFIS and SVS in real flight conditions using an acrobatic capable aircraft and significant unusual attitude entry conditions. Recommendations are made with regard to managing the depiction of water features on SVS. Flight technically, recoveries were better with wide FOV SVS than with narrow FOV SVS or standard EFIS. Subjectively, EPs clearly preferred the wide FOV SVS.

Simulator
Software

2 versions available

Digital Technologies in Architecture and Engineering: Exploring an engaged interaction within curricula

Year: 2016

Authors: S Eloy,MS Dias,PF Lopes,E Vilar

This chapter focuses on the development and adoption of new Multimedia, Computer Aided Design, and other ICT technologies for both Architecture and Computer Science curricula and highlights the multidisciplinary work that can be accomplished when these two areas work together. The authors describe in detail the addressed educational skills and the related developed research and highlight the contributions towards the improvements of teaching and learning in those areas. This chapter discusses the role of digital technologies, such as Virtual Reality, Augmented Reality, Multimedia, 3D Modelling software systems, Design Processes and its evaluation tools, such as Shape Grammar and Space Syntax, within the Architecture curricula.

Simulator
Software

9 versions available

Distracted driving: scientific basis for risk assessments of driver’s workplaces

Year: 2016

Authors: B Gross, S Birska, M Bretschneider

At professional driver’s workplaces, mobile devices are used as telematics applications for information exchange between dispatchers and drivers. In addition to the wide-ranging benefits, it nevertheless emerges potential for new risks, such as distracting drivers. The present study is based on conditions encountered in an existing company in the passenger transport sector and is part of a consultation of the Institute for Occupational Safety and Health, Germany to support the implementation of a risk assessment regarding the applied telematics software. In order to analyze the impact on driving performance and visual processing of the used telematics application, the study employed two driving simulation sessions (LCT, rFactor 1) and one eye-tracking session. Results indicated that the examined application may be considered tolerable in terms of the AAM criteria for In-Vehicle Information and Communication Systems.

Simulator
Software

1 version available:

Driver Demand: Eye Glance Measures

Year: 2016

Authors: S Seaman, L Hsieh, R Young

This study investigated driver glances while engaging in infotainment tasks in a stationary vehicle while surrogate driving: watching a driving video recorded from a driver’s viewpoint and projected on a large screen, performing a lane-tracking task, and performing the Tactile Detection Response Task (TDRT) to measure attentional effects of secondary tasks on event detection and response. Twenty-four participants were seated in a 2014 Toyota Corolla production vehicle with the navigation system option. They performed the lane-tracking task using the vehicle’s steering wheel, fitted with a laser pointer to indicate wheel movement on the driving video. Participants simultaneously performed the TDRT and a variety of infotainment tasks, including Manual and Mixed-Mode versions of Destination Entry and Cancel, Contact Dialing, Radio Tuning, Radio Preset selection, and other Manual tasks. Participants also completed the 0-and 1-Back pure auditory-vocal tasks. Glances were recorded using an eye-tracker, and validated by manual inspection. Glances were classified as on-road (i.e., looking through the windshield) or off-road (i.e., to locations other than through the windshield). Three off-road glance metrics were tabulated and scored using the NHTSA Guidelines methods: Mean Single Glance Duration (MSGD), Total Eyes-Off-Road Time (TEORT), and Long Glance Proportion (LGP). Comparisons were made for these metric values between the task conditions and a 30-s Baseline condition with no task. Mixed-Mode tasks did not have a statistically significant longer MSGD or TEORT, or higher LGP, than Baseline (except for Mixed-Mode Destination Entry), whereas all the Manual tasks did. Mixed-Mode tasks improved compliance with the NHTSA Guidelines.

Eye Tracking Glasses
Simulator

2 versions available

Gaze augmentation in egocentric video improves awareness of intention

Year: 2016

Authors: D Akkil,P Isokoski

Video communication using head-mounted cameras could be useful to mediate shared activities and support collaboration. Growing popularity of wearable gaze trackers presents an opportunity to add gaze information on the egocentric video. We hypothesized three potential benefits of gaze-augmented egocentric video to support collaborative scenarios: support deictic referencing, enable grounding in communication, and enable better awareness of the collaborator's intentions. Previous research on using egocentric videos for real-world collaborative tasks has failed to show clear benefits of gaze point visualization. We designed a study, deconstructing a collaborative car navigation scenario, to specifically target the value of gaze-augmented video for intention prediction. Our results show that viewers of gaze-augmented video could predict the direction taken by a driver at a four-way intersection more accurately and more confidently than a viewer of the same video without the superimposed gaze point. Our study demonstrates that gaze augmentation can be useful and encourages further study in real-world collaborative scenarios.

Eye Tracking Glasses
Software

3 versions available

NaviLight: investigating ambient light displays for turn-by-turn navigation in cars

Year: 2016

Authors: A Matviienko,A Löcken,A El Ali,W Heuten

Car navigation systems typically combine multiple output modalities; for example, GPS-based navigation aids show a real-time map, or feature spoken prompts indicating upcoming maneuvers. However, the drawback of graphical navigation displays is that drivers have to explicitly glance at them, which can distract from a situation on the road. To decrease driver distraction while driving with a navigation system, we explore the use of ambient light as a navigation aid in the car, in order to shift navigation aids to the periphery of human attention. We investigated this by conducting studies in a driving simulator, where we found that drivers spent significantly less time glancing at the ambient light navigation aid than on a GUI navigation display. Moreover, ambient light-based navigation was perceived to be easy to use and understand, and preferred over traditional GUI navigation displays. We discuss the implications of these outcomes on automotive personal navigation devices.

Simulator
Software

9 versions available

On the visual distraction effects of audio-visual route guidance

Year: 2016

Authors: T Kujala,H Grahn, J Mäkelä, A Lasch

This is the first controlled quantitative analysis on the visual distraction effects of audio-visual route guidance in simulated, but ecologically realistic driving scenarios with dynamic maneuvers and self-controlled speed (N = 24). The audio-visual route guidance system under testing passed the set verification criteria, which was based on drivers' preferred occlusion distances on the test routes. There were no significant effects of an upcoming maneuver instruction location (up, down) on the in-car display on any metric or on the experienced workload. The drivers' median occlusion distances correlated significantly with median in-car glance distances. There was no correlation between drivers' median occlusion distance and intolerance of uncertainty but significant inverse correlations between occlusion distances and age as well as driving experience were found. The findings suggest that the visual distraction effects of audio-visual route guidance are low and provide general support for the proposed testing method.

Simulator
Software

2 versions available

Supporting Wayfinding Through Mobile Gaze-Based Interaction

Year: 2016

Authors: I Giannopoulos

Wayfinding in unfamiliar environments often requires the use of assistance aids. Humans utilize navigation aids to make the correct spatial decisions in order to reach their destination. The main purpose of these aids is to minimize the complexity (e.g., cognitive load) of a decision, which varies according to the number of possible options to follow at a decision point, as well as based on the abilities of the wayfinder and the available environmental information that can be incorporated in a wayfinding instruction. Several wayfinding assistance aids require the user’s visual attention in order for her to obtain the provided information. The interaction with such assistance aids might increase the complexity of decision making having an impact on wayfinding performance. Furthermore, wayfinding aids that require the user’s visual attention distract her from the surrounding environment having an impact on safety as well as on spatial knowledge acquisition. The focus of this dissertation lies in the investigation of self-localization and navigation in urban environments utilizing eye tracking technology as well as in the investigation, implementation, and evaluation of gaze-based wayfinding assistance systems. The main aim was to identify problems that occur during aided wayfinding, focusing on the visual interaction with mobile devices and the environment. Through novel gaze-based interaction approaches with mobile devices and the environment, it was possible to address problems concerning visual attention switches away from the surrounding environment and provide solutions and directions for novel assistance systems that minimize the interaction with the device to a minimum, redirecting the visual attention to the surrounding environment, increasing spatial knowledge acquisition, performance as well as usability aspects.

Eye Tracking Glasses
Software

1 version available: