Publication Hub Archive

GSR

You have reached the Ergoneers Publication Hub for:

Used Tool > GSR

Find all Publications here:

Publication Hub

Total results: 49

Evaluating the interactive effects of responsible drinking messages and attentional bias on actual drinking behaviours

Year: 2017

Authors: D Frings, G Eskian,I Albery,T Moss

Responsible drinking messages (RDMs) are often used as a key tool to reduce alcohol related harms. Posters are a common form of RDM, displayed in places such as bars, bus stops and toilet cubicles. However, some recent research suggests RDMs may not have the desired effect of reducing levels of consumption. It is not known how environmental (e.g. the number of alcohol-related cues in a given environment) or individual difference measures (such as prior drinking behaviour and beliefs, or attentional bias towards alcohol related stimuli) influence interactions with RDMs. Nor is it known how these factors affect their efficacy. This research explored these issues by having participants view RDMs either in a bar-laboratory (i.e. a 'fake bar' inside our research facility) or a traditional psychology laboratory cubicle. The key findings of the research are: 1: That posters in general, and RDMs in particular, are poorly attended to in bar environments 2: That attentional biases towards alcohol influence the allocation of visual attention that is consciously controlled and effortful, but not visual attention that is automatic. Variations at the level of individual drinkers (such as prior drinking history or alcohol expectancies) were also associated with the direction of visual attention towards actual alcoholic drinks. This research has implications for the optimal placement of RDMs. It also highlights the sensitivity of such messages to changes in content. Theoretical implications include new questions around the relationship between attentional bias and other forms of attention, and the importance of cue saturation in understanding when attentional bias affects other cognitive and behavioural processes.

Eye Tracking Glasses
Software

3 versions available

Pupil response as an indicator of hazard perception during simulator driving

Year: 2017

Authors: F Vintila,TC Kübler,E Kasneci

We investigate the pupil response to hazard perception during driving simulation. Complementary to gaze movement and physiological stress indicators, pupil size changes can provide valuable information on traffic hazard perception with a relatively low temporal delay. We tackle the challenge of identifying those pupil dilation events associated with hazardous events from a noisy signal by a combination of wavelet transformation and machine learning. Therefore, we use features of the wavelet components as training data of a support vector machine. We further demonstrate how to utilize the method for the analysis of actual hazard perception and how it may differ from the behavioral driving response.

Eye Tracking Glasses
Simulator

13 versions available

Number of Display Units in Power Plant Control Rooms–Design Recommendations as a Derivation of Eye Tracking Studies

Year: 2016

Authors: R Kockrow, A Hoppe

In control rooms of coal-fired power plants the visual behavior of operators was analyzed during controlling and supervisory tasks in a broad-based study by using eye tracking techniques. This study focused on the usage of installed display units. Object of investigation were control stations under real conditions, which are equipped with a various number of display units. During real system operation and under simulated conditions the visual behavior of 104 operators could be captured objectively. According to the formulated hypotheses the Visual Comfort Zone for operator tasks was proved by statistical evaluation. This Zone is individually distinctive and unattached to the number of installed display units. There are some requirements allowing such preferences in organizing the visual behavior in supervisory tasks. Practical Relevance Based on the findings it is evident, that exceedance of a specific number of displays leads to decreasing usage on furthermore installed displays. From an ergonomic point of view the reasonableness and limits of visualizing concepts with a large number of displays can be deduced. As a result of this study a demand-actuated strategy for information visualization could be developed.

Eye Tracking Glasses
Software

1 version available:

Investigating the mechanisms underlying fixation durations during the first year of life: a computational account

Year: 2015

Authors: IR Saez de Urabain

Infants’ eye-movements provide a window onto the development of cognitive functions over the first years of life. Despite considerable advances in the past decade, studying the mechanisms underlying infant fixation duration and saccadic control remains a challenge due to practical and technical constraints in infant testing. This thesis addresses these issues and investigates infant oculomotor control by presenting novel software and methods for dealing with low-quality infant data (GraFIX), a series of behavioural studies involving novel gaze-contingent and scene-viewing paradigms, and computational modelling of fixation timing throughout development. In a cross-sectional study and two longitudinal studies, participants were eye-tracked while viewing dynamic and static complex scenes, and performed gap-overlap and double-step paradigms. Fixation data from these studies were modelled in a number of simulation studies with the CRISP model of fixation durations in adults in scene viewing. Empirical results showed how fixation durations decreased with age for all viewing conditions but at different rates. Individual differences between long- and short-lookers were found across visits and viewing conditions, with static images being the most stable viewing condition. Modelling results confirmed the CRISP theoretical framework’s applicability to infant data and highlighted the influence of both cognitive processing and the developmental state of the visuo-motor system on fixation durations during the first few months of life. More specifically, while the present work suggests that infant fixation durations reflect on-line perceptual and cognitive activity similarly to adults, the individual developmental state of the visuo-motor system still affects this relationship until 10 months of age. Furthermore, results suggested that infants are already able to program saccades in two stages at 3.5 months: (1) an initial labile stage subject to cancellation and (2) a subsequent non-labile stage that cannot be cancelled. The length of the non-labile stage decreased relative to the labile stage especially from 3.5 to 5 months, indicating a greater ability to cancel saccade programs as infants grew older. In summary, the present work provides unprecedented insights into the development of fixation durations and saccadic control during the first year of life and demonstrates the benefits of mixing behavioural and computational approaches to investigate methodologically challenging research topics such as oculomotor control in infancy.

Eye Tracking Glasses
Software

4 versions available

Prediction of take-over time in highly automated driving by two psychometric tests

Year: 2015

Authors: M Körber, T Weißgerber, L Kalb, C Blaschke, M Farid

In this study, we investigated if the driver's ability to take over vehicle control when being engaged in a secondary task (Surrogate Reference Task) can be predicted by a subject's multitasking ability and reaction time. 23 participants performed a multitasking test and a simple response task and then drove for about 38 min highly automated on a highway and encountered five take-over situations. Data analysis revealed significant correlations between the multitasking performance and take-over time as well as gaze distributions for Situations 1 and 2, even when reaction time was controlled. This correlation diminished beginning with Situation 3, but a stable difference between the worst multitaskers and the best multitaskers persisted. Reaction time was not a significant predictor in any situation. The results can be seen as evidence for stable individual differences in dual task situations regarding automated driving, but they also highlight effects associated with the experience of a take-over situation.

Eye Tracking Glasses
Simulator

16 versions available

An experimental eye-tracking study for the design of a context-dependent social robot blinking model

Year: 2014

Authors: A Zaraki,MB Dehkordi,D Mazzei

Human gaze and blinking behaviours have been recently considered, to empower humanlike robots to convey a realistic behaviour in a social human-robot interaction. This paper reports the findings of our investigation on human eye-blinking behaviour in relation to human gaze behaviour, in a human-human interaction. These findings then can be used to design a humanlike eye-blinking model for a social humanlike robot. In an experimental eye-tracking study, we showed to 11 participants, a 7-minute video of social interactions of two people, and collected their eye-blinking and gaze behaviours with an eye-tracker. Analysing the collected data, we measured information such as participants’ blinking rate, maximum and minimum blinking duration, number of frequent (multiple) blinking, as well as the participants’ gaze directions on environment. The results revealed that participants’ blinking rate in a social interaction are qualitatively correlated to the gaze behaviour, as higher number of gaze shift increased the blinking rate. Based on the findings of this study, we can propose a context-dependent blinking model as an important component of the robot’s gaze control system that can empower our robot to mimic human blinking behaviour in a multiparty social interaction.

Eye Tracking Glasses
Software

7 versions available

Temporal multimodal data synchronisation for the analysis of a game driving task using EEG

Year: 2014

Authors: A Sivanathan,T Lim,S Louchart, J Ritchie

Multimodal data channels such as bio-physiological signals are increasingly used in game-play studies to better understand players’ behaviours and their motivations. It is however difficult to perform any sort of conclusive analysis solely based on bio-physiological signals due to the complex nature of epistemic, semiotic and ergotic activities surrounding in-game activities and the artefacts facilitating player immersion. Thus a combined analysis of multiple data streams including in-game data and bio-physiological signals is indispensable to produce contextualised information from which a deep analysis of game mechanics and their effects can be performed. Precise synchronisation in capturing multiple streams is required to generate valid inter-stream correlations and meaningful information. Typically there are no automatic mechanisms built in the game architecture or in commercial data logging systems for multimodal data synchronisation and data fusion. This paper presents a novel and generic technique based on inducing identifiable signature pulses in data channels to accurately synchronise multiple temporal data streams. This technique is applied and its capabilities are exhibited using a driving game simulation as an exemplar. In this example, driver’s in-game behavioural data is synchronised and correlated with their temporal brain activity. The concept of simplex method borrowed from linear programming is used to correlate between the driving patterns and brain activity in this initial study is provided so as to allow studying/investigating user behaviour in relation to learning of the driving track.

Eye Tracking Glasses
Software

8 versions available

Vishnoo—An open-source software for vision research

Year: 2011

Authors: E Tafaj,TC Kubler, J Peter,W Rosenstiel

The visual input is perhaps the most important sensory information. Understanding its mechanisms as well as the way visual attention arises could be highly beneficial for many tasks involving the analysis of users' interaction with their environment. We present Vishnoo (Visual Search Examination Tool), an integrated framework that combines configurable search tasks with gaze tracking capabilities, thus enabling the analysis of both, the visual field and the visual attention. Our user studies underpin the viability of such a platform. Vishnoo is an open-source software and is available for download at http://www.vishnoo.de/

Eye Tracking Glasses
Software

6 versions available

Integration of a Component Based Driving Simulator and Design of Experiments on Multimodal Driver Assistance

Year: 2006

Authors: D Popiv

Fully automated driving is a future goal of research currently performed in automotive industry. Therefore this thesis deals with driving support systems on different levels of automation. Fully integrated multimodal approaches for such driving systems aim at providing intuitive means for minimally distractive assistance for car drivers. In this thesis, the description of design and implementation of three concepts of driving support systems in the driving simulator is given. The driving support systems are based on three concepts. Every concept represents assistance on different level of driving automation. A non-automated concept is based on the principle of driving activity without an automation support provided to the driver. A semi-automation support system is represented by the concept of Active Cruise Control, in which driver performs a role of a system’s supervisor and delegates part of the driving tasks to the system. The third concept is the concept of Active Gas Pedal. In terms of this concept driver is offered support on behalf of the driving system, but still is required to perform tasks of driving personally. Also lateral and longitudinal visual assistance is incorporated into the implementation of the three described concepts. To test mentioned concepts, a fixed-base driving simulator was set up, the architecture of which is explained in this thesis as well. The set-up of the fixed-base driving simulator, its hardware components, corresponding interfacing software applications, and their networking are described. The software system architecture in the driving simulator is explained, and development process of the driving support systems is introduced. Also, needed implementation information is provided for further extensions of the software system. Finally the experimental design for the user study is described, so that the experiment only needs to get executed.

Simulator
Software

2 versions available