A ground-truth data set and a classification algorithm for eye movements in 360-degree videos
The segmentation of a gaze trace into its constituent eye movements has been actively researched since the early days of eye tracking. As we move towards more naturalistic viewing conditions, the segmentation becomes even more challenging and convoluted as more complex patterns emerge. The definitions and the well-established methods that were developed for monitor-based eye tracking experiments are often not directly applicable to unrestrained set-ups such as eye tracking in wearable contexts or with head-mounted displays. The main contributions of this work to the eye movement research for 360° content are threefold: First, we collect, partially annotate, and make publicly available a new eye tracking data set, which consists of 13 participants viewing 15 video clips that are recorded in 360°. Second, we propose a new two-stage pipeline for ground truth annotation of the traditional fixations, saccades, smooth pursuits, as well as (optokinetic) nystagmus, vestibulo-ocular reflex, and pursuit of moving objects performed exclusively via the movement of the head. A flexible user interface for this pipeline is implemented and made freely accessible for use or modification. Lastly, we develop and test a simple proof-of-concept algorithm for automatic classification of all the eye movement types in our data set based on their operational definitions that were used for manual annotation. The data set and the source code for both the annotation tool and the algorithm are publicly available at https://web.gin.g-node.org/ioannis.agtzidis/360_em_dataset.
Eye Tracking Glasses
Software
A user experience‐based toolset for automotive human‐machine interface technology development
The development of new automotive Human-Machine Interface (HMI) technologies must consider the competing and often conflicting demands of commercial value, User Experience (UX) and safety. Technology innovation offers manufacturers the opportunity to gain commercial advantage in a competitive and crowded marketplace, leading to an increase in the features and functionality available to the driver. User response to technology influences the perception of the brand as a whole, so it is important that in-vehicle systems provide a high-quality user experience. However, introducing new technologies into the car can also increase accident risk. The demands of usability and UX must therefore be balanced against the requirement for driver safety. Adopting a technology-focused business strategy carries a degree of risk, as most innovations fail before they reach the market. Obtaining clear and relevant information on the UX and safety of new technologies early in their development can help to inform and support robust product development (PD) decision making, improving product outcomes. In order to achieve this, manufacturers need processes and tools to evaluate new technologies, providing customer-focused data to drive development. This work details the development of an Evaluation Toolset for automotive HMI technologies encompassing safety-related functional metrics and UX measures. The Toolset consists of four elements: an evaluation protocol, based on methods identified from the Human Factors, UX and Sensory Science literature; a fixed-base driving simulator providing a context-rich, configurable evaluation environment, supporting both hardware and software-based technologies; a standardised simulation scenario providing a repeatable basis for technology evaluations, allowing comparisons across multiple technologies and studies; and a technology scorecard that collates and presents evaluation data to support PD decision making processes. The Evaluation Toolset was applied in three technology evaluation case studies, conducted in conjunction with the industrial partner, Jaguar Land Rover. All three were live technology development projects, representing hardware and software concepts with different technology readiness levels. Case study 1 evaluated a software-based voice messaging system with reference to industry guidelines, confirming its performance and identifying potential UX improvements. Case study 2 compared three touchscreen technologies, identifying user preference and highlighting specific usability issues that would not have been found though analytical means. Case study 3 evaluated autostereoscopic 3D displays, assessing the effectiveness of 3D information while highlighting design considerations for long-term use and defining a design space for 3D content. Findings from the case studies, along with learning from visits to research facilities in the UK and USA, was used to validate and improve the toolset, with recommendations made for implementation into the PD workflow. The driving simulator received significant upgrades as part of a new interdisciplinary research collaboration with the Department of Psychology to support future research activity. Findings from the case studies also directly supported their respective technology development projects; the main outcome from the research therefore is an Evaluation Toolset that has been demonstrated, through application to live technology development projects, to provide valid, relevant and detailed information on holistic evaluations of early-phase HMI technologies, thus adding value to the PD process.
Abstracts of the 20th European Conference on Eye Movements, 18-22 August 2019, in Alicante (Spain)
Previous research has shown that when individuals are asked questions referring to visual stimuli seen before, their eye movements spontaneously return to the visual area where the stimuli were first seen. This recurring eye movement phenomenon has been shown to assist the memory retrieval of visual images. It is thus possible that oculomotor dysfunction can account for visual memory deficits, due to their diminished tendency to retrieve the same eye movement pattern. Considering patients with traumatic brain injury (TBI) characteristically suffer from visual memory deficits, also typical among this population, can partially account for this memory impairment. In this study, 27 healthy individuals and 27 patients with TBI from the Lowenstein Rehabilitation Hospital participated in a memory task. Participants were first exposed to stimuli and were then asked questions about the displayed stimuli. The testing session was conducted for each participant under two conditions: (1) while eyes were free to move over the screen; (2) while eyes were fixated. Study findings show that the control group significantly benefitted from the free-viewing condition in comparison to the fix-viewing condition, while this effect was absent among the TBI group. This was corroborated by the eye tracking data showing that participation in ...
Eye Tracking Glasses
Software
Automatic Prediction of Health Literacy through an eye tracker
The Web is one of the main sources of information. Health-related topics are one of the most searched topics on Web, fact corroborated by a study made in 2013 which concluded that 72% of the web users in the USA use it to search for information about health. Statistics gathered in Europe show that almost every country had percentages of individuals searching online for health information above 50% in 2017. With this high percentage of people searching for health topics online, it is noticeable that their diversity is great, which makes difficult the return of the most appropriate results to everyone. Health literacy is the degree to which an individual has the capacity to obtain, communicate, process and understand basic health information and services to make appropriate health decisions. It is important because, among other things, it determines the capacity individuals have to manage their health. Regarding health information retrieval, it influences the way people understand the documents retrieved from the Web and by understanding individual’s health literacy it may be possible to customize the results returned on the search engine results page (SERP). Health Literacy can be assessed using instruments that were already validated by other studies, being the same based on questionnaires that are applied to people. That way, this study is important because we’re trying to predict health literacy without being intrusive through an eye-tracker. Eye tracking is the process of measuring either the point of gaze (where one is looking) or the motion of an eye relative to the head. Also, as eye tracking technology allows us to assess how long users spend looking at specific components of one object, we can determine the important aspects of one web page to the user. Eye tracking technology is used in many different areas of research being very important in every one of them. Furthermore, one of the largest eye-tracker systems vendors (SensoMotoric Instruments) was acquired by Apple which indicates that this technology will be integrated into Apple products, which will, on one hand, give even more visibility to eye tracking technology and, on the other hand, will make this study have a short-term impact. This work aims to understand if eye movements vary according to the person’s health literacy during the search for health related topics on the web, through the use of an eye tracker. This study consisted of an experiment with users in a controlled environment. Also, there were created work tasks situations in order to motivate the research on health related topics, registering and comparing the ocular movements of the two groups of users, of high (adequate) and low (inadequate) health literacy. During the study itself, analyzing the points where one person is looking on a specific web page, or a predefined set of pages, we’ll try to see if there is a pattern between participants’ health literacy and their eye movement patterns regarding those pages. Then, the data will be analyzed trying to understand if there is a relation between health literacy and how individuals view information related to health on the Web. Finally, this study lead to conclude that participants with higher health literacy were more careful when trying to retrieve information about health conditions, spending more time in SERPs and giving more importance to the source of the content presented to them.
Eye Tracking Glasses
Software
Characterizing the visuomotor behaviour of upper limb body-powered prosthesis users
In recent years, significant research attention in the field of upper limb myoelectric prostheses has focused on improvements in control and integration of sensory feedback, which is hoped to reduce the visual attention and cognitive demand of operating these devices. However, there is currently no standard protocol for assessing the efficacy of these innovations by quantifying their impact on a user’s visuomotor behaviour. Furthermore, the visuomotor behaviour of individuals using prevailing upper limb prosthetic technologies (namely, body-powered prostheses) is not well understood. The primary objective of this thesis work was to characterize the visuomotor behaviour of a sample of body-powered prosthesis users to better understand current demands of traditional prostheses, as a future comparator to emerging prosthetic technologies. Five transradial body-powered prosthesis users and three transhumeral body-powered prosthesis users completed two functional upper limb tasks while their eye gaze behaviour and movement patterns were tracked using motion capture and eye-tracking technologies. Combined data from these systems was analyzed using a custom software tool that allowed for automatic and precise quantification of a number of outcome metrics relating to task performance, eye gaze behaviour, eye-hand coordination and quality of movement. Results for each body-powered prosthesis user were compared to a set of normative outcomes previously established under the same experimental protocol for twenty able-bodied individuals. Relative to the normative data set, trends in behaviour emerged across the body-powered prosthesis users. The body-powered prosthesis users consistently took longer to complete the tasks and exhibited decreased end effector movement quality, as evidence by increased numbers of movement units. The prosthesis users also tended to dedicate more visual attention to their terminal device, especially after picking up an object, and occasionally while reaching for an object. However, while transporting an object, they would eventually transition their gaze to the object drop-off location before their terminal device arrived there, and not glance back and forth between this target and their terminal device in flight. Despite similarity in behavioural trends across the body-powered prosthesis users, there was variability between them which revealed differences in skill level, strategies, and level of amputation. Differences between the two upper limb tasks also appeared to elicit different visuomotor behaviours and pose unique challenges for individuals with different levels of amputation. Further data collection is required to increase the sample size, and improve understanding of how the behaviour described in this thesis compares with other prosthesis user populations, such as myoelectric prosthesis users. However, these findings on the visuomotor behaviour of body-powered prosthesis users, and the technical development undertaken to accomplish this analysis, represent an important contribution. This work will be useful in assessing the efficacy of current and future innovations in upper limb prosthesis technology, which should in turn help to improve the state of technology available to individuals with upper limb loss.
Eye Tracking Glasses
Software
Derivation of a model of safety critical transitions between driver and vehicle in automated driving
Abstract In automated driving, there is the risk that users must take over the vehicle guidance despite a potential lack of involvement in the driving task. This publication presents an initial model of control distribution between users and the automated system. In this model, the elements of the control distribution in automated driving are addressed together with possible and safe transitions between different driving modes. Furthermore, the approach is initially empirically validated. In a driving study, in which participants operated both driving and a non-driving related task, objective driving data as well as eye-tracking parameters are used to estimate the model’s accuracy. Such an explanatory model can serve as a first approach to describe potential concepts of cooperation between users and automated vehicles. In this way, prospective road traffic concepts could be improved by preventing safety critical transitions between the driver and the vehicle.
Eye Tracking Glasses
Software
Design and development of a state-of-the-art Universal Laboratory for Virtual Reality, Real-time Simulation and Human-Machine Interaction
LUT University LUT School of Energy Systems LUT Mechanical Engineering Vladislav Vasilev Design and Development of a State-of-the-art Universal Laboratory for Virtual Reality, Real-time Simulation and Human-Machine Interaction Master’s thesis 2019 100 pages, 43 figures, 12 tables and 3 appendices Examiners: Professor Heikki Handroos D. Sc. (Tech.) Hamid Roozbahani Keywords: Laboratory of Intelligent Machines, Simulation laboratory, Visualization, Projection systems, Display systems, Virtual Environment. This Master’s Thesis is focused on design and development of the LUT Laboratory of Intelligent Machines as the cutting-edge Simulation Laboratory, which includes HMI, VR and real-time research areas. Composing of the future laboratory concepts and studying of the possible options for the visualization system were priority tasks. Developed visualization system was presented in two options: display-based and front projection system. The information about equipment, its requirements and compatibility was gathered from the open-sources, project meetings and consulting. Design of the Display Visualization Platform and Projection System was made based on average human eye properties, industry guidelines and standards. As the result of the project, technical comparison of studied equipment was made, and procurement documents were designed. Two feasible concepts for the Visualization Platform were suggested in this paper. Obtained results can be applied in further business and academic activities of the Laboratory and University in general.
Eye Tracking Glasses
Simulator
Detecting and Identifying Real and Decoy Tanks in a Computer Screen: Evidence from Stimuli Sensitivity and Eye-Tracking
In a modern warfare as well as in reconnaissance operations it is on one hand highly important to hide and protect own troops but on the other hand find and target the enemy. Target identification is often based on visual examination of video or still images produced by for example by Unmanned Aerial Vehicles (UAVs) or other means of Intelligence, Surveillance, and Reconnaissance (ISR). In the present study we examined the perception, detection and identification of real and decoy tanks among a total of 28 participants using reaction time tests and eye-tracking recordings during categorizing tasks of images of tanks (real vs. fake; without vs. with camouflage). We found, among other things, that fake and camouflage images of tanks as compared to real and non-camouflage images decreased identification speed. We also found that camouflage images elicited more attention shifting between image and background as compared to non-camouflage images. We argued that this is probable due the fact that as camouflage blurs the image contour and sharpness people seek cues for categorization by switching between image and background. The results are important in understanding the perception and identification of military visual objects in displays and can be used for example in optimization of decoys as well as, in connection with detection, configuring display settings.
Eye Tracking Glasses
Software