Excuse: Robust pupil detection in real-world scenarios
The reliable estimation of the pupil position is one the most important prerequisites in gaze-based HMI applications. Despite the rich landscape of image-based methods for pupil extraction, tracking the pupil in real-world images is highly challenging due to variations in the environment (e.g. changing illumination conditions, reflection, etc.), in the eye physiology or due to variations related to further sources of noise (e.g., contact lenses or mascara). We present a novel algorithm for robust pupil detection in real-world scenarios, which is based on edge filtering and oriented histograms calculated via the Angular Integral Projection Function. The evaluation on over 38,000 new, hand-labeled eye images from real-world tasks and 600 images from related work showed an outstanding robustness of our algorithm in comparison to the state-of-the-art.
Eye Tracking Glasses
Software
Exploiting the potential of eye movements analysis in the driving context
Driving is a complex and highly visual task. With the development of high-end eyetracking devices, numerous studies over the last two decades have investigated eye movements of the driver to identify deficits in visual search patterns and to derive assistive, informative, and entertainment systems. However, little is known about the visual behavior during autonomous driving, where the driver can be involved in other tasks but still has to remain attentive in order to be able to resume control of the vehicle. This work aims at exploiting the potential of eye movement analysis in the autonomous driving context. In a pilot study, we investigated whether the type of the secondary task in which the driver is involved, can be recognized solely from the eye movement parameters of the driver. Furthermore, we will discuss several applications of eye movement analysis to future autonomous driving approaches, e.g., to automatically detect whether the driver is being attentive and – when required – to guide her visual attention towards the driving task.
Eye Tracking Glasses
Software
Eye glance analysis of the surrogate tests for driver distraction
The purpose of this study was to examine the eye glance patterns of Detection Response Tasks (DRTs) for assessment of driver distraction during simulated driving. Several types of DRTs across visual, tactile and haptic modalities were used to investigate driver distraction by the ISO Driving Distraction working group. As part of the working group, we conducted a simulated driving study examining driver performance while engaging the primary driving task with visual-manual or auditory-verbal secondary tasks. Results of eye glance analysis showed that the visual DRTs increased visual load in driving more than the tactile DRT. Subsequently, the visual DRTs marginally increased the total glance time for forward view by 6.27 seconds and significantly increased the detection response time by 135.79 ms than the tactile DRT. As for the secondary tasks, the visual-manual secondary task yielded significantly longer total eye-off-the-road time (effect size = 50.75 ms), as well as DRT response times than the auditory-verbal ones time (effect size = 55.85 ms). This study allowed us to examine the relationships between rated situational awareness, DRT performance, and glance patterns, yielding insights into the relationship between objective task performance measures and subjective ratings.
Eye Tracking Glasses
Simulator
Eye movement synthesis with 1/
Eye movements are an essential part of non-verbal behavior. Non-player characters (NPCs), as they occur in many games, communicate with the player through dialogues and non-verbal behavior and can have a strong influence on the player experience or even on gameplay. In this paper we propose a procedural model to synthesize the subtleties of eye motions. More specifically, our model adds microsaccadic jitter and pupil unrest both modeled by 1/ f or pink noise to the standard main sequence. In a perceptual two-alternative forced-choice (2AFC) experiment we explore the perceived naturalness of different parameters of pink noise by comparing synthesized motions to rendered motion of recorded eye movements at extreme close shot and close shot distances. Our results show that, on average, data-driven motion is perceived as most natural, followed by parameterized pink noise, with motion lacking microsaccadic jitter being consistently selected as the least natural in appearance.
Eye Tracking Glasses
Software
Eye movement synthesis with 1/f pink noise
Eye movements are an essential part of non-verbal behavior. Non-player characters (NPCs), as they occur in many games, communicate with the player through dialogues and non-verbal behavior and can have a strong influence on the player experience or even on gameplay. In this paper we propose a procedural model to synthesize the subtleties of eye motions. More specifically, our model adds microsaccadic jitter and pupil unrest both modeled by 1/ f or pink noise to the standard main sequence. In a perceptual two-alternative forced-choice (2AFC) experiment we explore the perceived naturalness of different parameters of pink noise by comparing synthesized motions to rendered motion of recorded eye movements at extreme close shot and close shot distances. Our results show that, on average, data-driven motion is perceived as most natural, followed by parameterized pink noise, with motion lacking microsaccadic jitter being consistently selected as the least natural in appearance.
Eye Tracking Glasses
Software
Eye tracking and gaze interface design for pervasive displays
The integration of pervasive displays in public, semi-public, and private spaces has created opportunities for ambient information dissemination, public engagement, and interactive experiences. Eye tracking technology offers a novel approach to interacting with these displays by enabling gaze-based input. This dissertation investigates the design and implementation of gaze interfaces for pervasive displays, with the aim of enhancing user experience and interaction efficiency. Through a series of studies, the research explores the technical challenges, usability aspects, and potential applications of eye tracking in various display contexts. The findings provide valuable insights into the feasibility and effectiveness of gaze-based interaction, paving the way for innovative applications in digital signage, smart environments, and beyond.
Eye Tracking Glasses
Software
Eye Tracking Devices to Combat Distracted Driving
Distracted driving has become an issue that occurs on a national scale. Particularly, in the state of California, which houses numerous motorists, distracted driving has a higher probability to occur. Regardless of location, distracted driving is becoming a growing and popular topic of discussion in public policy due to the dangers associated with it and the potential deadly effects it can have on the lives of many. For such reasons it is a noteworthy issue to be addressed. Many individuals may be familiar with the concept, to some degree, and the hazards that are involved, however, this epidemic continues to be ever-growing. As efforts have been made to deter the action of distracted driving through legal interventions, such efforts are not producing satisfactory results as statistical findings illustrate that the numbers of citations being issued to the public by law enforcement and traffic collisions are increasing year by year since the passing of the distracted driving laws. Additionally, the current laws only apply to a portion of some forms of distracted driving while other forms of it continue to be legal. This study will enlighten the reader of other forms of distracted driving, the frequency at which it occurs and the devastation it incurs. This study will also raise the question if California’s current method of combating this issue is deemed effective and propose an alternative to more effectively address the issue. While some motorists recognize and respect the dangers associated with distracted driving and do not engage in the activity, too many other individuals do not hold such regard for it and continue in their practices. Regardless, there must be a new approach in addressing this issue using eye tracking technology integrated for automobiles as a tool to ensure that all motorists, regardless of outlying factors, avoid the activity and incorporate responsible driving habits in order to better preserve the lives of all individuals.
Eye-tracking technology in vehicles: application and design
This work analyses the eye-tracking technology and, as an outcome, it presents an idea of implementing it, along with other kinds of technology, in vehicles. The main advantage of such an implementation would be to augment safety while driving. The setup and the methodology used for detecting human activity and interaction using the means of the eye-tracking technology are investigated. Research in that area is growing rapidly and its results are used in a variety of cases. The main reasons for that growth are the constant lowering of prices of the special equipment that is necessary, the portability that is available in some cases as well as the easiness of use that make the usage of that technology more user-friendly than it was a few years ago. The whole idea of eye-tracking is to track the movements of the eyes in an effort to determine the direction of the gaze, using sophisticated software and purpose built hardware. This manuscript, makes a brief introduction in the history of eye monitoring presenting the very early scientific approaches used in an effort to better understand the movements of the human while tracking an object or during an activity. Following, there is an overview of the theory and the methodology used to track a specific object. As a result there exists a short presentation of the image processing and the machine learning procedures that are used to accomplish such tasks. Thereafter, we further analyze the specific eye-tracking technologies and techniques that are used nowadays and the characteristics that affect the exact choice of eye-tracking equipment. For the appropriate choice we have to take into account the area of research-interest in which the equipment will be used. In addition, the main categories of eye-tracking applications are presented and we shortlist the latest state of the art eye-tracking commercial systems. Following, we present our first approach, trying to describe an eye-tracking device that could be used in vehicles offering much better safety standards, controlling various parameters, continuously checking the readiness of the driver and alerting him for potential imminent collision incidents. Finally, we describe the existing way of connecting a device, in our case an eye-tracker, can be connected to an automobile’s system.
Eye Tracking Glasses
Software
Gaze estimation on glasses-based stereoscopic displays
Glasses-based 3D displays, such as those used in stereoscopic cinema or 3D monitors, are currently the most common form of 3D display. However, they are often reported to cause headaches and discomfort. One of the reasons for this is the vergence–accommodation conflict, where the binocular stimulus of one’s eyes rotating to look at a point is decoupled from the monocular stimulus of the eye lenses focusing on a point. This discomfort could be decreased by estimating the depth of a person’s gaze, and simulating a depth-of-field effect contingent on where they are looking in 3D space. In this dissertation, I investigate gaze estimation on such glasses-based 3D displays. Furthermore, I explore the feasibility of this gaze estimation with realistic constraints, such as low cost, low complexity hardware, free head motion, and real-time gaze estimates. I propose several algorithms for eye tracking and gaze estimation which are designed to work robustly and accurately despite these constraints. Firstly, I present a pupil detection approach which can accurately detect the pupil contour in difficult, off-axis images such as those captured by my eye cameras, which are attached underneath the frame of a pair of glasses. My algorithm is robust to occlusions such as eyelashes and eyelids, and operates in real-time. I evaluate it using a manually labelled dataset, and show that it has a higher detection rate than existing approaches, and sub-pixel accuracy. Secondly, I investigate the issue of evaluating gaze estimation, especially the question of how to collect ground truth data. As a result of this investigation, I present a new evaluation framework, which renders photorealistic synthetic eye images that can be used for evaluating the computer vision aspects of eye tracking. Thirdly, I present a novel eye model fitting algorithm, which initialises and refines an eye model based solely on pupil data, with no need for calibration or controlled lighting. I describe the geometry of initialising this eye model, and two methods of refining it using two different optimisation metrics. I evaluate it using synthetic images, and show that my refinements give a significant improvement in detection rate and gaze accuracy. Lastly, I present a binocular gaze estimation system which combines the above methods. My system performs geometric gaze estimation by combining the monocular eye models fitted to the left and right eye images. I describe two methods for combining these into a single binocular gaze point estimate, and methods for calibrating and refining this estimate. I then evaluate this system by performing a user study, showing that my system works for gaze estimation on glasses-based displays and is sufficiently accurate for simulating depth-of-field.