Company

Publication Hub

Welcome to the Ergoneers Publication Hub

Find studies in your field of interest, connect to authors, and push common goals.

If you are missing your own, relevant publication or would like to contribute to our  international research community, please contact us.

Groundbreaking publications are to be recognized by our Jury on a yearly basis. Submit a Paper

Ergoneers_Worldmap

Filter menu

Search by keywords

Filter by

Fields of Application

Used Tools

Product Categories

Product Names

Publication Years

Total results: 604

Evaluation of the optimal quantity of in-vehicle information icons using a fuzzy synthetic evaluation model in a driving simulator

Year: 2022

Authors: J Chen, X Wang, Z Cheng, Y Gao

In-Vehicle Information (IVI) features such as navigation assistance play an important role in the travel of drivers around the world. Frequent use of IVI, however, can easily increase the cognitive load of drivers. The interface design, especially the quantity of icons presented to the driver such as those for navigation, music, and phone calls, has not been fully researched. To determine the optimal number of icons, a systematic evaluation of the IVI Human Machine Interface (HMI) was examined using single-factor and multivariate analytical methods in a driving simulator. When one-way ANOVA was performed, the results showed that the 3-icon design scored best in subjective driver assessment, and the 4-icon design was best in the steering wheel angle. However, when a new method of analyzing the data that enabled a simultaneous accounting of changes observed in the dependent measures, 3 icons had the highest score (that is, revealed the overall best performance). This method is referred to as the fuzzy synthetic evaluation model (FSE). It represents the first use of it in an assessment of the HMI design of IVI. The findings also suggest that FSE will be applicable to various other HMI design problems.

Simulator
Software

4 versions available

Experimental study on panic during simulated fire evacuation using psycho-and physiological metrics

Year: 2022

Authors: K Deng, M Li, G Wang, X Hu, Y Zhang, H Zheng

Under circumstances of fire, panic usually brings uncertainty and unpredictability to evacuation. Therefore, a deep understanding of panic is desired. This study aims to dig into the underlying mechanism of fire evacuation panic by measuring and analysing psycho- and physiological indicators. In the experiment, participants watched a simulated train station within which three sets of stimuli were triggered separately. Eye movement and brain haemodynamic responses were collected during the watch, while questionnaires and interviews of emotions were conducted after. The analysed physiological indicators include the amplitude of pupil dilation, the time ratios of fixation and saccade, the binned entropy of gaze location, and the brain activation coefficients. The results of this research indicate that fire evacuation panic can be broken down into two elements. (1) Unawareness of situation: less knowledge of the situation leads to a higher level of panic; (2) Intensity of visual stimulation: the panic level is escalated with increased severity of fire that is perceived.

Eye Tracking Glasses
Simulator

10 versions available

Eye tracking system measurement of saccadic eye movement with different illuminance transmission exposures during driving simulation

Year: 2022

Authors: A Ahmad,SA Rosli,AH Chen

Numerous eye gaze changes of different fixation viewings are involved in driving. In addition, driving is done under various surrounding illuminance conditions. However, the effect of different illuminance transmissions on eye gaze movement was not explored during driving. This study investigated the saccadic eye movement using eye tracking system under different illuminance transmissions during driving simulation. The investigation was conducted on twenty-eight participants aged between 21 to 26 years old with proper driving licensing and experience. All participants had good vision status, with a good history of systemic, ocular, and binocular vision health. Using driving simulation, the participants were instructed to drive as they usually did, and their saccadic eye movement was recorded via the Dikablis eye tracker. The surrounding illuminance within the experimental room provided 100% transmission of 500 Lux, and the illuminance transmission was varied to 50%, 30%, and 15% using neural density filters. Under different illuminance transmissions, the saccadic eye movement showed no significant differences (p>0.05), even with the 15% transmission, both in the number and duration of saccadic eye movement. This showed similar eye gaze change specifically saccadic movement during driving simulation with different light transmissions. It could be concluded that eye gaze movement was not influenced by reduced illuminance when driving.

Eye Tracking Glasses
Simulator

1 version available:

Eye-tracking assistive technologies for individuals with amyotrophic lateral sclerosis

Year: 2022

Authors: HO Edughele, Y Zhang,F Muhammad

Amyotrophic lateral sclerosis, also known as ALS, is a progressive nervous system disorder that affects nerve cells in the brain and spinal cord, resulting in the loss of muscle control. For individuals with ALS, where mobility is limited to the movement of the eyes, the use of eye-tracking-based applications can be applied to achieve some basic tasks with certain digital interfaces. This paper presents a review of existing eye-tracking software and hardware through which eye-tracking their application is sketched as an assistive technology to cope with ALS. Eye-tracking also provides a suitable alternative as control of game elements. Furthermore, artificial intelligence has been utilized to improve eye-tracking technology with significant improvement in calibration and accuracy. Gaps in literature are highlighted in the study to offer a direction for future research.

Eye Tracking Glasses
Software

12 versions available

Gaze Behavior of E-Scooter Riders in an Urban Environment

Year: 2022

Authors: B Hristov, D Peukert, K Reinprecht

The increasing urbanization, which leads to an increase in traffic density, the new road user groups, such as e-scooter riders, the increasing number of accidents among the vulnerable road users, and the demands for environmentally friendly mobility point out the need to rethink the current transport infrastructure and the associated road safety concepts. This study is the result of a joint research project conducted by the University of Applied Sciences Berlin and the Inspectio Research Institute in Munich. The aim of this work has been to survey the gaze behavior of e-scooter riders and to generate initial insights into their perception. An analysis of the gaze behavior has been carried out for road sections and for junctions. The gaze behavior on protected bike lanes differs significantly from the one on bike lanes on the roadway, where e-scooter riders focused on areas to the right and left, so that they lose visual and concentration capacities. The analysis of the gaze behavior on junctions shows that the eyes are directed more toward the road and the area directly in front of the e-scooter (nearfield), which results in reduced attention to the actual traffic situation. Based on the results, recommendations have been derived for the planning of new construction or conversion of cycling facilities. From the perspective of road safety, protected cycle lanes are the optimal solution compared to other infrastructure solutions, as they require less widespread gaze behavior and e-scooter riders can concentrate on the actual traffic situation.

Eye Tracking Glasses
Software

2 versions available

How does navigating with Augmented Reality information affect drivers’ glance behaviour in terms of attention allocation?

Year: 2022

Authors: K Bauerfeind, J Drüke, L Bendewald

Drivers can benefit from Augmented Reality (AR) information especially in ambiguous navigation situations compared to conventional Head-up displays (HUD). AR information is correctly superimposed on the relevant objects in the environment and therefore directly related to the driving situation. Hence, it is assumed, that drivers no longer have to switch glances between the AR information and the environment (Kim & Dey, 2009). It has to be investigated whether switching glances between the presented navigation information and the environment can be reduced with AR information compared to HUD information. Furthermore, the question arises whether AR information might capture drivers’ attention and therefore distract from the traffic situation compared to a HUD as AR information is presented on the driver’s primary visual axis. The aim of the driving simulator study was to examine glance behaviour in terms of attention allocation while participants navigated in an ambiguous left turn situation with an oncoming car in an urban area (N = 58). Hence, drivers were faced with the decision to turn in front of it or let it pass. A conventional HUD and an AR display presented the navigation information to the driver. The drives differed in traffic complexity (low vs. high) to provide indications whether drivers adapt glance behaviour to altered environmental conditions. Besides the navigation task, drivers performed a non-driving-related task to raise drivers’ mental load while navigating. Results showed that with the AR display participants paid more attention to an oncoming car in the ambiguous left turn situation than with the HUD, which indicates that AR information was not distracting. Furthermore, participants switched glances significantly less between the AR navigation information and the environment, which indicates that with the AR display the driver did not have to map the virtual information onto the real driving situation. Independently of the display type 88% of the participants let the oncoming car pass the first time in this situation. Moreover, subjective data showed that drivers benefitted from AR information. The results of this study contribute to the investigation and development of AR user interfaces.

Eye Tracking Glasses
Simulator

2 versions available

How is emotional resonance achieved in storytellings of sadness/distress?

Year: 2022

Authors: C Rühlemann

Storytelling pivots around stance seen as a window unto emotion: storytellers project a stance expressing their emotion toward the events and recipients preferably mirror that stance by affiliating with the storyteller’s stance. Whether the recipient’s affiliative stance is at the same time expressive of his/her emotional resonance with the storyteller and of emotional contagion is a question that has recently attracted intriguing research in Physiological Interaction Research. Connecting to this line of inquiry, this paper concerns itself with storytellings of sadness/distress. Its aim is to identify factors that facilitate emotion contagion in storytellings of sadness/distress and factors that impede it. Given the complexity and novelty of this question, this study is designed as a pilot study to scour the terrain and sketch out an interim roadmap before a larger study is undertaken. The data base is small, comprising two storytellings of sadness/distress. The methodology used to address the above research question is expansive: it includes CA methods to transcribe and analyze interactionally relevant aspects of the storytelling interaction; it draws on psychophysiological measures to establish whether and to what degree emotional resonance between co-participants is achieved. In discussing possible reasons why resonance is (not or not fully) achieved, the paper embarks on an extended analysis of the storytellers’ multimodal storytelling performance (reenactments, prosody, gaze, gesture) and considers factors lying beyond the storyteller’s control, including relevance, participation framework, personality, and susceptibility to emotion contagion.

Eye Tracking Glasses
Software

6 versions available

How users of automated vehicles benefit from predictive ambient light displays

Year: 2022

Authors: T Hecht, S Weng, LF Kick,K Bengler

With the introduction of Level 3 and 4 automated driving, the engagement in a variety of non-driving related activities (NDRAs) will become legal. Previous research has shown that users desire information about the remaining time in automated driving mode and system status information to plan and terminate their activity engagement. In past studies, however, the positive effect of this additional information was realized when it was integrated in or displayed close by the NDRA. As future activities and corresponding items will be diverse, a device-independent and non-interruptive way of communication is required to continuously keep the user informed, thus avoiding negative effects on driver comfort and safety. With a set of two driving simulator studies, we have investigated the effectiveness of ambient light display (ALD) concepts communicating remaining time and system status when engaged in visually distracting NDRAs. In the first study with 21 participants, a traffic light color-coded ALD concept (LED stripe positioned at the bottom of the windshield) was compared to a baseline concept in two subsequent drives. Subjects were asked to rate usability, workload, trust, and their use of travel time after each drive. Furthermore, gaze data and NDRA disengagement timing was analyzed. The ALD with three discrete time steps led to improved usability ratings and lower workload levels compared to the baseline interface without any ALD. No significant effects on trust, attention ratio, travel time evaluation, and NDRA continuation were found, but a vast majority favored the ALD. Due to this positive evaluation, the traffic light ALD concept was subsequently improved and compared to an elapsing concept in a subsequent study with 32 participants. In addition to the first study, the focus was on the intuitiveness of the developed concepts. In a similar setting, results revealed no significant differences between the ALD concepts in subjective ratings (workload, usability, trust, travel time ratings), but advantages of the traffic light concept can be found in terms of its intuitiveness and the level of support experienced.

Eye Tracking Glasses
Simulator

7 versions available

Human hand motion prediction based on feature grouping and deep learning: Pipe skid maintenance example

Year: 2022

Authors: T Zhou,Y Wang,Q Zhu,J Du

Human-robot collaboration has gained popularity in various construction applications. The key to a successful human-robot collaboration at the same construction workplace is the delicate algorithm for predicting human motions to strengthen the robot's situational awareness, i.e., robot-human awareness. Most existing approaches have focused on predicting human motions based on repetitive patterns of human behaviors in well-defined task contexts, such as specific object picking tasks, for a relatively short period of time. These methods can hardly capture the 'pattern inconsistency' of human actions, i.e., the differences across people in terms of motion features and even for the same person at different time points of the task. This paper proposes an analytical pipeline that segments and clusters the human inconsistent behaviors into different pattern groups and builds separate human motion pattern prediction models correspondingly. The proposed method, Human Motion Feature Grouping and Prediction (HMFGP), quantifies the spatiotemporal relationship between gaze focus and hand movement trajectories, segments the raw data based on the detected gaze-hand relationship pattern changes, and clusters the matched gaze-hand data segments into several pattern groups based on the pattern similarity of the gaze-hand relationships. Then a time series Deep Learning method is used to predict hand motions based on gaze focus trajectories for each of the pattern groups. The gaze and hand motion data of a human subject experiment (n = 120) for pipe skid maintenance was used to test the prediction performance of HMFGP. The result shows that HMFGP can significantly improve the accuracy of human hand motion prediction and help quantity different patterns of human motions for specific analyses.

Eye Tracking Glasses
Software

1 version available:

Human motion prediction for intelligent construction: A review

Year: 2022

Authors: X Xia,T Zhou,J Du,N Li

Intelligent construction is an important construction trend. With the growing number of intelligent autonomous systems implemented in the construction area, understanding and predicting human motion becomes increasingly important. Based on such predictions, the autonomous systems can optimize their actions to improve the efficiency of human-robot interactions, and supervisors can make informed decisions about when and where to intervene in human motion to avoid collisions. This paper presents a comprehensive review of existing literature on human motion prediction (HMP). Relevant studies from a wide range of fields are reviewed, analyzed and synthesized, in terms of prediction indicators, methods and applications, based on a three-level taxonomy. The taxonomy is structured based on the levels of human information required by different prediction methods, and reflects different understandings of the underlying causality and mediators of human motions and intent. The paper also discusses the evolutions of the theoretical understanding and methodological development of HMP, its application scenarios in and beyond the construction domain, and possible directions for future research. This review is expected to increase the visibility of this rapidly expanding research area, and inspire future studies and advancements for human-robot interactions in construction.

Eye Tracking Glasses
Software

2 versions available

Explore Cutting-Edge Research in Human Factors and Ergonomics

Welcome to our comprehensive publication library, where we bring together the best research on human factors, ergonomics, psychology, usability, and consumer behavior. Our extensive collection includes white papers, PhD theses, and scholarly articles that delve into applications across various fields such as aerospace, defence, automotive, transportation, sport science, and education.

For researchers and engineers, our library serves as a vital resource, offering the latest insights to inspire innovation and drive projects forward. With a focus on sensor-based studies—utilizing technologies like EEG, ECG, eye tracking, and motion tracking—we provide a platform to explore how these tools enhance understanding of human performance and interaction.

Our unique offerings include advanced simulators for flight and driving, enabling users to study complex human behaviors in controlled environments. By fusing and synchronizing diverse data sources, our platform delivers in-depth analyses across correlated factors, streamlining research processes and saving valuable time.

Ergoneers has been at the forefront of innovation in physiological and environmental data-based research tools for over two decades. Our publication library invites the community to engage in exchange and growth, fostering collaboration around humanitarian goals.

Whether you’re a researcher, an engineer, or an educator, our library is designed to support your work, providing you with the resources necessary to advance your understanding and application of human factors in real-world scenarios. Discover how you can leverage the latest findings to enhance user experience and performance in your field. Join us in shaping the future of human-centered design and research—explore our publication library today!