Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
J Eye Mov Res ; 16(4)2023.
Article in English | MEDLINE | ID: mdl-38544928

ABSTRACT

During calibration, an eye-tracker fits a mapping function from features to a target gaze point. While there is research on which mapping function to use, little is known about how to best estimate the function's parameters. We investigate how different fitting methods impact accuracy under different noise factors, such as mobile eye-tracker imprecision or detection errors in feature extraction during calibration. For this purpose, a simulation of binocular gaze was developed for a) different calibration patterns and b) different noise characteristics. We found the commonly used polynomial regression via least-squares-error fit often lacks to find good mapping functions when compared to ridge regression. Especially as data becomes noisier, outlier-tolerant fitting methods are of importance. We demonstrate a reduction in mean MSE of 20% by simply using ridge over polynomial fit in a mobile eye-tracking experiment.

2.
J Eye Mov Res ; 10(3)2017 May 25.
Article in English | MEDLINE | ID: mdl-33828657

ABSTRACT

Eye-tracking technology has to date been primarily employed in research. With recent advances in affordable video-based devices, the implementation of gaze-aware smartphones, and marketable driver monitoring systems, a considerable step towards pervasive eye-tracking has been made. However, several new challenges arise with the usage of eye-tracking in the wild and will need to be tackled to increase the acceptance of this technology. The main challenge is still related to the usage of eye-tracking together with eyeglasses, which in combination with reflections for changing illumination conditions will make a subject "untrackable". If we really want to bring the technology to the consumer, we cannot simply exclude 30% of the population as potential users only because they wear eyeglasses, nor can we make them clean their glasses and the device regularly. Instead, the pupil detection algorithms need to be made robust to potential sources of noise. We hypothesize that the amount of dust and dirt on the eyeglasses and the eye-tracker camera has a significant influence on the performance of currently available pupil detection algorithms. Therefore, in this work, we present a systematic study of the effect of dust and dirt on the pupil detection by simulating various quantities of dirt and dust on eyeglasses. Our results show 1) an overall high robustness to dust in an offfocus layer. 2) the vulnerability of edge-based methods to even small in-focus dust particles. 3) a trade-off between tolerated particle size and particle amount, where a small number of rather large particles showed only a minor performance impact.

3.
J Eye Mov Res ; 10(4)2017 Nov 06.
Article in English | MEDLINE | ID: mdl-33828665

ABSTRACT

We investigate the pupil response to hazard perception during driving simulation. Complementary to gaze movement and physiological stress indicators, pupil size changes can provide valuable information on traffic hazard perception with a relatively low temporal delay. We tackle the challenge of identifying those pupil dilation events associated with hazardous events from a noisy signal by a combination of wavelet transformation and machine learning. Therefore, we use features of the wavelet components as training data of a support vector machine. We further demonstrate how to utilize the method for the analysis of actual hazard perception and how it may differ from the behavioral driving response.

4.
Behav Res Methods ; 49(3): 1048-1064, 2017 06.
Article in English | MEDLINE | ID: mdl-27443354

ABSTRACT

Our eye movements are driven by a continuous trade-off between the need for detailed examination of objects of interest and the necessity to keep an overview of our surrounding. In consequence, behavioral patterns that are characteristic for our actions and their planning are typically manifested in the way we move our eyes to interact with our environment. Identifying such patterns from individual eye movement measurements is however highly challenging. In this work, we tackle the challenge of quantifying the influence of experimental factors on eye movement sequences. We introduce an algorithm for extracting sequence-sensitive features from eye movements and for the classification of eye movements based on the frequencies of small subsequences. Our approach is evaluated against the state-of-the art on a novel and a very rich collection of eye movements data derived from four experimental settings, from static viewing tasks to highly dynamic outdoor settings. Our results show that the proposed method is able to classify eye movement sequences over a variety of experimental designs. The choice of parameters is discussed in detail with special focus on highlighting different aspects of general scanpath shape. Algorithms and evaluation data are available at: http://www.ti.uni-tuebingen.de/scanpathcomparison.html .


Subject(s)
Algorithms , Eye Movement Measurements/classification , Eye Movements/physiology , Female , Humans , Male , Photic Stimulation
5.
Optom Vis Sci ; 92(11): 1037-46, 2015 Nov.
Article in English | MEDLINE | ID: mdl-26501733

ABSTRACT

PURPOSE: The aim of this pilot study was to assess the driving performance and the visual search behavior, that is, eye and head movements, of patients with glaucoma in comparison to healthy-sighted subjects during a simulated driving test. METHODS: Driving performance and gaze behavior of six glaucoma patients and eight healthy-sighted age- and sex-matched control subjects were compared in an advanced driving simulator. All subjects underwent a 40-minute driving test including nine hazardous situations on city and rural roads. Fitness to drive was assessed by a masked driving instructor according to the requirements of the official German driving test. Several driving performance measures were investigated: lane position, time to line crossing, and speed. Additionally, eye and head movements were tracked and analyzed. RESULTS: Three out of six glaucoma patients passed the driving test and their driving performance was indistinguishable from that of the control group. Patients who passed the test showed an increased visual exploration in comparison to patients who failed; that is, they showed increased number of head and gaze movements toward eccentric regions. Furthermore, patients who failed the test showed a rightward bias in average lane position, probably in an attempt to maximize the safety margin to oncoming traffic. CONCLUSIONS: Our study suggests that a considerable subgroup of subjects with binocular glaucomatous visual field loss shows a safe driving behavior in a virtual reality environment, because they adapt their viewing behavior by increasing their visual scanning. Hence, binocular visual field loss does not necessarily influence driving safety. We recommend that more individualized driving assessments, which will take into account the patient's ability to compensate, are required.


Subject(s)
Automobile Driving , Fixation, Ocular/physiology , Glaucoma/physiopathology , Task Performance and Analysis , Vision Disorders/physiopathology , Vision, Binocular/physiology , Visual Fields/physiology , Aged , Automobile Driver Examination , Computer Simulation , Eye Movements/physiology , Female , Head Movements/physiology , Humans , Male , Middle Aged , Pilot Projects , Safety , Visual Perception/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...