Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 30.464
Filter
1.
J. optom. (Internet) ; 17(2): [100500], Abr-Jun, 2024. tab
Article in English | IBECS | ID: ibc-231624

ABSTRACT

Purpose: Visual snow syndrome (VSS) is a complex neurological condition presenting with an array of sensory, motor, and perceptual dysfunctions and related visual and non-visual symptoms. Recent laboratory studies have found subtle, basic, saccadic-based abnormalities in this population. The objective of the present investigation was to determine if saccadic-related problems could be confirmed and extended using three common clinical reading-related eye movement tests having well-developed protocols and normative databases. Methods: This was a retrospective analysis of 32 patients (ages 16–56 years) diagnosed with VSS in the first author's optometric practice. There was a battery of three reading-related tests: the Visagraph Reading Eye Movement Test, the Developmental Eye Movement (DEM) Test, and the RightEye Dynamic Vision Assessment Test, all performed using their standard documented protocols and large normative databases. Results: A high frequency of oculomotor deficits was found with all three tests. The greatest percentage was revealed with the Visagraph (56%) and the least with the RightEye (23%). A total of 77% of patients failed at least one of the three tests. Conclusion: The present findings confirm and extend earlier investigations revealing a high frequency of saccadic-based oculomotor problems in the VSS population, now including reading-related tasks. This is consistent with the more general oculomotor/motor problems found in these individuals.(AU)


Subject(s)
Humans , Male , Female , Central Nervous System Diseases/complications , Vision, Ocular , Ophthalmoplegia , Optometry , Eye Movements
2.
Cereb Cortex ; 34(5)2024 May 02.
Article in English | MEDLINE | ID: mdl-38725291

ABSTRACT

A widely used psychotherapeutic treatment for post-traumatic stress disorder (PTSD) involves performing bilateral eye movement (EM) during trauma memory retrieval. However, how this treatment-described as eye movement desensitization and reprocessing (EMDR)-alleviates trauma-related symptoms is unclear. While conventional theories suggest that bilateral EM interferes with concurrently retrieved trauma memories by taxing the limited working memory resources, here, we propose that bilateral EM actually facilitates information processing. In two EEG experiments, we replicated the bilateral EM procedure of EMDR, having participants engaging in continuous bilateral EM or receiving bilateral sensory stimulation (BS) as a control while retrieving short- or long-term memory. During EM or BS, we presented bystander images or memory cues to probe neural representations of perceptual and memory information. Multivariate pattern analysis of the EEG signals revealed that bilateral EM enhanced neural representations of simultaneously processed perceptual and memory information. This enhancement was accompanied by heightened visual responses and increased neural excitability in the occipital region. Furthermore, bilateral EM increased information transmission from the occipital to the frontoparietal region, indicating facilitated information transition from low-level perceptual representation to high-level memory representation. These findings argue for theories that emphasize information facilitation rather than disruption in the EMDR treatment.


Subject(s)
Electroencephalography , Eye Movement Desensitization Reprocessing , Humans , Female , Male , Young Adult , Adult , Eye Movement Desensitization Reprocessing/methods , Eye Movements/physiology , Stress Disorders, Post-Traumatic/physiopathology , Stress Disorders, Post-Traumatic/therapy , Stress Disorders, Post-Traumatic/psychology , Visual Perception/physiology , Memory/physiology , Brain/physiology , Photic Stimulation/methods , Memory, Short-Term/physiology
3.
Invest Ophthalmol Vis Sci ; 65(5): 7, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38700875

ABSTRACT

Purpose: This study aimed to explore the underlying mechanisms of the observed visuomotor deficit in amblyopia. Methods: Twenty-four amblyopic (25.8 ± 3.8 years; 15 males) and 22 normal participants (25.8 ± 2.1 years; 8 males) took part in the study. The participants were instructed to continuously track a randomly moving Gaussian target on a computer screen using a mouse. In experiment 1, the participants performed the tracking task at six different target sizes. In experiments 2 and 3, they were asked to track a target with the contrast adjusted to individual's threshold. The tracking performance was represented by the kernel function calculated as the cross-correlation between the target and mouse displacements. The peak, latency, and width of the kernel were extracted and compared between the two groups. Results: In experiment 1, target size had a significant effect on the kernel peak (F(1.649, 46.170) = 200.958, P = 4.420 × 10-22). At the smallest target size, the peak in the amblyopic group was significantly lower than that in the normal group (0.089 ± 0.023 vs. 0.107 ± 0.020, t(28) = -2.390, P = 0.024) and correlated with the contrast sensitivity function (r = 0.739, P = 0.002) in the amblyopic eyes. In experiments 2 and 3, with equally visible stimuli, there were still differences in the kernel between the two groups (all Ps < 0.05). Conclusions: When stimulus visibility was compensated, amblyopic participants still showed significantly poorer tracking performance.


Subject(s)
Amblyopia , Visual Acuity , Humans , Amblyopia/physiopathology , Male , Female , Adult , Young Adult , Visual Acuity/physiology , Psychophysics/methods , Motion Perception/physiology , Contrast Sensitivity/physiology , Eye Movements/physiology
4.
J Vis ; 24(5): 3, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38709511

ABSTRACT

In everyday life we frequently make simple visual judgments about object properties, for example, how big or wide is a certain object? Our goal is to test whether there are also task-specific oculomotor routines that support perceptual judgments, similar to the well-established exploratory routines for haptic perception. In a first study, observers saw different scenes with two objects presented in a photorealistic virtual reality environment. Observers were asked to judge which of two objects was taller or wider while gaze was tracked. All tasks were performed with the same set of virtual objects in the same scenes, so that we can compare spatial characteristics of exploratory gaze behavior to quantify oculomotor routines for each task. Width judgments showed fixations around the center of the objects with larger horizontal spread. In contrast, for height judgments, gaze was shifted toward the top of the objects with larger vertical spread. These results suggest specific strategies in gaze behavior that presumably are used for perceptual judgments. To test the causal link between oculomotor behavior and perception, in a second study, observers could freely gaze at the object or we introduced a gaze-contingent setup forcing observers to fixate specific positions on the object. Discrimination performance was similar between free-gaze and the gaze-contingent conditions for width and height judgments. These results suggest that although gaze is adapted for different tasks, performance seems to be based on a perceptual strategy, independent of potential cues that can be provided by the oculomotor system.


Subject(s)
Eye Movements , Fixation, Ocular , Judgment , Humans , Judgment/physiology , Male , Female , Adult , Eye Movements/physiology , Young Adult , Fixation, Ocular/physiology , Photic Stimulation/methods , Virtual Reality , Visual Perception/physiology
5.
Sci Rep ; 14(1): 10261, 2024 05 04.
Article in English | MEDLINE | ID: mdl-38704441

ABSTRACT

Previous studies have suggested behavioral patterns, such as visual attention and eye movements, relate to individual personality traits. However, these studies mainly focused on free visual tasks, and the impact of visual field restriction remains inadequately understood. The primary objective of this study is to elucidate the patterns of conscious eye movements induced by visual field restriction and to examine how these patterns relate to individual personality traits. Building on previous research, we aim to gain new insights through two behavioral experiments, unraveling the intricate relationship between visual behaviors and individual personality traits. As a result, both Experiment 1 and Experiment 2 revealed differences in eye movements during free observation and visual field restriction. Particularly, simulation results based on the analyzed data showed clear distinctions in eye movements between free observation and visual field restriction conditions. This suggests that eye movements during free observation involve a mixture of conscious and unconscious eye movements. Furthermore, we observed significant correlations between conscious eye movements and personality traits, with more pronounced effects in the visual field restriction condition used in Experiment 2 compared to Experiment 1. These analytical findings provide a novel perspective on human cognitive processes through visual perception.


Subject(s)
Eye Movements , Personality , Visual Fields , Humans , Visual Fields/physiology , Eye Movements/physiology , Male , Personality/physiology , Female , Adult , Young Adult , Attention/physiology , Visual Perception/physiology
6.
PLoS One ; 19(5): e0304150, 2024.
Article in English | MEDLINE | ID: mdl-38805447

ABSTRACT

When comprehending speech, listeners can use information encoded in visual cues from a face to enhance auditory speech comprehension. For example, prior work has shown that the mouth movements reflect articulatory features of speech segments and durational information, while pitch and speech amplitude are primarily cued by eyebrow and head movements. Little is known about how the visual perception of segmental and prosodic speech information is influenced by linguistic experience. Using eye-tracking, we studied how perceivers' visual scanning of different regions on a talking face predicts accuracy in a task targeting both segmental versus prosodic information, and also asked how this was influenced by language familiarity. Twenty-four native English perceivers heard two audio sentences in either English or Mandarin (an unfamiliar, non-native language), which sometimes differed in segmental or prosodic information (or both). Perceivers then saw a silent video of a talking face, and judged whether that video matched either the first or second audio sentence (or whether both sentences were the same). First, increased looking to the mouth predicted correct responses only for non-native language trials. Second, the start of a successful search for speech information in the mouth area was significantly delayed in non-native versus native trials, but just when there were only prosodic differences in the auditory sentences, and not when there were segmental differences. Third, (in correct trials) the saccade amplitude in native language trials was significantly greater than in non-native trials, indicating more intensely focused fixations in the latter. Taken together, these results suggest that mouth-looking was generally more evident when processing a non-native versus native language in all analyses, but fascinatingly, when measuring perceivers' latency to fixate the mouth, this language effect was largest in trials where only prosodic information was useful for the task.


Subject(s)
Language , Phonetics , Speech Perception , Humans , Female , Male , Adult , Speech Perception/physiology , Young Adult , Face/physiology , Visual Perception/physiology , Eye Movements/physiology , Speech/physiology , Eye-Tracking Technology
7.
Sci Rep ; 14(1): 11661, 2024 05 22.
Article in English | MEDLINE | ID: mdl-38778122

ABSTRACT

Gaze estimation is long been recognised as having potential as the basis for human-computer interaction (HCI) systems, but usability and robustness of performance remain challenging . This work focuses on systems in which there is a live video stream showing enough of the subjects face to track eye movements and some means to infer gaze location from detected eye features. Currently, systems generally require some form of calibration or set-up procedure at the start of each user session. Here we explore some simple strategies for enabling gaze based HCI to operate immediately and robustly without any explicit set-up tasks. We explore different choices of coordinate origin for combining extracted features from multiple subjects and the replacement of subject specific calibration by system initiation based on prior models. Results show that referencing all extracted features to local coordinate origins determined by subject start position enables robust immediate operation. Combining this approach with an adaptive gaze estimation model using an interactive user interface enables continuous operation with the 75th percentile gaze errors of 0.7 ∘ , and maximum gaze errors of 1.7 ∘ during prospective testing. There constitute state-of-the-art results and have the potential to enable a new generation of reliable gaze based HCI systems.


Subject(s)
Eye Movements , Fixation, Ocular , User-Computer Interface , Humans , Fixation, Ocular/physiology , Eye Movements/physiology , Male , Eye-Tracking Technology , Female , Adult
8.
Exp Brain Res ; 242(6): 1469-1479, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38695940

ABSTRACT

Ocular torsion and vertical divergence reflect the brain's sensorimotor integration of motion through the vestibulo-ocular reflex (VOR) and the optokinetic reflex (OKR) to roll rotations. Torsion and vergence however express different response patterns depending on several motion variables, but research on their temporal dynamics remains limited. This study investigated the onset times of ocular torsion (OT) and vertical vergence (VV) during visual, vestibular, and visuovestibular motion, as well as their relative decay rates following prolonged optokinetic stimulations. Temporal characteristics were retrieved from three separate investigations where the level of visual clutter and acceleration were controlled. Video eye-tracking was used to retrieve the eye-movement parameters from a total of 41 healthy participants across all trials. Ocular torsion consistently initiated earlier than vertical vergence, particularly evident under intensified visual information density, and higher clutter levels were associated with more balanced decay rates. Additionally, stimulation modality and accelerations affected the onsets of both eye movements, with visuovestibular motion triggering earlier responses compared to vestibular motion, and increased accelerations leading to earlier onsets for both movements. The present study showed that joint visuovestibular responses produced more rapid onsets, indicating a synergetic sensorimotor process. It also showed that visual content acted as a fusional force during the decay period, and imposed greater influence over the torsional onset compared to vergence. Acceleration, by contrast, did not affect the temporal relationship between the two eye movements. Altogether, these findings provide insights into the sensorimotor integration of the vestibulo-ocular and optokinetic reflex arcs.


Subject(s)
Reflex, Vestibulo-Ocular , Humans , Adult , Male , Female , Reflex, Vestibulo-Ocular/physiology , Young Adult , Rotation , Eye Movements/physiology , Vestibule, Labyrinth/physiology , Motion Perception/physiology , Convergence, Ocular/physiology
9.
PLoS One ; 19(5): e0303755, 2024.
Article in English | MEDLINE | ID: mdl-38758747

ABSTRACT

Recent eye tracking studies have linked gaze reinstatement-when eye movements from encoding are reinstated during retrieval-with memory performance. In this study, we investigated whether gaze reinstatement is influenced by the affective salience of information stored in memory, using an adaptation of the emotion-induced memory trade-off paradigm. Participants learned word-scene pairs, where scenes were composed of negative or neutral objects located on the left or right side of neutral backgrounds. This allowed us to measure gaze reinstatement during scene memory tests based on whether people looked at the side of the screen where the object had been located. Across two experiments, we behaviorally replicated the emotion-induced memory trade-off effect, in that negative object memory was better than neutral object memory at the expense of background memory. Furthermore, we found evidence that gaze reinstatement was related to recognition memory for the object and background scene components. This effect was generally comparable for negative and neutral memories, although the effects of valence varied somewhat between the two experiments. Together, these findings suggest that gaze reinstatement occurs independently of the processes contributing to the emotion-induced memory trade-off effect.


Subject(s)
Emotions , Eye Movements , Eye-Tracking Technology , Memory , Humans , Emotions/physiology , Female , Male , Young Adult , Adult , Memory/physiology , Eye Movements/physiology , Fixation, Ocular/physiology , Adolescent , Recognition, Psychology/physiology , Photic Stimulation
10.
PLoS Biol ; 22(5): e3002614, 2024 May.
Article in English | MEDLINE | ID: mdl-38743775

ABSTRACT

The processing of sensory information, even at early stages, is influenced by the internal state of the animal. Internal states, such as arousal, are often characterized by relating neural activity to a single "level" of arousal, defined by a behavioral indicator such as pupil size. In this study, we expand the understanding of arousal-related modulations in sensory systems by uncovering multiple timescales of pupil dynamics and their relationship to neural activity. Specifically, we observed a robust coupling between spiking activity in the mouse dorsolateral geniculate nucleus (dLGN) of the thalamus and pupil dynamics across timescales spanning a few seconds to several minutes. Throughout all these timescales, 2 distinct spiking modes-individual tonic spikes and tightly clustered bursts of spikes-preferred opposite phases of pupil dynamics. This multi-scale coupling reveals modulations distinct from those captured by pupil size per se, locomotion, and eye movements. Furthermore, coupling persisted even during viewing of a naturalistic movie, where it contributed to differences in the encoding of visual information. We conclude that dLGN spiking activity is under the simultaneous influence of multiple arousal-related processes associated with pupil dynamics occurring over a broad range of timescales.


Subject(s)
Action Potentials , Arousal , Geniculate Bodies , Pupil , Animals , Pupil/physiology , Geniculate Bodies/physiology , Mice , Action Potentials/physiology , Arousal/physiology , Male , Mice, Inbred C57BL , Photic Stimulation/methods , Neurons/physiology , Thalamus/physiology , Eye Movements/physiology , Time Factors , Visual Pathways/physiology
11.
Sensors (Basel) ; 24(9)2024 Apr 23.
Article in English | MEDLINE | ID: mdl-38732772

ABSTRACT

In mobile eye-tracking research, the automatic annotation of fixation points is an important yet difficult task, especially in varied and dynamic environments such as outdoor urban landscapes. This complexity is increased by the constant movement and dynamic nature of both the observer and their environment in urban spaces. This paper presents a novel approach that integrates the capabilities of two foundation models, YOLOv8 and Mask2Former, as a pipeline to automatically annotate fixation points without requiring additional training or fine-tuning. Our pipeline leverages YOLO's extensive training on the MS COCO dataset for object detection and Mask2Former's training on the Cityscapes dataset for semantic segmentation. This integration not only streamlines the annotation process but also improves accuracy and consistency, ensuring reliable annotations, even in complex scenes with multiple objects side by side or at different depths. Validation through two experiments showcases its efficiency, achieving 89.05% accuracy in a controlled data collection and 81.50% accuracy in a real-world outdoor wayfinding scenario. With an average runtime per frame of 1.61 ± 0.35 s, our approach stands as a robust solution for automatic fixation annotation.


Subject(s)
Eye-Tracking Technology , Fixation, Ocular , Humans , Fixation, Ocular/physiology , Video Recording/methods , Algorithms , Eye Movements/physiology
12.
Sensors (Basel) ; 24(9)2024 Apr 24.
Article in English | MEDLINE | ID: mdl-38732794

ABSTRACT

High-quality eye-tracking data are crucial in behavioral sciences and medicine. Even with a solid understanding of the literature, selecting the most suitable algorithm for a specific research project poses a challenge. Empowering applied researchers to choose the best-fitting detector for their research needs is the primary contribution of this paper. We developed a framework to systematically assess and compare the effectiveness of 13 state-of-the-art algorithms through a unified application interface. Hence, we more than double the number of algorithms that are currently usable within a single software package and allow researchers to identify the best-suited algorithm for a given scientific setup. Our framework validation on retrospective data underscores its suitability for algorithm selection. Through a detailed and reproducible step-by-step workflow, we hope to contribute towards significantly improved data quality in scientific experiments.


Subject(s)
Algorithms , Eye-Tracking Technology , Humans , Software , Data Accuracy , Eye Movements/physiology , Reproducibility of Results
13.
PLoS One ; 19(5): e0293436, 2024.
Article in English | MEDLINE | ID: mdl-38723019

ABSTRACT

BACKGROUND: Free throw is an important means of scoring in basketball games. With the improvement of basketball competition level and the enhancement of confrontation degree, the number of free throws in the game gradually increases, so the score of free throw will have an important impact on the result of the game. The purpose of this study is to explore the relationship between visual attention characteristics and hit rate of basketball players in free throw psychological procedure training, so as to provide scientific basis for basketball teaching and training. METHODS: Forty players with similar free throw abilities were randomly assigned to the experimental group (10 males, 10 females) and control group (10 males, 10 females). The experimental group was free throw psychological procedure training, while the control group was trained with routine training, Eye movement indices (number of fixations, fixation duration, and pupil dilation) and the free throw hit rate and analyzed before and after the experiment. Group differences were examined using t-tests, while paired sample t-tests were conducted to compare pre- and post-test results within each group. The training time and training times of the two groups were the same. RESULTS: There were significant differences in fixation duration, number of fixations, pupil diameter and free throw hit rate between pre-test and post-test in the experimental group (P < 0.05). Post-test, there were significant differences in number of fixations, fixation duration, pupil diameter and free throw hit rate between the two groups (P < 0.05). There was a significant positive correlation between number of fixations and free throw hit rate in top (P < 0.01), and there was a significant positive correlation between fixation duration and hit rate in front (P < 0.01). CONCLUSIONS: The psychological procedure training can improve the visual information search strategy and information processing ability of free throw, and significantly improve the free throw hit rate. There was a positive correlation between the front fixation time and the free throw hit rate, and there was a positive correlation between the top number of fixations and the free throw hit rate.


Subject(s)
Basketball , Fixation, Ocular , Humans , Male , Female , Basketball/psychology , Young Adult , Fixation, Ocular/physiology , Athletic Performance/physiology , Athletic Performance/psychology , Attention/physiology , Eye Movements/physiology , Adult
14.
PLoS One ; 19(5): e0302872, 2024.
Article in English | MEDLINE | ID: mdl-38768134

ABSTRACT

Whether a saccade is accurate and has reached the target cannot be evaluated during its execution, but relies on post-saccadic feedback. If the eye has missed the target object, a secondary corrective saccade has to be made to align the fovea with the target. If a systematic post-saccadic error occurs, adaptive changes to the oculomotor behavior are made, such as shortening or lengthening the saccade amplitude. Systematic post-saccadic errors are typically attributed internally to erroneous motor commands. The corresponding adaptive changes to the motor command reduce the error and the need for secondary corrective saccades, and, in doing so, restore accuracy and efficiency. However, adaptive changes to the oculomotor behavior also occur if a change in saccade amplitude is beneficial for task performance, or if it is rewarded. Oculomotor learning thus is more complex than reducing a post-saccadic position error. In the current study, we used a novel oculomotor learning paradigm and investigated whether human participants are able to adapt their oculomotor behavior to improve task performance even when they attribute the error externally. The task was to indicate the intended target object among several objects to a simulated human-machine interface by making eye movements. The participants were informed that the system itself could make errors. The decoding process depended on a distorted landing point of the saccade, resulting in decoding errors. Two different types of visual feedback were added to the post-saccadic scene and we compared how participants used the different feedback types to adjust their oculomotor behavior to avoid errors. We found that task performance improved over time, regardless of the type of feedback. Thus, error feedback from the simulated human-machine interface was used for post-saccadic error evaluation. This indicates that 1) artificial visual feedback signals and 2) externally caused errors might drive adaptive changes to oculomotor behavior.


Subject(s)
Saccades , Humans , Saccades/physiology , Adult , Male , Female , Eye Movements/physiology , Young Adult , Psychomotor Performance/physiology , Learning/physiology
15.
Sci Rep ; 14(1): 9996, 2024 05 01.
Article in English | MEDLINE | ID: mdl-38693184

ABSTRACT

Tracking a moving object with the eyes seems like a simple task but involves areas of prefrontal cortex (PFC) associated with attention, working memory and prediction. Increasing the demand on these processes with secondary tasks can affect eye movements and/or perceptual judgments. This is particularly evident in chronic or acute neurological conditions such as Alzheimer's disease or mild traumatic brain injury. Here, we combined near infrared spectroscopy and video-oculography to examine the effects of concurrent upper limb movement, which provides additional afference and efference that facilitates tracking of a moving object, in a novel dual-task pursuit protocol. We confirmed the expected effects on judgement accuracy in the primary and secondary tasks, as well as a reduction in eye velocity when the moving object was occluded. Although there was limited evidence of oculo-manual facilitation on behavioural measures, performing concurrent upper limb movement did result in lower activity in left medial PFC, as well as a change in PFC network organisation, which was shown by Graph analysis to be locally and globally more efficient. These findings extend upon previous work by showing how PFC is functionally organised to support eye-hand coordination when task demands more closely replicate daily activities.


Subject(s)
Prefrontal Cortex , Upper Extremity , Humans , Prefrontal Cortex/physiology , Male , Female , Upper Extremity/physiology , Adult , Young Adult , Movement/physiology , Psychomotor Performance/physiology , Eye Movements/physiology , Spectroscopy, Near-Infrared , Attention/physiology
16.
Opt Lett ; 49(9): 2489-2492, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38691751

ABSTRACT

Point scanning retinal imaging modalities, including confocal scanning light ophthalmoscopy (cSLO) and optical coherence tomography, suffer from fixational motion artifacts. Fixation targets, though effective at reducing eye motion, are infeasible in some applications (e.g., handheld devices) due to their bulk and complexity. Here, we report on a cSLO device that scans the retina in a spiral pattern under pseudo-visible illumination, thus collecting image data while simultaneously projecting, into the subject's vision, the image of a bullseye, which acts as a virtual fixation target. An imaging study of 14 young adult volunteers was conducted to compare the fixational performance of this technique to that of raster scanning, with and without a discrete inline fixation target. Image registration was used to quantify subject eye motion; a strip-wise registration method was used for raster scans, and a novel, to the best of our knowledge, ring-based method was used for spiral scans. Results indicate a statistically significant reduction in eye motion by the use of spiral scanning as compared to raster scanning without a fixation target.


Subject(s)
Fixation, Ocular , Ophthalmoscopy , Retina , Humans , Retina/diagnostic imaging , Fixation, Ocular/physiology , Ophthalmoscopy/methods , Adult , Young Adult , Eye Movements
17.
Nat Commun ; 15(1): 3692, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38693186

ABSTRACT

Over the last decades, cognitive neuroscience has identified a distributed set of brain regions that are critical for attention. Strong anatomical overlap with brain regions critical for oculomotor processes suggests a joint network for attention and eye movements. However, the role of this shared network in complex, naturalistic environments remains understudied. Here, we investigated eye movements in relation to (un)attended sentences of natural speech. Combining simultaneously recorded eye tracking and magnetoencephalographic data with temporal response functions, we show that gaze tracks attended speech, a phenomenon we termed ocular speech tracking. Ocular speech tracking even differentiates a target from a distractor in a multi-speaker context and is further related to intelligibility. Moreover, we provide evidence for its contribution to neural differences in speech processing, emphasizing the necessity to consider oculomotor activity in future research and in the interpretation of neural differences in auditory cognition.


Subject(s)
Attention , Eye Movements , Magnetoencephalography , Speech Perception , Speech , Humans , Attention/physiology , Eye Movements/physiology , Male , Female , Adult , Young Adult , Speech Perception/physiology , Speech/physiology , Acoustic Stimulation , Brain/physiology , Eye-Tracking Technology
18.
Sci Rep ; 14(1): 10040, 2024 05 02.
Article in English | MEDLINE | ID: mdl-38693189

ABSTRACT

Investigation of visual illusions helps us understand how we process visual information. For example, face pareidolia, the misperception of illusory faces in objects, could be used to understand how we process real faces. However, it remains unclear whether this illusion emerges from errors in face detection or from slower, cognitive processes. Here, our logic is straightforward; if examples of face pareidolia activate the mechanisms that rapidly detect faces in visual environments, then participants will look at objects more quickly when the objects also contain illusory faces. To test this hypothesis, we sampled continuous eye movements during a fast saccadic choice task-participants were required to select either faces or food items. During this task, pairs of stimuli were positioned close to the initial fixation point or further away, in the periphery. As expected, the participants were faster to look at face targets than food targets. Importantly, we also discovered an advantage for food items with illusory faces but, this advantage was limited to the peripheral condition. These findings are among the first to demonstrate that the face pareidolia illusion persists in the periphery and, thus, it is likely to be a consequence of erroneous face detection.


Subject(s)
Illusions , Humans , Female , Male , Adult , Illusions/physiology , Young Adult , Visual Perception/physiology , Photic Stimulation , Face/physiology , Facial Recognition/physiology , Eye Movements/physiology , Pattern Recognition, Visual/physiology
19.
Sensors (Basel) ; 24(10)2024 May 08.
Article in English | MEDLINE | ID: mdl-38793839

ABSTRACT

Understanding human actions often requires in-depth detection and interpretation of bio-signals. Early eye disengagement from the target (EEDT) represents a significant eye behavior that involves the proactive disengagement of the gazes from the target to gather information on the anticipated pathway, thereby enabling rapid reactions to the environment. It remains unknown how task difficulty and task repetition affect EEDT. We aim to provide direct evidence of how these factors influence EEDT. We developed a visual tracking task in which participants viewed arrow movement videos while their eye movements were tracked. The task complexity was increased by increasing movement steps. Every movement pattern was performed twice to assess the effect of repetition on eye movement. Participants were required to recall the movement patterns for recall accuracy evaluation and complete cognitive load assessment. EEDT was quantified by the fixation duration and frequency within the areas of eye before arrow. When task difficulty increased, we found the recall accuracy score decreased, the cognitive load increased, and EEDT decreased significantly. The EEDT was higher in the second trial, but significance only existed in tasks with lower complexity. EEDT was positively correlated with recall accuracy and negatively correlated with cognitive load. Performing EEDT was reduced by task complexity and increased by task repetition. EEDT may be a promising sensory measure for assessing task performance and cognitive load and can be used for the future development of eye-tracking-based sensors.


Subject(s)
Eye Movements , Eye-Tracking Technology , Humans , Male , Eye Movements/physiology , Female , Adult , Young Adult , Task Performance and Analysis , Cognition/physiology , Fixation, Ocular/physiology
20.
Sci Rep ; 14(1): 11188, 2024 05 16.
Article in English | MEDLINE | ID: mdl-38755251

ABSTRACT

In primates, foveal and peripheral vision have distinct neural architectures and functions. However, it has been debated if selective attention operates via the same or different neural mechanisms across eccentricities. We tested these alternative accounts by examining the effects of selective attention on the steady-state visually evoked potential (SSVEP) and the fronto-parietal signal measured via EEG from human subjects performing a sustained visuospatial attention task. With a negligible level of eye movements, both SSVEP and SND exhibited the heterogeneous patterns of attentional modulations across eccentricities. Specifically, the attentional modulations of these signals peaked at the parafoveal locations and such modulations wore off as visual stimuli appeared closer to the fovea or further away towards the periphery. However, with a relatively higher level of eye movements, the heterogeneous patterns of attentional modulations of these neural signals were less robust. These data demonstrate that the top-down influence of covert visuospatial attention on early sensory processing in human cortex depends on eccentricity and the level of saccadic responses. Taken together, the results suggest that sustained visuospatial attention operates differently across different eccentric locations, providing new understanding of how attention augments sensory representations regardless of where the attended stimulus appears.


Subject(s)
Attention , Electroencephalography , Evoked Potentials, Visual , Humans , Attention/physiology , Male , Female , Evoked Potentials, Visual/physiology , Adult , Young Adult , Photic Stimulation , Visual Perception/physiology , Eye Movements/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...