Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 2.314
Filtrar
1.
J Neural Eng ; 2024 Jul 03.
Artigo em Inglês | MEDLINE | ID: mdl-38959876

RESUMO

OBJECTIVE: Patients suffering from heavy paralysis or Locked-in-Syndrome can regain communication using a Brain-Computer Interface (BCI). Visual event-related potential (ERP) based BCI paradigms exploit visuospatial attention (VSA) to targets laid out on a screen. However, performance drops if the user does not direct their eye gaze at the intended target, harming the utility of this class of BCIs for patients suffering from eye motor deficits. We aim to create an ERP decoder that is less dependent on eye gaze. METHODS: ERP component latency jitter plays a role in covert visuospatial attention (VSA) decoding. We introduce a novel decoder which compensates for these latency effects, termed Woody Classifier-based Latency Estimation (WCBLE). We carried out a BCI experiment recording ERP data in overt and covert visuospatial attention (VSA), and introduce a novel special case of covert VSA termed split VSA, simulating the experience of patients with severely impaired eye motor control. We evaluate WCBLE on this dataset and the BNCI2014-009 dataset, within and across VSA conditions to study the dependency on eye gaze and the variation thereof during the experiment. RESULTS & DISCUSSION: WCBLE outperforms state-of-the-art methods in the VSA conditions of interest in gaze-independent decoding, without reducing overt VSA performance. Results from across-condition evaluation show that WCBLE is more robust to varying VSA conditions throughout a BCI operation session. Together, these results point towards a pathway to achieving gaze independence through suited ERP decoding. Our proposed gaze-independent solution enhances decoding performance in those cases where performing overt VSA is not possible.

2.
Am J Primatol ; : e23660, 2024 Jul 04.
Artigo em Inglês | MEDLINE | ID: mdl-38961748

RESUMO

Characterizing individual differences in cognition is crucial for understanding the evolution of cognition as well as to test the biological consequences of different cognitive traits. Here, we harnessed the strengths of a uniquely large, naturally-living primate population at the Cayo Santiago Biological Field Station to characterized individual differences in rhesus monkey performance across two social cognitive tasks. A total of n = 204 semi-free-ranging adult rhesus monkeys participated in a data collection procedure, where we aimed to test individuals on both tasks at two time-points that were one year apart. In the socioemotional responses task, we assessed monkeys' attention to conspecific photographs with neutral versus negative emotional expressions. We found that monkeys showed overall declines in interest in conspecific photographs with age, but relative increases in attention to threat stimuli specifically, and further that these responses exhibited long-term stability across repeated testing. In the gaze following task we assessed monkeys' propensity to co-orient with an experimenter. Here, we found no evidence for age-related change in responses, and responses showed only limited repeatability over time. Finally, we found some evidence for common individual variation for performance across the tasks: monkeys that showed greater interest in conspecific photographs were more likely to follow a human's gaze. These results show how studies of comparative cognitive development and aging can provide insights into the evolution of cognition, and identify core primate social cognitive traits that may be related across and within individuals.

3.
J Eye Mov Res ; 17(1)2024.
Artigo em Inglês | MEDLINE | ID: mdl-38966235

RESUMO

Gaze behaviour has been used as a proxy for information processing capabilities that underlie complex skill performance in real-world domains such as aviation. These processes are highly influenced by task requirements, expertise and can provide insight into situation awareness (SA). Little research has been done to examine the extent to which gaze behaviour, task performance and SA are impacted by various task manipulations within the confines of early-stage skill development. Accordingly, the current study aimed to understand the impact of task difficulty on landing performance, gaze behaviour and SA across different phases of flight. Twenty-four low-time (<300 hours) pilots completed simulated landing scenarios under visual flight rules conditions. Traditional gaze metrics, entropybased metrics, and blink rate provided meaningful insight about the extent to which information processing is modulated by flight phase and task difficulty. The results also suggested that gaze behavior changes compensated for increased task demands and minimized the impact on task performance. Dynamic gaze analyses were shown to be a robust measure of task difficulty and pilot flight hours. Recommendations for the effective implementation of gaze behaviour metrics and their utility in examining information processing changes are discussed.

4.
Front Psychol ; 15: 1377379, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38947900

RESUMO

Individuals diagnosed with attention deficit/hyperactivity disorder (ADHD) have been found to have impairments in multiple aspects of social cognition, thus including the attentional processing of socially relevant stimuli such as eye-gaze. However, to date, it remains unclear whether only the social-specific but not the domain-general directional components, elicited by eye-gaze are affected by ADHD symptomatology. To address this issue, the present study aimed to investigate the impact of ADHD-like traits on the social-specific attentional processing of eye-gaze. To this purpose, we conducted an online experiment with a sample of 140 healthy undergraduate participants who completed two self-reported questionnaires designed to assess ADHD-like traits, and a social variant of an interference spatial task known to effectively isolate the social-specific component of eye-gaze. To make our research plan transparent, our hypotheses, together with the plans of analyses, were registered before data exploration. Results showed that while the social-specific component of eye-gaze was evident in the sample, no significant correlation was found between this component and the measured ADHD-like traits. These results appear to contradict the intuition that the attentional processing of the social-specific components of eye-gaze may be impaired by ADHD symptomatology. However, further research involving children and clinical populations is needed in order to clarify this matter.

5.
World Neurosurg ; 2024 Jul 04.
Artigo em Inglês | MEDLINE | ID: mdl-38971495

RESUMO

Vertical 'half-and-half' syndrome, characterized by contralateral upward and ipsilateral downward gaze palsy, is a rare variant of vertical eye movement disturbance. Similarly, pseudo-abducens palsy, manifesting as abductive palsy despite no lesion to the pons, constitutes another rare type of eye movement disturbance. Both conditions have been associated with lesions in the thalamo-mesencephalic junction. We present a rare case report detailing a patient exhibiting vertical 'half-and-half' syndrome with ipsilateral pseudo-abducens palsy following a left lacunar infarction of the thalamo-mesencephalic junction. Additionally, we discuss the potential underlying mechanisms contributing to this rare combination of eye movement disorders.

6.
Dev Cogn Neurosci ; 68: 101401, 2024 Jun 10.
Artigo em Inglês | MEDLINE | ID: mdl-38870603

RESUMO

Infants' motivation to engage with the social world depends on the interplay between individual brain's characteristics and previous exposure to social cues such as the parent's smile or eye contact. Different hypotheses about why specific combinations of emotional expressions and gaze direction engage children have been tested with group-level approaches rather than focusing on individual differences in the social brain development. Here, a novel Artificial Intelligence-enhanced brain-imaging approach, Neuroadaptive Bayesian Optimisation (NBO), was applied to infant electro-encephalography (EEG) to understand how selected neural signals encode social cues in individual infants. EEG data from 42 6- to 9-month-old infants looking at images of their parent's face were analysed in real-time and used by a Bayesian Optimisation algorithm to identify which combination of the parent's gaze/head direction and emotional expression produces the strongest brain activation in the child. This individualised approach supported the theory that the infant's brain is maximally engaged by communicative cues with a negative valence (angry faces with direct gaze). Infants attending preferentially to faces with direct gaze had increased positive affectivity and decreased negative affectivity. This work confirmed that infants' attentional preferences for social cues are heterogeneous and shows the NBO's potential to study diversity in neurodevelopmental trajectories.

7.
Ophthalmol Glaucoma ; 2024 May 27.
Artigo em Inglês | MEDLINE | ID: mdl-38823680

RESUMO

PURPOSE: To evaluate the agreement between 24-2 visual field (VF) test results obtained using the gaze analyzing perimeter (GAP; FINDEX, Tokyo, Japan) and the Humphrey field analyzer (HFA; Carl Zeiss Meditec, Dublin, CA, USA). DESIGN: Cross-sectional study PARTICIPANTS: Patients who underwent HFA 24-2 for suspected or confirmed VF loss and were treated at the Kyoto University Hospital between December 2022 and July 2023. METHODS: Patients underwent consecutive VF tests on the same eye using HFA and GAP 24-2 tests. Bland-Altman analysis was used to compare GAP and HFA results. Examination points where the sensitivity measured using GAP was ≥10 dB higher than that measured using HFA were reevaluated by referring back to the original gaze data; two ophthalmologists assessed whether the gaze moved linearly toward the new test target. MAIN OUTCOME MEASURES: Mean deviation (MD) and elapsed time on an individual basis and sensitivity on an examination point basis. RESULTS: Forty-seven eyes of 47 patients were analyzed. The correlation coefficient of the MD using HFA and GAP was 0.811 (95% confidence interval [CI]: 0.683-0.891). Bland-Altman analysis showed good agreement between HFA and GAP tests. The mean difference (95% limits of agreement [LOA]) in MD between HFA and GAP results was -0.63 dB (-5.81 to 4.54 dB). Although no statistically significant differences were observed in the elapsed time (P = 0.99), measurements completed within 200 s were observed only in the GAP group (11 cases, 23.4%), who had significantly better HFA MD value than others (P=0.001). On an examination point basis for sensitivity, the correlation coefficient between HFA and GAP was 0.691 (95% LOA, 0.670-0.711). Original gaze data assessment revealed that the gaze moved linearly toward the new test target for 70.2% of the examination points with a sensitivity discrepancy. CONCLUSIONS: The results indicate that the GAP provides VF assessment outcomes comparable to those of the HFA. The GAP exhibited advantages in terms of testing time, particularly in patients with minimal VF impairment. Furthermore, the GAP records all eye movements, enabling the objective determination of VF abnormalities based on gaze patterns and facilitating easy post-hoc verification.

8.
Ergonomics ; : 1-21, 2024 Jun 04.
Artigo em Inglês | MEDLINE | ID: mdl-38832783

RESUMO

The affective experience generated when users play computer games can influence their attitude and preference towards the game. Existing evaluation means mainly depend on subjective scales and physiological signals. However, some limitations should not be ignored (e.g. subjective scales are not objective, and physiological signals are complicated). In this paper, we 1) propose a novel method to assess user affective experience when playing single-player games based on pleasure-arousal-dominance (PAD) emotions, facial expressions, and gaze directions, and 2) build an artificial intelligence model to identify user preference. Fifty-four subjects participated in a basketball experiment with three difficulty levels. Their expressions, gaze directions, and subjective PAD emotions were collected and analysed. Experimental results showed that the expression intensities of angry, sad, and neutral, yaw angle degrees of gaze direction, and PAD emotions varied significantly under different difficulties. Besides, the proposed model achieved better performance than other machine-learning algorithms on the collected dataset.


This paper considers the limitations of existing methods for assessing user affective experience when playing computer games. It demonstrates a novel approach using subjective emotion and objective facial cues to identify user affective experience and user preference for the game.

9.
J Clin Sleep Med ; 2024 Jun 17.
Artigo em Inglês | MEDLINE | ID: mdl-38881507

RESUMO

Kleine-Levin syndrome (KLS) is a rare, recurring sleep disorder that easily ignored. Episodic upward-gaze palsy is an uncommon manifestation observed in patients of KLS, which further complicates this disorder. Although peripheral microbial infection have been recognized as most common triggers for KLS, the underlying pathophysiology of this disorder remains unclear. We reported an unique case of KLS elicited by acute encephalitis, which was confirmed by pleocytosis of cerebrospinal fluid (CSF) at the early stage. The CSF returned to normal over time while the attacks continued to recur frequently. Episodic upward-gaze palsy was observed during attacks and clinical symptoms were exacerbated following a subsequent COVID-19 infection. This report presents a classic KLS case with distinctive characteristics, which should facilitate more accurate and earlier diagnosis for clinicians. Furthermore, it provides a new perspective for understanding the pathogenesis of this rare disease.

10.
Artigo em Inglês | MEDLINE | ID: mdl-38888820

RESUMO

PURPOSE: To facilitate the integration of point of gaze (POG) as an input modality for robot-assisted surgery, we introduce a robust head movement compensation gaze tracking system for the da Vinci Surgical System. Previous surgical eye gaze trackers require multiple recalibrations and suffer from accuracy loss when users move from the calibrated position. We investigate whether eye corner detection can reduce gaze estimation error in a robotic surgery context. METHODS: A polynomial regressor is first used to estimate POG after an 8-point calibration, and then, using another regressor, the POG error from head movement is estimated from the shift in 2D eye corner location. Eye corners are computed by first detecting regions of interest using the You Only Look Once (YOLO) object detector trained on 1600 annotated eye images (open dataset included). Contours are then extracted from the bounding boxes and a derivative-based curvature detector refines the eye corner. RESULTS: Through a user study (n = 24), our corner-contingent head compensation algorithm showed an error reduction in degrees of visual angle of 1.20 ∘ (p = 0.037) for the left eye and 1.26 ∘ (p = 0.079) for the right compared to the previous gold-standard POG error correction method. In addition, the eye corner pipeline showed a root-mean-squared error of 3.57 (SD = 1.92) pixels in detecting eye corners over 201 annotated frames. CONCLUSION: We introduce an effective method of using eye corners to correct for eye gaze estimation, enabling the practical acquisition of POG in robotic surgery.

11.
Front Psychol ; 15: 1324667, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38882511

RESUMO

Research on the adaptations talkers make to different communication conditions during interactive conversations has primarily focused on speech signals. We extended this type of investigation to two other important communicative signals, i.e., partner-directed gaze and iconic co-speech hand gestures with the aim of determining if the adaptations made by older adults differ from younger adults across communication conditions. We recruited 57 pairs of participants, comprising 57 primary talkers and 57 secondary ones. Primary talkers consisted of three groups: 19 older adults with mild Hearing Loss (older adult-HL); 17 older adults with Normal Hearing (older adult-NH); and 21 younger adults. The DiapixUK "spot the difference" conversation-based task was used to elicit conversions in participant pairs. One easy (No Barrier: NB) and three difficult communication conditions were tested. The three conditions consisted of two in which the primary talker could hear clearly, but the secondary talkers could not, due to multi-talker babble noise (BAB1) or a less familiar hearing loss simulation (HLS), and a condition in which both the primary and secondary talkers heard each other in babble noise (BAB2). For primary talkers, we measured mean number of partner-directed gazes; mean total gaze duration; and the mean number of co-speech hand gestures. We found a robust effects of communication condition that interacted with participant group. Effects of age were found for both gaze and gesture in BAB1, i.e., older adult-NH looked and gestured less than younger adults did when the secondary talker experienced babble noise. For hearing status, a difference in gaze between older adult-NH and older adult-HL was found for the BAB1 condition; for gesture this difference was significant in all three difficult communication conditions (older adult-HL gazed and gestured more). We propose the age effect may be due to a decline in older adult's attention to cues signaling how well a conversation is progressing. To explain the hearing status effect, we suggest that older adult's attentional decline is offset by hearing loss because these participants have learned to pay greater attention to visual cues for understanding speech.

12.
Neurohospitalist ; 14(3): 264-272, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38895013

RESUMO

Background and Purpose: Radiographic horizontal gaze deviation (RHGD) has been identified as a useful finding on computed tomography (CT) that indicates the affected side in supratentorial ischemic stroke; however, it remains unclear whether RHGD is essentially the same phenomenon as physical horizontal gaze deviation (PHGD). To resolve the issue, this study was conducted. Methods: Retrospective analyses were performed for 671 patients with ischemic stroke and 142 controls who were hospitalized and underwent head CT. First, clinical findings were examined to find differences between RHGD-positive and RHGD-negative patients. Second, patients were classified by their stroke mechanisms and/or affected vascular territories. For each subgroup, RHGD was compared with PHGD in frequency. Third, the proportions for patients divided by positivity for PHGD and RHGD were calculated in the subgroups. Results: Patients with RHGD had PHGD more often than those without. In all stroke subgroups, RHGD was more frequent than PHGD. The frequency difference was prominent in small-artery occlusion (SAO) and posterior inferior cerebellar artery (PICA) stroke. In SAO of the basilar artery pontine perforator, RHGD was positive in 25% and largely contralesionally-directed. In PICA stroke, lesions in the vestibulocerebellum were associated with contralesional RHGD. Moreover, lesions in the lateral medulla also caused RHGD, which was mainly directed to the ipsilesional side. PHGD-positive stroke without RHGD was infrequent, whereas RHGD-positive stroke without PHGD was commonly observed (PICA stroke, 45.9%; other subgroups, 21.1%-27.5%). Conclusions: RHGD had different characteristics from PHGD; therefore, assessments of both PHGD and RHGD may lead to more accurate diagnoses.

13.
Cogn Emot ; : 1-14, 2024 Jun 11.
Artigo em Inglês | MEDLINE | ID: mdl-38863208

RESUMO

The auditory gaze cueing effect (auditory-GCE) is a faster response to auditory targets at an eye-gaze cue location than at a non-cue location. Previous research has found that auditory-GCE can be influenced by the integration of both gaze direction and emotion conveyed through facial expressions. However, it is unclear whether the emotional information of auditory targets can be cross-modally integrated with gaze direction to affect auditory-GCE. Here, we set neutral faces with different gaze directions as cues and three emotional sounds (fearful, happy, and neutral) as targets to investigate how the emotion of sound target modulates the auditory-GCE. Moreover, we conducted a controlled experiment using arrow cues. The results show that the emotional content of sound targets influences the auditory-GCE but only for those induced by facial cues. Specifically, fearful sounds elicit a significantly larger auditory-GCE compared to happy and neutral sounds, indicating that the emotional content of auditory targets plays a modulating role in the auditory-GCE. Furthermore, this modulation appears to occur only at a higher level of social meaning, involving the integration of emotional information from a sound with social gaze direction, rather than at a lower level, which involves the integration of direction and auditory emotion.

14.
J Neurophysiol ; 2024 Jun 05.
Artigo em Inglês | MEDLINE | ID: mdl-38836297

RESUMO

People usually reach for objects to place them in some position and orientation, but the placement component of this sequence is often ignored. For example, reaches are influenced by gaze position, visual feedback, and memory delays, but their influence on object placement is unclear. Here, we tested these factors in a task where participants placed and oriented a trapezoidal block against 2D visual templates displayed on a frontally located computer screen. In Experiment 1, participants matched the block to three possible orientations: 0o (horizontal), +45o and -45o, with gaze fixated 10o to the left/right. The hand and template either remained illuminated (closed-loop), or visual feedback was removed (open-loop). Here, hand location consistently overshot the template relative to gaze, especially in the open-loop task; likewise, orientation was influenced by gaze position (depending on template orientation and visual feedback). In Experiment 2, a memory delay was added, and participants sometimes performed saccades (towards, away from, or across the template). In this task, the influence of gaze on orientation vanished, but location errors were influenced by both template orientation and final gaze position. Contrary to our expectations, the previous saccade metrics also impacted placement overshoot. Overall, hand orientation was influenced by template orientation in a nonlinear fashion. These results demonstrate interactions between gaze and orientation signals in the planning and execution of hand placement and suggest different neural mechanisms for closed-loop, open-loop, and memory delay placement.

15.
Neuron ; 2024 May 23.
Artigo em Inglês | MEDLINE | ID: mdl-38823391

RESUMO

Neurons from multiple prefrontal areas encode several key variables of social gaze interaction. To explore the causal roles of the primate prefrontal cortex in real-life gaze interaction, we applied weak closed-loop microstimulations that were precisely triggered by specific social gaze events. Microstimulations of the orbitofrontal cortex, but not the dorsomedial prefrontal cortex or the anterior cingulate cortex, enhanced momentary dynamic social attention in the spatial dimension by decreasing the distance of fixations relative to a partner's eyes and in the temporal dimension by reducing the inter-looking interval and the latency to reciprocate the other's directed gaze. By contrast, on a longer timescale, microstimulations of the dorsomedial prefrontal cortex modulated inter-individual gaze dynamics relative to one's own gaze positions. These findings demonstrate that multiple regions in the primate prefrontal cortex may serve as functionally accessible nodes in controlling different aspects of dynamic social attention and suggest their potential for a therapeutic brain interface.

16.
Front Artif Intell ; 7: 1391745, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38903158

RESUMO

The scanpath is an important concept in eye tracking. It refers to a person's eye movements over a period of time, commonly represented as a series of alternating fixations and saccades. Machine learning has been increasingly used for the automatic interpretation of scanpaths over the past few years, particularly in research on passive gaze-based interaction, i.e., interfaces that implicitly observe and interpret human eye movements, with the goal of improving the interaction. This literature review investigates research on machine learning applications in scanpath analysis for passive gaze-based interaction between 2012 and 2022, starting from 2,425 publications and focussing on 77 publications. We provide insights on research domains and common learning tasks in passive gaze-based interaction and present common machine learning practices from data collection and preparation to model selection and evaluation. We discuss commonly followed practices and identify gaps and challenges, especially concerning emerging machine learning topics, to guide future research in the field.

17.
Eur J Sport Sci ; 24(6): 750-757, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38874996

RESUMO

The purpose of this study was to clarify the temporal coordination between gaze, head, and arm movements during forehand rallies in table tennis. Collegiate male table tennis players (n = 7) conducted forehand rallies at a constant tempo (100, 120, and 150 bpm) using a metronome. In each tempo condition, participants performed 30 strokes (a total of 90 strokes). Gaze, head, and dominant arm (shoulder, elbow, and wrist) movements were recorded with an eye-tracking device equipped with a Gyro sensor and a 3-D motion capture system. The results showed that the effect of head movements relative to gaze movements was significantly higher than that of eye movements in the three tempo conditions. Our results indicate that head movements are closely associated with gaze movements during rallies. Furthermore, cross-correlation coefficients (CCs) between head and arm movements were more than 0.96 (maximum coefficient: 0.99). In addition, head and arm movements were synchronized during rallies. Finally, CCs between gaze and arm movements were more than 0.74 (maximum coefficient: 0.99), indicating that gaze movements are temporally coordinated with arm movements. Taken together, head movements could play important roles not only in gaze tracking but also in the temporal coordination with arm movements during table tennis forehand rallies.


Assuntos
Braço , Movimentos Oculares , Movimentos da Cabeça , Movimento , Desempenho Psicomotor , Tênis , Humanos , Masculino , Braço/fisiologia , Adulto Jovem , Movimentos da Cabeça/fisiologia , Tênis/fisiologia , Desempenho Psicomotor/fisiologia , Movimentos Oculares/fisiologia , Movimento/fisiologia , Cabeça/fisiologia
18.
Cureus ; 16(5): e60887, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38910704

RESUMO

Moebius syndrome is a rare disease characterized by unilateral or bilateral facial nerve palsies with/without other cranial nerve palsy. It manifests clinically with facial muscle weakness and/or ophthalmoplegia and can be associated with other physical anomalies such as various limb deformities and orofacial malformation. Herein, we have described the clinical and radiological features of Moebius syndrome in a 9-year-old female child who presented with left-side facial palsy and bilateral complete horizontal gaze palsy.

19.
J Psychosom Res ; 183: 111694, 2024 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-38734533

RESUMO

OBJECTIVE: Recent neuroscientific models suggest that functional bodily symptoms can be attributed to perceptual dysregulation in the central nervous system. Evidence for this hypothesis comes from patients with functional dizziness, who exhibit marked sensorimotor processing deficits during eye-head movement planning and execution. Similar findings in eye-head movement planning in patients with irritable bowel syndrome confirmed that these sensorimotor processing deficits represent a shared, transdiagnostic mechanism. We now examine whether erroneous sensorimotor processing is also at play in functional movement disorder. METHODS: We measured head movements of 10 patients with functional movement disorder (F44.4, ICD-10), 10 patients with functional dizziness (F45.8, ICD-10), and (respectively) 10 healthy controls during an eye-head experiment, where participants performed large gaze shifts under normal, increased, and again normal head moment of inertia. Head oscillations at the end of the gaze shift served as a well-established marker for sensorimotor processing problems. We calculated Bayesian statistics for comparison. RESULTS: Patients with functional movement disorder (Bayes Factor (BF)10 = 5.36, BFincl = 11.16; substantial to strong evidence) as well as patients with functional dizziness (BF10 = 2.27, BFincl = 3.56; anecdotal to substantial evidence) showed increased head oscillations compared to healthy controls, indicating marked deficits in planning and executing movement. CONCLUSION: We replicate earlier experimental findings on erroneous sensorimotor processing in patients with functional dizziness, and show that patients with functional movement disorder show a similar impairment of sensorimotor processing during large gaze shifts. This provides an objectively measurable, transdiagnostic marker for functional disorders, highlighting important implications for diagnosis, treatment, and de-stigmatization.


Assuntos
Tontura , Transtornos dos Movimentos , Humanos , Tontura/fisiopatologia , Feminino , Masculino , Adulto , Pessoa de Meia-Idade , Transtornos dos Movimentos/fisiopatologia , Transtornos dos Movimentos/diagnóstico , Movimentos da Cabeça/fisiologia , Movimentos Oculares/fisiologia , Teorema de Bayes
20.
Neuroimage ; 295: 120659, 2024 Jul 15.
Artigo em Inglês | MEDLINE | ID: mdl-38815675

RESUMO

Distinguishing the direction of another person's eye gaze is extremely important in everyday social interaction, as it provides critical information about people's attention and, therefore, intentions. The temporal dynamics of gaze processing have been investigated using event-related potentials (ERPs) recorded with electroencephalography (EEG). However, the moment at which our brain distinguishes the gaze direction (GD), irrespectively of other facial cues, remains unclear. To solve this question, the present study aimed to investigate the time course of gaze direction processing, using an ERP decoding approach, based on the combination of a support vector machine and error-correcting output codes. We recorded EEG in young healthy subjects, 32 of them performing GD detection and 34 conducting face orientation tasks. Both tasks presented 3D realistic faces with five different head and gaze orientations each: 30°, 15° to the left or right, and 0°. While the classical ERP analyses did not show clear GD effects, ERP decoding analyses revealed that discrimination of GD, irrespective of head orientation, started at 140 ms in the GD task and at 120 ms in the face orientation task. GD decoding accuracy was higher in the GD task than in the face orientation task and was the highest for the direct gaze in both tasks. These findings suggest that the decoding of brain patterns is modified by task relevance, which changes the latency and the accuracy of GD decoding.


Assuntos
Eletroencefalografia , Potenciais Evocados , Reconhecimento Facial , Fixação Ocular , Humanos , Masculino , Feminino , Adulto Jovem , Adulto , Potenciais Evocados/fisiologia , Fixação Ocular/fisiologia , Reconhecimento Facial/fisiologia , Máquina de Vetores de Suporte , Atenção/fisiologia , Encéfalo/fisiologia , Percepção Social
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...