Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Autism ; : 13623613231211967, 2023 Nov 24.
Article in English | MEDLINE | ID: mdl-38006222

ABSTRACT

LAY ABSTRACT: Autistic people have been said to have 'problems' with joint attention, that is, looking where someone else is looking. Past studies of joint attention have used tasks that require autistic people to continuously look at and respond to eye-gaze cues. But joint attention can also be done using other social cues, like pointing. This study looked at whether autistic and non-autistic young people use another person's eye gaze during joint attention in a task that did not require them to look at their partner's face. In the task, each participant worked together with their partner to find a computer-generated object in virtual reality. Sometimes the participant had to help guide their partner to the object, and other times, they followed their partner's lead. Participants were told to point to guide one another but were not told to use eye gaze. Both autistic and non-autistic participants often looked at their partner's face during joint attention interactions and were faster to respond to their partner's hand-pointing when the partner also looked at the object before pointing. This shows that autistic people can and do use information from another person's eyes, even when they don't have to. It is possible that, by not forcing autistic young people to look at their partner's face and eyes, they were better able to gather information from their partner's face when needed, without being overwhelmed. This shows how important it is to design tasks that provide autistic people with opportunities to show what they can do.

2.
Sci Rep ; 11(1): 21037, 2021 10 26.
Article in English | MEDLINE | ID: mdl-34702900

ABSTRACT

The coordination of attention between individuals is a fundamental part of everyday human social interaction. Previous work has focused on the role of gaze information for guiding responses during joint attention episodes. However, in many contexts, hand gestures such as pointing provide another valuable source of information about the locus of attention. The current study developed a novel virtual reality paradigm to investigate the extent to which initiator gaze information is used by responders to guide joint attention responses in the presence of more visually salient and spatially precise pointing gestures. Dyads were instructed to use pointing gestures to complete a cooperative joint attention task in a virtual environment. Eye and hand tracking enabled real-time interaction and provided objective measures of gaze and pointing behaviours. Initiators displayed gaze behaviours that were spatially congruent with the subsequent pointing gestures. Responders overtly attended to the initiator's gaze during the joint attention episode. However, both these initiator and responder behaviours were highly variable across individuals. Critically, when responders did overtly attend to their partner's face, their saccadic reaction times were faster when the initiator's gaze was also congruent with the pointing gesture, and thus predictive of the joint attention location. These results indicate that humans attend to and process gaze information to facilitate joint attention responsivity, even in contexts where gaze information is implicit to the task and joint attention is explicitly cued by more spatially precise and visually salient pointing gestures.


Subject(s)
Attention/physiology , Fixation, Ocular/physiology , Gestures , Hand Joints/physiology , Kinesthesis/physiology , Reaction Time/physiology , Adolescent , Adult , Female , Humans , Male
3.
Schizophr Res Cogn ; 21: 100181, 2020 Sep.
Article in English | MEDLINE | ID: mdl-32477892

ABSTRACT

The successful integration of eye gaze direction and emotion cues from faces is important not only for co-ordinated interactions, but also for the detection of social signals alerting us to threat posed by a conspecific, or elsewhere in our immediate environment. It is now well-established that people with schizophrenia experience aberrant eye gaze and facial emotion processing. These social-cognitive differences might contribute to the maintenance of socially-themed delusions which are characterised by the hyper-attribution of threatening intentions to others. However, no study has directly examined whether the mechanisms which govern the integration of eye gaze and emotion information diverge in schizophrenia, and more importantly, whether this reflects a fundamental 'bottom-up' perceptual deficit or a 'top-down' cognitive bias. Fifteen outpatients diagnosed with schizophrenia and 21 healthy age- and IQ-matched controls performed an emotion categorisation task (anger/fear) on morphed facial expressions of anger or fear, displaying either direct or averted gaze. Results in both controls and patients replicated the previous finding that combinations of anger with direct gaze, and fear with averted gaze - which signal a relevant threat to the observer - benefited from more accurate emotion recognition than alternate gaze-emotion combinations. Bayesian model selection revealed that for patients this effect was mediated by a shift in decision bias towards emotions which signal self-relevant threat, rather than a change in sensitivity as observed in controls. These results critically highlight a different cognitive mechanism governing gaze and face-cued emotion integration in schizophrenia, which has a top-down influence on the evaluation of perceptual input.

4.
Sci Rep ; 9(1): 16198, 2019 11 07.
Article in English | MEDLINE | ID: mdl-31700080

ABSTRACT

The human brain has evolved specialised mechanisms to enable the rapid detection of threat cues, including emotional face expressions (e.g., fear and anger). However, contextual cues - such as gaze direction - influence the ability to recognise emotional expressions. For instance, anger paired with direct gaze, and fear paired with averted gaze are more accurately recognised compared to alternate conjunctions of these features. It is argued that this is because gaze direction conveys the relevance and locus of the threat to the observer. Here, we used continuous flash suppression (CFS) to assess whether the modulatory effect of gaze direction on emotional face processing occurs outside of conscious awareness. Previous research using CFS has demonstrated that fearful facial expressions are prioritised by the visual system and gain privileged access to awareness over other expressed emotions. We hypothesised that if the modulatory effects of gaze on emotional face processing occur also at this level, then the gaze-emotion conjunctions signalling self-relevant threat will reach awareness faster than those that do not. We report that fearful faces gain privileged access to awareness over angry faces, but that gaze direction does not modulate this effect. Thus, our findings suggest that previously reported effects of gaze direction on emotional face processing are likely to occur once the face is detected, where the self-relevance and locus of the threat can be consciously appraised.


Subject(s)
Emotions , Facial Expression , Fixation, Ocular , Consciousness , Cues , Female , Humans , Male , Middle Aged
SELECTION OF CITATIONS
SEARCH DETAIL
...