Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
2.
Compr Psychiatry ; 124: 152386, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37075621

ABSTRACT

BACKGROUND: Inhibitory control (IC) deficits have been proposed as a potential risk factor for depression. However, little is known about the intra-individual daily fluctuations in IC, and its relationship to mood and depressive symptoms. Here, we examined the everyday association between IC and mood, in typical adults with various levels of depressive symptoms. METHODS: Participants (N = 106) reported their depressive symptoms and completed a Go-NoGo (GNG) task measuring IC at baseline. Then, they completed a 5-day ecological-momentary-assessment (EMA) protocol, in which they reported their current mood and performed a shortened GNG task twice/day using a mobile app. Depressive symptoms were measured again following the EMA. Hierarchical-linear-modeling (HLM) was applied to examine the association between momentary IC and mood, with post-EMA depressive symptoms as a moderator. RESULTS: Individuals with elevated depressive symptoms demonstrated worse and more variable IC performance over the EMA. In addition, post-EMA depressive symptoms moderated the association between momentary IC and daily mood, such that reduced IC was associated with more negative mood only for those with lower, but not higher, symptoms. LIMITATIONS: Future investigations should examine the validity of these outcomes in clinical samples, including patients with Major Depressive Disorder. CONCLUSIONS: Variable, rather than mere reduced, IC, is related to depressive symptoms. Moreover, the role of IC in modulating mood may differ in non-depressed individuals and individuals with sub-clinical depression. These findings contribute to our understanding of IC and mood in real life and help account for some of the discrepant findings related to cognitive control models of depression.


Subject(s)
Depression , Depressive Disorder, Major , Adult , Humans , Depression/diagnosis , Depression/psychology , Depressive Disorder, Major/diagnosis , Affect , Risk Factors , Ecological Momentary Assessment , Cognition
3.
Emotion ; 22(5): 844-860, 2022 Aug.
Article in English | MEDLINE | ID: mdl-32658507

ABSTRACT

Facial expression recognition relies on the processing of diagnostic information from different facial regions. For example, successful recognition of anger versus disgust requires one to process information located in the eye/brow region, or in the mouth/nose region, respectively. Yet, how this information is extracted from the face is less clear. One widespread view, supported by cross-cultural experiments as well as neuropsychological case studies, is that the distribution of gaze fixations on specific diagnostic regions plays a critical role in the extraction of affective information. According to this view, emotion recognition is strongly related to the distribution of fixations to diagnostic regions. Alternatively, facial expression recognition may not rely merely on the exact patterns of fixations, but rather on other factors such as the processing of extrafoveal information. In the present study, we examined this matter by characterizing and using individual differences in fixation distributions during facial expression recognition. We revealed 4 groups of observers that differed in their distribution of fixations toward face regions in a robust and consistent manner. In line with previous studies, we found that different facial emotion categories evoked distinct distribution of fixations according to their diagnostic facial regions. However, individual distinctive patterns of fixations were not correlated with emotion recognition: individuals that strongly focused on the eyes, or on the mouth, achieved comparable emotion recognition accuracy. These findings suggest that extrafoveal processing may play a larger role in emotion recognition from faces than previously assumed. Consequently, successful emotion recognition can rise from diverse patterns of fixations. (PsycInfo Database Record (c) 2022 APA, all rights reserved).


Subject(s)
Emotions , Facial Expression , Facial Recognition , Emotions/physiology , Facial Recognition/physiology , Fixation, Ocular , Humans , Recognition, Psychology/physiology
4.
Cortex ; 126: 343-354, 2020 05.
Article in English | MEDLINE | ID: mdl-32234565

ABSTRACT

Emotion recognition deficits in Huntington's disease (HD) are well-established. However, most previous studies have measured emotion recognition using stereotypical and intense facial expressions, which are easily recognized and artificial in their appearance. By contrast, everyday expressions are often more challenging to recognize, as they are subtle and non-stereotypical. Therefore, previous studies may have inflated the performance of HD patients and it is difficult to generalize their results to facial expressions encountered in everyday social interactions. In the present study, we tested 21 symptomatic HD patients and 28 healthy controls with a traditional facial expression set, as well as a novel stimulus set which exhibits subtle and non-stereotypical facial expressions. While HD patients demonstrated poor emotion recognition in both sets, when tested with the novel, ecologically looking facial expressions, patients' performance declined to chance level. Intriguingly, patients' emotion recognition deficit was predicted only by the severity of their motor symptoms, not by their cognitive status. This suggests a possible mechanism for emotion recognition impairments in HD, in line with embodiment theories. From this point of view, poor motor control may affect patients' ability to subtly produce and simulate a perceived facial expression, which in turn may contribute to their impaired recognition.


Subject(s)
Facial Recognition , Huntington Disease , Emotions , Facial Expression , Humans , Recognition, Psychology , Stereotyped Behavior
5.
Neuropsychologia ; 117: 26-35, 2018 08.
Article in English | MEDLINE | ID: mdl-29723598

ABSTRACT

Facial expressions are inherently dynamic cues that develop and change over time, unfolding their affective signal. Although facial dynamics are assumed important for emotion recognition, testing often involves intense and stereotypical expressions and little is known about the role of temporal information in the recognition of subtle, non-stereotypical expressions. In Experiment 1 we demonstrate that facial dynamics are critical for recognizing subtle and non-stereotypical facial expressions, but not for recognizing intense and stereotypical displays of emotion. In Experiment 2 we further examined whether the facilitative effect of motion can lead to improved emotion recognition in LG, an individual with developmental visual agnosia and prosopagnosia, who has poor emotion recognition when tested with static facial expressions. LG's emotion recognition improved when subtle, non-stereotypical faces were dynamic rather than static. However, compared to controls, his relative gain from temporal information was diminished. Furthermore, LG's eye-tracking data demonstrated atypical visual scanning of the dynamic faces, consisting of longer fixations and lower fixation rates for the dynamic-subtle facial expressions, comparing to the dynamic-intense facial expressions. We suggest that deciphering subtle dynamic expressions strongly relies on integrating broad facial regions across time, rather than focusing on local emotional cues, skills which are impaired in developmental visual agnosia.


Subject(s)
Agnosia/physiopathology , Emotions/physiology , Facial Expression , Pattern Recognition, Visual/physiology , Adult , Analysis of Variance , Eye Movements , Female , Humans , Male , Photic Stimulation , Young Adult
6.
Emotion ; 17(8): 1187-1198, 2017 12.
Article in English | MEDLINE | ID: mdl-28406679

ABSTRACT

According to dominant theories of affect, humans innately and universally express a set of emotions using specific configurations of prototypical facial activity. Accordingly, thousands of studies have tested emotion recognition using sets of highly intense and stereotypical facial expressions, yet their incidence in real life is virtually unknown. In fact, a commonplace experience is that emotions are expressed in subtle and nonprototypical forms. Such facial expressions are at the focus of the current study. In Experiment 1, we present the development and validation of a novel stimulus set consisting of dynamic and subtle emotional facial displays conveyed without constraining expressers to using prototypical configurations. Although these subtle expressions were more challenging to recognize than prototypical dynamic expressions, they were still well recognized by human raters, and perhaps most importantly, they were rated as more ecological and naturalistic than the prototypical expressions. In Experiment 2, we examined the characteristics of subtle versus prototypical expressions by subjecting them to a software classifier, which used prototypical basic emotion criteria. Although the software was highly successful at classifying prototypical expressions, it performed very poorly at classifying the subtle expressions. Further validation was obtained from human expert face coders: Subtle stimuli did not contain many of the key facial movements present in prototypical expressions. Together, these findings suggest that emotions may be successfully conveyed to human viewers using subtle nonprototypical expressions. Although classic prototypical facial expressions are well recognized, they appear less naturalistic and may not capture the richness of everyday emotional communication. (PsycINFO Database Record


Subject(s)
Emotions , Facial Expression , Facial Recognition , Software , Adult , Communication , Face/anatomy & histology , Face/physiology , Facial Recognition/physiology , Fear , Female , Humans , Male , Reproducibility of Results , Stereotyping , Young Adult
7.
Autism ; 20(7): 856-67, 2016 10.
Article in English | MEDLINE | ID: mdl-26802114

ABSTRACT

Stability and change in early autism spectrum disorder risk were examined in a cohort of 99 preterm infants (⩽34 weeks of gestation) using the Autism Observation Scale for Infants at 8 and 12 months and the Autism Diagnostic Observation Schedule-Toddler Module at 18 months. A total of 21 infants were identified at risk by the Autism Observation Scale for Infants at 8 months, and 9 were identified at risk at 12 months, including 4 children who were not previously identified. At 18 months, eight children were identified at risk for autism spectrum disorder using the Autism Diagnostic Observation Schedule-Toddler Module, only half of whom had been identified using the original Autism Observation Scale for Infants cutoffs. Results are discussed in relation to early trajectories of autism spectrum disorder risk among preterm infants as well as identifying social-communication deficiencies associated with the early preterm behavioral phenotype.


Subject(s)
Autism Spectrum Disorder/diagnosis , Infant, Premature , Cohort Studies , Female , Humans , Infant , Male , Risk Assessment
SELECTION OF CITATIONS
SEARCH DETAIL
...