Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 26
Filter
Add more filters










Publication year range
1.
Emotion ; 2024 Jun 13.
Article in English | MEDLINE | ID: mdl-38869853

ABSTRACT

Emotion recognition is influenced by contextual information such as social category cues or background scenes. However, past studies yielded mixed findings regarding whether broad valence or specific emotion matches drive context effects and how multiple sources of contextual information may influence emotion recognition. To address these questions, participants were asked to categorize expressions on male and female faces posing happiness and anger and happiness and fear on pleasant and fearful backgrounds (Experiment 1, conducted in 2019), fearful and disgusted expressions on fear and disgust eliciting backgrounds (Experiment 2, conducted in 2022), and fearful and sad expressions on fear and sadness eliciting backgrounds (Experiment 3, conducted in 2022). In Experiment 1 (where stimuli varied in valence), a broad valence match effect was observed. Faster recognition of happiness than fear and anger was more pronounced in pleasant compared to fearful scenes. In Experiments 2 and 3 (where stimuli were negative in valence), specific emotion match effects were observed. Faster recognition occurred when expression and background were emotionally congruent. In Experiments 1 and 3, poser sex independently moderated emotional expression recognition speed. These results suggest that the effect of emotional scenes on facial emotion recognition is mediated by a match in valence when broad valence is task-relevant. Specific emotion matches drive context effects when participants categorize expressions of a single valence. Looking at the influence of background contexts and poser sex together suggests that these two sources of contextual information have an independent rather than an interactive influence on emotional expression recognition speed. (PsycInfo Database Record (c) 2024 APA, all rights reserved).

2.
Front Med (Lausanne) ; 10: 1204274, 2023.
Article in English | MEDLINE | ID: mdl-37396888

ABSTRACT

Background: A growing body of literature has revealed that many medical students and doctors do not seek professional help for their mental health due to fear of stigma (both public- and self-stigma) and questioning of their clinical competency. The aim of this systematic review was to identify and evaluate direct and indirect interventions that address mental health stigma in medical students and/or doctors. We focused explicitly on studies that measured the impact on self-stigma outcomes. Method: A systematic search of the following electronic databases was undertaken from inception to 13 July 2022: PubMed, Embase, PsycINFO, and CINAHL, together with manual searching of reference lists. Screening of titles, abstracts, and full texts of eligible studies, plus quality appraisal using the Mixed Methods Appraisal Tool, were independently conducted by multiple reviewers with disagreements resolved via discussion. Results: From 4,018 citations, five publications met the inclusion criteria. None of the studies explicitly aimed to reduce self-stigmatisation, with the majority focusing on medical students. Most of the identified interventions focused on reducing professional stigma (i.e., stigma toward patients with mental illness) and measurement of self-stigma was incidentally collected via a subscale of the general stigma measure selected. Three studies found significant reductions in self-stigma following the delivered intervention. These studies were of moderate quality, had medical student samples, employed combined education and contact interventions, and used the same outcome measure. Discussion: Intentional development and evaluation of interventions specifically designed to decrease self-stigma among doctors and medical students are needed, with further research required on the optimal components, format, length, and delivery of such interventions. Researchers delivering public/professional stigma reduction interventions should strongly consider measuring the impact of such interventions on self-stigma outcomes, using fit-for-purpose, psychometrically sound instruments.

3.
Emotion ; 23(8): 2385-2398, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37126043

ABSTRACT

Although much research has investigated how multiple sources of social information derived from faces are processed and integrated, few studies have extended this investigation to bodies. The current study addressed this gap by investigating the nature of the interaction between bodily cues of sex and emotion. Using the Garner paradigm, participants recruited from a university student participant pool categorized the sex or the emotional expression (happy and angry in Experiment 1 [n = 194], angry and sad in Experiment 2 [n = 129]) of bodies across two block types. In orthogonal blocks, participants categorized bodies along one dimension while the second dimension was varied (e.g., categorizing the sex of happy and angry bodies), whereas in control blocks the second dimension was held constant (e.g., categorizing the sex of only happy bodies). Responses were analyzed in two ways. Comparing response times across blocks revealed Garner interference (overall faster categorization in the control than in the orthogonal block) of sex on emotion perception and emotion on sex perception in Experiment 1 but not Experiment 2. Comparing condition level responses in orthogonal blocks indicated that sex cues moderated emotion categorization and emotion cues moderated sex categorization in both experiments. A symmetrical interaction between bodily sex and emotion cues can be observed in simple categorization as well as in the Garner paradigm, though the presence of Garner interference depended on the valence of the expressions used. Given similar results observed in face processing, finding interactivity in body perception is likely to generalize beyond the study sample. (PsycInfo Database Record (c) 2023 APA, all rights reserved).


Subject(s)
Cues , Facial Expression , Humans , Emotions/physiology , Happiness , Anger
4.
Ecohealth ; 20(1): 3-8, 2023 Mar.
Article in English | MEDLINE | ID: mdl-37115466

ABSTRACT

Climate change and its effects present notable challenges for mental health, particularly for vulnerable populations, including young people. Immediately following the unprecedented Black Summer bushfire season of 2019/2020, 746 Australians (aged 16-25 years) completed measures of mental health and perceptions of climate change. Results indicated greater presentations of depression, anxiety, stress, adjustment disorder symptoms, substance abuse, and climate change distress and concern, as well as lower psychological resilience and perceived distance to climate change, in participants with direct exposure to these bushfires. Findings highlight significant vulnerabilities of concern for youth mental health as climate change advances.


Subject(s)
Climate Change , Mental Health , Adolescent , Humans , Australia/epidemiology , Seasons , Young Adult , Adult
5.
Front Public Health ; 10: 1049932, 2022.
Article in English | MEDLINE | ID: mdl-36408043

ABSTRACT

A Code Red has been declared for the planet and human health. Climate change (e.g., increasing temperatures, adverse weather events, rising sea levels) threatens the planet's already declining ecosystems. Without urgent action, all of Earth's inhabitants face an existential threat. Health professions education should therefore prepare learners to not only practice in a changing world, but authentic educational activities should also develop competencies for global and planetary citizenship. Planetary health has been integrated across the five-year Bond University (Australia) medical curriculum. It begins in the second week of Year 1 and ends with a session on Environmentally Sustainable Healthcare in the General Practice rotation in the final year. The purpose of this article is to describe the outcomes of the first 5 years (2018-2022) of a learner-centered planetary health assignment, underpinned by the 2030 United Nations (UN) Sustainable Development Goals (SDGs), in the second year of a five-year medical program. Using systems and/or design thinking with a focus on SDG13 (Climate Action) plus a second SDG of choice, self-selected teams of 4-6 students submit a protocol (with feedback) to develop a deliverable "product" for an intended audience. Data analysis of the first 5 years of implementation found that the most frequently selected SDGs in addition to SDG13 were: SDG12 Sustainable Production and Consumption (41% of teams), mostly relating to healthcare emissions and waste; SDG3 Health and Well-being (22%), generally involving the impact of air pollution; and SDG6 Clean Water and Sanitation (15%). A survey at the concluding conference garnered student feedback across various criteria. The planetary health assignment is authentic in that teams provide solutions to address climate change. Where appropriate, final "products" are sent to local or federal ministers for consideration (e.g., policy proposals) or integrated into the curriculum (e.g., learning modules). We believe that the competencies, attitudes, and values fostered through engagement with planetary health. Throughout the medical program, as evidenced by their evaluations, stands students in good stead to be change agents, not only in clinical practice but in society. An awareness has been created about the need for planetary citizenship in addition to global citizenship.


Subject(s)
Planets , Sustainable Development , Humans , Ecosystem , United Nations , Students
6.
Sci Rep ; 12(1): 5911, 2022 04 08.
Article in English | MEDLINE | ID: mdl-35396450

ABSTRACT

Human visual systems have evolved to extract ecologically relevant information from complex scenery. In some cases, the face in the crowd visual search task demonstrates an anger superiority effect, where anger is allocated preferential attention. Across three studies (N = 419), we tested whether facial hair guides attention in visual search and influences the speed of detecting angry and happy facial expressions in large arrays of faces. In Study 1, participants were faster to search through clean-shaven crowds and detect bearded targets than to search through bearded crowds and detect clean-shaven targets. In Study 2, targets were angry and happy faces presented in neutral backgrounds. Facial hair of the target faces was also manipulated. An anger superiority effect emerged that was augmented by the presence of facial hair, which was due to the slower detection of happiness on bearded faces. In Study 3, targets were happy and angry faces presented in either bearded or clean-shaven backgrounds. Facial hair of the background faces was also systematically manipulated. A significant anger superiority effect was revealed, although this was not moderated by the target's facial hair. Rather, the anger superiority effect was larger in clean-shaven than bearded face backgrounds. Together, results suggest that facial hair does influence detection of emotional expressions in visual search, however, rather than facilitating an anger superiority effect as a potential threat detection system, facial hair may reduce detection of happy faces within the face in the crowd paradigm.


Subject(s)
Facial Expression , Happiness , Anger , Emotions , Face , Hair , Humans , Reaction Time
7.
Cogn Emot ; 36(5): 855-875, 2022 08.
Article in English | MEDLINE | ID: mdl-35353033

ABSTRACT

Past research demonstrates that emotion recognition is influenced by social category cues present on faces. However, little research has investigated whether holistic processing is required to observe these influences of social category information on emotion perception, and no studies have investigated whether different visual sampling strategies (i.e. differences in the allocation of attention to different regions of the face) contribute to the interaction between social cues and emotional expressions. The current study aimed to address this. Participants categorised happy and angry expressions on own- and other-race faces, and male and female faces. In Experiments 1 and 2, holistic processing was disrupted by presenting inverted faces (Experiment 1) or part faces (Experiment 2). In Experiments 3 and 4 participants' eye-gaze to eye and mouth regions was also tracked. Disrupting holistic processing did not alter the moderating influence of sex and race cues on emotion recognition (Experiments 1, 2, 4). Gaze patterns differed as a function of emotional expression, and social category cues, however, eye-gaze patterns did not reflect response time patterns (Experiments 3 and 4). Results indicate that the interaction between social category cues and emotion does not require holistic processing and is not driven by differences in visual sampling.


Subject(s)
Cues , Facial Expression , Anger/physiology , Emotions/physiology , Female , Happiness , Humans , Male
8.
Br J Psychol ; 112(3): 645-661, 2021 Aug.
Article in English | MEDLINE | ID: mdl-33211325

ABSTRACT

The own-age bias (OAB) has been proposed to be caused by perceptual expertise and/or social-cognitive mechanisms. Investigations into the role of social cognition have, however, yielded mixed results. One reason for this might be the tendency for research to focus on the OAB in young adults, between young and older adult faces where other-age individuation experience is low. To explore whether social-cognitive manipulations may be successful when observers have sufficient other-age individuation experience, we examined biases involving middle-aged other-age faces and the influence of a context manipulation. Across four experiments, young adult participants were presented with middle-aged faces alongside young or older adult faces to remember. We predicted that in contexts where middle-aged faces were positioned as other-age faces (alongside young adult faces), recognition performance would be worse than when they were positioned as relative own-age faces (alongside older adult faces). However, the context manipulations did not moderate middle age face recognition. This suggests that past findings that context does not change other-age face recognition holds for other-age faces for which observers have higher individuation experience. These findings are consistent with a perceptual expertise account of the OAB but more investigation of the generality of these results is required.


Subject(s)
Facial Recognition , Aged , Bias , Face , Humans , Individuation , Middle Aged , Pattern Recognition, Visual , Recognition, Psychology , Young Adult
9.
Br J Psychol ; 111(4): 702-722, 2020 Nov.
Article in English | MEDLINE | ID: mdl-31777954

ABSTRACT

The own-age bias (OAB) is suggested to be caused by perceptual-expertise and/or social-cognitive mechanisms. Bryce and Dodson (2013, Psychology and Aging, 28, 87, Exp 2) provided support for the social-cognitive account, demonstrating an OAB for participants who encountered a mixed-list of own- and other-age faces, but not for participants who encountered a pure-list of only own- or other-age faces. They proposed that own-age/other-age categorization, and the resulting OAB, only emerge when age is made salient in the mixed-list condition. Our study aimed to replicate this finding using methods typically used to investigate the OAB to examine their robustness and contribution to our understanding of how the OAB forms. Across three experiments that removed theoretically unimportant components of the original paradigm, varied face sex, and included background scenes, the OAB emerged under both mixed-list and pure-list conditions. These results are more consistent with a perceptual-expertise than social-cognitive account of the OAB, but may suggest that manipulating age salience using mixed-list and pure-list presentations is not sufficient to alter categorization processes.


Subject(s)
Aging/psychology , Bias , Cognition , Face , Pattern Recognition, Visual , Adolescent , Adult , Age Factors , Female , Humans , Male , Young Adult
10.
Psychol Sci ; 30(5): 728-738, 2019 05.
Article in English | MEDLINE | ID: mdl-30908116

ABSTRACT

The beard is arguably one of the most obvious signals of masculinity in humans. Almost 150 years ago, Darwin suggested that beards evolved to communicate formidability to other males, but no studies have investigated whether beards enhance recognition of threatening expressions, such as anger. We found that the presence of a beard increased the speed and accuracy with which participants recognized displays of anger but not happiness (Experiment 1, N = 219). This effect was not due to negative evaluations shared by beardedness and anger or to negative stereotypes associated with beardedness, as beards did not facilitate recognition of another negative expression, sadness (Experiment 2, N = 90), and beards increased the rated prosociality of happy faces in addition to the rated masculinity and aggressiveness of angry faces (Experiment 3, N = 445). A computer-based emotion classifier reproduced the influence of beards on emotion recognition (Experiment 4). The results suggest that beards may alter perceived facial structure, facilitating rapid judgments of anger in ways that conform to evolutionary theory.


Subject(s)
Aggression/psychology , Agonistic Behavior/physiology , Emotions/physiology , Facial Recognition/physiology , Adult , Anger/physiology , Facial Expression , Female , Humans , Male , Masculinity , Middle Aged , Reaction Time/physiology , Sex Characteristics , Socialization
11.
Br J Psychol ; 110(4): 635-651, 2019 Nov.
Article in English | MEDLINE | ID: mdl-30676648

ABSTRACT

Young adults recognize other young adult faces more accurately than older adult faces, an effect termed the own-age bias (OAB). The categorization-individuation model (CIM) proposes that recognition memory biases like the OAB occur as unfamiliar faces are initially quickly categorized. In-group faces are seen as socially relevant which motivates the processing of individuating facial features. Outgroup faces are processed more superficially with attention to category-specific information which hinders subsequent recognition. To examine the roles of categorization and individuation in the context of the OAB, participants completed a face recognition task and a speeded age categorization task including young and older adult faces. In the recognition task, half of the participants were given instructions aimed to encourage individuation of other-age faces. An OAB emerged that was not influenced by individuation instructions, but the magnitude of the OAB was correlated with performance in the categorization task. The larger the categorization advantage for older adult over young adult faces, the larger the OAB. These results support the premise that social categorization processes can affect the subsequent recognition of own- and other-age faces, but do not provide evidence for the effectiveness of individuation instructions in reducing the OAB.


Subject(s)
Facial Recognition/physiology , Individuation , Adolescent , Adult , Aged , Aged, 80 and over , Attention/physiology , Face , Female , Humans , Male , Motivation , Social Identification , Young Adult
12.
Emotion ; 19(7): 1206-1213, 2019 Oct.
Article in English | MEDLINE | ID: mdl-30265077

ABSTRACT

We are better at recognizing faces of our own age group compared to faces of other age groups. It has been suggested that this own-age bias (OAB) might occur because of perceptual-expertise and/or social-cognitive mechanisms. Although there is evidence to suggest effects of perceptual-expertise, little research has explored the role of social-cognitive factors. To do so, we looked at how the presence of an emotional expression on the face changes the magnitude of the OAB. Across 3 experiments, young adult participants were presented with young and older adult faces to remember. Neutral faces were first presented alone (Experiment 1) to validate the proposed paradigm and then presented along with angry (Experiment 2) and sad or happy faces (Experiment 3). The presence of an emotional expression improved the recognition of older adult faces, reducing the OAB which was evident for neutral faces. These results support the involvement of social-cognitive factors in the OAB, suggesting that a perceptual-expertise account cannot fully explain this face recognition bias. (PsycINFO Database Record (c) 2019 APA, all rights reserved).


Subject(s)
Expressed Emotion/physiology , Reaction Time/physiology , Adult , Bias , Female , Humans , Male
13.
Emotion ; 19(8): 1495-1499, 2019 Dec.
Article in English | MEDLINE | ID: mdl-30475034

ABSTRACT

Previous research has demonstrated that facial social category cues influence emotion perception such that happy expressions are categorized faster than negative expressions on faces belonging to positively evaluated social groups. We examined whether character information that is experimentally manipulated can also influence emotion perception. Across two experiments, participants learned to associate individuals posing neutral expressions with positive or negative acts. In a subsequent task, participants categorized happy and angry expressions of these same individuals as quickly and accurately as possible. As predicted, a larger happy face advantage emerged for individuals associated with positive character information than for individuals associated with negative character information. These results demonstrate that experimentally manipulated evaluations of an individual's character are available quickly and affect early stages of face processing. Emotion perception is not only influenced by preexisting attitudes based on facial attributes, but also by information about a person that has been recently acquired. (PsycINFO Database Record (c) 2019 APA, all rights reserved).


Subject(s)
Emotions/physiology , Adult , Character , Female , Humans , Male , Perception , Young Adult
14.
Emotion ; 19(6): 1070-1080, 2019 Sep.
Article in English | MEDLINE | ID: mdl-30234330

ABSTRACT

A happy face advantage has consistently been shown in emotion categorization tasks; happy faces are categorized as happy faster than angry faces as angry. Furthermore, social category cues, such as facial sex and race, moderate the happy face advantage in evaluatively congruent ways with a larger happy face advantage for more positively evaluated faces. We investigated whether attractiveness, a facial attribute unrelated to more defined social categories, would moderate the happy face advantage consistent with the evaluative congruence account. A larger happy face advantage for the more positively evaluated attractive faces than for unattractive faces was predicted. Across 4 experiments participants categorized attractive and unattractive faces as happy or angry as quickly and accurately as possible. As predicted, when female faces were categorized separately, a happy face advantage emerged for the attractive females but not for the unattractive females. Corresponding results were only found in the error rates for male faces. This pattern was confirmed when female and male faces were categorized together, indicating that attractiveness may have a stronger influence on emotion perception for female faces. Attractiveness is shown to moderate emotion perception in line with the evaluative congruence account and is suggested to have a stronger influence on emotion perception than facial sex cues in contexts where attractiveness is a salient evaluative dimension. (PsycINFO Database Record (c) 2019 APA, all rights reserved).


Subject(s)
Emotions/physiology , Facial Expression , Happiness , Adult , Female , Humans , Male
15.
Behav Res Ther ; 108: 10-17, 2018 09.
Article in English | MEDLINE | ID: mdl-29966993

ABSTRACT

In associative learning, if stimulus A is presented in the same temporal context as the conditional stimulus (CS) - outcome association (but not in a way that allows an A-CS association to form) it becomes a temporal context cue, acquiring the ability to activate this context and retrieve the CS-outcome association. We examined whether a CS- presented during acquisition or extinction that predicted the absence of the unconditional stimulus (US) could act as a temporal context cue, reducing or enhancing responding, in differential fear conditioning. Two groups received acquisition (CSx-US, CSa-noUS) in phase 1 and extinction (CSx-noUS; CSe-noUS) in phase 2 (AE groups), and two groups received extinction in phase 1 and acquisition in phase 2 (EA groups). After a delay, participants were presented with either CSa (AEa and EAa groups) or CSe (AEe and EAe groups). Responding to CSx was enhanced after presentation of CSa but reduced after presentation of CSe, suggesting that training was segmented into two learning episodes and that the unreinforced CS present during an episode retrieved the CSx-US or CSx-noUS association. These findings suggest that temporal context cues may enhance or reduce fear responding, providing an exciting new avenue for relapse prevention research.


Subject(s)
Conditioning, Classical , Cues , Fear/psychology , Adolescent , Adult , Extinction, Psychological , Female , Humans , Male , Reinforcement, Psychology , Time Factors , Young Adult
16.
Br J Psychol ; 109(4): 736-757, 2018 Nov.
Article in English | MEDLINE | ID: mdl-29536506

ABSTRACT

Young adult participants are faster to detect young adult faces in crowds of infant and child faces than vice versa. These findings have been interpreted as evidence for more efficient attentional capture by own-age than other-age faces, but could alternatively reflect faster rejection of other-age than own-age distractors, consistent with the previously reported other-age categorization advantage: faster categorization of other-age than own-age faces. Participants searched for own-age faces in other-age backgrounds or vice versa. Extending the finding to different other-age groups, young adult participants were faster to detect young adult faces in both early adolescent (Experiment 1) and older adult backgrounds (Experiment 2). To investigate whether the own-age detection advantage could be explained by faster categorization and rejection of other-age background faces, participants in experiments 3 and 4 also completed an age categorization task. Relatively faster categorization of other-age faces was related to relatively faster search through other-age backgrounds on target absent trials but not target present trials. These results confirm that other-age faces are more quickly categorized and searched through and that categorization and search processes are related; however, this correlational approach could not confirm or reject the contribution of background face processing to the own-age detection advantage.


Subject(s)
Attention/physiology , Judgment/physiology , Visual Perception/physiology , Adolescent , Adult , Age Factors , Face , Female , Humans , Male , Pattern Recognition, Visual/physiology , Photic Stimulation , Reaction Time/physiology , Young Adult
17.
Cogn Emot ; 32(2): 350-362, 2018 03.
Article in English | MEDLINE | ID: mdl-28366112

ABSTRACT

Facial attributes such as race, sex, and age can interact with emotional expressions; however, only a couple of studies have investigated the nature of the interaction between facial age cues and emotional expressions and these have produced inconsistent results. Additionally, these studies have not addressed the mechanism/s driving the influence of facial age cues on emotional expression or vice versa. In the current study, participants categorised young and older adult faces expressing happiness and anger (Experiment 1) or sadness (Experiment 2) by their age and their emotional expression. Age cues moderated categorisation of happiness vs. anger and sadness in the absence of an influence of emotional expression on age categorisation times. This asymmetrical interaction suggests that facial age cues are obligatorily processed prior to emotional expressions. Finding a categorisation advantage for happiness expressed on young faces relative to both anger and sadness which are negative in valence but different in their congruence with old age stereotypes or structural overlap with age cues suggests that the observed influence of facial age cues on emotion perception is due to the congruence between relatively positive evaluations of young faces and happy expressions.


Subject(s)
Aging , Cues , Emotions/physiology , Facial Expression , Photic Stimulation/methods , Adult , Anger , Face , Female , Happiness , Humans , Male , Sadness , Young Adult
18.
Atten Percept Psychophys ; 79(7): 2212-2223, 2017 Oct.
Article in English | MEDLINE | ID: mdl-28681183

ABSTRACT

The magnitude of the happy categorisation advantage, the faster recognition of happiness than negative expressions, is influenced by facial race and sex cues. Previous studies have investigated these relationships using racial outgroups stereotypically associated with physical threat in predominantly Caucasian samples. To determine whether these influences generalise to stimuli representing other ethnic groups and to participants of different ethnicities, Caucasian Australian (Experiments 1 and 2) and Chinese participants (Experiment 2) categorised happy and angry expressions displayed on own-race male faces presented with emotional other-race male, own-race female, and other-race female faces in separate tasks. The influence of social category cues on the happy categorisation advantage was similar in the Australian and Chinese samples. In both samples, the happy categorisation advantage was present for own-race male faces when they were encountered with other-race male faces but reduced when own-race male faces were categorised along with female faces. The happy categorisation advantage was present for own-race and other-race female faces when they were encountered with own-race male faces in both samples. Results suggest similarity in the influence of social category cues on emotion categorisation.


Subject(s)
Asian People/psychology , Cues , Emotions/physiology , Facial Expression , Photic Stimulation/methods , White People/psychology , Adolescent , Anger/physiology , Australia/epidemiology , Facial Recognition/physiology , Female , Happiness , Humans , Male , Racial Groups/psychology , Reaction Time/physiology , Sex Factors , Young Adult
19.
Emotion ; 17(1): 28-39, 2017 02.
Article in English | MEDLINE | ID: mdl-27379894

ABSTRACT

The speed of recognizing facial expressions of emotion is influenced by a range of factors including other concurrently present facial attributes, like a person's sex. Typically, when participants categorize happy and angry expressions on male and female faces, they are faster to categorize happy than angry expressions displayed by females, but not displayed by males. Using the same emotional faces across tasks, we demonstrate that this influence of sex cues on emotion categorization is dependent on the other faces recently encountered in an experiment. Altering the salience of gender by presenting male and female faces in separate emotion categorization tasks rather than together in a single task changed the influence of sex cues on emotion categorization, whereas changing the evaluative dimension by presenting happy and angry expressions in separate tasks alongside neutral faces rather than together within 1 task did not. These results suggest that the way facial attributes influence emotion categorization depends on the situation in which the faces are encountered and specifically on what information is made salient within or across tasks by other recently encountered faces. (PsycINFO Database Record


Subject(s)
Emotions/physiology , Facial Expression , Sexual Behavior/psychology , Adult , Cues , Female , Humans , Male
20.
Cogn Emot ; 31(7): 1493-1501, 2017 11.
Article in English | MEDLINE | ID: mdl-27499098

ABSTRACT

Facial race and sex cues can influence the magnitude of the happy categorisation advantage. It has been proposed that implicit race or sex based evaluations drive this influence. Within this account a uniform influence of social category cues on the happy categorisation advantage should be observed for all negative expressions. Support has been shown with angry and sad expressions but evidence to the contrary has been found for fearful expressions. To determine the generality of the evaluative congruence account, participants categorised happiness with either sadness, fear, or surprise displayed on White male as well as White female, Black male, or Black female faces across three experiments. Faster categorisation of happy than negative expressions was observed for female faces when presented among White male faces, and for White male faces when presented among Black male faces. These results support the evaluative congruence account when both positive and negative expressions are presented.


Subject(s)
Black People/psychology , Cues , Facial Expression , Happiness , Social Identification , White People/psychology , Adult , Female , Humans , Male , Sex Factors , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...