Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
Add more filters










Publication year range
1.
Biol Psychol ; 146: 107723, 2019 09.
Article in English | MEDLINE | ID: mdl-31255686

ABSTRACT

Faces showing happy, angry or fearful expressions were presented in emotionally congruent or incongruent situational contexts (short sentences describing events that would usually provoke happiness, anger or fear). The participants were assigned the task of judging whether the expression was appropriate or not to the context (congruency judgment task). Effects of emotional congruency were observed at both the behavioral and electrophysiological levels. Behavioral results showed evidence of congruency effects based on specific emotion content (e.g., less accurate and slower responses to fear faces in angry contexts). Event-related-potentials (ERP) results also showed emotional congruency effects at different post-stimulus onset latencies, beginning with the face-sensitive N170 component. An effect of emotional congruency was also shown on the N400 component that is typically sensitive to semantic congruency. Finally, a late positive potential (LPP), appearing at 450-650 ms post-stimulus onset showed a complex pattern of effects with modulations driven by the different combinations of contexts and target expressions. These results are interpreted in terms of a double process of valence and emotion checking that is supposed to underlie affective processing and contextual integration of facial expressions of emotions.


Subject(s)
Emotions/physiology , Facial Expression , Adolescent , Adult , Anger , Electroencephalography , Evoked Potentials/physiology , Fear/psychology , Female , Happiness , Humans , Judgment , Male , Psychomotor Performance/physiology , Young Adult
2.
Brain Sci ; 9(5)2019 May 17.
Article in English | MEDLINE | ID: mdl-31109022

ABSTRACT

Faces showing expressions of happiness or anger were presented together with sentences that described happiness-inducing or anger-inducing situations. Two main variables were manipulated: (i) congruency between contexts and expressions (congruent/incongruent) and (ii) the task assigned to the participant, discriminating the emotion shown by the target face (emotion task) or judging whether the expression shown by the face was congruent or not with the context (congruency task). Behavioral and electrophysiological results (event-related potentials (ERP)) showed that processing facial expressions was jointly influenced by congruency and task demands. ERP results revealed task effects at frontal sites, with larger positive amplitudes between 250-450 ms in the congruency task, reflecting the higher cognitive effort required by this task. Effects of congruency appeared at latencies and locations corresponding to the early posterior negativity (EPN) and late positive potential (LPP) components that have previously been found to be sensitive to emotion and affective congruency. The magnitude and spatial distribution of the congruency effects varied depending on the task and the target expression. These results are discussed in terms of the modulatory role of context on facial expression processing and the different mechanisms underlying the processing of expressions of positive and negative emotions.

3.
Acta Psychol (Amst) ; 187: 66-76, 2018 Jun.
Article in English | MEDLINE | ID: mdl-29751931

ABSTRACT

Contextual influences on responses to facial expressions of emotion were studied using a context-target paradigm that allowed distinguishing the effects of affective congruency (context and target of same/different valence: positive or negative) and emotional congruency (context and target representing the same/different emotion: anger, fear, happiness). Sentences describing anger, fear or happiness-inducing events and faces expressing each of these emotions were used as contexts and targets, respectively. While between-valence comparisons (context and target of similar/different valence) revealed affective congruency effects, within-valence comparisons (context and target of similar valence and same/different emotion) revealed emotional congruency effects. In Experiment 1 no evidence of emotional congruency and limited evidence of affective congruency were found with an evaluative task. In Experiment 2 effects of both affective and emotional congruency were observed with an emotion recognition task. In this case, angry and fearful faces were recognized faster in emotionally congruent contexts. In Experiment 3 the participants were asked explicitly to judge the emotional congruency of the target faces. Emotional congruency effects were again found, with faster judgments of angry and fearful faces in the corresponding emotional contexts. Moreover, judgments of angry expressions were faster and more accurate in happy than in anger contexts. Thus, participants found easier to decide that angry faces did not match a happy context than to judge that they did match an anger context. These results suggest that there are differences in the way that facial expressions of positive and negative emotions are discriminated and integrated with their contexts. Specifically, compared to positive expressions, contextual integration of negative expressions seems to require a double check of the valence and the specific emotion category of the expression and the context.


Subject(s)
Emotions/physiology , Facial Expression , Photic Stimulation/methods , Psychomotor Performance/physiology , Adolescent , Affect/physiology , Anger/physiology , Decision Making/physiology , Fear/physiology , Female , Happiness , Humans , Judgment/physiology , Male , Young Adult
4.
Biol Psychol ; 112: 27-38, 2015 Dec.
Article in English | MEDLINE | ID: mdl-26450006

ABSTRACT

The influence of explicit evaluative processes on the contextual integration of facial expressions of emotion was studied in a procedure that required the participants to judge the congruency of happy and angry faces with preceding sentences describing emotion-inducing situations. Judgments were faster on congruent trials in the case of happy faces and on incongruent trials in the case of angry faces. At the electrophysiological level, a congruency effect was observed in the face-sensitive N170 component that showed larger amplitudes on incongruent trials. An interactive effect of congruency and emotion appeared on the LPP (late positive potential), with larger amplitudes in response to happy faces that followed anger-inducing situations. These results show that the deliberate intention to judge the contextual congruency of facial expressions influences not only processes involved in affective evaluation such as those indexed by the LPP but also earlier processing stages that are involved in face perception.


Subject(s)
Attention/physiology , Brain/physiology , Emotions/physiology , Facial Expression , Adolescent , Adult , Anger/physiology , Cues , Electroencephalography , Face , Female , Happiness , Humans , Male , Photic Stimulation/methods , Young Adult
5.
Front Psychol ; 5: 1431, 2014.
Article in English | MEDLINE | ID: mdl-25540631

ABSTRACT

Visual perception in schizophrenia is attracting a broad interest given the deep knowledge that we have about the visual system in healthy populations. One example is the class of effects known collectively as visual surround suppression. For example, the visibility of a grating located in the visual periphery is impaired by the presence of a surrounding grating of the same spatial frequency and orientation. Previous studies have suggested abnormal visual surround suppression in patients with schizophrenia. Given that schizophrenia patients have cortical alterations including hypofunction of NMDA receptors and reduced concentration of GABA neurotransmitter, which affect lateral inhibitory connections, then they should be relatively better than controls at detecting visual stimuli that are usually suppressed. We tested this hypothesis by measuring contrast detection thresholds using a new stimulus configuration. We tested two groups: 21 schizophrenia patients and 24 healthy subjects. Thresholds were obtained using Bayesian staircases in a four-alternative forced-choice detection task where the target was a grating within a 3∘ Butterworth window that appeared in one of four possible positions at 5∘ eccentricity. We compared three conditions, (a) target with no-surround, (b) target embedded within a surrounding grating of 20∘ diameter and 25% contrast with same spatial frequency and orthogonal orientation, and (c) target embedded within a surrounding grating with parallel (same) orientation. Previous results with healthy populations have shown that contrast thresholds are lower for orthogonal and no-surround (NS) conditions than for parallel surround (PS). The log-ratios between parallel and NS thresholds are used as an index quantifying visual surround suppression. Patients performed poorly compared to controls in the NS and orthogonal-surround conditions. However, they performed as well as controls when the surround was parallel, resulting in significantly lower suppression indices in patients. To examine whether the difference in suppression was driven by the lower NS thresholds for controls, we examined a matched subgroup of controls and patients, selected to have similar thresholds in the NS condition. Patients performed significantly better in the PS condition than controls. This analysis therefore indicates that a PS raised contrast thresholds less in patients than in controls. Our results support the hypothesis that inhibitory connections in early visual cortex are impaired in schizophrenia patients.

6.
Int J Psychophysiol ; 92(2): 59-66, 2014 May.
Article in English | MEDLINE | ID: mdl-24594443

ABSTRACT

The effects of task demands and the interaction between gender and expression in face perception were studied using event-related potentials (ERPs). Participants performed three different tasks with male and female faces that were emotionally inexpressive or that showed happy or angry expressions. In two of the tasks (gender and expression categorization) facial properties were task-relevant while in a third task (symbol discrimination) facial information was irrelevant. Effects of expression were observed on the visual P100 component under all task conditions, suggesting the operation of an automatic process that is not influenced by task demands. The earliest interaction between expression and gender was observed later in the face-sensitive N170 component. This component showed differential modulations by specific combinations of gender and expression (e.g., angry male vs. angry female faces). Main effects of expression and task were observed in a later occipito-temporal component peaking around 230 ms post-stimulus onset (EPN or early posterior negativity). Less positive amplitudes in the presence of angry faces and during performance of the gender and expression tasks were observed. Finally, task demands also modulated a positive component peaking around 400 ms (LPC, or late positive complex) that showed enhanced amplitude for the gender task. The pattern of results obtained here adds new evidence about the sequence of operations involved in face processing and the interaction of facial properties (gender and expression) in response to different task demands.


Subject(s)
Cerebral Cortex/physiology , Evoked Potentials/physiology , Face , Facial Expression , Visual Perception/physiology , Adolescent , Adult , Attention/physiology , Electroencephalography/instrumentation , Electroencephalography/methods , Female , Humans , Sex Factors , Social Perception , Time Factors , Young Adult
7.
Soc Neurosci ; 8(6): 601-20, 2013.
Article in English | MEDLINE | ID: mdl-24053118

ABSTRACT

Numerous studies using the event-related potential (ERP) technique have found that emotional expressions modulate ERP components appearing at different post-stimulus onset times and are indicative of different stages of face processing. With the aim of studying the time course of integration of context and facial expression information, we investigated whether these modulations are sensitive to the situational context in which emotional expressions are perceived. Participants were asked to identify the expression of target faces that were presented immediately after reading short sentences that described happy or anger-inducing situations. The main manipulation was the congruency between the emotional content of the sentences and the target expression. Context-independent amplitude modulation of the N170 and N400 components by emotional expression was observed. On the other hand, context effects appeared on a later component (late positive potential, or LPP), with enhanced amplitudes on incongruent trials. These results show that the early stages of face processing where emotional expressions are coded are not sensitive to verbal information about the situation in which they appear. The timing of context congruency effects suggests that integration of facial expression with situational information occurs at a later stage, probably related to the detection of affective congruency.


Subject(s)
Brain/physiology , Evoked Potentials, Visual/physiology , Facial Expression , Pattern Recognition, Visual/physiology , Adolescent , Electroencephalography , Face , Female , Humans , Male , Photic Stimulation , Signal Processing, Computer-Assisted , Young Adult
8.
Span J Psychol ; 16: E24, 2013.
Article in English | MEDLINE | ID: mdl-23866218

ABSTRACT

The possibility that facial expressions of emotion change the affective valence of faces through associative learning was explored using facial electromyography (EMG). In Experiment 1, EMG activity was registered while the participants (N = 57) viewed sequences of neutral faces (Stimulus 1 or S1) changing to either a happy or an angry expression (Stimulus 2 or S2). As a consequence of learning, participants who showed patterning of facial responses in the presence of angry and happy faces, that is, higher Corrugator Supercilii (CS) activity in the presence of angry faces and higher Zygomaticus Major (ZM) activity in the presence of happy faces, showed also a similar pattern when viewing the corresponding S1 faces. Explicit evaluations made by an independent sample of participants (Experiment 2) showed that evaluation of S1 faces was changed according to the emotional expression with which they had been associated. These results are consistent with an interpretation of rapid facial reactions to faces as affective responses that reflect the valence of the stimulus and that are sensitive to learned changes in the affective meaning of faces.


Subject(s)
Association Learning , Emotions , Facial Expression , Facial Muscles , Recognition, Psychology , Adolescent , Adult , Electromyography , Female , Humans , Male , Pattern Recognition, Visual , Photic Stimulation , Young Adult
9.
Cogn Affect Behav Neurosci ; 13(2): 284-96, 2013 Jun.
Article in English | MEDLINE | ID: mdl-23263839

ABSTRACT

We studied the effect of facial expression primes on the evaluation of target words through a variant of the affective priming paradigm. In order to make the affective valence of the faces irrelevant to the task, the participants were assigned a double prime-target task in which they were unpredictably asked either to identify the gender of the face or to evaluate whether the word was pleasant or unpleasant. Behavioral and electrophysiological (event-related potential, or ERP) indices of affective priming were analyzed. Temporal and spatial versions of principal components analyses were used to detect and quantify those ERP components associated with affective priming. Although no significant behavioral priming was observed, electrophysiological indices showed a reverse priming effect, in the sense that the amplitude of the N400 was higher in response to congruent than to incongruent negative words. Moreover, a late positive potential (LPP), peaking around 700 ms, was sensitive to affective valence but not to prime-target congruency. This pattern of results is consistent with previous accounts of ERP effects in the affective priming paradigm that have linked the LPP with evaluative priming and the N400 with semantic priming. Our proposed explanation of the N400 priming effects obtained in the present study is based on two assumptions: a double check of affective stimuli in terms of valence and specific emotion content, and the differential specificities of facial expressions of positive and negative emotions.


Subject(s)
Affect , Brain Mapping , Emotions/physiology , Evoked Potentials/physiology , Facial Expression , Adolescent , Adult , Analysis of Variance , Electroencephalography , Female , Humans , Male , Photic Stimulation , Reaction Time/physiology , Young Adult
10.
Span. j. psychol ; 16: e24.1-e24.10, 2013. ilus
Article in English | IBECS | ID: ibc-116252

ABSTRACT

The possibility that facial expressions of emotion change the affective valence of faces through associative learning was explored using facial electromyography (EMG). In Experiment 1, EMG activity was registered while the participants (N = 57) viewed sequences of neutral faces (Stimulus 1 or S1) changing to either a happy or an angry expression (Stimulus 2 or S2). As a consequence of learning, participants who showed patterning of facial responses in the presence of angry and happy faces, that is, higher Corrugator Supercilii (CS) activity in the presence of angry faces and higher Zygomaticus Major (ZM) activity in the presence of happy faces, showed also a similar pattern when viewing the corresponding S1 faces. Explicit evaluations made by an independent sample of participants (Experiment 2) showed that evaluation of S1 faces was changed according to the emotional expression with which they had been associated. These results are consistent with an interpretation of rapid facial reactions to faces as affective responses that reflect the valence of the stimulus and that are sensitive to learned changes in the affective meaning of faces (AU)


No disponible


Subject(s)
Humans , Male , Female , Stress, Psychological/psychology , Expressed Emotion/physiology , Affective Symptoms/psychology , Emotions/physiology , Facial Expression , Biofeedback, Psychology/methods , Electromyography
11.
Span J Psychol ; 14(2): 523-34, 2011 Nov.
Article in English | MEDLINE | ID: mdl-22059299

ABSTRACT

The results of two studies on the relationship between evaluations of trustworthiness, valence and arousal of faces are reported. In Experiment 1, valence and trustworthiness judgments of faces were positively correlated, while arousal was negatively correlated with both trustworthiness and valence. In Experiment 2, learning about faces based on their emotional expression and the extent to which this learning is influenced by perceived trustworthiness was investigated. Neutral faces of different models differing in trustworthiness were repeatedly associated with happy or with angry expressions and the participants were asked to categorize each neutral face as belonging to a "friend" or to an "enemy" based on these associations. Four pairing conditions were defined in terms of the congruency between trustworthiness level and expression: Trustworthy-congruent, trustworthy-incongruent, untrustworthy-congruent and untrustworthy-incongruent. Categorization accuracy during the learning phase and face evaluation after learning were measured. During learning, participants learned to categorize with similar efficiency trustworthy and untrustworthy faces as friends or enemies and thus no effects of congruency were found. In the evaluation phase, faces of enemies were rated as more negative and arousing than those of friends, thus showing that learning was effective to change the affective value of the faces. However, faces of untrustworthy models were still judged on average more negative and arousing than those of trustworthy ones. In conclusion, although face trustworthiness did not influence learning of associations between faces and positive or negative social information it did have a significant influence on face evaluation that was manifest even after that learning.


Subject(s)
Affect , Discrimination Learning , Facial Expression , Pattern Recognition, Visual , Trust , Adolescent , Arousal , Association Learning , Female , Friends/psychology , Humans , Interpersonal Relations , Judgment , Male , Young Adult
12.
Span. j. psychol ; 14(2): 523-534, nov. 2011. ilus
Article in English | IBECS | ID: ibc-91195

ABSTRACT

The results of two studies on the relationship between evaluations of trustworthiness, valence and arousal of faces are reported. In Experiment 1, valence and trustworthiness judgments of faces were positively correlated, while arousal was negatively correlated with both trustworthiness and valence. In Experiment 2, learning about faces based on their emotional expression and the extent to which this learning is influenced by perceived trustworthiness was investigated. Neutral faces of different models differing in trustworthiness were repeatedly associated with happy or with angry expressions and the participants were asked to categorize each neutral face as belonging to a «friend» or to an «enemy» based on these associations. Four pairing conditions were defined in terms of the congruency between trustworthiness level and expression: Trustworthy-congruent, trustworthy-incongruent, untrustworthy-congruent and untrustworthy incongruent. Categorization accuracy during the learning phase and face evaluation after learning were measured. During learning, participants learned to categorize with similar efficiency trustworthy and untrustworthy faces as friends or enemies and thus no effects of congruency were found. In the evaluation phase, faces of enemies were rated as more negative and arousing than those of friends, thus showing that learning was effective to change the affective value of the faces. However, faces of untrustworthy models were still judged on average more negative and arousing than those of trustworthy ones. In conclusion, although face trustworthiness did not influence learning of associations between faces and positive or negative social information it did have a significant influence on face evaluation that was manifest even after that learning (AU)


Se presentan dos estudios sobre la relación entre la evaluación de caras en confiabilidad, valencia y activación. En el Experimento 1 confiabilidad y valencia correlacionaron positivamente y activación correlacionó negativamente con confiabilidad y valencia. En el Experimento 2 se analizó el aprendizaje basado en la expresión facial, estudiando la posibilidad de que se viera afectado por la confiabilidad percibida de las caras. Caras neutras de distinta confiabilidad aparecieron asociadas con expresiones de ira o alegría, pidiéndose a los participantes que categorizasen cada cara neutra como perteneciente a un «amigo» o a un «enemigo». Se definieron cuatro condiciones según la congruencia entre confiabilidad y expresión: confiable-congruente, confiable-incongruente, no confiable-congruente y no confiable-incongruente. Se midió la precisión de la categorización durante la fase de aprendizaje y la evaluación posterior de las caras. Los participantes aprendieron a categorizar con igual eficacia caras confiables y no confiables como amigos o enemigos, no observándose, por tanto, efectos de congruencia. Las caras de enemigos fueron evaluadas como más negativas y activadoras que las de los amigos, mostrando que el aprendizaje había alterado el valor afectivo de las caras. No obstante, las caras de los modelos menos confiables aún fueron juzgadas por término medio como más negativas y activadoras que las de los más confiables. En resumen, aunque el nivel de confiabilidad no afectó a la formación de asociaciones entre caras e información social positiva o negativa, sí tuvo un efecto significativo sobre la evaluación afectiva de las caras que se manifestó incluso después del aprendizaje (AU)


Subject(s)
Humans , Male , Female , Young Adult , Adult , Evaluation Studies as Topic , Students/psychology , Students, Health Occupations/psychology , Students, Health Occupations/statistics & numerical data , Learning/physiology , Affect , Facial Expression , Motivation/physiology , Reproducibility of Results , Analysis of Variance
SELECTION OF CITATIONS
SEARCH DETAIL
...