Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Psychol Res ; 85(8): 2954-2969, 2021 Nov.
Article in English | MEDLINE | ID: mdl-33236175

ABSTRACT

While it has been taken for granted in the development of several automatic facial expression recognition tools, the question of the coherence between subjective feelings and facial expressions is still a subject of debate. On one hand, the "Basic Emotion View" conceives emotions as genetically hardwired and, therefore, being genuinely displayed through facial expressions. Consequently, emotion recognition is perceiver independent. On the other hand, the constructivist approach conceives emotions as socially constructed, the emotional meaning of a facial expression being inferred by the perceiver. Hence, emotion recognition is perceiver dependent. In order (1) to evaluate the coherence between the subjective feeling of emotions and their spontaneous facial displays, and (2) to compare the recognition of such displays by human perceivers and by an automatic facial expression classifier, 232 videos of expressers recruited to carry out an emotion elicitation task were annotated by 1383 human perceivers as well as by Affdex, an automatic classifier. Results show a weak consistency between self-reported emotional states by expressers and their facial emotional displays. They also show low accuracy both of human perceivers and of the automatic classifier to infer the subjective feeling from the spontaneous facial expressions displayed by expressers. However, the results are more in favor of a perceiver-dependent view. Based on these results, the hypothesis of genetically hardwired emotion genuinely displayed is difficult to support, whereas the idea of emotion and facial expression as being socially constructed appears to be more likely. Accordingly, automatic emotion recognition tools based on facial expressions should be questioned.


Subject(s)
Facial Expression , Facial Recognition , Emotions , Face , Humans , Recognition, Psychology
2.
Front Psychol ; 9: 1190, 2018.
Article in English | MEDLINE | ID: mdl-30050487

ABSTRACT

This study aims at examining the precise temporal dynamics of the emotional facial decoding as it unfolds in the brain, according to the emotions displayed. To characterize this processing as it occurs in ecological settings, we focused on unconstrained visual explorations of natural emotional faces (i.e., free eye movements). The General Linear Model (GLM; Smith and Kutas, 2015a,b; Kristensen et al., 2017a) enables such a depiction. It allows deconvolving adjacent overlapping responses of the eye fixation-related potentials (EFRPs) elicited by the subsequent fixations and the event-related potentials (ERPs) elicited at the stimuli onset. Nineteen participants were displayed with spontaneous static facial expressions of emotions (Neutral, Disgust, Surprise, and Happiness) from the DynEmo database (Tcherkassof et al., 2013). Behavioral results on participants' eye movements show that the usual diagnostic features in emotional decoding (eyes for negative facial displays and mouth for positive ones) are consistent with the literature. The impact of emotional category on both the ERPs and the EFRPs elicited by the free exploration of the emotional faces is observed upon the temporal dynamics of the emotional facial expression processing. Regarding the ERP at stimulus onset, there is a significant emotion-dependent modulation of the P2-P3 complex and LPP components' amplitude at the left frontal site for the ERPs computed by averaging. Yet, the GLM reveals the impact of subsequent fixations on the ERPs time-locked on stimulus onset. Results are also in line with the valence hypothesis. The observed differences between the two estimation methods (Average vs. GLM) suggest the predominance of the right hemisphere at the stimulus onset and the implication of the left hemisphere in the processing of the information encoded by subsequent fixations. Concerning the first EFRP, the Lambda response and the P2 component are modulated by the emotion of surprise compared to the neutral emotion, suggesting an impact of high-level factors, in parieto-occipital sites. Moreover, no difference is observed on the second and subsequent EFRP. Taken together, the results stress the significant gain obtained in analyzing the EFRPs using the GLM method and pave the way toward efficient ecological emotional dynamic stimuli analyses.

SELECTION OF CITATIONS
SEARCH DETAIL
...