Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Cortex ; 149: 148-164, 2022 04.
Artigo em Inglês | MEDLINE | ID: mdl-35231722

RESUMO

When we hear an emotional voice, does this alter how the brain perceives and evaluates a subsequent face? Here, we tested this question by comparing event-related potentials evoked by angry, sad, and happy faces following vocal expressions which varied in form (speech-embedded emotions, non-linguistic vocalizations) and emotional relationship (congruent, incongruent). Participants judged whether face targets were true exemplars of emotion (facial affect decision). Prototypicality decisions were more accurate and faster for congruent vs. incongruent faces and for targets that displayed happiness. Principal component analysis identified vocal context effects on faces in three distinct temporal factors: a posterior P200 (150-250 ms), associated with evaluating face typicality; a slow frontal negativity (200-750 ms) evoked by angry faces, reflecting enhanced attention to threatening targets; and the Late Positive Potential (LPP, 450-1000 ms), reflecting sustained contextual evaluation of intrinsic face meaning (with independent LPP responses in posterior and prefrontal cortex). Incongruent faces and faces primed by speech (compared to vocalizations) tended to increase demands on face perception at stages of structure-building (P200) and meaning integration (posterior LPP). The frontal LPP spatially overlapped with the earlier frontal negativity response; these components were functionally linked to expectancy-based processes directed towards the incoming face, governed by the form of a preceding vocal expression (especially for anger). Our results showcase differences in how vocalizations and speech-embedded emotion expressions modulate cortical operations for predicting (prefrontal) versus integrating (posterior) face meaning in light of contextual details.


Assuntos
Expressão Facial , Reconhecimento Facial , Eletroencefalografia/métodos , Emoções/fisiologia , Potenciais Evocados/fisiologia , Reconhecimento Facial/fisiologia , Humanos
2.
Brain Cogn ; 48(2-3): 499-504, 2002.
Artigo em Inglês | MEDLINE | ID: mdl-12030496

RESUMO

This report describes some preliminary attributes of stimuli developed for future evaluation of nonverbal emotion in neurological populations with acquired communication impairments. Facial and vocal exemplars of six target emotions were elicited from four male and four female encoders and then prejudged by 10 young decoders to establish the category membership of each item at an acceptable consensus level. Representative stimuli were then presented to 16 additional decoders to gather indices of how category membership and encoder gender influenced recognition accuracy of emotional meanings in each nonverbal channel. Initial findings pointed to greater facility in recognizing target emotions from facial than vocal stimuli overall and revealed significant accuracy differences among the six emotions in both the vocal and facial channels. The gender of the encoder portraying emotional expressions was also a significant factor in how well decoders recognized specific emotions (disgust, neutral), but only in the facial condition.


Assuntos
Afeto , Expressão Facial , Testes Neuropsicológicos , Comunicação não Verbal , Voz , Adulto , Sinais (Psicologia) , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Reconhecimento Psicológico , Percepção da Fala , Percepção Visual
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...