Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Cogn Affect Behav Neurosci ; 22(1): 57-74, 2022 02.
Article in English | MEDLINE | ID: mdl-34498230

ABSTRACT

Whilst research has largely focused on the recognition of emotional items, emotion may be a more subtle part of our surroundings and conveyed by context rather than by items. Using ERPs, we investigated which effects an arousing context during encoding may have for item-context binding and subsequent familiarity-based and recollection-based item-memory. It has been suggested that arousal could facilitate item-context bindings and by this enhance the contribution of recollection to subsequent memory judgements. Alternatively, arousal could shift attention onto central features of a scene and by this foster unitisation during encoding. This could boost the contribution of familiarity to remembering. Participants learnt neutral objects paired with ecologically highly valid emotional faces whose names later served as neutral cues during an immediate and delayed test phase. Participants identified objects faster when they had originally been studied together with emotional context faces. Items with both neutral and emotional context elicited an early frontal ERP old/new difference (200-400 ms). Neither the neurophysiological correlate for familiarity nor recollection were specific to emotionality. For the ERP correlate of recollection, we found an interaction between stimulus type and day, suggesting that this measure decreased to a larger extend on Day 2 compared with Day 1. However, we did not find direct evidence for delayed forgetting of items encoded in emotional contexts at Day 2. Emotion at encoding might make retrieval of items with emotional context more readily accessible, but we found no significant evidence that emotional context either facilitated familiarity-based or recollection-based item-memory after a delay of 24 h.


Subject(s)
Electroencephalography , Evoked Potentials , Emotions/physiology , Evoked Potentials/physiology , Humans , Mental Recall/physiology , Recognition, Psychology/physiology
2.
Emotion ; 17(6): 912-937, 2017 09.
Article in English | MEDLINE | ID: mdl-28252978

ABSTRACT

[Correction Notice: An Erratum for this article was reported in Vol 17(6) of Emotion (see record 2017-18585-001). In the article, the copyright attribution was incorrectly listed and the Creative Commons CC-BY license disclaimer was incorrectly omitted from the author note. The correct copyright is "© 2017 The Author(s)" and the omitted disclaimer is below. All versions of this article have been corrected. "This article has been published under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Copyright for this article is retained by the author(s). Author(s) grant(s) the American Psychological Association the exclusive right to publish the article and identify itself as the original publisher."] Emotions are a vital component of social communication, carried across a range of modalities and via different perceptual signals such as specific muscle contractions in the face and in the upper respiratory system. Previous studies have found that emotion recognition impairments after brain damage depend on the modality of presentation: recognition from faces may be impaired whereas recognition from voices remains preserved, and vice versa. On the other hand, there is also evidence for shared neural activation during emotion processing in both modalities. In a behavioral study, we investigated whether there are shared representations in the recognition of emotions from faces and voices. We used a within-subjects design in which participants rated the intensity of facial expressions and nonverbal vocalizations for each of the 6 basic emotion labels. For each participant and each modality, we then computed a representation matrix with the intensity ratings of each emotion. These matrices allowed us to examine the patterns of confusions between emotions and to characterize the representations of emotions within each modality. We then compared the representations across modalities by computing the correlations of the representation matrices across faces and voices. We found highly correlated matrices across modalities, which suggest similar representations of emotions across faces and voices. We also showed that these results could not be explained by commonalities between low-level visual and acoustic properties of the stimuli. We thus propose that there are similar or shared coding mechanisms for emotions which may act independently of modality, despite their distinct perceptual inputs. (PsycINFO Database Record (c) 2017 APA, all rights reserved).


Subject(s)
Comprehension/physiology , Emotions , Facial Expression , Voice , Acoustic Stimulation , Adolescent , Adult , Face , Female , Humans , Male , Middle Aged , Photic Stimulation , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...