Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters











Publication year range
1.
Cogn Process ; 2024 Aug 24.
Article in English | MEDLINE | ID: mdl-39180634

ABSTRACT

Emoticons have been considered pragmatic cues that enhance emotional expressivity during computer-mediated communication. Yet, it is unclear how emoticons are processed in ambiguous text-based communication due to incongruences between the emoticon's emotional valence and its context. In this study, we investigated the electrophysiological correlates of contextual influence on the early emotional processing of emoticons, during an emotional congruence judgment task. Participants were instructed to judge the congruence between a text message expressing an emotional situation (positive or negative), and a subsequent emoticon expressing positive or negative emotions. We analyzed early event-related potentials elicited by emoticons related to face processing (N170) and emotional salience in visual perception processing (Early Posterior Negativity, EPN). Our results show that accuracy and Reaction Times depend on the interaction between the emotional valence of the context and the emoticon. Negative emoticons elicited a larger N170, suggesting that the emotional information of the emoticon is integrated at the early stages of the perceptual process. During emoticon processing, a valence effect was observed with enhanced EPN amplitudes in occipital areas for emoticons representing negative valences. Moreover, we observed a congruence effect in parieto-temporal sites within the same time-window, with larger amplitudes for the congruent condition. We conclude that, similar to face processing, emoticons are processed differently according to their emotional content and the context in which they are embedded. A congruent context might enhance the emotional salience of the emoticon (and therefore, its emotional expression) during the early stages of their processing.

2.
Psychol. av. discip ; 13(2): 95-106, jul.-dic. 2019. tab, graf
Article in Spanish | LILACS | ID: biblio-1250600

ABSTRACT

Resumen En este estudio se pretendió corroborar el efecto de inversión con estímulos de rostros calmos, de miedo e ira tomados de la base de datos de rostros calmos y emocionales NSTIM (base de datos de rostros), presentados de forma vertical e invertida, con una duración de presentación de 2000 milisegundos. Los resultados muestran que no hubo modulación del componente N170 tanto en latencia como en amplitud ante la presentación de rostros invertidos comparados con los rostros presentados verticalmente. Tampoco se encontró alteración en la precisión de la respuesta y demora en los tiempos de reacción ante los rostros invertidos comparados con los rostros verticales. Estos resultados probablemente se pueden explicar por la presentación prolongada de los rostros invertidos que permitieron a los participantes tener el tiempo para reorganizar los elementos que componen los rostros. El aporte de este estudio es poner en evidencia que el tiempo de presentación de rostros invertidos es crucial para que se presente el efecto de inversión de rostros, el cual debe estar entre 200 y 500 milisegundos.


Abstract The aim of this study was to confirm the inversion effect with calm, fear and anger faces taken from calm and emotional NSTIM (face database) face database presented in upright and upside down position. The duration of the stimuli was 2000 ms. There was no modulation in latency or amplitude of the N170 component for inverted faces. There was no alteration in accuracy and reaction times in inverted faces compared to vertical faces. These results could be explained by the long presentation of the inverted faces that probably permitted the participants to reorganize the elements that compose the face. The contribution of this study is to demonstrate that the time presentation of the inverted faces between 200 and 500 ms is crucial for the FIE to appear.


Subject(s)
Reaction Time , Facial Expression , Fear , Anger , Orientation , Adaptation, Psychological , Face
3.
Autism Res ; 12(5): 744-758, 2019 05.
Article in English | MEDLINE | ID: mdl-30973210

ABSTRACT

Individuals with autism spectrum disorder (ASD) exhibit impaired adult facial processing, as shown by the N170 event-related potential. However, few studies explore such processing in mothers of children with ASD, and none has assessed the early processing of infant faces in these women. Moreover, whether processing of infant facial expressions in mothers of children with ASD is related to their response to their child's needs (maternal sensitivity [MS]) remains unknown. This study explored the N170 related to infant faces in a group of mothers of children with ASD (MA) and a reference group of mothers of children without ASD. For both emotional (crying, smiling) and neutral expressions, the MA group exhibited larger amplitudes of N170 in the right hemisphere, while the reference group showed similar interhemispheric amplitudes. This lateralization effect within the MA group was not present for nonfaces and was stronger in the mothers with higher MS. We propose that mothers of ASD children use specialized perceptual resources to process infant faces, and this specialization is mediated by MS. Our findings suggest that having an ASD child modulates mothers' early neurophysiological responsiveness to infant cues. Whether this modulation represents a biological marker or a response given by experience remains to be explored. Autism Research 2019, 12: 744-758. © 2019 International Society for Autism Research, Wiley Periodicals, Inc. LAY SUMMARY: When mothers of children with autism spectrum disorder (ASD) see baby faces expressing emotions, they show a right-sided electrical response in the brain. This lateralization was stronger in mothers who were more sensitive to their children's needs. We conclude that having a child with ASD and being more attuned to their behavior generates a specialized pattern of brain activity when processing infant faces. Whether this pattern is biological or given by experience remains to be explored.


Subject(s)
Autism Spectrum Disorder/psychology , Brain/physiology , Evoked Potentials/physiology , Facial Expression , Maternal Behavior/psychology , Mothers/psychology , Adult , Child , Child, Preschool , Cues , Female , Humans , Male , Middle Aged , Photic Stimulation/methods
4.
Psychol. av. discip ; 11(1): 39-48, ene.-jun. 2017. tab, graf
Article in Spanish | LILACS | ID: biblio-895984

ABSTRACT

Resumen La discriminación de las emociones expresadas a nivel facial es importante para las relaciones sociales, la empatía y la interacción social. El objetivo de este estudio fue observar si existían diferencias en el procesamiento cortical ante dos emociones básicas, la ira y el miedo y definir si la percepción de la ira intensa genera una mayor modulación del componente N170 en amplitud y latencia en comparación con las imágenes de rostros con expresión de miedo intenso. Para este estudio se utilizó la técnica de potenciales evocados con un montaje de 32 canales. Se encontraron diferencias significativas en latencia para las imágenes de rostros que expresan ira intensa, comparados con la condición de imágenes de rostros de miedo intenso. Se encontraron diferencias tanto de la amplitud como de latencia ante imágenes de rostros de ira y miedo intensos en comparación con imágenes de rostros neutros.


Abstract The discrimination of emotions expressed by the facial expressions is important for social relationships, empathy and social interactions. The main aim of this study was to study whether there were differences in the cortical processing to two basic emotions, anger and fear, and whether the perception of intense anger generates a higher modulation in amplitude and latency of the N170 component than intense fear. The technique used was event related potentials with a 32-channel montage. We found significant differences in the latency for images of faces expressing intense anger compared to the condition of images of faces of intense fear, as well as differences in amplitude in latency in response to the presentation of neutral faces in comparison to intense anger and fear.


Subject(s)
Perception , Reaction Time , Facial Expression , Fear , Anger , Emotions , Empathy , Face , Social Discrimination , Interpersonal Relations , Object Attachment
5.
Front Hum Neurosci ; 4: 188, 2010.
Article in English | MEDLINE | ID: mdl-21079750

ABSTRACT

The Implicit Association Test (IAT) is the most popular measure to evaluate implicit attitudes. Nevertheless, its neural correlates are not yet fully understood. We examined event related potentials (ERPs) in response to face- and word processing while indigenous and non-indigenous participants performed an IAT displaying faces (ingroup and outgroup members) and words (positive and negative valence) as targets of category judgments. The N170 component was modulated by valence of words and by ingroup/outgroup face categorization. Contextual effects (face-words implicitly associated in the task) had an influence on the N170 amplitude modulation. On the one hand, in face categorization, right N170 showed differences according to the association between social categories of faces and affective valence of words. On the other, in word categorization, left N170 presented a similar modulation when the task implied a negative-valence associated with ingroup faces. Only indigenous participants showed a significant IAT effect and N170 differences. Our results demonstrate an early ERP blending of stimuli processing with both intergroup and evaluative contexts, suggesting an integration of contextual information related to intergroup attitudes during the early stages of word and face processing. To our knowledge, this is the first report of early ERPs during an ethnicity IAT, opening a new branch of exchange between social neuroscience and social psychology of attitudes.

SELECTION OF CITATIONS
SEARCH DETAIL