Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
Add more filters










Publication year range
1.
Cortex ; 177: 37-52, 2024 May 28.
Article in English | MEDLINE | ID: mdl-38833819

ABSTRACT

Fearful, angry, and disgusted facial expressions are evolutionarily salient and convey different types of threat signals. However, it remains unclear whether these three expressions impact sensory perception and attention in the same way. The present ERP study investigated the temporal dynamics underlying the processing of different types of threatening faces and the impact of attentional resources employed during a perceptual load task. Participants were asked to judge the length of bars superimposed over faces presented in the center of the screen. A mass univariate statistical approach was used to analyze the EEG data. Behaviorally, task accuracy was significantly reduced following exposure to fearful faces relative to neutral distractors, independent of perceptual load. The ERP results revealed that the P1 amplitude over the right hemisphere was found to be enhanced for fearful relative to disgusted faces, reflecting the rapid and coarse detection of fearful cues. The N170 responses elicited by fearful, angry, and disgusted faces were larger than those elicited by neutral faces, suggesting the largely automatic and preferential processing of threats. Furthermore, the early posterior negativity (EPN) component yielded increased responses to fearful and angry faces, indicating prioritized attention to stimuli representing acute threats. Additionally, perceptual load exerted a pronounced influence on the EPN and late positive potential (LPP), with larger responses observed in the low perceptual load condition, indicating goal-directed cognitive processing. Overall, the early sensory processing of fearful, angry, and disgusted faces is characterized by differential sensitivity in capturing attention automatically, despite the importance of these facial signals for survival. Fearful faces produce a strong interference effect and are processed with higher priority than angry and disgusted ones.

2.
Neuroimage ; 290: 120578, 2024 Apr 15.
Article in English | MEDLINE | ID: mdl-38499051

ABSTRACT

Face perception is a complex process that involves highly specialized procedures and mechanisms. Investigating into face perception can help us better understand how the brain processes fine-grained, multidimensional information. This research aimed to delve deeply into how different dimensions of facial information are represented in specific brain regions or through inter-regional connections via an implicit face recognition task. To capture the representation of various facial information in the brain, we employed support vector machine decoding, functional connectivity, and model-based representational similarity analysis on fMRI data, resulting in the identification of three crucial findings. Firstly, despite the implicit nature of the task, emotions were still represented in the brain, contrasting with all other facial information. Secondly, the connection between the medial amygdala and the parahippocampal gyrus was found to be essential for the representation of facial emotion in implicit tasks. Thirdly, in implicit tasks, arousal representation occurred in the parahippocampal gyrus, while valence depended on the connection between the primary visual cortex and the parahippocampal gyrus. In conclusion, these findings dissociate the neural mechanisms of emotional valence and arousal, revealing the precise spatial patterns of multidimensional information processing in faces.


Subject(s)
Emotions , Magnetic Resonance Imaging , Humans , Brain/diagnostic imaging , Brain Mapping/methods , Parahippocampal Gyrus/diagnostic imaging , Facial Expression
3.
Brain Sci ; 13(12)2023 Dec 09.
Article in English | MEDLINE | ID: mdl-38137147

ABSTRACT

Recognizing the emotions of faces in a crowd is crucial for understanding overall behavior and intention as well as for smooth and friendly social interactions. However, it is unclear whether the spatial frequency of faces affects the discrimination of crowd emotion. Although high- and low-spatial-frequency information for individual faces is processed by distinct neural channels, there is a lack of evidence on how this applies to crowd faces. Here, we used functional magnetic resonance imaging (fMRI) to investigate neural representations of crowd faces at different spatial frequencies. Thirty-three participants were asked to compare whether a test face was happy or more fearful than a crowd face that varied in high, low, and broad spatial frequencies. Our findings revealed that fearful faces with low spatial frequencies were easier to recognize in terms of accuracy (78.9%) and response time (927 ms). Brain regions, such as the fusiform gyrus, located in the ventral visual stream, were preferentially activated in high spatial frequency crowds, which, however, were the most difficult to recognize behaviorally (68.9%). Finally, the right inferior frontal gyrus was found to be better activated in the broad spatial frequency crowds. Our study suggests that people are more sensitive to fearful crowd faces with low spatial frequency and that high spatial frequency does not promote crowd face recognition.

4.
Brain Sci ; 13(10)2023 Sep 29.
Article in English | MEDLINE | ID: mdl-37891761

ABSTRACT

The eye region conveys considerable information regarding an individual's emotions, motivations, and intentions during interpersonal communication. Evidence suggests that the eye regions of an individual expressing emotions can capture attention more rapidly than the eye regions of an individual in a neutral affective state. However, how attentional resources affect the processing of emotions conveyed by the eye regions remains unclear. Accordingly, the present study employed a dual-target rapid serial visual presentation task: happy, neutral, or fearful eye regions were presented as the second target, with a temporal lag between two targets of 232 or 696 ms. Participants completed two tasks successively: Task 1 was to identify which species the upright eye region they had seen belonged to, and Task 2 was to identify what emotion was conveyed in the upright eye region. The behavioral results showed that the accuracy for fearful eye regions was lower than that for neutral eye regions under the condition of limited attentional resources; however, accuracy differences across the three types of eye regions did not reach significance under the condition of adequate attentional resources. These findings indicate that preferential processing of fearful expressions is not automatic but is modulated by available attentional resources.

5.
Int J Psychophysiol ; 190: 8-19, 2023 Aug.
Article in English | MEDLINE | ID: mdl-37271224

ABSTRACT

Although the eye region has been found to convey sufficient information for emotional recognition and interpersonal communication, little is known regarding the extent to which the prioritized processing of emotional eye regions relies on available attentional resources. To address this issue, the present study used a dual-target rapid serial visual presentation task in which the perceptual load levels of the first target (T1), as well as the valence of the second target (T2), were manipulated. In addition to the traditional event-related potential (ERP) analysis method, the mass univariate statistics approach was employed. Behaviorally, both happy and fearful eye regions were recognized more accurately than neutral eye regions, regardless of the T1 perceptual load. ERP findings revealed an enhanced N170 amplitude for fearful eye regions compared to neutral eye regions, confirming the preferential and automatic processing of fearful signals at the early sensory stage. The late positive potential component exhibited enhanced responses to fearful and happy eye regions, suggesting the amplified representation consolidation in working memory. Collectively, these findings indicate that isolated eye regions are processed automatically to a higher degree owing to their perceptual and motivational significance.


Subject(s)
Emotions , Facial Expression , Humans , Emotions/physiology , Fear , Evoked Potentials/physiology , Attention/physiology , Electroencephalography
6.
Brain Sci ; 13(2)2023 Jan 29.
Article in English | MEDLINE | ID: mdl-36831770

ABSTRACT

Decision-making under time pressure may better reflect an individual's response preference, but few studies have examined whether individuals choose to be more selfish or altruistic in a scenario where third-party punishment is essential for maintaining social norms. This study used a third-party punishment paradigm to investigate how time pressure impacts on individuals' maintenance of behavior that follows social norms. Thirty-one participants observed a Dictator Game and had to decide whether to punish someone who made what was categorized as a high unfair offer by spending their own Monetary units to reduce that person's payoff. The experiment was conducted across different offer conditions. The study results demonstrated that reaction times were faster under time pressure compared with no time pressure. Time pressure was also correlated with less severe punishment. Specifically, participants were less likely to punish the dictator under time pressure compared with no time pressure when the offer was categorized as a high unfair. The findings suggested that individuals in these game conditions and under time pressure do not overcome their pro-selves and that time pressure weakens an individual's willingness to punish high unfair offers.

7.
Brain Sci ; 12(12)2022 Dec 03.
Article in English | MEDLINE | ID: mdl-36552125

ABSTRACT

Although emotional expressions conveyed by the eye regions are processed efficiently, little is known regarding the relationship between emotional processing of isolated eye regions and temporal attention. In this study, we conducted three rapid serial visual presentation (RSVP) experiments with varying task demands (emotion discrimination, eye detection, eyes ignored) related to the first target (T1) to investigate how the perception of emotional valence in the eye region (T1: happy, neutral, fearful) impacts the identification of a second target (T2: neutral houses). Event-related potential (ERP) findings indicated that fearful stimuli reliably increased N170 amplitude regardless of the emotional relevance of task demands. The P3 component exhibited enhanced responses to happy and fearful stimuli in the emotion discrimination task and to happy eye regions in the eye detection task. Analysis of T2-related ERPs within the attentional blink period revealed that T2 houses preceded by fearful and happy stimuli elicited larger N2 and P3 amplitudes than those preceded by neutral stimuli only in the emotion discrimination task. Together, these findings indicate that attention to affective content conveyed by the eyes can not only amplify the perceptual analysis of emotional eye regions but also facilitate the processing of a subsequent target.

8.
Chem Commun (Camb) ; 58(82): 11539-11542, 2022 Oct 13.
Article in English | MEDLINE | ID: mdl-36155688

ABSTRACT

A pyridyl-functionalized mesoporous graphene is developed to accommodate sulfur for Al-S batteries, which can significantly reduce the voltage hysteresis to ∼0.43 V. The reaction kinetics of the Al-S battery are accelerated by the catalyst-free carbon host, ascribed to both the mesoporous graphene structure and the covalently functionalized pyridyl groups.

9.
Int J Psychophysiol ; 182: 1-11, 2022 12.
Article in English | MEDLINE | ID: mdl-35917954

ABSTRACT

Despite of the crucial role of eye region in nonverbal communication, how and when the brain responds to affective signals projected from this region remains unclear. This study explored the temporal dynamics in the processing of emotionally valenced eye regions (happy, neutral, and fearful) and the influence of attentional resources using a dual-target rapid serial visual presentation task and event-related potential (ERP) technique. Behaviorally, the recognition accuracy of happy and fearful eye regions was higher than that for neutral eye regions in the deficient attentional resources condition (lag2), indicating reduced attentional blink for emotional eye regions. The ERP findings denote that fearful and happy eye regions modulated the N170. The early posterior negativity (EPN) was influenced by the interaction between lag and emotional valence, which was reflected by larger amplitudes for fearful rather than neutral eye regions in the lag2 condition. The amplitude of the late positive potential (LPP) also increased for the happy and fearful eye regions. These outcomes suggest that the human brain is highly sensitive to isolated eye regions. Moreover, fearful signals emitted from the eye region are processed automatically and are unaffected by attentional resource availability.


Subject(s)
Attentional Blink , Humans , Facial Expression , Electroencephalography/methods , Emotions/physiology , Evoked Potentials/physiology
10.
Brain Sci ; 12(4)2022 Mar 31.
Article in English | MEDLINE | ID: mdl-35447997

ABSTRACT

Social species perceive emotion via extracting diagnostic features of body movements. Although extensive studies have contributed to knowledge on how the entire body is used as context for decoding bodily expression, we know little about whether specific body parts (e.g., arms and legs) transmit enough information for body understanding. In this study, we performed behavioral experiments using the Bubbles paradigm on static body images to directly explore diagnostic body parts for categorizing angry, fearful and neutral expressions. Results showed that subjects recognized emotional bodies through diagnostic features from the torso with arms. We then conducted a follow-up functional magnetic resonance imaging (fMRI) experiment on body part images to examine whether diagnostic parts modulated body-related brain activity and corresponding neural representations. We found greater activations of the extra-striate body area (EBA) in response to both anger and fear than neutral for the torso and arms. Representational similarity analysis showed that neural patterns of the EBA distinguished different bodily expressions. Furthermore, the torso with arms and whole body had higher similarities in EBA representations relative to the legs and whole body, and to the head and whole body. Taken together, these results indicate that diagnostic body parts (i.e., torso with arms) can communicate bodily expression in a detectable manner.

11.
Neuropsychologia ; 137: 107286, 2020 02 03.
Article in English | MEDLINE | ID: mdl-31786222

ABSTRACT

Empathy is essential for social interactions and individual development. Through the empathy field, the domain could be divided into three subgroups according to the stimulus materials adopted in tasks: empathy towards physical pain (PhyE), empathy towards emotional situations (ESuE) and towards emotional faces (ExpE). By far, no empirical studies directly compared the neural correlates underlying three sub-domains. The current study, therefore, utilized ALE meta-analysis to identify the general and distinct neural correlates underlying three sub-domains of empathy. The results revealed the conjunction in bilateral supplementary motor areas which were generally activated across three sub-domains. Preferential correlates for PhyE were found in bilateral IPL, left middle cingulate cortex and left anterior insula, which were associated with pain, action and somatosensory functions. Left middle temporal gyrus was found to be preferentially engaged for ESuE. And the preferential activations for ExpE were identified in right amygdala and right dorsal lateral prefrontal cortex, the regions of which were statistically associated with functional Neurosynth terms "facial expression", "face", "emotion" and "social". Through the current meta-analyses, we firstly indicated that the domain-general and domain-preferential neural correlates potentially exist to underlie the processing of empathy evoked by different types of stimuli.


Subject(s)
Brain Mapping , Cerebral Cortex/physiology , Emotions/physiology , Empathy/physiology , Facial Expression , Facial Recognition/physiology , Pain/physiopathology , Social Perception , Cerebral Cortex/diagnostic imaging , Humans
12.
Biol Psychol ; 146: 107728, 2019 09.
Article in English | MEDLINE | ID: mdl-31306692

ABSTRACT

The perception of surprised faces is demonstrably modulated by emotional context. However, the influence of self-relevance and its interaction with emotional context have not been explored. The present study investigated the effects of contextual valence and self-reference on the perception of surprised faces. Our results revealed that faces in a negative context elicited a larger N170 than those in a neutral context. The EPN was affected by the interaction between contextual valence and self-reference, with larger amplitudes for faces in self-related positive contexts and sender-related negative contexts. Additionally, LPP amplitudes were enhanced for faces in negative contexts relative to neutral and positive contexts, as well as for self-related contexts in comparison to sender-related contexts. Together, these findings help to elucidate the psychophysiological mechanisms underlying the effects of emotional and self-referential contexts on the perception of surprised faces, which are characterized by distinctive ERPs.


Subject(s)
Emotions/physiology , Evoked Potentials/physiology , Facial Expression , Self Concept , Adolescent , Adult , Electroencephalography , Female , Humans , Male , Photic Stimulation , Young Adult
13.
Psychophysiology ; 55(5): e13039, 2018 05.
Article in English | MEDLINE | ID: mdl-29239478

ABSTRACT

In this study, we presented pictorial representations of happy, neutral, and fearful expressions projected in the eye regions to determine whether the eye region alone is sufficient to produce a context effect. Participants were asked to judge the valence of surprised faces that had been preceded by a picture of an eye region. Behavioral results showed that affective ratings of surprised faces were context dependent. Prime-related ERPs with presentation of happy eyes elicited a larger P1 than those for neutral and fearful eyes, likely due to the recognition advantage provided by a happy expression. Target-related ERPs showed that surprised faces in the context of fearful and happy eyes elicited dramatically larger C1 than those in the neutral context, which reflected the modulation by predictions during the earliest stages of face processing. There were larger N170 with neutral and fearful eye contexts compared to the happy context, suggesting faces were being integrated with contextual threat information. The P3 component exhibited enhanced brain activity in response to faces preceded by happy and fearful eyes compared with neutral eyes, indicating motivated attention processing may be involved at this stage. Altogether, these results indicate for the first time that the influence of isolated eye regions on the perception of surprised faces involves preferential processing at the early stages and elaborate processing at the late stages. Moreover, higher cognitive processes such as predictions and attention can modulate face processing from the earliest stages in a top-down manner.


Subject(s)
Brain/physiology , Emotions/physiology , Evoked Potentials, Visual/physiology , Facial Expression , Adolescent , Attention/physiology , Electroencephalography , Female , Humans , Male , Photic Stimulation , Young Adult
14.
Sci Rep ; 7(1): 13062, 2017 10 12.
Article in English | MEDLINE | ID: mdl-29026111

ABSTRACT

The current study compared the effectiveness of distraction, an antecedent-focused strategy that involves diverting an individual's attention away from affective terms, and expressive suppression, a response-focused strategy that involves inhibiting conscious emotion-expressive behavior during an emotionally aroused state, in the regulation of high- and low-intensity unpleasant stimuli, using event-related potentials (ERPs). Sixteen participants completed an emotion regulation experiment in which they passively viewed high- or low-intensity unpleasant images (view), solved a mathematical equation presented on high- or low-intensity negative images (distraction), or suppressed their emotional expression in response to high- or low-intensity unpleasant images (suppression). Their negative experiences after implementation of these strategies were rated by participants on a 1-9 scale. We mainly found that compared with expressive suppression, distraction yielded greater attenuation of the early phase of centro-parietal LPP when the participants responded to high-intensity stimuli. In the low-intensity condition, distraction, but not expressive suppression, effectively decreased the early phase of LPP. The findings suggest that expressive suppression works as early as distraction in the high-intensity condition; more importantly, distraction is superior to expressive suppression in regulating negative emotion, which is influenced by emotional intensity.


Subject(s)
Attention/physiology , Emotions/physiology , Evoked Potentials/physiology , Adolescent , Adult , Cognition/physiology , Electroencephalography , Female , Humans , Male , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...