Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 59
Filter
1.
Neuroimage ; 269: 119907, 2023 04 01.
Article in English | MEDLINE | ID: mdl-36717042

ABSTRACT

Previous functional imaging studies demonstrated body-selective patches in the primate visual temporal cortex, comparing activations to static bodies and static images of other categories. However, the use of static instead of dynamic displays of moving bodies may have underestimated the extent of the body patch network. Indeed, body dynamics provide information about action and emotion and may be processed in patches not activated by static images. Thus, to map with fMRI the full extent of the macaque body patch system in the visual temporal cortex, we employed dynamic displays of natural-acting monkey bodies, dynamic monkey faces, objects, and scrambled versions of these videos, all presented during fixation. We found nine body patches in the visual temporal cortex, starting posteriorly in the superior temporal sulcus (STS) and ending anteriorly in the temporal pole. Unlike for static images, body patches were present consistently in both the lower and upper banks of the STS. Overall, body patches showed a higher activation by dynamic displays than by matched static images, which, for identical stimulus displays, was less the case for the neighboring face patches. These data provide the groundwork for future single-unit recording studies to reveal the spatiotemporal features the neurons of these body patches encode. These fMRI findings suggest that dynamics have a stronger contribution to population responses in body than face patches.


Subject(s)
Pattern Recognition, Visual , Temporal Lobe , Animals , Macaca mulatta , Pattern Recognition, Visual/physiology , Photic Stimulation/methods , Temporal Lobe/physiology , Magnetic Resonance Imaging/methods , Brain Mapping
2.
Cereb Cortex ; 29(7): 2859-2875, 2019 07 05.
Article in English | MEDLINE | ID: mdl-30060011

ABSTRACT

Cortical plasticity in congenitally blind individuals leads to cross-modal activation of the visual cortex and may lead to superior perceptual processing in the intact sensory domains. Although mental imagery is often defined as a quasi-perceptual experience, it is unknown whether it follows similar cortical reorganization as perception in blind individuals. In this study, we show that auditory versus tactile perception evokes similar intra-modal discriminative patterns in congenitally blind compared with sighted participants. These results indicate that cortical plasticity following visual deprivation does not influence broad intra-modal organization of auditory and tactile perception as measured by our task. Furthermore, not only the blind, but also the sighted participants showed cross-modal discriminative patterns for perception modality in the visual cortex. During mental imagery, both groups showed similar decoding accuracies for imagery modality in the intra-modal primary sensory cortices. However, no cross-modal discriminative information for imagery modality was found in early visual cortex of blind participants, in contrast to the sighted participants. We did find evidence of cross-modal activation of higher visual areas in blind participants, including the representation of specific-imagined auditory features in visual area V4.


Subject(s)
Blindness , Cerebral Cortex/physiology , Imagination/physiology , Neuronal Plasticity/physiology , Acoustic Stimulation , Adult , Female , Humans , Magnetic Resonance Imaging , Male , Middle Aged , Physical Stimulation
3.
Sci Rep ; 8(1): 2692, 2018 02 09.
Article in English | MEDLINE | ID: mdl-29426819

ABSTRACT

The role of empathy and perspective-taking in preventing aggressive behaviors has been highlighted in several theoretical models. In this study, we used immersive virtual reality to induce a full body ownership illusion that allows offenders to be in the body of a victim of domestic abuse. A group of male domestic violence offenders and a control group without a history of violence experienced a virtual scene of abuse in first-person perspective. During the virtual encounter, the participants' real bodies were replaced with a life-sized virtual female body that moved synchronously with their own real movements. Participants' emotion recognition skills were assessed before and after the virtual experience. Our results revealed that offenders have a significantly lower ability to recognize fear in female faces compared to controls, with a bias towards classifying fearful faces as happy. After being embodied in a female victim, offenders improved their ability to recognize fearful female faces and reduced their bias towards recognizing fearful faces as happy. For the first time, we demonstrate that changing the perspective of an aggressive population through immersive virtual reality can modify socio-perceptual processes such as emotion recognition, thought to underlie this specific form of aggressive behaviors.


Subject(s)
Domestic Violence/psychology , Emotional Intelligence/physiology , Visual Perception/physiology , Adult , Aggression/psychology , Anger/physiology , Emotions/physiology , Empathy/physiology , Facial Expression , Fear/psychology , Humans , Illusions/physiology , Male , Surveys and Questionnaires , Virtual Reality
4.
Soc Cogn Affect Neurosci ; 11(8): 1299-309, 2016 08.
Article in English | MEDLINE | ID: mdl-27025242

ABSTRACT

The neural basis of emotion perception has mostly been investigated with single face or body stimuli. However, in daily life one may also encounter affective expressions by groups, e.g. an angry mob or an exhilarated concert crowd. In what way is brain activity modulated when several individuals express similar rather than different emotions? We investigated this question using an experimental design in which we presented two stimuli simultaneously, with same or different emotional expressions. We hypothesized that, in the case of two same-emotion stimuli, brain activity would be enhanced, while in the case of two different emotions, one emotion would interfere with the effect of the other. The results showed that the simultaneous perception of different affective body expressions leads to a deactivation of the amygdala and a reduction of cortical activity. It was revealed that the processing of fearful bodies, compared with different-emotion bodies, relied more strongly on saliency and action triggering regions in inferior parietal lobe and insula, while happy bodies drove the occipito-temporal cortex more strongly. We showed that this design could be used to uncover important differences between brain networks underlying fearful and happy emotions. The enhancement of brain activity for unambiguous affective signals expressed by several people simultaneously supports adaptive behaviour in critical situations.


Subject(s)
Amygdala/physiology , Brain Mapping/methods , Cerebral Cortex/physiology , Emotions/physiology , Kinesics , Social Perception , Adolescent , Adult , Aged , Anger/physiology , Female , Happiness , Humans , Magnetic Resonance Imaging , Male , Middle Aged , Visual Perception/physiology , Young Adult
5.
Wiley Interdiscip Rev Cogn Sci ; 6(2): 149-158, 2015.
Article in English | MEDLINE | ID: mdl-26263069

ABSTRACT

During communication, we perceive and express emotional information through many different channels, including facial expressions, prosody, body motion, and posture. Although historically the human body has been perceived primarily as a tool for actions, there is now increased understanding that the body is also an important medium for emotional expression. Indeed, research on emotional body language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-body signals are processed and understood, at the behavioral and neural levels, with specific reference to their role in emotional communication. The first part of this review outlines brain regions and spectrotemporal dynamics underlying perception of isolated neutral and affective bodies, the second part details the contextual effects on body emotion recognition, and final part discusses body processing on a subconscious level. More specifically, research has shown that body expressions as compared with neutral bodies draw upon a larger network of regions responsible for action observation and preparation, emotion processing, body processing, and integrative processes. Results from neurotypical populations and masking paradigms suggest that subconscious processing of affective bodies relies on a specific subset of these regions. Moreover, recent evidence has shown that emotional information from the face, voice, and body all interact, with body motion and posture often highlighting and intensifying the emotion expressed in the face and voice.


Subject(s)
Emotions/physiology , Facial Expression , Kinesics , Visual Perception/physiology , Brain Mapping , Cognitive Science , Humans , Voice/physiology
6.
Neurophysiol Clin ; 44(5): 457-69, 2014 Nov.
Article in English | MEDLINE | ID: mdl-25438978

ABSTRACT

BACKGROUND: Auditory stimulation is often used to evoke responses in unresponsive patients who have suffered severe brain injury. In order to investigate visual responses, we examined visual evoked potentials (VEPs) and behavioral responses to visual stimuli in vegetative patients during recovery to consciousness. METHODS: Behavioral responses to visual stimuli (visual localization, comprehension of written commands, and object manipulation) and flash VEPs were repeatedly examined in eleven vegetative patients every two weeks for an average period of 2.6months, and patients' VEPs were compared to a healthy control group. Long-term outcome of the patients was assessed 2-3years later. RESULTS: Visual response scores increased during recovery to consciousness for all scales: visual localization, comprehension of written commands, and object manipulation. VEP amplitudes were smaller, and latencies were longer in the patient group relative to the controls. VEPs characteristics at first measurement were related to long-term outcome up to three years after injury. CONCLUSIONS: Our findings show the improvement of visual responding with recovery from the vegetative state to consciousness. Elementary visual processing is present, yet according to VEP responses, poorer in vegetative and minimally conscious state than in healthy controls, and remains poorer when patients recovered to consciousness. However, initial VEPs are related to long-term outcome.


Subject(s)
Behavior/physiology , Brain Injuries/physiopathology , Consciousness/physiology , Convalescence , Evoked Potentials, Visual/physiology , Persistent Vegetative State/physiopathology , Visual Perception/physiology , Adolescent , Adult , Apraxias/physiopathology , Apraxias/psychology , Brain Injuries/psychology , Child , Convalescence/psychology , Electroencephalography , Female , Follow-Up Studies , Humans , Male , Motion Perception/physiology , Motor Skills/physiology , Persistent Vegetative State/psychology , Photic Stimulation , Prognosis , Treatment Outcome , Young Adult
7.
Exp Brain Res ; 228(4): 399-410, 2013 Aug.
Article in English | MEDLINE | ID: mdl-23828232

ABSTRACT

Previous reports have suggested an enhancement of facial expression recognition in women as compared to men. It has also been suggested that men versus women have a greater attentional bias towards angry cues. Research has shown that facial expression recognition impairments and attentional biases towards anger are enhanced in violent criminal male offenders. Bodily expressions of anger form a more direct physical threat as compared to facial expressions. In four experiments, we tested how 29 imprisoned aggressive male offenders perceive body expressions by other males. The performance of all participants in a matching-to-sample task dropped significantly when the distracting image showed an angry posture. Violent offenders misjudged fearful body movements as expressing anger significantly more often than the control group. When violent offenders were asked to categorize facial expressions and ignore the simultaneously presented congruent or incongruent posture, they performed worse than the control group, specifically, when a smile was combined with an aggressive posture. Finally, violent offenders showed a greater congruency effect than controls when viewing postures as part of an emotionally congruent social scene and did not perform above chance when categorizing a happy posture presented in a fight scene. The results suggest that violent offenders have difficulties in processing emotional incongruence when aggressive stimuli are involved and a possible bias towards aggressive body language.


Subject(s)
Criminals/psychology , Emotions , Facial Expression , Kinesics , Smiling/psychology , Violence/psychology , Adult , Emotions/physiology , Female , Humans , Male , Middle Aged , Perception/physiology , Photic Stimulation/methods , Reaction Time/physiology , Smiling/physiology , Young Adult
8.
Schizophr Res ; 146(1-3): 209-16, 2013 May.
Article in English | MEDLINE | ID: mdl-23522906

ABSTRACT

Schizophrenia research has identified deficits in neurocognition, social cognition, and sensory processing. Because a cohesive model of "disturbed cognitive machinery" is currently lacking, we built a conceptual model to integrate neurocognition, social cognition, and sensory processing. In a cross-sectional study, the cognitive performance of participants was measured. In accordance with the Schedules for Clinical Assessment in Neuropsychiatry, the participants were assigned to either the schizophrenia group or the non-schizophrenic psychosis group. Exclusion criteria included substance abuse, serious somatic/neurological illness, and perceptual handicap. The male/female ratio, educational level, and handedness did not differ significantly between the groups. The data were analyzed using structural equation modeling. Based upon the results of all possible pairwise models correlating neurocognition, social cognition, and sensory processing, three omnibus models were analyzed. A statistical analysis of a pairwise model-fit (χ(2), CFI, and RMSEA statistics) revealed poor interrelatedness between sensory processing and neurocognition in schizophrenia patients compared with healthy control participants. The omnibus model that predicted disintegration between sensory processing and neurocognition was statistically confirmed as superior for the schizophrenia group (χ(2)(53) of 56.62, p=0.341, RMSEA=0.04, CFI=0.95). In healthy participants, the model predicting maximal interrelatedness between sensory processing/neurocognition and neurocognition/social cognition gave the best fit (χ(2)(52) of 53.74, p=0.408, RMSEA=0.03, CFI=0.97). The performance of the patients with non-schizophrenic psychosis fell between the schizophrenia patients and control participants. These findings suggest increasing separation between sensory processing and neurocognition along the continuum from mental health to schizophrenia. Our results support a conceptual model that posits disintegration between sensory processing of social stimuli and neurocognitive processing.


Subject(s)
Auditory Perceptual Disorders/etiology , Cognition Disorders/etiology , Models, Psychological , Schizophrenia/complications , Schizophrenic Psychology , Social Behavior , Acoustic Stimulation , Adult , Analysis of Variance , Chi-Square Distribution , Cross-Sectional Studies , Female , Humans , Male , Middle Aged , Neuropsychological Tests , Psychiatric Status Rating Scales
10.
Neuropsychologia ; 50(7): 1211-21, 2012 Jun.
Article in English | MEDLINE | ID: mdl-22245006

ABSTRACT

Interest in sex-related differences in psychological functioning has again come to the foreground with new findings about their possible functional basis in the brain. Sex differences may be one way how evolution has capitalized on the capacity of homologous brain regions to process social information between men and women differently. This paper focuses specifically on the effects of emotional valence, sex of the observed and sex of the observer on regional brain activations. We also discuss the effects of and interactions between environment, hormones, genes and structural differences of the brain in the context of differential brain activity patterns between men and women following exposure to seen expressions of emotion and in this context we outline a number of methodological considerations for future research. Importantly, results show that although women are better at recognizing emotions and express themselves more easily, men show greater responses to threatening cues (dominant, violent or aggressive) and this may reflect different behavioral response tendencies between men and women as well as evolutionary effects. We conclude that sex differences must not be ignored in affective research and more specifically in affective neuroscience.


Subject(s)
Emotions/physiology , Sex Differentiation , Brain/anatomy & histology , Brain/physiology , Female , Functional Laterality , Hormones/metabolism , Humans , Male , Sex Differentiation/genetics
11.
Neuroimage ; 54(2): 1755-62, 2011 Jan 15.
Article in English | MEDLINE | ID: mdl-20723605

ABSTRACT

Neuroscientific research on the perception of emotional signals has mainly focused on how the brain processes threat signals from photographs of facial expressions. Much less is known about body postures or about the processing of dynamic images. We undertook a systematic comparison of the neurofunctional network dedicated to processing facial and bodily expressions. Two functional magnetic resonance imaging (fMRI) experiments investigated whether areas involved in processing social signals are activated differently by threatening signals (fear and anger) from facial or bodily expressions. The amygdala (AMG) was more active for facial than for bodily expressions. Body stimuli triggered higher activation than face stimuli in a number of areas. These were the cuneus, fusiform gyrus (FG), extrastriate body area (EBA), temporoparietal junction (TPJ), superior parietal lobule (SPL), primary somatosensory cortex (SI), as well as the thalamus. Emotion-specific effects were found in TPJ and FG for bodies and faces alike. EBA and superior temporal sulcus (STS) were more activated by threatening bodies.


Subject(s)
Brain Mapping , Brain/physiology , Facial Expression , Posture , Visual Perception/physiology , Adolescent , Adult , Emotions/physiology , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Young Adult
12.
Schizophr Res ; 122(1-3): 136-43, 2010 Sep.
Article in English | MEDLINE | ID: mdl-20554159

ABSTRACT

BACKGROUND: Deficits in emotion perception are a well-established phenomenon in schizophrenic patients and studies have typically used unimodal emotion tasks, presenting either emotional faces or emotional vocalizations. We introduced bimodal emotion conditions in two previous studies in order to study the process of multisensory integration of visible and audible emotion cues. We now build on our earlier work and address the regulatory effects of selective attention mechanisms on the ability to integrate emotion cues stemming from multisensory channels. METHODS: We added a neutral secondary distractor condition to the original bimodal paradigm in order to investigate modality-specific selective attention mechanisms. We compared schizophrenic patients (n=50) to non-schizophrenic psychotic patients (n=46), as well as to healthy controls (n=50). A trained psychiatrist used the Schedules of Clinical Assessment in Neuropsychiatry (SCAN 2.1) to diagnose the patients. RESULTS: As expected, in healthy controls, and to a lesser extent in non-schizophrenic psychotic patients, modality-specific attention attenuated multisensory integration of emotional faces and vocalizations. Conversely, in schizophrenic patients, auditory and visual distractor conditions yielded unaffected and even exaggerated multisensory integration. CONCLUSIONS: These results suggest that schizophrenics, as compared to healthy controls and non-schizophrenic psychotic patients, have modality-specific attention deficits when attempting to integrate information regarding emotions that stem from multichannel sources. Various explanations for our findings, as well as their possible consequences, are discussed.


Subject(s)
Attention Deficit Disorder with Hyperactivity/etiology , Emotions , Perceptual Disorders/etiology , Schizophrenia/complications , Schizophrenic Psychology , Acoustic Stimulation/methods , Adult , Analysis of Variance , Attention Deficit Disorder with Hyperactivity/diagnosis , Auditory Perception/physiology , Discrimination, Psychological , Face , Female , Humans , Male , Middle Aged , Pattern Recognition, Visual/physiology , Perceptual Disorders/diagnosis , Photic Stimulation/methods , Reaction Time/physiology
13.
Neuroimage ; 49(2): 1717-27, 2010 Jan 15.
Article in English | MEDLINE | ID: mdl-19804836

ABSTRACT

We casually observe many interactions that do not really concern us. Yet sometimes we need to be able to rapidly appraise whether an interaction between two people represents a real threat for one of them rather than an innocent tease. Using functional magnetic resonance imaging, we investigated whether small differences in the body language of two interacting people are picked up by the brain even if observers are performing an unrelated task. Fourteen participants were scanned while watching 3-s movies (192 trials and 96 scrambles) showing a male person either threatening or teasing a female one. In one task condition, observers categorized the interaction as threatening or teasing, and in the other, they monitored randomly appearing dots and categorized the color. Our results clearly show that right amygdala responds more to threatening than to teasing situations irrespective of the observers' task. When observers' attention is not explicitly directed to the situation, this heightened amygdala activation goes together with increased activity in body sensitive regions in fusiform gyrus, extrastriate body area-human motion complex and superior temporal sulcus and is associated with a better behavioral performance of the participants during threatening situations. In addition, regions involved in action observation (inferior frontal gyrus, temporoparietal junction, and inferior parietal lobe) and preparation (premotor, putamen) show increased activation for threat videos. Also regions involved in processing moral violations (temporoparietal junction, hypothalamus) reacted selectively to the threatening interactions. Taken together, our results show which brain regions react selectively to witnessing a threatening interaction even if the situation is not attended because the observers perform an unrelated task.


Subject(s)
Brain/physiology , Emotions , Interpersonal Relations , Judgment/physiology , Kinesics , Social Behavior , Aggression , Brain Mapping , Color , Female , Humans , Magnetic Resonance Imaging , Male , Neuropsychological Tests , Photic Stimulation , Play and Playthings , Video Recording , Visual Perception/physiology , Young Adult
14.
Neuropsychologia ; 47(8-9): 1816-25, 2009 Jul.
Article in English | MEDLINE | ID: mdl-19428413

ABSTRACT

The ability to grasp emotional messages in everyday gestures and respond to them is at the core of successful social communication. The hypothesis that abnormalities in socio-emotional behavior in people with autism are linked to a failure to grasp emotional significance conveyed by gestures was explored. We measured brain activity using fMRI during perception of fearful or neutral actions and showed that whereas similar activation of brain regions known to play a role in action perception was revealed in both autistics and controls, autistics failed to activate amygdala, inferior frontal gyrus and premotor cortex when viewing gestures expressing fear. Our results support the notion that dysfunctions in this network may contribute significantly to the characteristic communicative impairments documented in autism.


Subject(s)
Autistic Disorder , Brain Mapping , Brain/physiopathology , Emotions/physiology , Hand Strength/physiology , Adolescent , Adult , Autistic Disorder/complications , Autistic Disorder/pathology , Autistic Disorder/psychology , Brain/blood supply , Female , Humans , Image Processing, Computer-Assisted/methods , Magnetic Resonance Imaging/methods , Male , Middle Aged , Neuropsychological Tests , Oxygen/blood , Photic Stimulation , Young Adult
15.
Schizophr Res ; 107(2-3): 286-93, 2009 Feb.
Article in English | MEDLINE | ID: mdl-18986799

ABSTRACT

Since Kraepelin called dementia praecox what we nowadays call schizophrenia, cognitive dysfunction has been regarded as central to its psychopathological profile. Disturbed experience and integration of emotions are, both intuitively and experimentally, likely to be intermediates between basic, non-social cognitive disturbances and functional outcome in schizophrenia. While a number of studies have consistently proven that, as part of social cognition, recognition of emotional faces and voices is disturbed in schizophrenics, studies on multisensory integration of facial and vocal affect are rare. We investigated audiovisual integration of emotional faces and voices in three groups: schizophrenic patients, non-schizophrenic psychosis patients and mentally healthy controls, all diagnosed by means of the Schedules of Clinical Assessment in Neuropsychiatry (SCAN 2.1). We found diminished crossmodal influence of emotional faces on emotional voice categorization in schizophrenics, but not in non-schizophrenia psychosis patients. Results are discussed in the perspective of recent theories on multisensory integration.


Subject(s)
Cognition Disorders/diagnosis , Emotions , Facial Expression , Pattern Recognition, Visual , Psychotic Disorders/diagnosis , Schizophrenia/diagnosis , Schizophrenic Psychology , Speech Perception , Adult , Cognition Disorders/psychology , Discrimination, Psychological , Female , Humans , Male , Middle Aged , Psychotic Disorders/psychology , Young Adult
16.
Neurophysiol Clin ; 38(3): 163-9, 2008 Jun.
Article in English | MEDLINE | ID: mdl-18539249

ABSTRACT

Recent advances in functional brain imaging offer unique opportunities to explore the neurofunctional basis of tools used to assess personality differences which have proven their clinical usefulness. In this functional magnetic resonance imaging (fMRI) study, the focus was on the amygdala activation and we investigated whether individual differences in activity of the amygdala following presentation of emotional expressions in the face and the whole body may be systematically related to the presence of Type D (distressed) personality or to its constituting factors, Negative Affectivity (NA) and Social Inhibition (SI). Our results show that the observed difference in amygdala activity between fearful and neutral expressions was present in participants that did not meet the criteria for Type D personality, while this effect was absent in participants that could be classified as Type D personality. Our correlation analyses further showed that the activation in the left amygdala elicited by fearful versus neutral bodily expressions correlated negatively with the Negative Affectivity score. The same pattern was observed for the right amygdala for fearful facial and bodily expressions when contrasted with neutral facial and bodily expressions.


Subject(s)
Amygdala/physiopathology , Facial Expression , Fear/psychology , Personality Disorders/physiopathology , Personality Disorders/psychology , Adult , Affect/physiology , Amygdala/metabolism , Depression/metabolism , Depression/physiopathology , Depression/psychology , Humans , Inhibition, Psychological , Magnetic Resonance Imaging , Male , Personality Disorders/metabolism , Social Behavior , Surveys and Questionnaires
17.
Brain Res ; 1186: 233-41, 2007 Dec.
Article in English | MEDLINE | ID: mdl-17996856

ABSTRACT

Recent findings indicate that the perceptual processing of fearful expressions in the face can already be initiated around 100-120 ms after stimulus presentation, demonstrating that emotional information of a face can be encoded before the identity of the face is fully recognized. At present it is not clear whether fear signals from body expressions may be encoded equally as rapid. To answer this question we investigated the early temporal dynamics of perceiving fearful body expression by measuring EEG. Participants viewed images of whole body actions presented either in a neutral or a fearful version. We observed an early emotion effect on the P1 peak latency around 112 ms post stimulus onset hitherto only found for facial expressions. Also consistent with the majority of facial expression studies, the N170 component elicited by perceiving bodies proved not to be sensitive for the expressed fear. In line with previous work, its vertex positive counterpart, the VPP, did show a condition-specific influence for fearful body expression. Our results indicate that the information provided by fearful body expression is already encoded in the early stages of visual processing, and suggest that similar early processing mechanisms are involved in the perception of fear in faces and bodies.


Subject(s)
Comprehension/physiology , Evoked Potentials, Visual/physiology , Fear/psychology , Kinesics , Recognition, Psychology/physiology , Social Perception , Adult , Female , Humans , Male , Reaction Time/physiology , Reference Values , Time Factors
18.
Neuroimage ; 35(2): 959-67, 2007 Apr 01.
Article in English | MEDLINE | ID: mdl-17270466

ABSTRACT

Characteristic fear behaviour like putting the hands in front of the face and running for cover provides strong fear signals to observers who may not themselves be aware of any danger. Using event-related functional magnetic resonance imaging (fMRI) in humans, we investigated how such dynamic fear signals from the whole body are perceived. A factorial design allowed us to investigate brain activity induced by viewing bodies, bodily expressions of fear and the role of dynamic information in viewing them. Our critical findings are threefold. We find that viewing neutral and fearful body expressions enhances amygdala activity; moreover actions expressing fear activate the temporal pole and lateral orbital cortex more than neutral actions; and finally differences in activations between static and dynamic bodily expressions were larger for actions expressing fear in the STS and premotor cortex compared to neutral actions.


Subject(s)
Brain/physiology , Fear , Gestures , Magnetic Resonance Imaging , Perception/physiology , Adult , Female , Humans , Male
19.
Clin Neurophysiol ; 118(3): 597-605, 2007 Mar.
Article in English | MEDLINE | ID: mdl-17239656

ABSTRACT

OBJECTIVE: Mismatch negativity (MMN) is an automatic event related brain response, well investigated in the acute phase after severe brain injury: the presence of a MMN is often found to predict the emergence from coma, and the exclusion of shifting into a vegetative state (VS). In the present study MMN was examined during recovery from VS. METHODS: Ten vegetative patients were repeatedly examined every 2 weeks for an average period of 3.5 months. Amplitudes and latencies were related to the patients' recovery from VS to consciousness, and to a healthy norm group. In addition, MMN was examined on its prognostic value in VS patients, in predicting recovery to consciousness and long-term functional outcome. RESULTS: With recovery to consciousness MMN-amplitudes increased. A sudden increase was seen in MMN amplitude when patients started to show inconsistent behavioural responses to simple commands. At this level MMN resembled the MMN response as was seen in the norm group. In addition, the MMN-amplitude and latency during the first measurement predicted the patients' outcome on recovery to consciousness. CONCLUSIONS: With recovery from VS to consciousness the ability to process auditory stimulus deviance increases. A sudden enhancement in MMN-amplitude preceded overt communication with the environment. This might be indicative of the consolidation of neural networks underlying overt communication. Moreover, MMN can be helpful in identifying the ability to recover from VS. SIGNIFICANCE: MMN can be used to track recovery from the vegetative state in the post-acute phase after severe brain injury. In addition, MMN can be used to predict the ability to recover from the vegetative state.


Subject(s)
Contingent Negative Variation/physiology , Persistent Vegetative State/physiopathology , Recovery of Function/physiology , Adolescent , Adult , Brain Injuries/physiopathology , Child , Cognition/physiology , Consciousness/physiology , Electroencephalography , Female , Humans , Longitudinal Studies , Male , Persistent Vegetative State/diagnosis , Predictive Value of Tests , Prognosis , Reaction Time/physiology
20.
Proc Natl Acad Sci U S A ; 102(51): 18682-7, 2005 Dec 20.
Article in English | MEDLINE | ID: mdl-16352717

ABSTRACT

Nonconscious recognition of facial expressions opens an intriguing possibility that two emotions can be present together in one brain with unconsciously and consciously perceived inputs interacting. We investigated this interaction in three experiments by using a hemianope patient with residual nonconscious vision. During simultaneous presentation of facial expressions to the intact and the blind field, we measured interactions between conscious and nonconsciously recognized images. Fear-specific congruence effects were expressed as enhanced neuronal activity in fusiform gyrus, amygdala, and pulvinar. Nonconscious facial expressions also influenced processing of consciously recognized emotional voices. Emotional congruency between visual and an auditory input enhances activity in amygdala and superior colliculus for blind, relative to intact, field presentation of faces. Our findings indicate that recognition of fear is mandatory and independent of awareness. Most importantly, unconscious fear recognition remains robust even in the light of a concurrent incongruent happy facial expression or an emotional voice of which the observer is aware.


Subject(s)
Awareness/physiology , Emotions/physiology , Facial Expression , Fear/physiology , Fear/psychology , Unconscious, Psychology , Voice , Auditory Perception/physiology , Brain/cytology , Face , Happiness , Humans , Male , Middle Aged , Visual Perception/physiology , Voice/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...