Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
Add more filters










Publication year range
1.
PLoS One ; 18(6): e0285767, 2023.
Article in English | MEDLINE | ID: mdl-37379260

ABSTRACT

Friendships are central to our social lives, yet little is known about individual differences associated with the number of friends people enjoy spending time with. Here we present the Friendship Habits Questionnaire (FHQ), a new scale of group versus dyadic-oriented friendship styles. Three studies investigated the psychometric properties of group-oriented friendships and the relevant individual differences. The initially developed questionnaire measured individual differences in extraversion as well as desire for intimacy, competitiveness, and group identification, traits that previous research links with socializing in groups versus one-to-one friendships. In three validation studies involving more than 800 participants (353 men, age M = 25.76) and using principal and confirmatory factor analyses, we found that the structure of the FHQ is best described with four dimensions: extraversion, intimacy, positive group identification, and negative group identification. Therefore, competitiveness was dropped from the final version of the FHQ. Moreover, FHQ scores reliably predicted the size of friendship groups in which people enjoy socializing, suggesting good construct validity. Together, our results document individual differences in pursuing group versus dyadic-oriented friendships and provide a new tool for measuring such differences.


Subject(s)
Friends , Interpersonal Relations , Male , Humans , Peer Group , Social Identification , Surveys and Questionnaires
2.
PLoS One ; 16(12): e0260814, 2021.
Article in English | MEDLINE | ID: mdl-34855898

ABSTRACT

Certain facial features provide useful information for recognition of facial expressions. In two experiments, we investigated whether foveating informative features of briefly presented expressions improves recognition accuracy and whether these features are targeted reflexively when not foveated. Angry, fearful, surprised, and sad or disgusted expressions were presented briefly at locations which would ensure foveation of specific features. Foveating the mouth of fearful, surprised and disgusted expressions improved emotion recognition compared to foveating an eye or cheek or the central brow. Foveating the brow led to equivocal results in anger recognition across the two experiments, which might be due to the different combination of emotions used. There was no consistent evidence suggesting that reflexive first saccades targeted emotion-relevant features; instead, they targeted the closest feature to initial fixation. In a third experiment, angry, fearful, surprised and disgusted expressions were presented for 5 seconds. Duration of task-related fixations in the eyes, brow, nose and mouth regions was modulated by the presented expression. Moreover, longer fixation at the mouth positively correlated with anger and disgust accuracy both when these expressions were freely viewed (Experiment 2b) and when briefly presented at the mouth (Experiment 2a). Finally, an overall preference to fixate the mouth across all expressions correlated positively with anger and disgust accuracy. These findings suggest that foveal processing of informative features is functional/contributory to emotion recognition, but they are not automatically sought out when not foveated, and that facial emotion recognition performance is related to idiosyncratic gaze behaviour.


Subject(s)
Emotions/physiology , Eye Movements/physiology , Facial Expression , Facial Recognition/physiology , Fovea Centralis/physiology , Recognition, Psychology/physiology , Adolescent , Adult , Anger , Fear , Female , Happiness , Humans , Male , Young Adult
3.
Front Psychol ; 11: 309, 2020.
Article in English | MEDLINE | ID: mdl-32194476

ABSTRACT

Recent models of emotion recognition suggest that when people perceive an emotional expression, they partially activate the respective emotion in themselves, providing a basis for the recognition of that emotion. Much of the focus of these models and of their evidential basis has been on sensorimotor simulation as a basis for facial expression recognition - the idea, in short, that coming to know what another feels involves simulating in your brain the motor plans and associated sensory representations engaged by the other person's brain in producing the facial expression that you see. In this review article, we argue that simulation accounts of emotion recognition would benefit from three key extensions. First, that fuller consideration be given to simulation of bodily and vocal expressions, given that the body and voice are also important expressive channels for providing cues to another's emotional state. Second, that simulation of other aspects of the perceived emotional state, such as changes in the autonomic nervous system and viscera, might have a more prominent role in underpinning emotion recognition than is typically proposed. Sensorimotor simulation models tend to relegate such body-state simulation to a subsidiary role, despite the plausibility of body-state simulation being able to underpin emotion recognition in the absence of typical sensorimotor simulation. Third, that simulation models of emotion recognition be extended to address how embodied processes and emotion recognition abilities develop through the lifespan. It is not currently clear how this system of sensorimotor and body-state simulation develops and in particular how this affects the development of emotion recognition ability. We review recent findings from the emotional body recognition literature and integrate recent evidence regarding the development of mimicry and interoception to significantly expand simulation models of emotion recognition.

4.
J Exp Psychol Hum Percept Perform ; 46(3): 292-312, 2020 Mar.
Article in English | MEDLINE | ID: mdl-32077743

ABSTRACT

At normal interpersonal distances all features of a face cannot fall within one's fovea simultaneously. Given that certain facial features are differentially informative of different emotions, does the ability to identify facially expressed emotions vary according to the feature fixated and do saccades preferentially seek diagnostic features? Previous findings are equivocal. We presented faces for a brief time, insufficient for a saccade, at a spatial position that guaranteed that a given feature-an eye, cheek, the central brow, or mouth-fell at the fovea. Across 2 experiments, observers were more accurate and faster at discriminating angry expressions when the high spatial-frequency information of the brow was projected to their fovea than when 1 or other cheek or eye was. Performance in classifying fear and happiness (Experiment 1) was not influenced by whether the most informative features (eyes and mouth, respectively) were projected foveally or extrafoveally. Observers more accurately distinguished between fearful and surprised expressions (Experiment 2) when the mouth was projected to the fovea. Reflexive first saccades tended toward the left and center of the face rather than preferentially targeting emotion-distinguishing features. These results reflect the integration of task-relevant information across the face constrained by the differences between foveal and extrafoveal processing (Peterson & Eckstein, 2012). (PsycINFO Database Record (c) 2020 APA, all rights reserved).


Subject(s)
Emotions , Eye Movements/physiology , Facial Expression , Facial Recognition/physiology , Social Perception , Adult , Female , Fixation, Ocular/physiology , Fovea Centralis/physiology , Humans , Male , Young Adult
5.
Exp Brain Res ; 237(11): 2875-2883, 2019 Nov.
Article in English | MEDLINE | ID: mdl-31471678

ABSTRACT

Adults use vision during stepping and walking to fine-tune foot placement. However, the developmental profile of visually guided stepping is unclear. We asked (1) whether children use online vision to fine-tune precise steps and (2) whether precision stepping develops as part of broader visuomotor development, alongside other fundamental motor skills like reaching. With 6-(N = 11), 7-(N = 11), 8-(N = 11)-year-olds and adults (N = 15), we manipulated visual input during steps and reaches. Using motion capture, we measured step and reach error, and postural stability. We expected (1) both steps and reaches would be visually guided (2) with similar developmental profiles (3) foot placement biases that promote stability, and (4) correlations between postural stability and step error. Children used vision to fine-tune both steps and reaches. At all ages, foot placement was biased (albeit not in the predicted directions). Contrary to our predictions, step error was not correlated with postural stability. By 8 years, children's step and reach error were adult-like. Despite similar visual control mechanisms, stepping and reaching had different developmental profiles: step error reduced with age whilst reach error was lower and stable with age. We argue that the development of both visually guided and non-visually guided action is limb-specific.


Subject(s)
Child Development/physiology , Motor Activity/physiology , Postural Balance/physiology , Psychomotor Performance/physiology , Visual Perception/physiology , Adult , Child , Female , Foot , Humans , Male , Young Adult
6.
Soc Cogn Affect Neurosci ; 13(12): 1269-1279, 2018 12 04.
Article in English | MEDLINE | ID: mdl-30351422

ABSTRACT

Both when actions are executed and observed, electroencephalography (EEG) has shown reduced alpha-band (8-12 Hz) oscillations over sensorimotor cortex. This 'µ-alpha' suppression is thought to reflect mental simulation of action, which has been argued to support internal representation of others' emotional states. Despite the proposed role of simulation in emotion perception, little is known about the effect of emotional content on µ-suppression. We recorded high-density EEG while participants viewed point-light displays of emotional vs neutral body movements in 'coherent' biologically plausible and 'scrambled' configurations. Although coherent relative to scrambled stimuli elicited µ-alpha suppression, the comparison of emotional and neutral movement, controlling for basic visual input, revealed suppression effects in both alpha and beta bands. Whereas alpha-band activity reflected reduced power for emotional stimuli in central and occipital sensors, beta power at frontocentral sites was driven by enhancement for neutral relative to emotional actions. A median-split by autism-spectrum quotient score revealed weaker µ-alpha suppression and beta enhancement in participants with autistic tendencies, suggesting that sensorimotor simulation may be differentially engaged depending on social capabilities. Consistent with theories of embodied emotion, these data support a link between simulation and social perception while more firmly connecting emotional processing to the activity of sensorimotor systems.


Subject(s)
Emotions/physiology , Movement/physiology , Adult , Autistic Disorder , Beta Rhythm , Electroencephalography , Female , Humans , Male , Young Adult
7.
Dev Sci ; 18(2): 243-53, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25041388

ABSTRACT

Reading others' emotional body expressions is an essential social skill. Adults readily recognize emotions from body movements. However, it is unclear when in development infants become sensitive to bodily expressed emotions. We examined event-related brain potentials (ERPs) in 4- and 8-month-old infants in response to point-light displays (PLDs) of happy and fearful body expressions presented in two orientations (upright and inverted). The ERP results revealed that 8-month-olds but not 4-month-olds respond sensitively to the orientation and the emotion of the dynamic expressions. Specifically, 8-month-olds showed (i) an early (200-400 ms) orientation-sensitive positivity over frontal and central electrodes, and (ii) a late (700-1100 ms) emotion-sensitive positivity over temporal and parietal electrodes in the right hemisphere. These findings suggest that orientation-sensitive and emotion-sensitive brain processes, distinct in timing and topography, develop between 4 and 8 months of age.


Subject(s)
Brain/physiology , Child Development , Emotions/physiology , Evoked Potentials/physiology , Expressed Emotion/physiology , Brain Mapping , Child, Preschool , Electroencephalography , Female , Functional Laterality , Humans , Male , Photic Stimulation , Time Factors , Visual Perception/physiology
8.
Front Hum Neurosci ; 8: 531, 2014.
Article in English | MEDLINE | ID: mdl-25104929

ABSTRACT

Responding to others' emotional body expressions is an essential social skill in humans. Adults readily detect emotions from body postures, but it is unclear whether infants are sensitive to emotional body postures. We examined 8-month-old infants' brain responses to emotional body postures by measuring event-related potentials (ERPs) to happy and fearful bodies. Our results revealed two emotion-sensitive ERP components: body postures evoked an early N290 at occipital electrodes and a later Nc at fronto-central electrodes that were enhanced in response to fearful (relative to happy) expressions. These findings demonstrate that: (a) 8-month-old infants discriminate between static emotional body postures; and (b) similar to infant emotional face perception, the sensitivity to emotional body postures is reflected in early perceptual (N290) and later attentional (Nc) neural processes. This provides evidence for an early developmental emergence of the neural processes involved in the discrimination of emotional body postures.

9.
Brain Cogn ; 82(2): 219-27, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23561915

ABSTRACT

According to Damasio's somatic marker hypothesis, emotions are generated by conveying the current state of the body to the brain through interoceptive and proprioceptive afferent input. The resulting brain activation patterns represent unconscious emotions and correlate with subjective feelings. This proposition implies a corollary that the deliberate control of motor behavior could regulate feelings. We tested this possibility, hypothesizing that engaging in movements associated with a certain emotion would enhance that emotion and/or the corresponding valence. Furthermore, because motor imagery and observation are thought to activate the same mirror-neuron network engaged during motor execution, they might also activate the same emotional processing circuits, leading to similar emotional effects. Therefore, we measured the effects of motor execution, motor imagery and observation of whole-body dynamic expressions of emotions (happiness, sadness, fear) on affective state. All three tasks enhanced the corresponding affective state, indicating their potential to regulate emotions.


Subject(s)
Affect/physiology , Emotions/physiology , Imagination/physiology , Movement/physiology , Adult , Brain/physiology , Humans , Surveys and Questionnaires
10.
Cognition ; 124(3): 261-71, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22717166

ABSTRACT

Personality trait attribution can underpin important social decisions and yet requires little effort; even a brief exposure to a photograph can generate lasting impressions. Body movement is a channel readily available to observers and allows judgements to be made when facial and body appearances are less visible; e.g., from great distances. Across three studies, we assessed the reliability of trait judgements of point-light walkers and identified motion-related visual cues driving observers' judgements. The findings confirm that observers make reliable, albeit inaccurate, trait judgements, and these were linked to a small number of motion components derived from a Principal Component Analysis of the motion data. Parametric manipulation of the motion components linearly affected trait ratings, providing strong evidence that the visual cues captured by these components drive observers' trait judgements. Subsequent analyses suggest that reliability of trait ratings was driven by impressions of emotion, attractiveness and masculinity.


Subject(s)
Cues , Gait/physiology , Judgment/physiology , Personality/physiology , Behavior , Female , Gender Identity , Humans , Male , Movement , Photic Stimulation , Principal Component Analysis , Reproducibility of Results , Social Perception , Young Adult
11.
Neuroimage ; 59(2): 1700-12, 2012 Jan 16.
Article in English | MEDLINE | ID: mdl-21924368

ABSTRACT

Neural regions selective for facial or bodily form also respond to facial or bodily motion in highly form-degraded point-light displays. Yet it is unknown whether these face-selective and body-selective regions are sensitive to human motion regardless of stimulus type (faces and bodies) or to the specific motion-related cues characteristic of their proprietary stimulus categories. Using fMRI, we show that facial and bodily motions activate selectively those populations of neurons that code for the static structure of faces and bodies. Bodily (vs. facial) motion activated body-selective EBA bilaterally and right but not left FBA, irrespective of whether observers judged the emotion or color-change in point-light angry, happy and neutral stimuli. Facial (vs. bodily) motion activated face-selective right and left FFA, but only during emotion judgments for right FFA. Moreover, the strength of responses to point-light bodies vs. faces positively correlated with voxelwise selectivity for static bodies but not faces, whereas the strength of responses to point-light faces positively correlated with voxelwise selectivity for static faces but not bodies. Emotional content carried by point-light form-from-motion cues was sufficient to enhance the activity of several regions, including bilateral EBA and right FFA and FBA. However, although the strength of emotional modulation in right and left EBA by point-light body movements was related to the degree of voxelwise selectivity to static bodies but not static faces, there was no evidence that emotional modulation in fusiform cortex occurred in a similarly stimulus category-selective manner. This latter finding strongly constrains the claim that emotionally expressive movements modulate precisely those neuronal populations that code for the viewed stimulus category.


Subject(s)
Affect/physiology , Body Image , Brain/physiology , Expressed Emotion/physiology , Motion Perception/physiology , Pattern Recognition, Visual/physiology , Photic Stimulation/methods , Adaptation, Physiological/physiology , Adult , Face , Female , Humans , Male , Young Adult
12.
Philos Trans R Soc Lond B Biol Sci ; 366(1571): 1726-38, 2011 Jun 12.
Article in English | MEDLINE | ID: mdl-21536556

ABSTRACT

Face processing relies on a distributed, patchy network of cortical regions in the temporal and frontal lobes that respond disproportionately to face stimuli, other cortical regions that are not even primarily visual (such as somatosensory cortex), and subcortical structures such as the amygdala. Higher-level face perception abilities, such as judging identity, emotion and trustworthiness, appear to rely on an intact face-processing network that includes the occipital face area (OFA), whereas lower-level face categorization abilities, such as discriminating faces from objects, can be achieved without OFA, perhaps via the direct connections to the fusiform face area (FFA) from several extrastriate cortical areas. Some lesion, transcranial magnetic stimulation (TMS) and functional magnetic resonance imaging (fMRI) findings argue against a strict feed-forward hierarchical model of face perception, in which the OFA is the principal and common source of input for other visual and non-visual cortical regions involved in face perception, including the FFA, face-selective superior temporal sulcus and somatosensory cortex. Instead, these findings point to a more interactive model in which higher-level face perception abilities depend on the interplay between several functionally and anatomically distinct neural regions. Furthermore, the nature of these interactions may depend on the particular demands of the task. We review the lesion and TMS literature on this topic and highlight the dynamic and distributed nature of face processing.


Subject(s)
Cerebral Cortex/physiology , Face , Neuropsychology , Visual Perception , Animals , Brain Mapping , Emotions/physiology , Humans , Magnetic Resonance Imaging , Models, Psychological , Prosopagnosia/psychology , Psychomotor Performance , Transcranial Magnetic Stimulation
13.
J Cogn Neurosci ; 23(10): 2782-96, 2011 Oct.
Article in English | MEDLINE | ID: mdl-21254798

ABSTRACT

Judging the sex of faces relies on cues related to facial morphology and spatial relations between features, whereas judging the trustworthiness of faces relies on both structural and expressive cues that signal affective valence. The right occipital face area (OFA) processes structural cues and has been associated with sex judgments, whereas the posterior STS processes changeable facial cues related to muscle movements and is activated when observers judge trustworthiness. It is commonly supposed that the STS receives inputs from the OFA, yet it is unknown whether these regions have functionally dissociable, critical roles in sex and trustworthiness judgments. We addressed this issue using event-related, fMRI-guided repetitive transcranial magnetic stimulation (rTMS). Twelve healthy volunteers judged the sex of individually presented faces and, in a separate session, whether those same faces were trustworthy or not. Relative to sham stimulation, RTs were significantly longer for sex judgments when rTMS was delivered over the right OFA but not the right or left STS, and for trustworthiness judgments on male but not female faces when rTMS was delivered over the right STS or left STS but not the right OFA. Nonetheless, an analysis of the RT distributions revealed a possible critical role also for the right OFA in trustworthiness judgments, limited to faces with longer RTs, perhaps reflecting the later, ancillary use of structural cues related to the sex of the face. On the whole, our findings provide evidence that evaluations of the trustworthiness and sex of faces rely on functionally dissociable cortical regions.


Subject(s)
Cerebral Cortex/physiology , Evoked Potentials/physiology , Functional Laterality/physiology , Gender Identity , Interpersonal Relations , Judgment , Adult , Analysis of Variance , Brain Mapping , Cues , Electroencephalography/methods , Female , Humans , Male , Middle Aged , Pattern Recognition, Visual/physiology , Photic Stimulation , Reaction Time , Transcranial Magnetic Stimulation
14.
J Neurosci ; 30(30): 10127-34, 2010 Jul 28.
Article in English | MEDLINE | ID: mdl-20668196

ABSTRACT

Basic emotional states (such as anger, fear, and joy) can be similarly conveyed by the face, the body, and the voice. Are there human brain regions that represent these emotional mental states regardless of the sensory cues from which they are perceived? To address this question, in the present study participants evaluated the intensity of emotions perceived from face movements, body movements, or vocal intonations, while their brain activity was measured with functional magnetic resonance imaging (fMRI). Using multivoxel pattern analysis, we compared the similarity of response patterns across modalities to test for brain regions in which emotion-specific patterns in one modality (e.g., faces) could predict emotion-specific patterns in another modality (e.g., bodies). A whole-brain searchlight analysis revealed modality-independent but emotion category-specific activity patterns in medial prefrontal cortex (MPFC) and left superior temporal sulcus (STS). Multivoxel patterns in these regions contained information about the category of the perceived emotions (anger, disgust, fear, happiness, sadness) across all modality comparisons (face-body, face-voice, body-voice), and independently of the perceived intensity of the emotions. No systematic emotion-related differences were observed in the overall amplitude of activation in MPFC or STS. These results reveal supramodal representations of emotions in high-level brain areas previously implicated in affective processing, mental state attribution, and theory-of-mind. We suggest that MPFC and STS represent perceived emotions at an abstract, modality-independent level, and thus play a key role in the understanding and categorization of others' emotional mental states.


Subject(s)
Brain Mapping , Brain/physiology , Emotions/physiology , Models, Neurological , Adult , Analysis of Variance , Brain/blood supply , Face , Facial Expression , Female , Humans , Image Processing, Computer-Assisted/methods , Magnetic Resonance Imaging/methods , Male , Oxygen/blood , Pattern Recognition, Visual/physiology , Photic Stimulation/methods , Reaction Time/physiology , Young Adult
15.
Cortex ; 46(5): 650-7, 2010 May.
Article in English | MEDLINE | ID: mdl-19505685

ABSTRACT

Paraneoplastic limbic encephalitis (PNLE) affects limbic portions of the brain associated with recognition of social signals of emotions. Yet it is not known whether this perceptual ability is impaired in individuals with PNLE. We therefore conducted a single case study to explore possible impairments in recognising facially, vocally and bodily expressed emotions, using standardised emotion recognition tests. Facial expression recognition was tested with two forced-choice emotion-labelling tasks using static faces with either prototypical or morphed blends of basic emotions. Recognition of vocally and bodily expressed emotions was also tested with forced-choice labelling tasks, one based on prosodic cues, the other on whole-body movement cues. We found a deficit in fear and disgust recognition from both face and voice, while recognition of bodily expressed emotions was unaffected. These findings are consistent with data from previous studies demonstrating critical roles for certain brain regions - particularly the amygdala and insular cortex - in processing facially and vocally displayed basic emotions, and furthermore, suggest that recognition of bodily expressed emotions may not depend on neural structures involved in facial and vocal emotion recognition. Impaired facial and vocal emotion recognition may form a further neuropsychological marker of limbic encephalitis, in addition to the already well-described mnestic deficits.


Subject(s)
Emotions , Limbic Encephalitis , Recognition, Psychology , Social Perception , Speech Perception , Visual Perception , Acoustic Stimulation , Adult , Brain/pathology , Face , Facial Expression , Humans , Limbic Encephalitis/pathology , Magnetic Resonance Imaging , Male , Neuropsychological Tests , Pattern Recognition, Physiological , Pattern Recognition, Visual , Photic Stimulation , Speech
16.
Neuropsychologia ; 47(13): 3023-9, 2009 Nov.
Article in English | MEDLINE | ID: mdl-19500604

ABSTRACT

Recent research has confirmed that individuals with autism spectrum disorder (ASD) have difficulties in recognizing emotions from body movements. Difficulties in perceiving coherent motion are also common in ASD. Yet it is unknown whether these two impairments are related. Thirteen adults with ASD and 16 age- and IQ-matched typically developing (TD) adults classified basic emotions from point-light and full-light displays of body movements and discriminated the direction of coherent motion in random-dot kinematograms. The ASD group was reliably less accurate in classifying emotions regardless of stimulus display type, and in perceiving coherent motion. As predicted, ASD individuals with higher motion coherence thresholds were less accurate in classifying emotions from body movements, especially in the point-light displays; this relationship was not evident for the TD group. The results are discussed in relation to recent models of biological motion processing and known abnormalities in the neural substrates of motion and social perception in ASD.


Subject(s)
Autistic Disorder/psychology , Child Development Disorders, Pervasive/psychology , Emotions , Motion Perception , Movement , Recognition, Psychology , Adolescent , Adult , Asperger Syndrome/psychology , Case-Control Studies , Child , Choice Behavior , Female , Humans , Male , Middle Aged
17.
Neuropsychologia ; 45(12): 2772-82, 2007 Sep 20.
Article in English | MEDLINE | ID: mdl-17561172

ABSTRACT

Bilateral amygdala lesions impair the ability to identify certain emotions, especially fear, from facial expressions, and neuroimaging studies have demonstrated differential amygdala activation as a function of the emotional expression of faces, even under conditions of subliminal presentation, and again especially for fear. Yet the amygdala's role in processing emotion from other classes of stimuli remains poorly understood. On the basis of its known connectivity as well as prior studies in humans and animals, we hypothesised that the amygdala would be important also for the recognition of fear from body expressions. To test this hypothesis, we assessed a patient (S.M.) with complete bilateral amygdala lesions who is known to be severely impaired at recognising fear from faces. S.M. completed a battery of tasks involving forced-choice labelling and rating of the emotions in two sets of dynamic body movement stimuli, as well as in a set of static body postures. Unexpectedly, S.M.'s performance was completely normal. We replicated the finding in a second rare subject with bilateral lesions entirely confined to the amygdala. Compared to healthy comparison subjects, neither of the amygdala lesion subjects was impaired in identifying fear from any of these displays. Thus, whatever the role of the amygdala in processing whole-body fear cues, it is apparently not necessary for the normal recognition of fear from either static or dynamic body expressions.


Subject(s)
Amygdala/injuries , Cues , Facial Expression , Fear/psychology , Recognition, Psychology/physiology , Adolescent , Adult , Emotions/physiology , Female , Gestures , Humans , Kinesics , Middle Aged , Neuropsychological Tests , Photic Stimulation , Posture/physiology , Reaction Time/physiology
18.
Soc Cogn Affect Neurosci ; 2(4): 274-83, 2007 Dec.
Article in English | MEDLINE | ID: mdl-18985133

ABSTRACT

Emotionally expressive faces have been shown to modulate activation in visual cortex, including face-selective regions in ventral temporal lobe. Here, we tested whether emotionally expressive bodies similarly modulate activation in body-selective regions. We show that dynamic displays of bodies with various emotional expressions vs neutral bodies, produce significant activation in two distinct body-selective visual areas, the extrastriate body area and the fusiform body area. Multi-voxel pattern analysis showed that the strength of this emotional modulation was related, on a voxel-by-voxel basis, to the degree of body selectivity, while there was no relation with the degree of selectivity for faces. Across subjects, amygdala responses to emotional bodies positively correlated with the modulation of body-selective areas. Together, these results suggest that emotional cues from body movements produce topographically selective influences on category-specific populations of neurons in visual cortex, and these increases may implicate discrete modulatory projections from the amygdala.


Subject(s)
Affect , Amygdala/physiology , Body Image , Choice Behavior , Expressed Emotion , Facial Expression , Visual Perception , Female , Humans , Magnetic Resonance Imaging , Male , Temporal Lobe/physiology , Visual Cortex/physiology , Young Adult
19.
Cognition ; 104(1): 59-72, 2007 Jul.
Article in English | MEDLINE | ID: mdl-16831411

ABSTRACT

The importance of kinematics in emotion perception from body movement has been widely demonstrated. Evidence also suggests that the perception of biological motion relies to some extent on information about spatial and spatiotemporal form, yet the contribution of such form-related cues to emotion perception remains unclear. This study reports, for the first time, the relative effects on emotion recognition of inverting and motion-reversing patch-light compared to fully illuminated displays of whole-body emotion gestures. Inverting the gesture movies or playing them backwards significantly impaired emotion classification accuracy, but did so more for patch-light displays than for identical but fully illuminated movement sequences. This result suggests that inversion impairs the processing of form information related to the configuration of body parts, and reversal impairs the sequencing of form changes, more than these manipulations impair the processing of kinematic cues. This effect was strongest for inversion, suggesting an important role for configural information in emotion recognition. Nevertheless, even in combination these stimulus manipulations did not abolish above chance recognition of any of the emotions, suggesting that kinematics help distinguish emotions expressed by body gestures. Disproportionate impairments in recognition accuracy were observed for fear and disgust under inversion, and for fear under motion reversal, suggesting a greater role for form-related cues in the perception of these emotions.


Subject(s)
Affect , Form Perception , Gestures , Motion Perception , Recognition, Psychology , Adolescent , Adult , Cues , Female , Humans , Male , Space Perception
20.
Percept Psychophys ; 67(7): 1199-213, 2005 Oct.
Article in English | MEDLINE | ID: mdl-16502842

ABSTRACT

Previous research with speeded-response interference tasks modeled on the Garner paradigm has demonstrated that task-irrelevant variations in either emotional expression or facial speech do not interfere with identity judgments, but irrelevant variations in identity do interfere with expression and facial speech judgments. Sex, like identity, is a relatively invariant aspect of faces. Drawing on a recent model of face processing according to which invariant and changeable aspects of faces are represented in separate neurological systems, we predicted asymmetric interference between sex and emotion classification. The results of Experiment 1, in which the Garner paradigm was employed, confirmed this prediction: Emotion classifications were influenced by the sex of the faces, but sex classifications remained relatively unaffected by facial expression. A second experiment, in which the difficulty of the tasks was equated, corroborated these findings, indicating that differences in processing speed cannot account for the asymmetric relationship between facial emotion and sex processing. A third experiment revealed the same pattern of asymmetric interference through the use of a variant of the Simon paradigm. To the extent that Garner interference and Simon interference indicate interactions at perceptual and response-selection stages of processing, respectively, a challenge for face processing models is to show how the same asymmetric pattern of interference could occur at these different stages. The implications of these findings for the functional independence of the different components of face processing are discussed.


Subject(s)
Affect , Face , Facial Asymmetry , Visual Perception , Adolescent , Adult , Facial Expression , Female , Humans , Male , Reaction Time , Recognition, Psychology
SELECTION OF CITATIONS
SEARCH DETAIL
...