Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
J Affect Disord ; 291: 46-56, 2021 08 01.
Article in English | MEDLINE | ID: mdl-34023747

ABSTRACT

Cognitive bias in depression may increase sensitivity to judgmental appraisal of communicative cues. Nonverbal communication encompassing co-speech gestures is crucial for social functioning and is perceived differentially by men and women, however, little is known about the effect of depression on the perception of appraisal. We investigate if a cognitive bias influences the perception of appraisal and judgement of nonverbal communication in major depressive disorder (MDD). During watching videos of speakers retelling a story and gesticulating, 22 patients with MDD and 22 matched healthy controls pressed a button when they perceived the speaker as appraising in a positive or negative way. The speakers were presented in four different conditions (with and without speech and with natural speaker or as stick-figures) to evaluate context effects. Inter-subject covariance (ISC) of the button-press time series measured consistency across the groups of the response pattern depending on the factors diagnosis and gender. Significant effects emerged for the factors diagnosis (p = .002), gender (p = .007), and their interaction (p < .001). The female healthy controls perceived the gestures more consistently appraising than male controls, the female patients, and male patients whereas the latter three groups did not differ. Further, the ISC measure for consistency correlated negatively with depression severity. The natural speaker video without audio speech yielded the highest responses consistency. Indeed co-speech gestures may drive these ISC effects because number of gestures but not facial shrugs correlated with ISC amplitude. During co-speech gestures, a cognitive bias led to disturbed perception of appraisal in MDD for females. Social communication is critical for functional outcomes in mental disorders; thus perception of gestural communication is important in rehabilitation.


Subject(s)
Depressive Disorder, Major , Speech Perception , Female , Gestures , Humans , Male , Nonverbal Communication , Perception , Speech
2.
Brain Lang ; 205: 104772, 2020 06.
Article in English | MEDLINE | ID: mdl-32126372

ABSTRACT

This paper presents an fMRI study on healthy adult understanding of metaphors in multimodal communication. We investigated metaphors expressed either only in coverbal gestures ("monomodal metaphors") or in speech with accompanying gestures ("multimodal metaphors"). Monomodal metaphoric gestures convey metaphoric information not expressed in the accompanying speech (e.g. saying the non-metaphoric utterance, "She felt bad" while dropping down the hand with palm facing up; here, the gesture alone indicates metaphoricity), whereas coverbal gestures in multimodal metaphors indicate metaphoricity redundant to the speech (e.g. saying the metaphoric utterance, "Her spirits fell" while dropping the hand with palm facing up). In other words, in monomodal metaphors, gestures add information not spoken, whereas the gestures in multimodal metaphors can be redundant to the spoken content. Understanding and integrating the information in each modality, here spoken and visual, is important in multimodal communication, but most prior studies have only considered multimodal metaphors where the gesture is redundant to what is spoken. Our participants watched audiovisual clips of an actor speaking while gesturing. We found that abstract metaphor comprehension recruited the lateral superior/middle temporal cortices, regardless of the modality in which the conceptual metaphor is expressed. These results suggest that abstract metaphors, regardless of modality, involve resources implicated in general semantic processing and are consistent with the role of these areas in supramodal semantic processing as well as the theory of embodied cognition.


Subject(s)
Communication , Gestures , Metaphor , Reaction Time/physiology , Semantics , Temporal Lobe/diagnostic imaging , Adolescent , Adult , Cognition/physiology , Comprehension/physiology , Female , Humans , Magnetic Resonance Imaging/methods , Male , Photic Stimulation/methods , Speech/physiology , Temporal Lobe/physiology , Young Adult
3.
Front Psychol ; 10: 254, 2019.
Article in English | MEDLINE | ID: mdl-30873059

ABSTRACT

This paper aims to evidence the inherently metonymic nature of co-speech gestures. Arguing that motivation in gesture involves iconicity (similarity), indexicality (contiguity), and habit (conventionality) to varying degrees, it demonstrates how a set of metonymic principles may lend a certain systematicity to experientially grounded processes of gestural abstraction and enaction. Introducing visuo-kinetic signs as an umbrella term for co-speech gestures and signed languages, the paper shows how a frame-based approach to gesture may integrate different cognitive/functional linguistic and semiotic accounts of metonymy (e.g., experiential domains, frame metonymy, contiguity, and pragmatic inferencing). The guiding assumption is that gestures metonymically profile deeply embodied, routinized aspects of familiar scenes, that is, the motivating context of frames. The discussion shows how gestures may evoke frame structures exhibiting varying degrees of groundedness, complexity, and schematicity: basic physical action and object frames; more complex frames; and highly abstract, complex frame structures. It thereby provides gestural evidence for the idea that metonymy is more basic and more directly experientially grounded than metaphor and thus often feeds into correlated metaphoric processes. Furthermore, the paper offers some initial insights into how metonymy also seems to induce the emergence of schematic patterns in gesture which may result from action-based and discourse-driven processes of habituation and conventionalization. It exemplifies how these forces may engender grammaticalization of a basic physical action into a gestural marker that shows strong metonymic form reduction, decreased transitivity, and interacting pragmatic functions. Finally, addressing basic metonymic operations in signed lexemes elucidates certain similarities regarding sign constitution in gesture and sign. English and German multimodal discourse data as well as German Sign Language (DGS) are drawn upon to illustrate the theoretical points of the paper. Overall, this paper presents a unified account of metonymy's role in underpinning forms, functions, and patterns in visuo-kinetic signs.

4.
Front Hum Neurosci ; 12: 296, 2018.
Article in English | MEDLINE | ID: mdl-30154703

ABSTRACT

Social interactions arise from patterns of communicative signs, whose perception and interpretation require a multitude of cognitive functions. The semiotic framework of Peirce's Universal Categories (UCs) laid ground for a novel cognitive-semiotic typology of social interactions. During functional magnetic resonance imaging (fMRI), 16 volunteers watched a movie narrative encompassing verbal and non-verbal social interactions. Three types of non-verbal interactions were coded ("unresolved," "non-habitual," and "habitual") based on a typology reflecting Peirce's UCs. As expected, the auditory cortex responded to verbal interactions, but non-verbal interactions modulated temporal areas as well. Conceivably, when speech was lacking, ambiguous visual information (unresolved interactions) primed auditory processing in contrast to learned behavioral patterns (habitual interactions). The latter recruited a parahippocampal-occipital network supporting conceptual processing and associative memory retrieval. Requesting semiotic contextualization, non-habitual interactions activated visuo-spatial and contextual rule-learning areas such as the temporo-parietal junction and right lateral prefrontal cortex. In summary, the cognitive-semiotic typology reflected distinct sensory and association networks underlying the interpretation of observed non-verbal social interactions.

5.
Neuropsychologia ; 109: 232-244, 2018 01 31.
Article in English | MEDLINE | ID: mdl-29275004

ABSTRACT

In "Two heads are better than one," "head" stands for people and focuses the message on the intelligence of people. This is an example of figurative language through metonymy, where substituting a whole entity by one of its parts focuses attention on a specific aspect of the entity. Whereas metaphors, another figurative language device, are substitutions based on similarity, metonymy involves substitutions based on associations. Both are figures of speech but are also expressed in coverbal gestures during multimodal communication. The closest neuropsychological studies of metonymy in gestures have been nonlinguistic tool-use, illustrated by the classic apraxic problem of body-part-as-object (BPO, equivalent to an internal metonymy representation of the tool) vs. pantomimed action (external metonymy representation of the absent object/tool). Combining these research domains with concepts in cognitive linguistic research on gestures, we conducted an fMRI study to investigate metonymy resolution in coverbal gestures. Given the greater difficulty in developmental and apraxia studies, perhaps explained by the more complex semantic inferencing involved for external metonymy than for internal metonymy representations, we hypothesized that external metonymy resolution requires greater processing demands and that the neural resources supporting metonymy resolution would modulate regions involved in semantic processing. We found that there are indeed greater activations for external than for internal metonymy resolution in the temporoparietal junction (TPJ). This area is posterior to the lateral temporal regions recruited by metaphor processing. Effective connectivity analysis confirmed our hypothesis that metonymy resolution modulates areas implicated in semantic processing. We interpret our results in an interdisciplinary view of what metonymy in action can reveal about abstract cognition.


Subject(s)
Brain/physiology , Gestures , Language , Metaphor , Perception/physiology , Adolescent , Adult , Attention/physiology , Brain/diagnostic imaging , Brain Mapping , Cognition/physiology , Comprehension/physiology , Female , Humans , Magnetic Resonance Imaging , Male , Young Adult
6.
Front Hum Neurosci ; 11: 573, 2017.
Article in English | MEDLINE | ID: mdl-29249945

ABSTRACT

Face-to-face communication is multimodal; it encompasses spoken words, facial expressions, gaze, and co-speech gestures. In contrast to linguistic symbols (e.g., spoken words or signs in sign language) relying on mostly explicit conventions, gestures vary in their degree of conventionality. Bodily signs may have a general accepted or conventionalized meaning (e.g., a head shake) or less so (e.g., self-grooming). We hypothesized that subjective perception of conventionality in co-speech gestures relies on the classical language network, i.e., the left hemispheric inferior frontal gyrus (IFG, Broca's area) and the posterior superior temporal gyrus (pSTG, Wernicke's area) and studied 36 subjects watching video-recorded story retellings during a behavioral and an functional magnetic resonance imaging (fMRI) experiment. It is well documented that neural correlates of such naturalistic videos emerge as intersubject covariance (ISC) in fMRI even without involving a stimulus (model-free analysis). The subjects attended either to perceived conventionality or to a control condition (any hand movements or gesture-speech relations). Such tasks modulate ISC in contributing neural structures and thus we studied ISC changes to task demands in language networks. Indeed, the conventionality task significantly increased covariance of the button press time series and neuronal synchronization in the left IFG over the comparison with other tasks. In the left IFG, synchronous activity was observed during the conventionality task only. In contrast, the left pSTG exhibited correlated activation patterns during all conditions with an increase in the conventionality task at the trend level only. Conceivably, the left IFG can be considered a core region for the processing of perceived conventionality in co-speech gestures similar to spoken language. In general, the interpretation of conventionalized signs may rely on neural mechanisms that engage during language comprehension.

SELECTION OF CITATIONS
SEARCH DETAIL
...