Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
Add more filters










Publication year range
1.
Behav Brain Res ; 471: 115126, 2024 Jun 29.
Article in English | MEDLINE | ID: mdl-38950784

ABSTRACT

In face-to-face social interactions, emotional expressions provide insights into the mental state of an interactive partner. This information can be crucial to infer action intentions and react towards another person's actions. Here we investigate how facial emotional expressions impact subjective experience and physiological and behavioral responses to social actions during real-time interactions. Thirty-two participants interacted with virtual agents while fully immersed in Virtual Reality. Agents displayed an angry or happy facial expression before they directed an appetitive (fist bump) or aversive (punch) social action towards the participant. Participants responded to these actions, either by reciprocating the fist bump or by defending the punch. For all interactions, subjective experience was measured using ratings. In addition, physiological responses (electrodermal activity, electrocardiogram) and participants' response times were recorded. Aversive actions were judged to be more arousing and less pleasant relative to appetitive actions. In addition, angry expressions increased heart rate relative to happy expressions. Crucially, interaction effects between facial emotional expression and action were observed. Angry expressions reduced pleasantness stronger for appetitive compared to aversive actions. Furthermore, skin conductance responses to aversive actions were increased for happy compared to angry expressions and reaction times were faster to aversive compared to appetitive actions when agents showed an angry expression. These results indicate that observers used facial emotional expression to generate expectations for particular actions. Consequently, the present study demonstrates that observers integrate information from facial emotional expressions with actions during social interactions.

2.
Sci Rep ; 14(1): 7538, 2024 03 30.
Article in English | MEDLINE | ID: mdl-38553517

ABSTRACT

Cue exposure therapy (CET) in substance-use disorders aims to reduce craving and ultimately relapse rates. Applying CET in virtual reality (VR) was proposed to increase its efficacy, as VR enables the presentation of social and environmental cues along with substance-related stimuli. However, limited success has been reported so far when applying VR-CET for smoking cessation. Understanding if effects of VR-CET differ between future abstainers and relapsing smokers may help to improve VR-CET. Data from 102 participants allocated to the intervention arm (VR-CET) of a recent RCT comparing VR-CET to relaxation in the context of smoking cessation was analyzed with respect to tolerability, presence, and craving during VR-CET. Cue exposure was conducted in four VR contexts (Loneliness/Rumination, Party, Stress, Café), each presented twice. Relapsed smokers compared to abstainers experienced higher craving during VR-CET and stronger craving responses especially during the Stress scenario. Furthermore, lower mean craving during VR-CET positively predicted abstinence at 6-month follow-up. Attempts to improve smoking cessation outcomes of VR-CET should aim to identify smokers who are more at risk of relapse based on high craving levels during VR-CET. Specifically measuring craving responses during social stress seems to be well suited to mark relapse. We propose to investigate individualized treatment approaches accordingly.


Subject(s)
Tobacco Products , Virtual Reality , Humans , Craving , Smoking/therapy , Cues , Smokers , Recurrence
3.
Proc Natl Acad Sci U S A ; 120(47): e2306279120, 2023 Nov 21.
Article in English | MEDLINE | ID: mdl-37963247

ABSTRACT

Recent neurobiological models on language suggest that auditory sentence comprehension is supported by a coordinated temporal interplay within a left-dominant brain network, including the posterior inferior frontal gyrus (pIFG), posterior superior temporal gyrus and sulcus (pSTG/STS), and angular gyrus (AG). Here, we probed the timing and causal relevance of the interplay between these regions by means of concurrent transcranial magnetic stimulation and electroencephalography (TMS-EEG). Our TMS-EEG experiments reveal region- and time-specific causal evidence for a bidirectional information flow from left pSTG/STS to left pIFG and back during auditory sentence processing. Adapting a condition-and-perturb approach, our findings further suggest that the left pSTG/STS can be supported by the left AG in a state-dependent manner.


Subject(s)
Language , Transcranial Magnetic Stimulation , Cerebral Cortex , Parietal Lobe , Comprehension/physiology , Magnetic Resonance Imaging , Brain Mapping
4.
Sci Rep ; 13(1): 13968, 2023 08 26.
Article in English | MEDLINE | ID: mdl-37633990

ABSTRACT

Public speaking is a challenging task that requires practice. Virtual Reality allows to present realistic public speaking scenarios in this regard, however, the role of the virtual audience during practice remains unknown. In the present study, 73 participants completed a Virtual Reality practice session while audience was manipulated to be supportive or unsupportive or presentations were practiced without audience. Importantly, following the virtual practice, participants held the presentation during a real university course via Zoom. We measured emotional experience, self-efficacy, and the subjective evaluation of performance at baseline, after VR practice, and after the real presentation. Additionally, participants' performance in the real presentation was evaluated by instructors (blinded to condition). Supportive in contrast to unsupportive audiences led to more positive believes about one's own performance, while there were no changes in beliefs in the group without audience. Importantly, practice in front of a supportive compared to unsupportive audience resulted in a more positive evaluation of speaker confidence in real-life public speaking as rated by the instructors. These results demonstrate an impact of virtual social feedback during public speaking on subsequent subjective performance evaluation. This may increase self-confidence resulting in actual improved public speaking performance in real-life.


Subject(s)
Speech , Virtual Reality , Humans , Emotions , Mental Processes , Self Efficacy
5.
Biol Psychol ; 175: 108453, 2022 11.
Article in English | MEDLINE | ID: mdl-36347358

ABSTRACT

Face-to-face social interactions are characterized by the reciprocal exchange of facial emotions between interaction partners. Typically, facial emotional expressions have been studied in passive observation paradigms, while interactive mechanisms remain unknown. In the current study we investigate how sending a facial emotional expression influenced the evaluation of an emotional expression received in return. Sixty-eight participants were cued to direct a facial emotional expression (happy, angry, neutral) towards a virtual agent in front of them. The virtual agent then responded with either the same or another emotional expression (happy, angry). Evaluation of the response expressions was measured via ratings of valence and arousal as well as EMG recordings of the M. corrugator supercilii and the M. zygomaticus major. Results revealed a significant interaction between the emotion of the initial facial expression and the response expression. Valence of happy response expressions were increased when participants had initially displayed a smile compared to a neutral expression or a frown. This was also reflected in the EMG responses. Initiating an interaction with a smile increased Zygomaticus activation for happy relative to angry response expressions compared to when the interaction was initiated with a frown. In contrast, no interplay of the initial and the response expression was observed in the Corrugator. These findings demonstrate that smiling or frowning at another person can modulate socio-emotional processing of subsequent social cues. Therefore, the present study highlights the interactive nature of facial emotional expressions.


Subject(s)
Facial Expression , Smiling , Humans , Social Interaction , Electromyography , Emotions/physiology , Facial Muscles/physiology
6.
Sci Rep ; 12(1): 2213, 2022 02 09.
Article in English | MEDLINE | ID: mdl-35140279

ABSTRACT

During the COVID-19 pandemic several behavioral measures have been implemented to reduce viral transmission. While these measures reduce the risk of infections, they may also increase risk behavior. Here, we experimentally investigate the influence of face masks on physical distancing. Eighty-four participants with or without face masks passed virtual agents in a supermarket environment to reach a target while interpersonal distance was recorded. Agents differed in wearing face masks and age (young, elderly). In addition, situational constraints varied in whether keeping a distance of 1.5 m required an effortful detour or not. Wearing face masks (both self and other) reduced physical distancing. This reduction was most prominent when keeping the recommended distance was effortful, suggesting an influence of situational constraints. Similarly, increased distances to elderly were only observed when keeping a recommended distance was effortless. These findings highlight contextual constraints in compensation behavior and have important implications for safety policies.


Subject(s)
Masks , Virtual Reality , Adult , COVID-19/prevention & control , COVID-19/virology , Female , Humans , Male , Middle Aged , Physical Distancing , SARS-CoV-2/isolation & purification , Young Adult
7.
PLoS One ; 16(9): e0256912, 2021.
Article in English | MEDLINE | ID: mdl-34469494

ABSTRACT

Social interaction requires fast and efficient processing of another person's intentions. In face-to-face interactions, aversive or appetitive actions typically co-occur with emotional expressions, allowing an observer to anticipate action intentions. In the present study, we investigated the influence of facial emotions on the processing of action intentions. Thirty-two participants were presented with video clips showing virtual agents displaying a facial emotion (angry vs. happy) while performing an action (punch vs. fist-bump) directed towards the observer. During each trial, video clips stopped at varying durations of the unfolding action, and participants had to recognize the presented action. Naturally, participants' recognition accuracy improved with increasing duration of the unfolding actions. Interestingly, while facial emotions did not influence accuracy, there was a significant influence on participants' action judgements. Participants were more likely to judge a presented action as a punch when agents showed an angry compared to a happy facial emotion. This effect was more pronounced in short video clips, showing only the beginning of an unfolding action, than in long video clips, showing near-complete actions. These results suggest that facial emotions influence anticipatory processing of action intentions allowing for fast and adaptive responses in social interactions.


Subject(s)
Anger , Facial Expression , Facial Recognition , Prejudice/psychology , Social Interaction , Adult , Female , Happiness , Healthy Volunteers , Humans , Male , Photic Stimulation/methods , Young Adult
8.
Cortex ; 141: 311-321, 2021 08.
Article in English | MEDLINE | ID: mdl-34118750

ABSTRACT

Listeners are sensitive to a speaker's individual language use and generate expectations for particular speakers. It is unclear, however, how such expectations affect online language processing. In the present EEG study, we presented thirty-two participants with auditory sentence stimuli of two speakers. Speakers differed in their use of two particular syntactic structures, easy subject-initial SOV structures and more difficult object-initial OSV structures. One speaker, the SOV-Speaker, had a high proportion of SOV sentences (75%) and a low proportion of OSV sentences (25%), and vice-versa for the OSV-Speaker. Participants were exposed to the speakers' individual language use in a training session followed by a test session on the consecutive day. ERP-results show that early stages of sentence processing are driven by syntactic processing only and are unaffected by speaker-specific expectations. In a late stage, however, an interaction between speaker and syntax information was observed. For the SOV-Speaker condition, the classical P600-effect reflected the effort of processing difficult and unexpected sentence structures. For the OSV-Speaker condition, both structures elicited different responses on frontal electrodes, possibly indexing effort to switch from a local speaker model to a global model of language use. Overall, the study identifies distinct neural mechanisms related to speaker-specific expectations.


Subject(s)
Language , Speech Perception , Electroencephalography , Evoked Potentials , Humans
9.
Clin Psychol Eur ; 3(1): e3061, 2021 Mar.
Article in English | MEDLINE | ID: mdl-36397781

ABSTRACT

Background: Habits and behaviors in everyday life currently need to be modified as quickly as possible due to the COVID-19 pandemic. Two of the most effective tools to prevent infection seem to be regular and thorough hand-washing and physical distancing during interpersonal interactions. Method: Two hundred and eighty-four participants completed a short survey to investigate how previous habits regarding hand-washing and physical distancing have changed in the general population as a function of the current pandemic and the thereby increased information and constant recommendations regarding these behaviors. Results: Participants aged 51 and older reported a greater change in everyday hand-washing behavior than younger participants. In addition, participants aged 31 and older selected significantly greater distances to have a conversation than younger participants. However, that was not the case if participants had to actively stop their conversational partner from approaching. Conclusion: Participants aged 51 years and older seem to be well aware of their at-risk status during the current pandemic and might therefore be willing to change their behavior more strongly than younger survey participants. Nevertheless, they seem to struggle with enforcing the current rules towards others. The group aged between 31 and 50 years, however, reports a comparable level of fear, but no corresponding change in hand-washing behavior. Future surveys should try to provide more insight into why this might be the case.

10.
Front Psychiatry ; 11: 561, 2020.
Article in English | MEDLINE | ID: mdl-32595544

ABSTRACT

Physical distance is a prominent feature in face-to-face social interactions and allows regulating social encounters. Close interpersonal distance (IPD) increases emotional responses during interaction and has been related to avoidance behavior in social anxiety. However, a systematic investigation of the effects of IPD on subjective experience combined with measures of physiological arousal and behavioral responses during real-time social interaction has been missing. Virtual Reality allows for a controlled manipulation of IPD while maintaining naturalistic social encounters. The present study investigates IPD in social interaction using a novel paradigm in Virtual Reality. Thirty-six participants approached virtual agents and engaged in short interactions. IPD was varied between 3.5 and 1 m by manipulating the distance at which agents reacted to the participant's approach. Closer distances were rated as more arousing, less pleasant, and less natural than longer distances and this effect was significantly modulated by social anxiety scores. Skin conductance responses were also increased at short distances compared to longer distances. Finally, an interaction of IPD and social anxiety was observed for avoidance behavior, measured as participants' backward motion during interaction, with stronger avoidance related to close distances and high values of social anxiety. These results highlight the influence of IPD on experience, physiological response, and behavior during social interaction. The interaction of social anxiety and IPD suggests including the manipulation of IPD in behavioral tests in Virtual Reality as a promising tool for the treatment of social anxiety disorder.

11.
Cereb Cortex Commun ; 1(1): tgaa021, 2020.
Article in English | MEDLINE | ID: mdl-34296098

ABSTRACT

Effective natural communication requires listeners to incorporate not only very general linguistic principles which evolved during a lifetime but also other information like the specific individual language use of a particular interlocutor. Traditionally, research has focused on the general linguistic rules, and brain science has shown a left hemispheric fronto-temporal brain network related to this processing. The present fMRI research explores speaker-specific individual language use because it is unknown whether this processing is supported by similar or distinct neural structures. Twenty-eight participants listened to sentences of persons who used more easy or difficult language. This was done by manipulating the proportion of easy SOV vs. complex OSV sentences for each speaker. Furthermore, ambiguous probe sentences were included to test top-down influences of speaker information in the absence of syntactic structure information. We observed distinct neural processing for syntactic complexity and speaker-specific language use. Syntactic complexity correlated with left frontal and posterior temporal regions. Speaker-specific processing correlated with bilateral (right-dominant) fronto-parietal brain regions. Finally, the top-down influence of speaker information was found in frontal and striatal brain regions, suggesting a mechanism for controlled syntactic processing. These findings show distinct neural networks related to general language principles as well as speaker-specific individual language use.

12.
Cortex ; 115: 86-98, 2019 06.
Article in English | MEDLINE | ID: mdl-30776735

ABSTRACT

Sentence comprehension requires the rapid analysis of semantic and syntactic information. These processes are supported by a left hemispheric dominant fronto-temporal network, including left posterior inferior frontal gyrus (pIFG) and posterior superior temporal gyrus/sulcus (pSTG/STS). Previous electroencephalography (EEG) studies have associated semantic expectancy within a sentence with a modulation of the N400 and syntactic gender violations with increases in the LAN and P600. Here, we combined focal perturbations of neural activity by means of short bursts of transcranial magnetic stimulation (TMS) with simultaneous EEG recordings to probe the functional relevance of pIFG and pSTG/STS for sentence comprehension. We applied 10 Hz TMS bursts of three pulses at verb onset during auditory presentation of short sentences. Verb-based semantic expectancy and article-based syntactic gender requirement were manipulated for the sentence final noun. We did not find any TMS effect at the noun. However, TMS had a short-lasting impact at the mid-sentence verb that differed for the two stimulation sites. Specifically, TMS over pIFG elicited a frontal positivity in the first 200 msec post verb onset whereas TMS over pSTG/STS was limited to a parietal negativity at 200-400 msec post verb onset. This indicates that during verb processing in sentential context, frontal brain areas play an earlier role than temporal areas in predicting the upcoming noun. The short-living perturbation effects at the mid-sentence verb suggest a high degree of online compensation within the language system since the sentence final noun processing was unaffected.


Subject(s)
Comprehension/physiology , Evoked Potentials, Auditory/physiology , Frontal Lobe/physiology , Functional Laterality/physiology , Language , Temporal Lobe/physiology , Adult , Electroencephalography , Female , Humans , Male , Transcranial Magnetic Stimulation
13.
Sci Rep ; 7(1): 17581, 2017 12 14.
Article in English | MEDLINE | ID: mdl-29242511

ABSTRACT

Predictions allow for efficient human communication. To be efficient, listeners' predictions need to be adapted to the communicative context. Here we show that during speech processing this adaptation is a highly flexible and selective process that is able to fine-tune itself to individual language styles of specific interlocutors. In a newly developed paradigm, speakers differed in the probabilities by which they used particular sentence structures. Probe trials were applied to infer participants' syntactic expectations for a given speaker and to track changes of these expectations over time. The results show that listeners fine-tune their linguistic expectations according to the individual language style of a speaker. Strikingly, nine months after the initial experiment these highly specific expectations could be rapidly reactivated when confronted with the particular language style of a speaker but not merely on the basis of an association with speaker identity per se. These findings highlight that communicative interaction fine-tunes and consolidates interlocutor specific communicative predictions which can overrule strong linguistic priors.


Subject(s)
Communication , Linguistics , Adult , Female , Humans , Male , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...