Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
Add more filters










Database
Language
Publication year range
1.
J Exp Psychol Hum Percept Perform ; 50(6): 626-635, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38635224

ABSTRACT

Intentional binding refers to the subjective temporal compression between a voluntary action and its subsequent sensory outcome. Despite some studies challenging the link between temporal compression and intentional action, intentional binding is still widely used as an implicit measure for the sense of agency. The debate remains unsettled primarily because the experimental conditions used in previous studies were confounded with various alternative causes for temporal compression, and action intention has not yet been tested comprehensively against all potential alternative causes in a single study. Here, we solve this puzzle by jointly comparing participants' estimates of the interval between three types of triggering events with comparable predictability-voluntary movement, passive movement, and external sensory event-and an external sensory outcome (auditory or visual across experiments). The results failed to show intentional binding, that is, no shorter interval estimation for the voluntary than the passive movement conditions. Instead, we observed temporal (but not intentional) binding when comparing both movement conditions with the external sensory condition. Thus, temporal binding appears to originate from sensory integration and temporal prediction, not from action intention. As such, these findings underscore the need to reconsider the use of "intentional binding" as a reliable proxy of the sense of agency. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Subject(s)
Intention , Psychomotor Performance , Time Perception , Humans , Adult , Young Adult , Male , Female , Time Perception/physiology , Psychomotor Performance/physiology , Auditory Perception/physiology , Visual Perception/physiology , Motor Activity/physiology
2.
Memory ; 31(9): 1113-1133, 2023 10.
Article in English | MEDLINE | ID: mdl-37649134

ABSTRACT

Most everyday experiences are multisensory, and all senses can trigger the conscious re-experience of unique personal events embedded in their specific spatio-temporal context. Yet, little is known about how a cue's sensory modality influences episodic memory, and which step of this process is impacted. This study investigated recognition and episodic memory across olfactory, auditory and visual sensory modalities in a laboratory-ecological task using a non-immersive virtual reality device. At encoding, participants freely and actively explored unique and rich episodes in a three-room house where boxes delivered odours, musical pieces and pictures of face. At retrieval, participants were presented with modality-specific memory cues and were told to 1) recognise encoded cues among distractors and, 2) go to the room and select the box in which they encountered them at encoding. Memory performance and response times revealed that music and faces outperformed odours in recognition memory, but that odours and faces outperformed music in evoking encoding context. Interestingly, correct recognition of music and faces was accompanied by more profound inspirations than correct rejection. By directly comparing memory performance across sensory modalities, our study demonstrated that despite limited recognition, odours are powerful cues to evoke specific episodic memory retrieval.


Subject(s)
Memory, Episodic , Music , Virtual Reality , Humans , Cues , Odorants
3.
Front Neurol ; 14: 1151515, 2023.
Article in English | MEDLINE | ID: mdl-37064179

ABSTRACT

Objectives: Virtual reality (VR) offers an ecological setting and the possibility of altered visual feedback during head movements useful for vestibular research and treatment of vestibular disorders. There is however no data quantifying vestibulo-ocular reflex (VOR) during head impulse test (HIT) in VR. The main objective of this study is to assess the feasibility and performance of eye and head movement measurements of healthy subjects in a VR environment during high velocity horizontal head rotation (VR-HIT) under a normal visual feedback condition. The secondary objective is to establish the feasibility of VR-HIT recordings in the same group of normal subjects but under altered visual feedback conditions. Design: Twelve healthy subjects underwent video HIT using both a standard setup (vHIT) and VR-HIT. In VR, eye and head positions were recorded by using, respectively, an imbedded eye tracker and an infrared motion tracker. Subjects were tested under four conditions, one reproducing normal visual feedback and three simulating an altered gain or direction of visual feedback. During these three altered conditions the movement of the visual scene relative to the head movement was decreased in amplitude by 50% (half), was nullified (freeze) or was inverted in direction (inverse). Results: Eye and head motion recording during normal visual feedback as well as during all 3 altered conditions was successful. There was no significant difference in VOR gain in VR-HIT between normal, half, freeze and inverse conditions. In the normal condition, VOR gain was significantly but slightly (by 3%) different for VR-HIT and vHIT. Duration and amplitude of head impulses were significantly greater in VR-HIT than in vHIT. In all three altered VR-HIT conditions, covert saccades were present in approximatively one out of four trials. Conclusion: Our VR setup allowed high quality recording of eye and head data during head impulse test under normal and altered visual feedback conditions. This setup could be used to investigate compensation mechanisms in vestibular hypofunction, to elicit adaptation of VOR in ecological settings or to allow objective evaluation of VR-based vestibular rehabilitation.

4.
Physiol Behav ; 264: 114147, 2023 05 15.
Article in English | MEDLINE | ID: mdl-36893999

ABSTRACT

Humans can communicate their emotions to others via volatile emissions from their bodies. Although there is now solid evidence for human chemical communication of fear, stress and anxiety, investigations of positive emotions remain scarce. In a recent study, we found that women's heart rate and performance in creativity tasks were modulated by body odors of men sampled while they were in a positive vs. neutral mood. However, inducing positive emotions in laboratory settings remains challenging. Therefore, an important step to further investigate the human chemical communication of positive emotions is to develop new methods to induce positive moods. Here, we present a new mood induction procedure (MIP) based on virtual reality (VR), that we assumed to be more powerful than videos (used in our previous study) to induce positive emotions. We hypothesized that, consequently, given the more intense emotions created, this VR-based MIP would induce larger differences between the receivers' responses to the positive body odor versus a neutral control body odor, than the Video-based MIP. The results confirmed the higher efficacy of VR to induce positive emotions compared with videos. More specifically, VR had more repeatable effects between individuals. Although positive body odors had similar effects to those found in the previous video study, especially faster problem solving, these effects did not reach statistical significance. These outcomes are discussed as a function of the specificities of VR and of other methodological parameters, that may have prevented the observation of such subtle effects and that should be understood more in-depth for future studies on human chemical communication.


Subject(s)
Body Odor , Emotions , Nonverbal Communication , Female , Humans , Male , Affect/physiology , Emotions/physiology , Virtual Reality , Nonverbal Communication/psychology , Young Adult , Adult
5.
Front Hum Neurosci ; 16: 981330, 2022.
Article in English | MEDLINE | ID: mdl-36248682

ABSTRACT

When describing motion along both the horizontal and vertical axes, languages from different families express the elements encoding verticality before those coding for horizontality (e.g., going up right instead of right up). In light of the motor grounding of language, the present study investigated whether the prevalence of verticality in Path expression also governs the trajectory of arm biological movements. Using a 3D virtual-reality setting, we tracked the kinematics of hand pointing movements in five spatial directions, two of which implied the vertical and horizontal vectors equally (i.e., up right +45° and bottom right -45°). Movement onset could be prompted by visual or auditory verbal cues, the latter being canonical in French ("en haut à droite"/up right) or not ("à droite en haut"/right up). In two experiments, analyses of the index finger kinematics revealed a significant effect of gravity, with earlier acceleration, velocity, and deceleration peaks for upward (+45°) than downward (-45°) movements, irrespective of the instructions. Remarkably, confirming the linguistic observations, we found that vertical kinematic parameters occurred earlier than horizontal ones for upward movements, both for visual and congruent verbal cues. Non-canonical verbal instructions significantly affected this temporal dynamic: for upward movements, the horizontal and vertical components temporally aligned, while they reversed for downward movements where the kinematics of the vertical axis was delayed with respect to that of the horizontal one. This temporal dynamic is so deeply anchored that non-canonical verbal instructions allowed for horizontality to precede verticality only for movements that do not fight against gravity. Altogether, our findings provide new insights into the embodiment of language by revealing that linguistic path may reflect the organization of biological movements, giving priority to the vertical axis.

6.
Cereb Cortex Commun ; 3(3): tgac031, 2022.
Article in English | MEDLINE | ID: mdl-36072709

ABSTRACT

We constantly face situations involving interactions with others that require us to automatically adjust our physical distances to avoid discomfort or anxiety. A previous case study has demonstrated that the integrity of both amygdalae is essential to regulate interpersonal distances. Despite unilateral lesion to the amygdala, as to other sectors of the medial temporal cortex, are known to also affect social behavior, their role in the regulation of interpersonal distances has never been investigated. Here, we sought to fill this gap by testing three patients with unilateral temporal lesions following surgical resections, including one patient with a lesion mainly centered on the amygdala and two with lesions to adjacent medial temporal cortex, on two versions of the stop distance paradigm (i.e. in a virtual reality environment and in a real setting). Our results showed that all three patients set shorter interpersonal distances compared to neurotypical controls. In addition, compared to controls, none of the patients adjusted such physical distances depending on facial emotional expressions, despite they preserved ability to categorize them. Finally, patients' heart rate responses differed from controls when viewing approaching faces. Our findings bring compelling evidence that unilateral lesions within the medial temporal cortex, not necessarily restricted to the amygdala, are sufficient to alter interpersonal distance, thus shedding new light on the neural circuitry regulating distance in social interactions.

7.
Cortex ; 138: 40-58, 2021 05.
Article in English | MEDLINE | ID: mdl-33677327

ABSTRACT

Accumulating evidence indicates that the peripersonal space (PPS) constitutes a privileged area for efficient processing of proximal stimuli, allowing to flexibly adapt our behavior both to the physical and social environment. Whether and how behavioral and physiological signatures of PPS relate to each other in emotional contexts remains, though, elusive. Here, we addressed this question by having participants to discriminate male from female faces depicting different emotions (happiness, anger or neutral) and presented at different distances (50 cm-300 cm) while we measured the reaction time and accuracy of their responses, as well as pupillary diameter, heart rate and heart rate variability. Results showed facilitation of participants' performances (i.e., faster response time) when faces were presented close compared to far from the participants, even when controlling for retinal size across distances. These behavioral effects were accompanied by significant modulation of participants' physiological indexes when faces were presented in PPS. Interestingly, both PPS representation and physiological signals were affected by features of the seen faces such as the emotional valence, its sex and the participants' sex, revealing the profound impact of social context onto the autonomic state and behavior within PPS. Together, these findings suggest that both external and internal signals contribute in shaping PPS representation.


Subject(s)
Emotions , Personal Space , Anger , Facial Expression , Female , Happiness , Humans , Male , Reaction Time
8.
Front Psychol ; 11: 510787, 2020.
Article in English | MEDLINE | ID: mdl-33192759

ABSTRACT

Previous research using immersive virtual reality (VR) has shown that after a short period of embodiment by White people in a Black virtual body, their implicit racial bias against Black people diminishes. Here we tested the effects of some socio-cognitive variables that could contribute to enhancing or reducing the implicit racial bias. The first aim of the study was to assess the beneficial effects of cooperation within a VR scenario, the second aim was to provide preliminary testing of the hypothesis that empathy and political attitudes could contribute to implicit bias about race, while the third aim was to explore the relationship between political attitudes and empathy. We had (Caucasian) participants embodied in a Black virtual body and engaged either in a cooperative (Coop group) or in a non-cooperative (Neutral group) activity with a confederate experimenter embodying another Black avatar. Before and after VR, we measured participants' implicit racial bias by means of Implicit Association Test (IAT) and their perceived closeness toward the confederate experimenter. Before VR we also assessed participants' political attitudes and empathy traits. Results revealed that, as compared to the Neutral group, the Coop group showed lower IAT scores after the social interaction. Interestingly, in the Neutral but not the Coop group the perceived closeness toward the confederate experimenter was associated with the initial racial bias: the more the participants reduced their distance, the more they reduced their IAT score. Moreover, reported traits of empathy and political attitudes significantly explained the variance observed in the initial implicit bias, with perspective-taking, empathic concern, and personal distress being significant predictors of the IAT scores. Finally, there was a relationship between political attitudes and empathy: the more participants considered themselves as left-wing voters, the higher their perspective-taking and empathic concern scores. We discuss these findings within the neuroscientific and social cognition field and encourage scholars from different domains to further explore whether and under which conditions a given manipulation for reducing racial bias could be efficiently transposed in VR.

9.
Psychol Sci ; 29(11): 1868-1877, 2018 11.
Article in English | MEDLINE | ID: mdl-30285541

ABSTRACT

Closer objects are invariably perceived as bigger than farther ones and are therefore easier to detect and discriminate. This is so deeply grounded in our daily experience that no question has been raised as to whether the advantage for near objects depends on other features (e.g., depth itself). In a series of five experiments ( N = 114), we exploited immersive virtual environments and visual illusions (i.e., Ponzo) to probe humans' perceptual abilities in depth and, specifically, in the space closely surrounding our body, termed peripersonal space. We reversed the natural distance scaling of size in favor of the farther object, which thus appeared bigger, to demonstrate a persistent shape-discrimination advantage for close objects. Psychophysical modeling further suggested a sigmoidal trend for this benefit, mirroring that found for multisensory estimates of peripersonal space. We argue that depth is a fundamental, yet overlooked, dimension of human perception and that future studies in vision and perception should be depth aware.


Subject(s)
Distance Perception , Illusions , Personal Space , Space Perception , Adult , Female , Humans , Male , Models, Psychological , Reaction Time , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...