Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 123
Filter
1.
Q J Exp Psychol (Hove) ; : 17470218241256870, 2024 May 24.
Article in English | MEDLINE | ID: mdl-38785308

ABSTRACT

Visual objects in the peripersonal space (PPS), are perceived faster than farther ones, appearing in the extrapersonal space (EPS). This shows preferential processing for visual stimuli near our body. Such an advantage should favor visual perceptual learning occurring near, as compared to far from observers, but opposite evidence has been recently provided from online testing protocols, showing larger perceptual learning in the far space. Here, we ran two laboratory-based experiments investigating whether visual training in PPS and EPS has different effects. We used the horizontal Ponzo Illusion to create a lateralized depth perspective while participants completed a visual search task in which they reported whether or not a specific target object orientation (e.g., a triangle pointing upward) was present amongst distractors. This task was completed before and after a training phase in either the (illusory) near or far space for one hour. In Experiment 1, the near space was in the left hemispace, whereas in Experiment 2 it was in the right. Results showed that, in both experiments, participants were more accurate after training in the far space, whereas training in the near space led to either improvement in the far space (Exp. 1), or no change (Exp. 2). Moreover, we found a larger visual perceptual learning when stimuli were presented in the left compared to the right hemispace. Differently from visual processing, visual perceptual learning is more effective in the far space. We propose that depth is a key dimension that can be used to improve human visual learning.

2.
J Exp Psychol Hum Percept Perform ; 50(6): 626-635, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38635224

ABSTRACT

Intentional binding refers to the subjective temporal compression between a voluntary action and its subsequent sensory outcome. Despite some studies challenging the link between temporal compression and intentional action, intentional binding is still widely used as an implicit measure for the sense of agency. The debate remains unsettled primarily because the experimental conditions used in previous studies were confounded with various alternative causes for temporal compression, and action intention has not yet been tested comprehensively against all potential alternative causes in a single study. Here, we solve this puzzle by jointly comparing participants' estimates of the interval between three types of triggering events with comparable predictability-voluntary movement, passive movement, and external sensory event-and an external sensory outcome (auditory or visual across experiments). The results failed to show intentional binding, that is, no shorter interval estimation for the voluntary than the passive movement conditions. Instead, we observed temporal (but not intentional) binding when comparing both movement conditions with the external sensory condition. Thus, temporal binding appears to originate from sensory integration and temporal prediction, not from action intention. As such, these findings underscore the need to reconsider the use of "intentional binding" as a reliable proxy of the sense of agency. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Subject(s)
Intention , Psychomotor Performance , Time Perception , Humans , Adult , Young Adult , Male , Female , Time Perception/physiology , Psychomotor Performance/physiology , Auditory Perception/physiology , Visual Perception/physiology , Motor Activity/physiology
3.
iScience ; 27(3): 109092, 2024 Mar 15.
Article in English | MEDLINE | ID: mdl-38405611

ABSTRACT

It has been suggested that our brain re-uses body-based computations to localize touch on tools, but the neural implementation of this process remains unclear. Neural oscillations in the alpha and beta frequency bands are known to map touch on the body in external and skin-centered coordinates, respectively. Here, we pinpointed the role of these oscillations during tool-extended sensing by delivering tactile stimuli to either participants' hands or the tips of hand-held rods. To disentangle brain responses related to each coordinate system, we had participants' hands/tool tips crossed or uncrossed at their body midline. We found that midline crossing modulated alpha (but not beta) band activity similarly for hands and tools, also involving a similar network of cortical regions. Our findings strongly suggest that the brain uses similar oscillatory mechanisms for mapping touch on the body and tools, supporting the idea that body-based neural processes are repurposed for tool use.

4.
bioRxiv ; 2024 Feb 14.
Article in English | MEDLINE | ID: mdl-36711691

ABSTRACT

Implicit sensorimotor adaptation keeps our movements well-calibrated amid changes in the body and environment. We have recently postulated that implicit adaptation is driven by a perceptual error: the difference between the desired and perceived movement outcome. According to this perceptual re-alignment model, implicit adaptation ceases when the perceived movement outcome - a multimodal percept determined by a prior belief conveying the intended action, the motor command, and feedback from proprioception and vision - is aligned with the desired movement outcome. Here, we examined the role of proprioception in implicit motor adaptation and perceived movement outcome by examining individuals who lack proprioception. We used a modified visuomotor rotation task designed to isolate implicit adaptation and probe perceived outcome throughout the experiment. Surprisingly, implicit adaptation and perceived outcome were minimally impacted by deafferentation, posing a challenge to the perceptual re-alignment model of implicit adaptation.

5.
Psychon Bull Rev ; 2023 Nov 06.
Article in English | MEDLINE | ID: mdl-37932577

ABSTRACT

Visual shape discrimination is faster for objects close to the body, in the peripersonal space (PPS), compared with objects far from the body. Visual processing enhancement in PPS occurs also when perceived depth is based on 2D pictorial cues. This advantage has been observed from relatively low-level (detection, size, orientation) to high-level visual features (face processing). While multisensory association also displays proximal advantages, whether PPS influences visual perceptual learning remains unclear. Here, we investigated whether perceptual learning effects vary according to the distance of visual stimuli (near or far) from the observer, illusorily induced by leveraging the Ponzo illusion. Participants performed a visual search task in which they reported whether a specific target object orientation (e.g., triangle pointing downward) was present among distractors. Performance was assessed before and after practicing the visual search task (30 minutes/day for 5 days) at either the close (near group) or far (far group) distance. Results showed that participants that performed the training in the near space did not improve. By contrast, participants that performed the training in the far space showed an improvement in the visual search task in both the far and near spaces. We suggest that such improvement following the far training is due to a greater deployment of attention in the far space, which could make the learning more effective and generalize across spaces.

6.
eNeuro ; 10(11)2023 Nov.
Article in English | MEDLINE | ID: mdl-37848289

ABSTRACT

It is often claimed that tools are embodied by their user, but whether the brain actually repurposes its body-based computations to perform similar tasks with tools is not known. A fundamental computation for localizing touch on the body is trilateration. Here, the location of touch on a limb is computed by integrating estimates of the distance between sensory input and its boundaries (e.g., elbow and wrist of the forearm). As evidence of this computational mechanism, tactile localization on a limb is most precise near its boundaries and lowest in the middle. Here, we show that the brain repurposes trilateration to localize touch on a tool, despite large differences in initial sensory input compared with touch on the body. In a large sample of participants, we found that localizing touch on a tool produced the signature of trilateration, with highest precision close to the base and tip of the tool. A computational model of trilateration provided a good fit to the observed localization behavior. To further demonstrate the computational plausibility of repurposing trilateration, we implemented it in a three-layer neural network that was based on principles of probabilistic population coding. This network determined hit location in tool-centered coordinates by using a tool's unique pattern of vibrations when contacting an object. Simulations demonstrated the expected signature of trilateration, in line with the behavioral patterns. Our results have important implications for how trilateration may be implemented by somatosensory neural populations. We conclude that trilateration is likely a fundamental spatial computation that unifies limbs and tools.


Subject(s)
Touch Perception , Touch , Humans , Hand , Brain , Wrist
7.
Trends Hear ; 27: 23312165231182289, 2023.
Article in English | MEDLINE | ID: mdl-37611181

ABSTRACT

Lateralized sounds can orient visual attention, with benefits for audio-visual processing. Here, we asked to what extent perturbed auditory spatial cues-resulting from cochlear implants (CI) or unilateral hearing loss (uHL)-allow this automatic mechanism of information selection from the audio-visual environment. We used a classic paradigm from experimental psychology (capture of visual attention with sounds) to probe the integrity of audio-visual attentional orienting in 60 adults with hearing loss: bilateral CI users (N = 20), unilateral CI users (N = 20), and individuals with uHL (N = 20). For comparison, we also included a group of normal-hearing (NH, N = 20) participants, tested in binaural and monaural listening conditions (i.e., with one ear plugged). All participants also completed a sound localization task to assess spatial hearing skills. Comparable audio-visual orienting was observed in bilateral CI, uHL, and binaural NH participants. By contrast, audio-visual orienting was, on average, absent in unilateral CI users and reduced in NH listening with one ear plugged. Spatial hearing skills were better in bilateral CI, uHL, and binaural NH participants than in unilateral CI users and monaurally plugged NH listeners. In unilateral CI users, spatial hearing skills correlated with audio-visual-orienting abilities. These novel results show that audio-visual-attention orienting can be preserved in bilateral CI users and in uHL patients to a greater extent than unilateral CI users. This highlights the importance of assessing the impact of hearing loss beyond auditory difficulties alone: to capture to what extent it may enable or impede typical interactions with the multisensory environment.


Subject(s)
Cochlear Implantation , Cochlear Implants , Deafness , Hearing Loss, Unilateral , Hearing Loss , Sound Localization , Speech Perception , Adult , Humans , Cues , Hearing , Cochlear Implantation/methods
8.
Exp Brain Res ; 241(7): 1785-1796, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37222776

ABSTRACT

To protect our body against physical threats, it is important to integrate the somatic and extra-somatic inputs generated by these stimuli. Temporal synchrony is an important parameter determining multisensory interaction, and the time taken by a given sensory input to reach the brain depends on the length and conduction velocity of the specific pathways through which it is transmitted. Nociceptive inputs are transmitted through very slow conducting unmyelinated C and thinly myelinated Aδ nociceptive fibers. It was previously shown that to perceive a visual stimulus and a thermo-nociceptive stimulus applied on the hand as coinciding in time, the nociceptive stimulus must precede the visual one by 76 ms for nociceptive inputs conveyed by Aδ fibers and 577 ms for inputs conveyed by C fibers. Since spatial proximity is also hypothesized to contribute to multisensory interaction, the present study investigated the effect of spatial congruence between visual and nociceptive stimuli. Participants judged the temporal order of visual and nociceptive stimuli, with the visual stimuli flashed either next to the stimulated hand or next to the opposite unstimulated hand, and with nociceptive stimuli evoking responses mediated by either Aδ or C fibers. The amount of time by which the nociceptive stimulus had to precede the visual stimulus for them to be perceived as appearing concomitantly was smaller when the visual stimulus occurred near the hand receiving the nociceptive stimulus as compared to when it occurred near the contralateral hand. This illustrates the challenge for the brain to process the synchrony between nociceptive and non-nociceptive stimuli to enable their efficient interaction to optimize defensive reaction against physical dangers.


Subject(s)
Nociception , Visual Perception , Humans , Visual Perception/physiology , Nociception/physiology , Hand , Brain
9.
Front Neurol ; 14: 1151515, 2023.
Article in English | MEDLINE | ID: mdl-37064179

ABSTRACT

Objectives: Virtual reality (VR) offers an ecological setting and the possibility of altered visual feedback during head movements useful for vestibular research and treatment of vestibular disorders. There is however no data quantifying vestibulo-ocular reflex (VOR) during head impulse test (HIT) in VR. The main objective of this study is to assess the feasibility and performance of eye and head movement measurements of healthy subjects in a VR environment during high velocity horizontal head rotation (VR-HIT) under a normal visual feedback condition. The secondary objective is to establish the feasibility of VR-HIT recordings in the same group of normal subjects but under altered visual feedback conditions. Design: Twelve healthy subjects underwent video HIT using both a standard setup (vHIT) and VR-HIT. In VR, eye and head positions were recorded by using, respectively, an imbedded eye tracker and an infrared motion tracker. Subjects were tested under four conditions, one reproducing normal visual feedback and three simulating an altered gain or direction of visual feedback. During these three altered conditions the movement of the visual scene relative to the head movement was decreased in amplitude by 50% (half), was nullified (freeze) or was inverted in direction (inverse). Results: Eye and head motion recording during normal visual feedback as well as during all 3 altered conditions was successful. There was no significant difference in VOR gain in VR-HIT between normal, half, freeze and inverse conditions. In the normal condition, VOR gain was significantly but slightly (by 3%) different for VR-HIT and vHIT. Duration and amplitude of head impulses were significantly greater in VR-HIT than in vHIT. In all three altered VR-HIT conditions, covert saccades were present in approximatively one out of four trials. Conclusion: Our VR setup allowed high quality recording of eye and head data during head impulse test under normal and altered visual feedback conditions. This setup could be used to investigate compensation mechanisms in vestibular hypofunction, to elicit adaptation of VOR in ecological settings or to allow objective evaluation of VR-based vestibular rehabilitation.

10.
Eur Arch Otorhinolaryngol ; 280(8): 3661-3672, 2023 Aug.
Article in English | MEDLINE | ID: mdl-36905419

ABSTRACT

BACKGROUND AND PURPOSE: Use of unilateral cochlear implant (UCI) is associated with limited spatial hearing skills. Evidence that training these abilities in UCI user is possible remains limited. In this study, we assessed whether a Spatial training based on hand-reaching to sounds performed in virtual reality improves spatial hearing abilities in UCI users METHODS: Using a crossover randomized clinical trial, we compared the effects of a Spatial training protocol with those of a Non-Spatial control training. We tested 17 UCI users in a head-pointing to sound task and in an audio-visual attention orienting task, before and after each training.
Study is recorded in clinicaltrials.gov (NCT04183348). RESULTS: During the Spatial VR training, sound localization errors in azimuth decreased. Moreover, when comparing head-pointing to sounds before vs. after training, localization errors decreased after the Spatial more than the control training. No training effects emerged in the audio-visual attention orienting task. CONCLUSIONS: Our results showed that sound localization in UCI users improves during a Spatial training, with benefits that extend also to a non-trained sound localization task (generalization). These findings have potentials for novel rehabilitation procedures in clinical contexts.


Subject(s)
Cochlear Implantation , Cochlear Implants , Sound Localization , Speech Perception , Humans , Hearing , Cochlear Implantation/methods , Hearing Tests/methods
11.
J Clin Med ; 12(6)2023 Mar 17.
Article in English | MEDLINE | ID: mdl-36983357

ABSTRACT

Unilateral hearing loss (UHL) leads to an alteration of binaural cues resulting in a significant increment of spatial errors in the horizontal plane. In this study, nineteen patients with UHL were recruited and randomized in a cross-over design into two groups; a first group (n = 9) that received spatial audiovisual training in the first session and a non-spatial audiovisual training in the second session (2 to 4 weeks after the first session). A second group (n = 10) received the same training in the opposite order (non-spatial and then spatial). A sound localization test using head-pointing (LOCATEST) was completed prior to and following each training session. The results showed a significant decrease in head-pointing localization errors after spatial training for group 1 (24.85° ± 15.8° vs. 16.17° ± 11.28°; p < 0.001). The number of head movements during the spatial training for the 19 participants did not change (p = 0.79); nonetheless, the hand-pointing errors and reaction times significantly decreased at the end of the spatial training (p < 0.001). This study suggests that audiovisual spatial training can improve and induce spatial adaptation to a monaural deficit through the optimization of effective head movements. Virtual reality systems are relevant tools that can be used in clinics to develop training programs for patients with hearing impairments.

12.
Ear Hear ; 44(1): 189-198, 2023.
Article in English | MEDLINE | ID: mdl-35982520

ABSTRACT

OBJECTIVES: We assessed if spatial hearing training improves sound localization in bilateral cochlear implant (BCI) users and whether its benefits can generalize to untrained sound localization tasks. DESIGN: In 20 BCI users, we assessed the effects of two training procedures (spatial versus nonspatial control training) on two different tasks performed before and after training (head-pointing to sound and audiovisual attention orienting). In the spatial training, participants identified sound position by reaching toward the sound sources with their hand. In the nonspatial training, comparable reaching movements served to identify sound amplitude modulations. A crossover randomized design allowed comparison of training procedures within the same participants. Spontaneous head movements while listening to the sounds were allowed and tracked to correlate them with localization performance. RESULTS: During spatial training, BCI users reduced their sound localization errors in azimuth and adapted their spontaneous head movements as a function of sound eccentricity. These effects generalized to the head-pointing sound localization task, as revealed by greater reduction of sound localization error in azimuth and more accurate first head-orienting response, as compared to the control nonspatial training. BCI users benefited from auditory spatial cues for orienting visual attention, but the spatial training did not enhance this multisensory attention ability. CONCLUSIONS: Sound localization in BCI users improves with spatial reaching-to-sound training, with benefits to a nontrained sound localization task. These findings pave the way to novel rehabilitation procedures in clinical contexts.


Subject(s)
Cochlear Implantation , Cochlear Implants , Sound Localization , Humans , Auditory Perception/physiology , Cochlear Implantation/methods , Hearing/physiology , Hearing Tests/methods , Sound Localization/physiology , Cross-Over Studies
13.
Front Hum Neurosci ; 16: 1026056, 2022.
Article in English | MEDLINE | ID: mdl-36310849

ABSTRACT

Moving the head while a sound is playing improves its localization in human listeners, in children and adults, with or without hearing problems. It remains to be ascertained if this benefit can also extend to aging adults with hearing-loss, a population in which spatial hearing difficulties are often documented and intervention solutions are scant. Here we examined performance of elderly adults (61-82 years old) with symmetrical or asymmetrical age-related hearing-loss, while they localized sounds with their head fixed or free to move. Using motion-tracking in combination with free-field sound delivery in visual virtual reality, we tested participants in two auditory spatial tasks: front-back discrimination and 3D sound localization in front space. Front-back discrimination was easier for participants with symmetrical compared to asymmetrical hearing-loss, yet both groups reduced their front-back errors when head-movements were allowed. In 3D sound localization, free head-movements reduced errors in the horizontal dimension and in a composite measure that computed errors in 3D space. Errors in 3D space improved for participants with asymmetrical hearing-impairment when the head was free to move. These preliminary findings extend to aging adults with hearing-loss the literature on the advantage of head-movements on sound localization, and suggest that the disparity of auditory cues at the two ears can modulate this benefit. These results point to the possibility of taking advantage of self-regulation strategies and active behavior when promoting spatial hearing skills.

14.
Cereb Cortex Commun ; 3(3): tgac031, 2022.
Article in English | MEDLINE | ID: mdl-36072709

ABSTRACT

We constantly face situations involving interactions with others that require us to automatically adjust our physical distances to avoid discomfort or anxiety. A previous case study has demonstrated that the integrity of both amygdalae is essential to regulate interpersonal distances. Despite unilateral lesion to the amygdala, as to other sectors of the medial temporal cortex, are known to also affect social behavior, their role in the regulation of interpersonal distances has never been investigated. Here, we sought to fill this gap by testing three patients with unilateral temporal lesions following surgical resections, including one patient with a lesion mainly centered on the amygdala and two with lesions to adjacent medial temporal cortex, on two versions of the stop distance paradigm (i.e. in a virtual reality environment and in a real setting). Our results showed that all three patients set shorter interpersonal distances compared to neurotypical controls. In addition, compared to controls, none of the patients adjusted such physical distances depending on facial emotional expressions, despite they preserved ability to categorize them. Finally, patients' heart rate responses differed from controls when viewing approaching faces. Our findings bring compelling evidence that unilateral lesions within the medial temporal cortex, not necessarily restricted to the amygdala, are sufficient to alter interpersonal distance, thus shedding new light on the neural circuitry regulating distance in social interactions.

15.
PLoS One ; 17(4): e0263509, 2022.
Article in English | MEDLINE | ID: mdl-35421095

ABSTRACT

Localising sounds means having the ability to process auditory cues deriving from the interplay among sound waves, the head and the ears. When auditory cues change because of temporary or permanent hearing loss, sound localization becomes difficult and uncertain. The brain can adapt to altered auditory cues throughout life and multisensory training can promote the relearning of spatial hearing skills. Here, we study the training potentials of sound-oriented motor behaviour to test if a training based on manual actions toward sounds can learning effects that generalize to different auditory spatial tasks. We assessed spatial hearing relearning in normal hearing adults with a plugged ear by using visual virtual reality and body motion tracking. Participants performed two auditory tasks that entail explicit and implicit processing of sound position (head-pointing sound localization and audio-visual attention cueing, respectively), before and after having received a spatial training session in which they identified sound position by reaching to auditory sources nearby. Using a crossover design, the effects of the above-mentioned spatial training were compared to a control condition involving the same physical stimuli, but different task demands (i.e., a non-spatial discrimination of amplitude modulations in the sound). According to our findings, spatial hearing in one-ear plugged participants improved more after reaching to sound trainings rather than in the control condition. Training by reaching also modified head-movement behaviour during listening. Crucially, the improvements observed during training generalize also to a different sound localization task, possibly as a consequence of acquired and novel head-movement strategies.


Subject(s)
Cues , Sound Localization , Acoustic Stimulation , Adaptation, Physiological , Adult , Auditory Perception , Cross-Over Studies , Hearing , Humans
16.
J Cogn Neurosci ; 34(4): 675-686, 2022 03 05.
Article in English | MEDLINE | ID: mdl-35061032

ABSTRACT

The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7-14 Hz) and beta (15-30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.


Subject(s)
Spatial Processing , Touch Perception , Hand , Humans , Parietal Lobe , Space Perception , Touch
17.
Proc Natl Acad Sci U S A ; 119(1)2022 01 04.
Article in English | MEDLINE | ID: mdl-34983835

ABSTRACT

Perhaps the most recognizable sensory map in all of neuroscience is the somatosensory homunculus. Although it seems straightforward, this simple representation belies the complex link between an activation in a somatotopic map and the associated touch location on the body. Any isolated activation is spatially ambiguous without a neural decoder that can read its position within the entire map, but how this is computed by neural networks is unknown. We propose that the somatosensory system implements multilateration, a common computation used by surveying and global positioning systems to localize objects. Specifically, to decode touch location on the body, multilateration estimates the relative distance between the afferent input and the boundaries of a body part (e.g., the joints of a limb). We show that a simple feedforward neural network, which captures several fundamental receptive field properties of cortical somatosensory neurons, can implement a Bayes-optimal multilateral computation. Simulations demonstrated that this decoder produced a pattern of localization variability between two boundaries that was unique to multilateration. Finally, we identify this computational signature of multilateration in actual psychophysical experiments, suggesting that it is a candidate computational mechanism underlying tactile localization.


Subject(s)
Neural Networks, Computer , Touch Perception/physiology , Touch/physiology , Adult , Animals , Bayes Theorem , Brain Mapping , Female , Humans , Mice , Models, Neurological , Neurons/physiology , Physical Stimulation , Somatosensory Cortex/physiology , Young Adult
18.
Ear Hear ; 43(1): 192-205, 2022.
Article in English | MEDLINE | ID: mdl-34225320

ABSTRACT

OBJECTIVES: The aim of this study was to assess three-dimensional (3D) spatial hearing abilities in reaching space of children and adolescents fitted with bilateral cochlear implants (BCI). The study also investigated the impact of spontaneous head movements on sound localization abilities. DESIGN: BCI children (N = 18, aged between 8 and 17) and age-matched normal-hearing (NH) controls (N = 18) took part in the study. Tests were performed using immersive virtual reality equipment that allowed control over visual information and initial eye position, as well as real-time 3D motion tracking of head and hand position with subcentimeter accuracy. The experiment exploited these technical features to achieve trial-by-trial exact positioning in head-centered coordinates of a single loudspeaker used for real, near-field sound delivery, which was reproducible across trials and participants. Using this novel approach, broadband sounds were delivered at different azimuths within the participants' arm length, in front and back space, at two different distances from their heads. Continuous head-monitoring allowed us to compare two listening conditions: "head immobile" (no head movements allowed) and "head moving" (spontaneous head movements allowed). Sound localization performance was assessed by computing the mean 3D error (i.e. the difference in space between the X-Y-Z position of the loudspeaker and the participant's final hand position used to indicate the localization of the sound's source), as well as the percentage of front-back and left-right confusions in azimuth, and the discriminability between two nearby distances. Several clinical factors (i.e. age at test, interimplant interval, and duration of binaural experience) were also correlated with the mean 3D error. Finally, the Speech Spatial and Qualities of Hearing Scale was administered to BCI participants and their parents. RESULTS: Although BCI participants distinguished well between left and right sound sources, near-field spatial hearing remained challenging, particularly under the " head immobile" condition. Without visual priors of the sound position, response accuracy was lower than that of their NH peers, as evidenced by the mean 3D error (BCI: 55 cm, NH: 24 cm, p = 0.008). The BCI group mainly pointed along the interaural axis, corresponding to the position of their CI microphones. This led to important front-back confusions (44.6%). Distance discrimination also remained challenging for BCI users, mostly due to sound compression applied by their processor. Notably, BCI users benefitted from head movements under the "head moving" condition, with a significant decrease of the 3D error when pointing to front targets (p < 0.001). Interimplant interval was correlated with 3D error (p < 0.001), whereas no correlation with self-assessment of spatial hearing difficulties emerged (p = 0.9). CONCLUSIONS: In reaching space, BCI children and adolescents are able to extract enough auditory cues to discriminate sound side. However, without any visual cues or spontaneous head movements during sound emission, their localization abilities are substantially impaired for front-back and distance discrimination. Exploring the environment with head movements was a valuable strategy for improving sound localization within individuals with different clinical backgrounds. These novel findings could prompt new perspectives to better understand sound localization maturation in BCI children, and more broadly in patients with hearing loss.


Subject(s)
Cochlear Implantation , Cochlear Implants , Hearing Loss , Sound Localization , Speech Perception , Adolescent , Child , Cochlear Implantation/methods , Head Movements , Hearing , Humans
19.
J Exp Psychol Gen ; 151(4): 872-884, 2022 Apr.
Article in English | MEDLINE | ID: mdl-34694859

ABSTRACT

The conscious body image includes the visual representation of body-parts size; whether this component of body perception can flexibly adapt to changes of the sense of ownership of one's body-parts remains to be demonstrated. The present study addresses this issue, showing that the ownership of a novel hand affects the conscious visual perception of the size of the really owned hand. Through a series of experiments in healthy adults, we assess how the embodiment of fake hands of different sizes (i.e., Rubber Hand Illusion, RHI) affects visual size estimation of the own hand. Our results demonstrate that the embodiment of a fake hand bigger in size than the own hand (Experiment 1), but not of a smaller fake hand (Experiment 2), affects the perception of similarity in size between the own hand and a visual model of the own hand, with a tendency toward an overestimation of the size of the hand exposed to the RHI (Experiment 1). The illusory ownership of a bigger hand does not affect the visual estimate of object size (Experiment 3). These findings show the tight link between the body image and the sense of ownership, the latter being able to change stored representations of body-parts size. This evidence might pave the way for restoring pathological alteration of body image through strategies accessing the body schema. (PsycInfo Database Record (c) 2022 APA, all rights reserved).


Subject(s)
Body Image , Body Size , Ownership , Visual Perception , Adult , Body Image/psychology , Hand , Humans , Illusions
20.
Eur J Neurosci ; 55(1): 189-200, 2022 01.
Article in English | MEDLINE | ID: mdl-34796553

ABSTRACT

Reorganization of the sensorimotor cortex following permanent (e.g., amputation) or temporary (e.g., local anaesthesia) deafferentation of the hand has revealed large-scale plastic changes between the hand and face representations that are accompanied by perceptual correlates. The physiological mechanisms underlying this reorganization remain poorly understood. The aim of this study was to investigate sensorimotor interactions between the face and hand using an afferent inhibition transcranial magnetic stimulation protocol in which the motor evoked potential elicited by the magnetic pulse is inhibited when it is preceded by an afferent stimulus. We hypothesized that if face and hand representations in the sensorimotor cortex are functionally coupled, then electrocutaneous stimulation of the face would inhibit hand muscle motor responses. In two separate experiments, we delivered an electrocutaneous stimulus to either the skin over the right upper lip (Experiment 1) or right cheek (Experiment 2) and recorded muscular activity from the right first dorsal interosseous. Both lip and cheek stimulation inhibited right first dorsal interosseous motor evoked potentials. To investigate the specificity of this effect, we conducted two additional experiments in which electrocutaneous stimulation was applied to either the right forearm (Experiment 3) or right upper arm (Experiment 4). Forearm and upper arm stimulation also significantly inhibited the right first dorsal interosseous motor evoked potentials, but this inhibition was less robust than the inhibition associated with face stimulation. These findings provide the first evidence for face-to-hand afferent inhibition.


Subject(s)
Motor Cortex , Electric Stimulation , Evoked Potentials, Motor/physiology , Hand/physiology , Motor Cortex/physiology , Muscle, Skeletal/physiology , Neural Inhibition/physiology , Transcranial Magnetic Stimulation
SELECTION OF CITATIONS
SEARCH DETAIL
...