Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Exp Brain Res ; 240(11): 2817-2833, 2022 Nov.
Article in English | MEDLINE | ID: mdl-36071210

ABSTRACT

In everyday life, sound localization entails more than just the extraction and processing of auditory cues. When determining sound position in three dimensions, the brain also considers the available visual information (e.g., visual cues to sound position) and resolves perceptual ambiguities through active listening behavior (e.g., spontaneous head movements while listening). Here, we examined to what extent spontaneous head movements improve sound localization in 3D-azimuth, elevation, and depth-by comparing static vs. active listening postures. To this aim, we developed a novel approach to sound localization based on sounds delivered in the environment, brought into alignment thanks to a VR system. Our system proved effective for the delivery of sounds at predetermined and repeatable positions in 3D space, without imposing a physically constrained posture, and with minimal training. In addition, it allowed measuring participant behavior (hand, head and eye position) in real time. We report that active listening improved 3D sound localization, primarily by ameliorating accuracy and variability of responses in azimuth and elevation. The more participants made spontaneous head movements, the better was their 3D sound localization performance. Thus, we provide proof of concept of a novel approach to the study of spatial hearing, with potentials for clinical and industrial applications.


Subject(s)
Sound Localization , Humans , Sound Localization/physiology , Auditory Perception/physiology , Hearing/physiology , Head Movements , Cues
2.
Psychon Bull Rev ; 28(6): 1894-1905, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34159525

ABSTRACT

Peripersonal space (PPS) is a multisensory representation of the space near body parts facilitating interactions with the close environment. Studies on non-human and human primates agree in showing that PPS is a body part-centered representation that guides actions. Because of these characteristics, growing confusion surrounds peripersonal and arm-reaching space (ARS), that is the space one's arm can reach. Despite neuroanatomical evidence favoring their distinction, no study has contrasted directly their respective extent and behavioral features. Here, in five experiments (N = 140) we found that PPS differs from ARS, as evidenced both by participants' spatial and temporal performance and by its modeling. We mapped PPS and ARS using both their respective gold standard tasks and a novel multisensory facilitation paradigm. Results show that: (1) PPS is smaller than ARS; (2) multivariate analyses of spatial patterns of multisensory facilitation predict participants' hand locations within ARS; and (3) the multisensory facilitation map shifts isomorphically following hand positions, revealing hand-centered coding of PPS, therefore pointing to a functional similarity to the receptive fields of monkeys' multisensory neurons. A control experiment further corroborated these results and additionally ruled out the orienting of attention as the driving mechanism for the increased multisensory facilitation near the hand. In sharp contrast, ARS mapping results in a larger spatial extent, with undistinguishable patterns across hand positions, cross-validating the conclusion that PPS and ARS are distinct spatial representations. These findings show a need for refinement of theoretical models of PPS, which is relevant to constructs as diverse as self-representation, social interpersonal distance, and motor control.


Subject(s)
Personal Space , Space Perception , Body Image , Hand , Neurons
SELECTION OF CITATIONS
SEARCH DETAIL
...