Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 19 de 19
Filter
Add more filters











Publication year range
1.
Nat Commun ; 13(1): 2489, 2022 05 05.
Article in English | MEDLINE | ID: mdl-35513362

ABSTRACT

Neural mechanisms that arbitrate between integrating and segregating multisensory information are essential for complex scene analysis and for the resolution of the multisensory correspondence problem. However, these mechanisms and their dynamics remain largely unknown, partly because classical models of multisensory integration are static. Here, we used the Multisensory Correlation Detector, a model that provides a good explanatory power for human behavior while incorporating dynamic computations. Participants judged whether sequences of auditory and visual signals originated from the same source (causal inference) or whether one modality was leading the other (temporal order), while being recorded with magnetoencephalography. First, we confirm that the Multisensory Correlation Detector explains causal inference and temporal order behavioral judgments well. Second, we found strong fits of brain activity to the two outputs of the Multisensory Correlation Detector in temporo-parietal cortices. Finally, we report an asymmetry in the goodness of the fits, which were more reliable during the causal inference task than during the temporal order judgment task. Overall, our results suggest the existence of multisensory correlation detectors in the human brain, which explain why and how causal inference is strongly driven by the temporal correlation of multisensory signals.


Subject(s)
Auditory Perception , Visual Perception , Acoustic Stimulation , Brain , Humans , Magnetoencephalography , Parietal Lobe , Photic Stimulation
2.
J Neurophysiol ; 126(4): 1375-1390, 2021 Oct 01.
Article in English | MEDLINE | ID: mdl-34495782

ABSTRACT

Besides providing information on elementary properties of objects, like texture, roughness, and softness, the sense of touch is also important in building a representation of object movement and the movement of our hands. Neural and behavioral studies shed light on the mechanisms and limits of our sense of touch in the perception of texture and motion, and of its role in the control of movement of our hands. The interplay between the geometrical and mechanical properties of the touched objects, such as shape and texture, the movement of the hand exploring the object, and the motion felt by touch, will be discussed in this article. Interestingly, the interaction between motion and textures can generate perceptual illusions in touch. For example, the orientation and the spacing of the texture elements on a static surface induces the illusion of surface motion when we move our hand on it or can elicit the perception of a curved trajectory during sliding, straight hand movements. In this work we present a multiperspective view that encompasses both the perceptual and the motor aspects, as well as the response of peripheral and central nerve structures, to analyze and better understand the complex mechanisms underpinning the tactile representation of texture and motion. Such a better understanding of the spatiotemporal features of the tactile stimulus can reveal novel transdisciplinary applications in neuroscience and haptics.


Subject(s)
Illusions/physiology , Motion Perception/physiology , Somatosensory Cortex/physiology , Touch Perception/physiology , Touch/physiology , Humans
3.
Proc Biol Sci ; 284(1858)2017 Jul 12.
Article in English | MEDLINE | ID: mdl-28701558

ABSTRACT

Unlike vision, the mechanisms underlying auditory motion perception are poorly understood. Here we describe an auditory motion illusion revealing a novel cue to auditory speed perception: the temporal frequency of amplitude modulation (AM-frequency), typical for rattling sounds. Naturally, corrugated objects sliding across each other generate rattling sounds whose AM-frequency tends to directly correlate with speed. We found that AM-frequency modulates auditory speed perception in a highly systematic fashion: moving sounds with higher AM-frequency are perceived as moving faster than sounds with lower AM-frequency. Even more interestingly, sounds with higher AM-frequency also induce stronger motion aftereffects. This reveals the existence of specialized neural mechanisms for auditory motion perception, which are sensitive to AM-frequency. Thus, in spatial hearing, the brain successfully capitalizes on the AM-frequency of rattling sounds to estimate the speed of moving objects. This tightly parallels previous findings in motion vision, where spatio-temporal frequency of moving displays systematically affects both speed perception and the magnitude of the motion aftereffects. Such an analogy with vision suggests that motion detection may rely on canonical computations, with similar neural mechanisms shared across the different modalities.


Subject(s)
Acoustic Stimulation , Auditory Perception , Cues , Motion Perception , Hearing , Humans , Sound
4.
PLoS Comput Biol ; 13(7): e1005546, 2017 Jul.
Article in English | MEDLINE | ID: mdl-28692700

ABSTRACT

Sensory information about the state of the world is generally ambiguous. Understanding how the nervous system resolves such ambiguities to infer the actual state of the world is a central quest for sensory neuroscience. However, the computational principles of perceptual disambiguation are still poorly understood: What drives perceptual decision-making between multiple equally valid solutions? Here we investigate how humans gather and combine sensory information-within and across modalities-to disambiguate motion perception in an ambiguous audiovisual display, where two moving stimuli could appear as either streaming through, or bouncing off each other. By combining psychophysical classification tasks with reverse correlation analyses, we identified the particular spatiotemporal stimulus patterns that elicit a stream or a bounce percept, respectively. From that, we developed and tested a computational model for uni- and multi-sensory perceptual disambiguation that tightly replicates human performance. Specifically, disambiguation relies on knowledge of prototypical bouncing events that contain characteristic patterns of motion energy in the dynamic visual display. Next, the visual information is linearly integrated with auditory cues and prior knowledge about the history of recent perceptual interpretations. What is more, we demonstrate that perceptual decision-making with ambiguous displays is systematically driven by noise, whose random patterns not only promote alternation, but also provide signal-like information that biases perception in highly predictable fashion.


Subject(s)
Auditory Perception/physiology , Decision Making/physiology , Visual Perception/physiology , Acoustic Stimulation , Adult , Algorithms , Computational Biology , Female , Humans , Male , Models, Psychological , Photic Stimulation , Psychophysics , Young Adult
6.
Multisens Res ; 29(1-3): 7-28, 2016.
Article in English | MEDLINE | ID: mdl-27311289

ABSTRACT

Crossmodal correspondences refer to the systematic associations often found across seemingly unrelated sensory features from different sensory modalities. Such phenomena constitute a universal trait of multisensory perception even in non-human species, and seem to result, at least in part, from the adaptation of sensory systems to natural scene statistics. Despite recent developments in the study of crossmodal correspondences, there are still a number of standing questions about their definition, their origins, their plasticity, and their underlying computational mechanisms. In this paper, I will review such questions in the light of current research on sensory cue integration, where crossmodal correspondences can be conceptualized in terms of natural mappings across different sensory cues that are present in the environment and learnt by the sensory systems. Finally, I will provide some practical guidelines for the design of experiments that might shed new light on crossmodal correspondences.


Subject(s)
Association , Attention , Perception , Cues , Humans , Psychophysics
7.
Nat Commun ; 7: 11543, 2016 06 06.
Article in English | MEDLINE | ID: mdl-27265526

ABSTRACT

The brain efficiently processes multisensory information by selectively combining related signals across the continuous stream of multisensory inputs. To do so, it needs to detect correlation, lag and synchrony across the senses; optimally integrate related information; and dynamically adapt to spatiotemporal conflicts across the senses. Here we show that all these aspects of multisensory perception can be jointly explained by postulating an elementary processing unit akin to the Hassenstein-Reichardt detector-a model originally developed for visual motion perception. This unit, termed the multisensory correlation detector (MCD), integrates related multisensory signals through a set of temporal filters followed by linear combination. Our model can tightly replicate human perception as measured in a series of empirical studies, both novel and previously published. MCDs provide a unified general theory of multisensory processing, which simultaneously explains a wide spectrum of phenomena with a simple, yet physiologically plausible model.


Subject(s)
Auditory Perception/physiology , Sensation , Visual Perception/physiology , Acoustic Stimulation , Adult , Computer Simulation , Cues , Female , Humans , Judgment , Male , Models, Neurological , Photic Stimulation , Reproducibility of Results , Time Factors , Young Adult
8.
Sci Rep ; 5: 14054, 2015 Sep 15.
Article in English | MEDLINE | ID: mdl-26370720

ABSTRACT

Perception can often be described as a statistically optimal inference process whereby noisy and incomplete sensory evidence is combined with prior knowledge about natural scene statistics. Previous evidence has shown that humans tend to underestimate the speed of unreliable moving visual stimuli. This finding has been interpreted in terms of a Bayesian prior favoring low speed, given that in natural visual scenes objects are mostly stationary or slowly-moving. Here we investigated whether an analogous tendency to underestimate speed also occurs in audition: even if the statistics of the visual environment seem to favor low speed, the statistics of the stimuli reaching the individual senses may differ across modalities, hence potentially leading to different priors. Here we observed a systematic bias for underestimating the speed of unreliable moving sounds. This finding suggests the existence of a slow-motion prior in audition, analogous to the one previously found in vision. The nervous system might encode the overall statistics of the world, rather than the specific properties of the signals reaching the individual senses.

9.
Exp Brain Res ; 233(6): 1921-9, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25850405

ABSTRACT

Watching self-generated unilateral hand movements reflected in a mirror-oriented along the midsagittal plane-enhances the excitability of the primary motor cortex (M1) ipsilateral to the moving hand of the observer. Mechanisms detecting sensory-motor conflicts generated by the mirror reflection of such movements might mediate this effect; if so, cortical excitability should be modulated by the magnitude of sensory-motor conflict. To this end, we explored the modulatory effects of an altered visual feedback on M1 excitability in a mirror-box-like paradigm, by increasing or decreasing the speed of the observed movement. Healthy subjects performed movements with their left index finger while watching a video of a hand superimposed to their right static hand, which was hidden from view. The hand observed in the video executed the same movement as the observer's left hand, but at slower, same, or faster paces. Motor evoked potentials (MEPs) induced by transcranial magnetic stimulation were measured from the first dorsal interosseous and the abductor digiti minimi of the participant's hidden resting hand. The excitability of the M1 ipsilateral to the moving hand was systematically modulated by the speed of the observed hand movement: the slower the observed movement, the greater the MEP amplitude from both muscles. This evidence shows that the magnitude of the visual-motor conflicts can be used to adjust the activity of the observer's motor system. Hence, an appropriate alteration of the visual feedback, here the reduction in the movement speed, may be useful to increase its modulatory effect on motor cortical excitability.


Subject(s)
Evoked Potentials, Motor/physiology , Feedback, Sensory/physiology , Functional Laterality/physiology , Motor Cortex/physiology , Movement/physiology , Adult , Analysis of Variance , Electromyography , Female , Fourier Analysis , Hand/innervation , Humans , Male , Photic Stimulation , Reaction Time , Transcranial Magnetic Stimulation , Young Adult
10.
Proc Natl Acad Sci U S A ; 111(16): 6104-8, 2014 Apr 22.
Article in English | MEDLINE | ID: mdl-24711409

ABSTRACT

Human perception, cognition, and action are laced with seemingly arbitrary mappings. In particular, sound has a strong spatial connotation: Sounds are high and low, melodies rise and fall, and pitch systematically biases perceived sound elevation. The origins of such mappings are unknown. Are they the result of physiological constraints, do they reflect natural environmental statistics, or are they truly arbitrary? We recorded natural sounds from the environment, analyzed the elevation-dependent filtering of the outer ear, and measured frequency-dependent biases in human sound localization. We find that auditory scene statistics reveals a clear mapping between frequency and elevation. Perhaps more interestingly, this natural statistical mapping is tightly mirrored in both ear-filtering properties and in perceived sound location. This suggests that both sound localization behavior and ear anatomy are fine-tuned to the statistics of natural auditory scenes, likely providing the basis for the spatial connotation of human hearing.


Subject(s)
Acoustic Stimulation , Auditory Perception/physiology , Hearing/physiology , Adult , Female , Humans , Male , Space Perception/physiology , Young Adult
11.
PLoS One ; 9(3): e91688, 2014.
Article in English | MEDLINE | ID: mdl-24621793

ABSTRACT

Our body is made of flesh and bones. We know it, and in our daily lives all the senses constantly provide converging information about this simple, factual truth. But is this always the case? Here we report a surprising bodily illusion demonstrating that humans rapidly update their assumptions about the material qualities of their body, based on their recent multisensory perceptual experience. To induce a misperception of the material properties of the hand, we repeatedly gently hit participants' hand with a small hammer, while progressively replacing the natural sound of the hammer against the skin with the sound of a hammer hitting a piece of marble. After five minutes, the hand started feeling stiffer, heavier, harder, less sensitive, unnatural, and showed enhanced Galvanic skin response (GSR) to threatening stimuli. Notably, such a change in skin conductivity positively correlated with changes in perceived hand stiffness. Conversely, when hammer hits and impact sounds were temporally uncorrelated, participants did not spontaneously report any changes in the perceived properties of the hand, nor did they show any modulation in GSR. In two further experiments, we ruled out that mere audio-tactile synchrony is the causal factor triggering the illusion, further demonstrating the key role of material information conveyed by impact sounds in modulating the perceived material properties of the hand. This novel bodily illusion, the 'Marble-Hand Illusion', demonstrates that the perceived material of our body, surely the most stable attribute of our bodily self, can be quickly updated through multisensory integration.


Subject(s)
Calcium Carbonate , Hand/physiology , Illusions/physiology , Adult , Humans , Male , Mechanical Phenomena , Skin , Surveys and Questionnaires , Touch Perception , Young Adult
12.
Multisens Res ; 26(3): 307-16, 2013.
Article in English | MEDLINE | ID: mdl-23964482

ABSTRACT

Humans are equipped with multiple sensory channels that provide both redundant and complementary information about the objects and events in the world around them. A primary challenge for the brain is therefore to solve the 'correspondence problem', that is, to bind those signals that likely originate from the same environmental source, while keeping separate those unisensory inputs that likely belong to different objects/events. Whether multiple signals have a common origin or not must, however, be inferred from the signals themselves through a causal inference process. Recent studies have demonstrated that cross-correlation, that is, the similarity in temporal structure between unimodal signals, represents a powerful cue for solving the correspondence problem in humans. Here we provide further evidence for the role of the temporal correlation between auditory and visual signals in multisensory integration. Capitalizing on the well-known fact that sensitivity to crossmodal conflict is inversely related to the strength of coupling between the signals, we measured sensitivity to crossmodal spatial conflicts as a function of the cross-correlation between the temporal structures of the audiovisual signals. Observers' performance was systematically modulated by the cross-correlation, with lower sensitivity to crossmodal conflict being measured for correlated as compared to uncorrelated audiovisual signals. These results therefore provide support for the claim that cross-correlation promotes multisensory integration. A Bayesian framework is proposed to interpret the present results, whereby stimulus correlation is represented on the prior distribution of expected crossmodal co-occurrence.


Subject(s)
Auditory Perception/physiology , Bayes Theorem , Cues , Visual Perception/physiology , Acoustic Stimulation/methods , Brain Mapping/methods , Humans , Photic Stimulation/methods
13.
Iperception ; 3(7): 410-2, 2012.
Article in English | MEDLINE | ID: mdl-23145291

ABSTRACT

In a recent article, N. Bien, S. ten Oever, R. Goebel, and A. T. Sack (2012) used event-related potentials to investigate the consequences of crossmodal correspondences (the "natural" mapping of features, or dimensions, of experience across sensory modalities) on the time course of neural information processing. Then, by selectively lesioning the right intraparietal cortex using transcranial magnetic stimulation, these researchers went on to demonstrate (for the first time) that it is possible to temporarily eliminate the effect of crossmodal congruency on multisensory integration (specifically on the spatial ventriloquism effect). These results are especially exciting given the possibility that the cognitive neuroscience methodology utilized by Bien et al. (2012) holds for dissociating between putatively different kinds of crossmodal correspondence in future research.

14.
Exp Brain Res ; 220(3-4): 319-33, 2012 Aug.
Article in English | MEDLINE | ID: mdl-22706551

ABSTRACT

A growing body of empirical research on the topic of multisensory perception now shows that even non-synaesthetic individuals experience crossmodal correspondences, that is, apparently arbitrary compatibility effects between stimuli in different sensory modalities. In the present study, we replicated a number of classic results from the literature on crossmodal correspondences and highlight the existence of two new crossmodal correspondences using a modified version of the implicit association test (IAT). Given that only a single stimulus was presented on each trial, these results rule out selective attention and multisensory integration as possible mechanisms underlying the reported compatibility effects on speeded performance. The crossmodal correspondences examined in the present study all gave rise to very similar effect sizes, and the compatibility effect had a very rapid onset, thus speaking to the automatic detection of crossmodal correspondences. These results are further discussed in terms of the advantages of the IAT over traditional techniques for assessing the strength and symmetry of various crossmodal correspondences.


Subject(s)
Association , Attention/physiology , Auditory Perception/physiology , Visual Perception/physiology , Acoustic Stimulation , Adolescent , Adult , Female , Humans , Male , Photic Stimulation , Reaction Time/physiology , Symbolism
15.
Curr Biol ; 22(1): 46-9, 2012 Jan 10.
Article in English | MEDLINE | ID: mdl-22177899

ABSTRACT

Inferring which signals have a common underlying cause, and hence should be integrated, represents a primary challenge for a perceptual system dealing with multiple sensory inputs [1-3]. This challenge is often referred to as the correspondence problem or causal inference. Previous research has demonstrated that spatiotemporal cues, along with prior knowledge, are exploited by the human brain to solve this problem [4-9]. Here we explore the role of correlation between the fine temporal structure of auditory and visual signals in causal inference. Specifically, we investigated whether correlated signals are inferred to originate from the same distal event and hence are integrated optimally [10]. In a localization task with visual, auditory, and combined audiovisual targets, the improvement in precision for combined relative to unimodal targets was statistically optimal only when audiovisual signals were correlated. This result demonstrates that humans use the similarity in the temporal structure of multisensory signals to solve the correspondence problem, hence inferring causation from correlation.


Subject(s)
Auditory Perception , Visual Perception , Acoustic Stimulation , Humans , Nontherapeutic Human Experimentation , Photic Stimulation
16.
Exp Brain Res ; 214(3): 373-80, 2011 Oct.
Article in English | MEDLINE | ID: mdl-21901453

ABSTRACT

The question of the arbitrariness of language is among the oldest in cognitive sciences, and it relates to the nature of the associations between vocal sounds and their meaning. Growing evidence seems to support sound symbolism, claiming for a naturally constrained mapping of meaning into sounds. Most of such evidence, however, comes from studies based on the interpretation of pseudowords, and to date, there is little empirical evidence that sound symbolism can affect phonatory behavior. In the present study, we asked participants to utter the letter /a/ in response to visual stimuli varying in shape, luminance, and size, and we observed consistent sound symbolic effects on vocalizations. Utterances' loudness was modulated by stimulus shape and luminance. Moreover, stimulus shape consistently modulated the frequency of the third formant (F3). This finding reveals an automatic mapping of specific visual attributes into phonological features of vocalizations. Furthermore, it suggests that sound-meaning associations are reciprocal, affecting active (production) as well as passive (comprehension) linguistic behavior.


Subject(s)
Auditory Perception/physiology , Phonation/physiology , Semantics , Speech Perception/physiology , Symbolism , Visual Perception/physiology , Acoustic Stimulation/methods , Adult , Female , Humans , Imagination/physiology , Language Tests/standards , Male , Middle Aged , Photic Stimulation/methods , Young Adult
17.
Conscious Cogn ; 19(1): 364-79, 2010 Mar.
Article in English | MEDLINE | ID: mdl-20056554

ABSTRACT

The law of prior entry was one of E.B. Titchener's seven fundamental laws of attention. According to Titchener (1908, p. 251): "the object of attention comes to consciousness more quickly than the objects which we are not attending to." Although researchers have been studying prior entry for more than a century now, progress in understanding the effect has been hindered by the many methodological confounds present in early research. As a consequence, it is unclear whether the behavioral effects reported in the majority of published studies in this area should be attributed to attention, decisional response biases, and/or, in the case of exogenous spatial cuing studies of the prior-entry effect, to sensory facilitation effects instead. In this article, the literature on the prior-entry effect is reviewed, the confounds present in previous research highlighted, current consensus summarized, and some of the key questions for future research outlined. In particular, recent research has now provided compelling psychophysical and electrophysiological evidence to support the claim that attending to a sensory modality, spatial location, or stimulus feature/attribute can all give rise to a relative speeding-up of the time of arrival of attended, as compared to relatively less attended (or unattended) stimuli. Prior-entry effects have now been demonstrated following both the endogenous and exogenous orienting of attention, though prior-entry effects tend to be smaller in magnitude when assessed by means of participants' performance on SJ tasks than when assessed by means of their performance on TOJ tasks.


Subject(s)
Attention/physiology , Brain/physiology , Consciousness/physiology , Cues , Auditory Perception/physiology , Cognition/physiology , Differential Threshold/physiology , Humans , Judgment/physiology , Space Perception/physiology , Time Perception/physiology , Touch/physiology , Visual Perception/physiology
18.
PLoS One ; 4(5): e5664, 2009 May 27.
Article in English | MEDLINE | ID: mdl-19471644

ABSTRACT

BACKGROUND: Synesthesia is a condition in which the stimulation of one sense elicits an additional experience, often in a different (i.e., unstimulated) sense. Although only a small proportion of the population is synesthetic, there is growing evidence to suggest that neurocognitively-normal individuals also experience some form of synesthetic association between the stimuli presented to different sensory modalities (i.e., between auditory pitch and visual size, where lower frequency tones are associated with large objects and higher frequency tones with small objects). While previous research has highlighted crossmodal interactions between synesthetically corresponding dimensions, the possible role of synesthetic associations in multisensory integration has not been considered previously. METHODOLOGY: Here we investigate the effects of synesthetic associations by presenting pairs of asynchronous or spatially discrepant visual and auditory stimuli that were either synesthetically matched or mismatched. In a series of three psychophysical experiments, participants reported the relative temporal order of presentation or the relative spatial locations of the two stimuli. PRINCIPAL FINDINGS: The reliability of non-synesthetic participants' estimates of both audiovisual temporal asynchrony and spatial discrepancy were lower for pairs of synesthetically matched as compared to synesthetically mismatched audiovisual stimuli. CONCLUSIONS: Recent studies of multisensory integration have shown that the reduced reliability of perceptual estimates regarding intersensory conflicts constitutes the marker of a stronger coupling between the unisensory signals. Our results therefore indicate a stronger coupling of synesthetically matched vs. mismatched stimuli and provide the first psychophysical evidence that synesthetic congruency can promote multisensory integration. Synesthetic crossmodal correspondences therefore appear to play a crucial (if unacknowledged) role in the multisensory integration of auditory and visual information.


Subject(s)
Acoustic Stimulation , Photic Stimulation , Humans , Pitch Discrimination , Space Perception , Time Factors
19.
Neurosci Lett ; 442(3): 257-61, 2008 Sep 19.
Article in English | MEDLINE | ID: mdl-18638522

ABSTRACT

People sometimes find it easier to judge the temporal order in which two visual stimuli have been presented if one tone is presented before the first visual stimulus and a second tone is presented after the second visual stimulus. This enhancement of people's visual temporal sensitivity has been attributed to the temporal ventriloquism of the visual stimuli toward the temporally proximate sounds, resulting in an expansion of the perceived interval between the two visual events. In the present study, we demonstrate that the synesthetic congruency between the auditory and visual stimuli (in particular, between the relative pitch of the sounds and the relative size of the visual stimuli) can modulate the magnitude of this multisensory integration effect: The auditory capture of vision is larger for pairs of auditory and visual stimuli that are synesthetically congruent than for pairs of stimuli that are synesthetically incongruent, as reflected by participants' increased sensitivity in discriminating the temporal order of the visual stimuli. These results provide the first evidence that multisensory temporal integration can be affected by the synesthetic congruency between the auditory and visual stimuli that happen to be presented.


Subject(s)
Auditory Perception/physiology , Brain/physiology , Visual Perception/physiology , Adolescent , Adult , Female , Humans , Male
SELECTION OF CITATIONS
SEARCH DETAIL