Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
Add more filters










Publication year range
1.
Multisens Res ; 32(3): 215-234, 2019 01 01.
Article in English | MEDLINE | ID: mdl-31071679

ABSTRACT

Recent exposure to asynchronous multisensory signals has been shown to shift perceived timing between the sensory modalities, a phenomenon known as 'temporal recalibration'. Recently, Van der Burg et al. (2013, J Neurosci, 33, pp. 14633-14637) reported results showing that recalibration to asynchronous audiovisual events can happen extremely rapidly. In an extended series of variously asynchronous trials, simultaneity judgements were analysed based on the modality order in the preceding trial and showed that shifts in the point of subjective synchrony occurred almost instantaneously, shifting from one trial to the next. Here we replicate the finding that shifts in perceived timing occur following exposure to a single, asynchronous audiovisual stimulus and by manipulating the spatial location of the audiovisual events we demonstrate that recalibration occurs even when the adapting stimulus is presented in a different location. Timing shifts were also observed when the adapting audiovisual pair were defined only by temporal proximity, with the auditory component presented over headphones rather than being collocated with the visual stimulus. Combined with previous findings showing that timing shifts are independent of stimulus features such as colour and pitch, our finding that recalibration is not spatially specific provides strong evidence for a rapid recalibration process that is solely dependent on recent temporal information, regardless of feature or location. These rapid and automatic shifts in perceived synchrony may allow our sensory systems to flexibly adjust to the variation in timing of neural signals occurring as a result of delayed environmental transmission and differing neural latencies for processing vision and audition.


Subject(s)
Adaptation, Physiological/physiology , Auditory Perception/physiology , Time Perception/physiology , Visual Perception/physiology , Acoustic Stimulation , Female , Humans , Judgment/physiology , Male , Photic Stimulation
2.
J Assoc Res Otolaryngol ; 17(3): 209-21, 2016 06.
Article in English | MEDLINE | ID: mdl-27033087

ABSTRACT

The location of a sound is derived computationally from acoustical cues rather than being inherent in the topography of the input signal, as in vision. Since Lord Rayleigh, the descriptions of that representation have swung between "labeled line" and "opponent process" models. Employing a simple variant of a two-point separation judgment using concurrent speech sounds, we found that spatial discrimination thresholds changed nonmonotonically as a function of the overall separation. Rather than increasing with separation, spatial discrimination thresholds first declined as two-point separation increased before reaching a turning point and increasing thereafter with further separation. This "dipper" function, with a minimum at 6 ° of separation, was seen for regions around the midline as well as for more lateral regions (30 and 45 °). The discrimination thresholds for the binaural localization cues were linear over the same range, so these cannot explain the shape of these functions. These data and a simple computational model indicate that the perception of auditory space involves a local code or multichannel mapping emerging subsequent to the binaural cue coding.


Subject(s)
Auditory Perception , Sound Localization , Adult , Auditory Threshold , Cues , Female , Humans , Male
3.
J Exp Psychol Hum Percept Perform ; 42(7): 953-64, 2016 07.
Article in English | MEDLINE | ID: mdl-26766511

ABSTRACT

The brain integrates signals from multiple modalities to provide a reliable estimate of environmental events. A temporally cluttered environment presents a challenge for sensory integration because of the risk of misbinding, yet it also provides scope for cross-modal binding to greatly enhance performance by highlighting multimodal events. We present a tactile search task in which fingertips received pulsed vibrations and participants identified which finger was stimulated in synchrony with an auditory signal. Results showed that performance for identifying the target finger was impaired when other fingers were stimulated, even though all fingers were stimulated sequentially. When the number of fingers vibrated was fixed, we found that both spatial and temporal factors constrained performance, because events occurring close to the target vibration in either space or time reduced accuracy. When tactile search was compared with visual search, we found overall performance was lower in touch than in vision, although the cost of reducing temporal separation between stimuli or increasing the presentation rate was similar for both target modalities. Audiotactile performance benefitted from increasing spatial separation between target and distractors, with a particularly strong benefit for locating the target on a different hand to the distractors, whereas the spatial manipulations did not affect audiovisual performance. The similar trends in performance for temporal manipulations across vision and touch suggest a common supramodal binding mechanism that, when combining audition and touch, is limited by the poor resolution of the underlying unisensory representation of touch in cluttered settings. (PsycINFO Database Record


Subject(s)
Auditory Perception/physiology , Psychomotor Performance/physiology , Touch Perception/physiology , Visual Perception/physiology , Adolescent , Adult , Female , Humans , Male , Young Adult
4.
Perception ; 45(4): 409-24, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26655643

ABSTRACT

Temporal ventriloquism is the shift in perceived timing of a visual stimulus that occurs when an auditory stimulus is presented close in time. This study investigated whether crossmodal correspondence between auditory pitch and visual elevation modulates temporal ventriloquism. Participants were presented two visual stimuli (above and below fixation) across a range of stimulus onset asynchronies and were asked to judge the order of the events. A task-irrelevant auditory click was presented shortly before the first and another shortly after the second visual stimulus. There were two pitches used (low and high) and the congruency between the auditory and visual stimuli was manipulated. The results show that incongruent pairings between pitch and elevation abolish temporal ventriloquism. In contrast, the crossmodal correspondence effect was absent when the direction of the pitch change was fixed within sessions, reducing the saliency of the pitch change. The results support previous studies suggesting that in addition to spatial and temporal factors, crossmodal correspondences can influence binding of information across the senses, although these effects are likely to be dependent on the saliency of the crossmodal mapping.


Subject(s)
Pitch Perception/physiology , Time Perception/physiology , Visual Perception/physiology , Acoustic Stimulation , Adolescent , Adult , Auditory Perception/physiology , Female , Humans , Male , Photic Stimulation , Young Adult
5.
Sci Rep ; 5: 17467, 2015 Dec 01.
Article in English | MEDLINE | ID: mdl-26621493

ABSTRACT

Perception and behavior are fundamentally shaped by the integration of different sensory modalities into unique multisensory representations, a process governed by spatio-temporal correspondence. Prior work has characterized temporal perception using the point in time at which subjects are most likely to judge multisensory stimuli to be simultaneous (PSS) and the temporal binding window (TBW) over which participants are likely to do so. Here we examine the relationship between the PSS and the TBW within and between individuals, and within and between three sensory combinations: audiovisual, audiotactile and visuotactile. We demonstrate that TBWs correlate within individuals and across multisensory pairings, but PSSs do not. Further, we reveal that while the audiotactile and audiovisual pairings show tightly related TBWs, they also exhibit a differential relationship with respect to true and perceived multisensory synchrony. Thus, audiotactile and audiovisual temporal processing share mechanistic features yet are respectively functionally linked to objective and subjective synchrony.


Subject(s)
Sensation/physiology , Adult , Female , Humans , Male
6.
Multisens Res ; 28(3-4): 351-70, 2015.
Article in English | MEDLINE | ID: mdl-26288904

ABSTRACT

Following prolonged exposure to audiovisual asynchrony, an observer's point of subjective simultaneity (PSS) shifts in the direction of the leading modality. It has been debated whether other sensory pairings, such as vision and touch, lead to a similar temporal recalibration, and if so, whether the internal timing mechanism underlying lag visuotactile adaptation is centralised or distributed. To address these questions, we adapted observers to vision- and tactile-leading visuotactile asynchrony on either their left or right hand side in different blocks. In one test condition, participants performed a simultaneity judgment on the adapted side (unilateral) and in another they performed a simultaneity judgment on the non-adapted side (contralateral). In a third condition, participants adapted concurrently to equal and opposite asynchronies on each side and were tested randomly on either hand (bilateral opposed). Results from the first two conditions show that observers recalibrate to visuotactile asynchronies, and that the recalibration transfers to the non-adapted side. These findings suggest a centralised recalibration mechanism not linked to the adapted side and predict no recalibration for the bilateral opposed condition, assuming the adapted effects were equal on each side. This was confirmed in the group of participants that adapted to vision- and tactile-leading asynchrony on the right and left hand side, respectively. However, the other group (vision-leading on the left and tactile-leading on the right) did show a recalibration effect, suggesting a distributed mechanism. We discuss these findings in terms of a hybrid model that assumes the co-existence of a centralised and distributed timing mechanism.


Subject(s)
Adaptation, Physiological , Auditory Perception/physiology , Judgment/physiology , Time Perception/physiology , Touch , Visual Perception/physiology , Adult , Female , Humans , Male , Photic Stimulation/methods , Reaction Time , Young Adult
7.
Exp Brain Res ; 233(1): 53-9, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25200176

ABSTRACT

Following prolonged exposure to asynchronous multisensory signals, the brain adapts to reduce the perceived asynchrony. Here, in three separate experiments, participants performed a synchrony judgment task on audiovisual, audiotactile or visuotactile stimuli and we used inter-trial analyses to examine whether temporal recalibration occurs rapidly on the basis of a single asynchronous trial. Even though all combinations used the same subjects, task and design, temporal recalibration occurred for audiovisual stimuli (i.e., the point of subjective simultaneity depended on the preceding trial's modality order), but none occurred when the same auditory or visual event was combined with a tactile event. Contrary to findings from prolonged adaptation studies showing recalibration for all three combinations, we show that rapid, inter-trial recalibration is unique to audiovisual stimuli. We conclude that recalibration occurs at two different timescales for audiovisual stimuli (fast and slow), but only on a slow timescale for audiotactile and visuotactile stimuli.


Subject(s)
Adaptation, Physiological/physiology , Auditory Perception/physiology , Brain/physiology , Touch Perception/physiology , Visual Perception/physiology , Acoustic Stimulation , Adult , Female , Humans , Judgment/physiology , Male , Photic Stimulation , Reaction Time/physiology , Time Factors , Young Adult
8.
Atten Percept Psychophys ; 77(3): 896-906, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25522831

ABSTRACT

Frequency modulation is critical to human speech. Evidence from psychophysics, neurophysiology, and neuroimaging suggests that there are neuronal populations tuned to this property of speech. Consistent with this, extended exposure to frequency change produces direction specific aftereffects in frequency change detection. We show that this aftereffect occurs extremely rapidly, requiring only a single trial of just 100-ms duration. We demonstrate this using a long, randomized series of frequency sweeps (both upward and downward, by varying amounts) and analyzing intertrial adaptation effects. We show the point of constant frequency is shifted systematically towards the previous trial's sweep direction (i.e., a frequency sweep aftereffect). Furthermore, the perception of glide direction is also independently influenced by the glide presented two trials previously. The aftereffect is frequency tuned, as exposure to a frequency sweep from a set centered on 1,000 Hz does not influence a subsequent trial drawn from a set centered on 400 Hz. More generally, the rapidity of adaptation suggests the auditory system is constantly adapting and "tuning" itself to the most recent environmental conditions.


Subject(s)
Auditory Perception/physiology , Hearing/physiology , Acoustic Stimulation , Adaptation, Physiological , Adult , Analysis of Variance , Female , Humans , Male , Psychophysics , Reaction Time/physiology , Reference Values , Sound , Young Adult
9.
PLoS One ; 9(7): e102864, 2014.
Article in English | MEDLINE | ID: mdl-25076211

ABSTRACT

Evidence that the auditory system contains specialised motion detectors is mixed. Many psychophysical studies confound speed cues with distance and duration cues and present sound sources that do not appear to move in external space. Here we use the 'discrimination contours' technique to probe the probabilistic combination of speed, distance and duration for stimuli moving in a horizontal arc around the listener in virtual auditory space. The technique produces a set of motion discrimination thresholds that define a contour in the distance-duration plane for different combination of the three cues, based on a 3-interval oddity task. The orientation of the contour (typically elliptical in shape) reveals which cue or combination of cues dominates. If the auditory system contains specialised motion detectors, stimuli moving over different distances and durations but defining the same speed should be more difficult to discriminate. The resulting discrimination contours should therefore be oriented obliquely along iso-speed lines within the distance-duration plane. However, we found that over a wide range of speeds, distances and durations, the ellipses aligned with distance-duration axes and were stretched vertically, suggesting that listeners were most sensitive to duration. A second experiment showed that listeners were able to make speed judgements when distance and duration cues were degraded by noise, but that performance was worse. Our results therefore suggest that speed is not a primary cue to motion in the auditory system, but that listeners are able to use speed to make discrimination judgements when distance and duration cues are unreliable.


Subject(s)
Cues , Discrimination, Psychological , Sound Localization , Speech Perception , Adult , Female , Humans , Male , Time Factors
10.
Multisens Res ; 26(4): 333-45, 2013.
Article in English | MEDLINE | ID: mdl-24319927

ABSTRACT

Information about the world is captured by our separate senses, and must be integrated to yield a unified representation. This raises the issue of which signals should be integrated and which should remain separate, as inappropriate integration will lead to misrepresentation and distortions. One strong cue suggesting that separate signals arise from a single source is coincidence, in space and in time. We measured increment thresholds for discriminating spatial intervals defined by pairs of simultaneously presented targets, one flash and one auditory sound, for various separations. We report a 'dipper function', in which thresholds follow a 'U-shaped' curve, with thresholds initially decreasing with spatial interval, and then increasing for larger separations. The presence of a dip in the audiovisual increment-discrimination function is evidence that the auditory and visual signals both input to a common mechanism encoding spatial separation, and a simple filter model with a sigmoidal transduction function simulated the results well. The function of an audiovisual spatial filter may be to detect coincidence, a fundamental cue guiding whether to integrate or segregate.


Subject(s)
Auditory Perception/physiology , Cues , Reaction Time/physiology , Visual Perception/physiology , Acoustic Stimulation/methods , Female , Humans , Male , Photic Stimulation/methods
11.
Atten Percept Psychophys ; 75(8): 1892-905, 2013 Nov.
Article in English | MEDLINE | ID: mdl-23979813

ABSTRACT

Recently, Guzman-Martinez, Ortega, Grabowecky, Mossbridge, and Suzuki (Current Biology : CB, 22(5), 383-388, 2012) reported that observers could systematically match auditory amplitude modulations and tactile amplitude modulations to visual spatial frequencies, proposing that these cross-modal matches produced automatic attentional effects. Using a series of visual search tasks, we investigated whether informative auditory, tactile, or bimodal cues can guide attention toward a visual Gabor of matched spatial frequency (among others with different spatial frequencies). These cues improved visual search for some but not all frequencies. Auditory cues improved search only for the lowest and highest spatial frequencies, whereas tactile cues were more effective and frequency specific, although less effective than visual cues. Importantly, although tactile cues could produce efficient search when informative, they had no effect when uninformative. This suggests that cross-modal frequency matching occurs at a cognitive rather than sensory level and, therefore, influences visual search through voluntary, goal-directed behavior, rather than automatic attentional capture.


Subject(s)
Attention/physiology , Auditory Perception/physiology , Cues , Pattern Recognition, Visual , Touch/physiology , Visual Perception/physiology , Adolescent , Adult , Female , Humans , Male , Young Adult
12.
J Vis ; 13(3)2013 Apr 15.
Article in English | MEDLINE | ID: mdl-23589802

ABSTRACT

A recent study by Guzman-Martinez, Ortega, Grabowecky, Mossbridge, and Suzuki (2012) showed that participants match the frequency of an amplitude-modulated auditory stimulus to visual spatial frequency with a linear relationship and suggested this crossmodal matching guided attention to specific spatial frequencies. Here, we replicated this matching relationship and used the visual search paradigm to investigate whether auditory signals guide attention to matched visual spatial frequencies. Participants were presented with a search display of Gabors, all with different spatial frequencies. When the auditory signal was informative, improved search efficiency occurred for some spatial frequencies. However, when uninformative, a matched auditory signal produced no effect on visual search performance whatsoever. Moreover, search benefits were also observed when the auditory signal was informative, but did not match the spatial frequency. Together, these findings suggest that an amplitude-modulated auditory signal can influence visual selection of a matched spatial frequency, but the effect is due to top-down knowledge rather than resulting from automatic attentional capture derived from low-level mapping.


Subject(s)
Attention , Auditory Perception/physiology , Cues , Orientation , Space Perception/physiology , Acoustic Stimulation/methods , Adult , Female , Humans , Male , Middle Aged , Photic Stimulation/methods , Reaction Time , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...