Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 19 de 19
Filter
Add more filters










Publication year range
1.
Iperception ; 13(1): 20416695211070616, 2022.
Article in English | MEDLINE | ID: mdl-35024134

ABSTRACT

This paper reports on the deterioration in sound-localization accuracy during listeners' head and body movements. We investigated the sound-localization accuracy during passive body rotations at speeds in the range of 0.625-5 °/s. Participants were asked to determine whether a 30-ms noise stimuli emerged relative to their subjective-straight-ahead reference. Results indicated that the sound-localization resolution degraded with passive rotation, irrespective of the rotation speed, even at speeds of 0.625 °/s.

2.
J Acoust Soc Am ; 140(3): EL285, 2016 09.
Article in English | MEDLINE | ID: mdl-27914430

ABSTRACT

Spatial descriptions of the head-related transfer function (HRTF) using spherical harmonics, which is commonly used for the purpose, consider all directions simultaneously. However, in perceptual studies, it is necessary to model HRTFs with different angular resolutions at different directions. To this end, an alternative spatial representation of the HRTF, based on local analysis functions, is introduced. The proposal is shown to have the potential to describe the local features of the HRTF. This is verified by comparing the reconstruction error achieved by the proposal to that of the spherical harmonic decomposition when reconstructing the HRTF inside a spherical cap.

3.
Iperception ; 7(5): 2041669516669614, 2016.
Article in English | MEDLINE | ID: mdl-27698993

ABSTRACT

Movement detection for a virtual sound source was measured during the listener's horizontal head rotation. Listeners were instructed to do head rotation at a given speed. A trial consisted of two intervals. During an interval, a virtual sound source was presented 60° to the right or left of the listener, who was instructed to rotate the head to face the sound image position. Then in one of a pair of intervals, the sound position was moved slightly in the middle of the rotation. Listeners were asked to judge the interval in a trial during which the sound stimuli moved. Results suggest that detection thresholds are higher when listeners do head rotation. Moreover, this effect was found to be independent of the rotation velocity.

4.
Multisens Res ; 27(1): 1-16, 2014.
Article in English | MEDLINE | ID: mdl-25102663

ABSTRACT

The brain apparently remaps the perceived locations of simultaneous auditory and visual events into a unified audio-visual space to integrate and/or compare multisensory inputs. However, there is little qualitative or quantitative data on how simultaneous auditory and visual events are located in the peripheral visual field (i.e., outside a few degrees of the fovea). We presented a sound burst and a flashing light simultaneously not only in the central visual field but also in the peripheral visual field and measured the relative perceived locations of the sound and flash. The results revealed that the sound and flash were perceptually located at the same location when the sound was presented at a 5 degrees periphery of the flash, even when the participants' eyes were fixed. Measurements of the unisensory locations of each sound and flash in a pointing task demonstrated that the perceived location of the sound shifted toward the front, while the perceived location of the flash shifted toward the periphery. As a result, the discrepancy between the perceptual location of the sound and the flash was around 4 degrees. This suggests that the brain maps the unisensory locations of auditory and visual events into a unified audio-visual space, enabling it to generate unisensory spatial information about the events.


Subject(s)
Auditory Perception/physiology , Feedback, Sensory/physiology , Visual Fields/physiology , Visual Perception/physiology , Adult , Eye Movements/physiology , Female , Humans , Male , Photic Stimulation , Young Adult
6.
Iperception ; 4(4): 253-64, 2013.
Article in English | MEDLINE | ID: mdl-24349686

ABSTRACT

We investigated the effects of listeners' head movements and proprioceptive feedback during sound localization practice on the subsequent accuracy of sound localization performance. The effects were examined under both restricted and unrestricted head movement conditions in the practice stage. In both cases, the participants were divided into two groups: a feedback group performed a sound localization drill with accurate proprioceptive feedback; a control group conducted it without the feedback. Results showed that (1) sound localization practice, while allowing for free head movement, led to improvement in sound localization performance and decreased actual angular errors along the horizontal plane, and that (2) proprioceptive feedback during practice decreased actual angular errors in the vertical plane. Our findings suggest that unrestricted head movement and proprioceptive feedback during sound localization training enhance perceptual motor learning by enabling listeners to use variable auditory cues and proprioceptive information.

7.
PLoS One ; 7(6): e39402, 2012.
Article in English | MEDLINE | ID: mdl-22768076

ABSTRACT

BACKGROUND: Spatial inputs from the auditory periphery can be changed with movements of the head or whole body relative to the sound source. Nevertheless, humans can perceive a stable auditory environment and appropriately react to a sound source. This suggests that the inputs are reinterpreted in the brain, while being integrated with information on the movements. Little is known, however, about how these movements modulate auditory perceptual processing. Here, we investigate the effect of the linear acceleration on auditory space representation. METHODOLOGY/PRINCIPAL FINDINGS: Participants were passively transported forward/backward at constant accelerations using a robotic wheelchair. An array of loudspeakers was aligned parallel to the motion direction along a wall to the right of the listener. A short noise burst was presented during the self-motion from one of the loudspeakers when the listener's physical coronal plane reached the location of one of the speakers (null point). In Experiments 1 and 2, the participants indicated which direction the sound was presented, forward or backward relative to their subjective coronal plane. The results showed that the sound position aligned with the subjective coronal plane was displaced ahead of the null point only during forward self-motion and that the magnitude of the displacement increased with increasing the acceleration. Experiment 3 investigated the structure of the auditory space in the traveling direction during forward self-motion. The sounds were presented at various distances from the null point. The participants indicated the perceived sound location by pointing a rod. All the sounds that were actually located in the traveling direction were perceived as being biased towards the null point. CONCLUSIONS/SIGNIFICANCE: These results suggest a distortion of the auditory space in the direction of movement during forward self-motion. The underlying mechanism might involve anticipatory spatial shifts in the auditory receptive field locations driven by afferent signals from vestibular system.


Subject(s)
Auditory Perception/physiology , Motion Perception/physiology , Movement/physiology , Acceleration , Acoustic Stimulation , Adult , Female , Humans , Male , Robotics , Sound Localization/physiology , Young Adult
8.
J Vis ; 12(3)2012 Mar 12.
Article in English | MEDLINE | ID: mdl-22410584

ABSTRACT

Auditory temporal or semantic information often modulates visual motion events. However, the effects of auditory spatial information on visual motion perception were reported to be absent or of smaller size at perceptual level. This could be caused by a superiority of vision over hearing in reliability of motion information. Here, we manipulated the retinal eccentricity of visual motion and challenged the previous findings. Visual apparent motion stimuli were presented in conjunction with a sound delivered alternately from two horizontally or vertically aligned loudspeakers; the direction of visual apparent motion was always perpendicular to the direction in which the sound alternated. We found that the perceived direction of visual motion could be consistent with the direction in which the sound alternated or lay between this direction and that of actual visual motion. The deviation of the perceived direction of motion from the actual direction was more likely to occur at larger retinal eccentricities. These findings suggest that the auditory and visual modalities can mutually influence one another in motion processing so that the brain obtains the best estimates of external events.


Subject(s)
Auditory Perception/physiology , Motion Perception/physiology , Perceptual Masking/physiology , Sound Localization/physiology , Acoustic Stimulation/methods , Cues , Eye Movements/physiology , Humans , Photic Stimulation/methods
9.
PLoS One ; 6(3): e17499, 2011 Mar 09.
Article in English | MEDLINE | ID: mdl-21408078

ABSTRACT

BACKGROUND: Vision provides the most salient information with regard to the stimulus motion. However, it has recently been demonstrated that static visual stimuli are perceived as moving laterally by alternating left-right sound sources. The underlying mechanism of this phenomenon remains unclear; it has not yet been determined whether auditory motion signals, rather than auditory positional signals, can directly contribute to visual motion perception. METHODOLOGY/PRINCIPAL FINDINGS: Static visual flashes were presented at retinal locations outside the fovea together with a lateral auditory motion provided by a virtual stereo noise source smoothly shifting in the horizontal plane. The flash appeared to move by means of the auditory motion when the spatiotemporal position of the flashes was in the middle of the auditory motion trajectory. Furthermore, the lateral auditory motion altered visual motion perception in a global motion display where different localized motion signals of multiple visual stimuli were combined to produce a coherent visual motion perception. CONCLUSIONS/SIGNIFICANCE: These findings suggest there exist direct interactions between auditory and visual motion signals, and that there might be common neural substrates for auditory and visual motion processing.


Subject(s)
Acoustic Stimulation , Motion Perception/physiology , Motion , Humans
10.
Atten Percept Psychophys ; 72(8): 2215-26, 2010 Nov.
Article in English | MEDLINE | ID: mdl-21097864

ABSTRACT

In representational momentum (RM), the final position of a moving target is mislocalized in the direction of motion. Here, the effect of a concurrent sound on visual RM was demonstrated. A visual stimulus moved horizontally and disappeared at unpredictable positions. A complex tone without any motion cues was presented continuously from the beginning of the visual motion. As compared with a silent condition, the RM magnitude increased when the sound lasted longer than and decreased when it did not last as long as the visual motion. However, the RM was unchanged when a brief complex tone was presented before or after the target disappeared (Experiment 2) or when the onset of the long-lasting sound was not synchronized with that of the visual motion (Experiments 3 and 4). These findings suggest that visual motion representation can be modulated by a sound if the visual motion information is firmly associated with the auditory information.


Subject(s)
Association , Attention , Auditory Perception , Cues , Motion Perception , Orientation , Pattern Recognition, Visual , Sound Localization , Time Perception , Adolescent , Adult , Female , Humans , Male , Psychophysics , Young Adult
11.
Vision Res ; 50(20): 2093-9, 2010 Sep 24.
Article in English | MEDLINE | ID: mdl-20682329

ABSTRACT

An abrupt change in a visual attribute (size) of apparently moving visual stimuli extends the time the changed stimuli is visible even after its physical termination (visible persistence). In this study, we show that elongation of visible persistence is enhanced by an abrupt change in an attribute (frequency) of the sounds presented along with the size-changed apparently moving visual stimuli. This auditory effect disappears when sounds are not associated with the visual stimuli. These results suggest that auditory attribute change can contribute to the establishment of a new object representation and that object-level audio-visual interactions can occur in motion perception.


Subject(s)
Motion Perception/physiology , Sound , Acoustic Stimulation , Humans , Photic Stimulation/methods
12.
Neurosci Lett ; 479(3): 221-5, 2010 Aug 02.
Article in English | MEDLINE | ID: mdl-20639000

ABSTRACT

The alternation of sounds in the left and right ears induces motion perception of a static visual stimulus (SIVM: Sound-Induced Visual Motion). In this case, binaural cues were of considerable benefit in perceiving locations and movements of the sounds. The present study investigated how a spectral cue - another important cue for sound localization and motion perception - contributed to the SIVM. In experiments, two alternating sound sources aligned in the vertical plane were presented, synchronized with a static visual stimulus. We found that the proportion of the SIVM and the magnitude of the perceived movements of the static visual stimulus increased with an increase of retinal eccentricity (1.875-30 degree), indicating the influence of the spectral cue on the SIVM. These findings suggest that the SIVM can be generalized to the whole two dimensional audio-visual space, and strongly imply that there are common neural substrates for auditory and visual motion perception in the brain.


Subject(s)
Motion Perception , Sound , Acoustic Stimulation , Cues , Humans
13.
PLoS One ; 4(12): e8188, 2009 Dec 07.
Article in English | MEDLINE | ID: mdl-19997648

ABSTRACT

BACKGROUND: Audition provides important cues with regard to stimulus motion although vision may provide the most salient information. It has been reported that a sound of fixed intensity tends to be judged as decreasing in intensity after adaptation to looming visual stimuli or as increasing in intensity after adaptation to receding visual stimuli. This audiovisual interaction in motion aftereffects indicates that there are multimodal contributions to motion perception at early levels of sensory processing. However, there has been no report that sounds can induce the perception of visual motion. METHODOLOGY/PRINCIPAL FINDINGS: A visual stimulus blinking at a fixed location was perceived to be moving laterally when the flash onset was synchronized to an alternating left-right sound source. This illusory visual motion was strengthened with an increasing retinal eccentricity (2.5 deg to 20 deg) and occurred more frequently when the onsets of the audio and visual stimuli were synchronized. CONCLUSIONS/SIGNIFICANCE: We clearly demonstrated that the alternation of sound location induces illusory visual motion when vision cannot provide accurate spatial information. The present findings strongly suggest that the neural representations of auditory and visual motion processing can bias each other, which yields the best estimates of external events in a complementary manner.


Subject(s)
Motion Perception/physiology , Sound , Acoustic Stimulation , Humans , Photic Stimulation
14.
Neuroreport ; 20(14): 1231-4, 2009 Sep 23.
Article in English | MEDLINE | ID: mdl-19629016

ABSTRACT

Effects of auditory training with bimodal audio-visual stimuli on monomodal aural speech intelligibility were examined in individuals with normal hearing using highly degraded noise-vocoded speech sound. Visual cue simultaneously presented with auditory stimuli during the training session significantly improved auditory speech intelligibility not only for words used in the training session, but also untrained words, when compared with the auditory training using only auditory stimuli. Visual information is generally considered to complement insufficient speech information conveyed by the auditory system during audio-visual speech perception. However, the present results showed another beneficial effect of audio-visual training that the visual cue enhances the auditory adaptation process to the degraded new speech sound, which is different from those given during bimodal training.


Subject(s)
Adaptation, Psychological , Learning , Practice, Psychological , Speech Perception , Visual Perception , Acoustic Stimulation , Adult , Analysis of Variance , Comprehension , Cues , Feedback, Psychological , Female , Humans , Male , Photic Stimulation , Speech
15.
Neuroreport ; 20(5): 473-7, 2009 Mar 25.
Article in English | MEDLINE | ID: mdl-19240661

ABSTRACT

This study investigated the effects of intermodal timing differences and speed differences on word intelligibility of auditory-visual speech. Words were presented under visual-only, auditory-only, and auditory-visual conditions. Two types of auditory-visual conditions were used: asynchronous and expansion conditions. In the asynchronous conditions, the audio lag was 0-400 ms. In the expansion conditions, the auditory signal was time expanded (0-400 ms), whereas the visual signal was kept at the original speed. Results showed that word intelligibility was higher in the auditory-visual conditions than in the auditory-only condition. The results of auditory-visual benefit revealed that the benefit at the end of words declined as the amount of time expansion increased, although it did not decline in the asynchronous conditions.


Subject(s)
Auditory Perception , Speech Intelligibility , Speech Perception , Visual Perception , Acoustic Stimulation , Analysis of Variance , Face , Humans , Photic Stimulation , Young Adult
16.
J Acoust Soc Am ; 116(2): 918-33, 2004 Aug.
Article in English | MEDLINE | ID: mdl-15376658

ABSTRACT

Equal-loudness-level contours provide the foundation for theoretical and practical analyses of intensity-frequency characteristics of auditory systems. Since 1956 equal-loudness-level contours based on the free-field measurements of Robinson and Dadson [Br. J. Appl. Phys. 7, 166-181 (1956)] have been widely accepted. However, in 1987 some questions about the general applicability of these contours were published [H. Fastl and E. Zwicker, Fortschritte der Akustik, DAGA '87, pp. 189-193 (1987)]. As a result, a new international effort to measure equal-loudness-level contours was undertaken. The present paper brings together the results of 12 studies starting in the mid-1980s to arrive at a new set of contours. The new contours estimated in this study are compared with four sets of classic contours taken from the available literature. The contours described by Fletcher and Munson [J. Acoust. Soc. Am. 5, 82-108 (1933)] exhibit some overall similarity to our proposed estimated contours in the mid-frequency range up to 60 phons. The contours described by Robinson and Dadson exhibit clear differences from the new contours. These differences are most pronounced below 500 Hz and the discrepancy is often as large as 14 dB.


Subject(s)
Auditory Threshold/physiology , Loudness Perception/physiology , Hearing Tests , Humans , Psychoacoustics
17.
Int J Audiol ; 42(6): 297-302, 2003 Sep.
Article in English | MEDLINE | ID: mdl-14570236

ABSTRACT

Diplacusis is defined as the phenomenon of hearing the same tone at different pitches in the two ears. Although binaural pitch-matching using method-of-adjustment has been employed in most studies, it is sometimes hard for subjects with impaired hearing to judge 'equal pitch' as one frequency. To resolve this problem. a modified pitch-matching test, in which the relation of pitch sensation between the two ears was assessed as a matched frequency 'range' using the randomized maximum likelihood sequential procedure, was developed. Eight subjects with unilaterally impaired hearing, as well as eight normal subjects, were examined to evaluate this new test procedure. In the present method, matched frequency is assessed as a frequency range, in which subjects cannot judge whether the pitch of the signal in one ear is higher or lower than that in the opposite ear. This method appeared to be useful for assessing the characteristics of diplacusis in subjects with impaired hearing as well as in normal subjects.


Subject(s)
Hearing Disorders/diagnosis , Hearing Tests/methods , Pitch Perception , Adult , Female , Hearing Disorders/physiopathology , Humans , Male
18.
Tohoku J Exp Med ; 200(3): 129-35, 2003 Jul.
Article in English | MEDLINE | ID: mdl-14521255

ABSTRACT

It is well known that sound presented in the contralateral ear can elicit the activity of the olivocochlear (OC) efferent. In the present study, the effects of the addition of contralateral noise on the psychophysical measurements of auditory thresholds were investigated in human subjects with normal hearing. The results obtained in the present study indicate that the addition of contralateral noise at a level of only 20 or 30 dB sound pressure level (SPL) may cause a significant elevation of the auditory threshold in the mid-frequency area (usually 2-3 dB). When the level of contralateral noise was elevated, the elevation of the auditory threshold tended to be larger and the affected frequency area became wider. Although other factors that elevate the auditory thresholds, such as cross-talk effects and the acoustic reflex of the middle ear muscles, may be involved in the above-mentioned paradigm, especially when higher levels of contralateral noise are used, it is important to know the degree of OC-mediated threshold elevation in usual audiometric measurement.


Subject(s)
Auditory Threshold/physiology , Noise , Perceptual Masking/physiology , Audiometry, Pure-Tone/methods , Auditory Perception/physiology , Cochlear Nerve/physiology , Ear/physiology , Electronic Data Processing , Female , Hearing/physiology , Humans , Male , Psychoacoustics
SELECTION OF CITATIONS
SEARCH DETAIL
...