Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
J Assoc Res Otolaryngol ; 2(1): 1-21, 2001 Mar.
Article in English | MEDLINE | ID: mdl-11545146

ABSTRACT

Specific cues in a sound signal are naturally linked to certain parameters in acoustic space. In the barn owl, interaural time difference (ITD) varies mainly with azimuth, while interaural level difference (ILD) varies mainly with elevation. Previous data suggested that ITD is indeed the main cue for azimuthal sound localization in this species, while ILD is an important cue for elevational sound localization. The exact contributions of these parameters could be tested only indirectly because it was not possible to generate a stimulus that contained all relevant spatial information on the one hand, and allowed for a clean separation of these parameters on the other hand. Virtual auditory worlds offer this opportunity. Here we show that barn owls responded to azimuthal variations in virtual space in the same way as to variations in free-field stimuli. We interpret the increase of turning angle with sound-source azimuths (up to +/- 140 degrees) such that the owls did not experience front/back confusions with virtual stimuli. We then separated the influence of ITD from the influence of all other stimulus parameters by fixing the overall ITD in virtual stimuli to a constant value (+100 micros or +100 micros) while leaving all other sound characteristics unchanged. This manipulation influenced both azimuthal and elevational components of head arms. Since the owls' azimuthal head-turn amplitude always resembled the value signified by the ITD, these data demonstrated that azimuthal sound localization is influenced only by ITD both in the frontal hemisphere and in large parts of the rear hemisphere. ILDs did not have an influence on azimuthal components of head turns. While response latency to normal virtual stimuli was found to be largely independent of stimulus position, response delays of the head turns became longer if the ITD information pointed into a different hemisphere as the other cues of the sounds.


Subject(s)
Ear/physiology , Head/physiology , Motor Activity/physiology , Sound Localization/physiology , Strigiformes/physiology , User-Computer Interface , Animals , Dominance, Cerebral , Models, Psychological , Psychoacoustics , Reaction Time/physiology , Reference Values , Time Factors
2.
J Comp Physiol A ; 187(3): 225-33, 2001 Apr.
Article in English | MEDLINE | ID: mdl-11401202

ABSTRACT

Interaural level differences play an important role for elevational sound localization in barn owls. The changes of this cue with sound location are complex and frequency dependent. We exploited the opportunities offered by the virtual space technique to investigate the behavioral relevance of the overall interaural level difference by fixing this parameter in virtual stimuli to a constant value or introducing additional broadband level differences to normal virtual stimuli. Frequency-specific monaural cues in the stimuli were not manipulated. We observed an influence of the broadband interaural level differences on elevational, but not on azimuthal sound localization. Since results obtained with our manipulations explained only part of the variance in elevational turning angle, we conclude that frequency-specific cues are also important. The behavioral consequences of changes of the overall interaural level difference in a virtual sound depended on the combined interaural time difference contained in the stimulus, indicating an indirect influence of temporal cues on elevational sound localization as well. Thus, elevational sound localization is influenced by a combination of many spatial cues including frequency-dependent and temporal features.


Subject(s)
Auditory Perception/physiology , Locomotion , Strigiformes/physiology , Acoustics , Animals , Behavior, Animal , Head , Posture , User-Computer Interface
3.
J Neurosci Methods ; 106(1): 29-38, 2001 Mar 30.
Article in English | MEDLINE | ID: mdl-11248338

ABSTRACT

The study of spatial processing in the auditory system usually requires complex experimental setups, using arrays of speakers or speakers mounted on moving arms. These devices, while allowing precision in the presentation of the spatial attributes of sound, are complex, expensive and limited. Alternative approaches rely on virtual space sound delivery. In this paper, we describe a virtual space algorithm that enables accurate reconstruction of eardrum waveforms for arbitrary sound sources moving along arbitrary trajectories in space. A physical validation of the synthesis algorithm is performed by comparing waveforms recorded during real motion with waveforms synthesized by the algorithm. As a demonstration of possible applications of the algorithm, virtual motion stimuli are used to reproduce psychophysical results in humans and for studying responses of barn owls to auditory motion stimuli.


Subject(s)
Acoustic Stimulation/methods , Algorithms , Motion , Sound Localization/physiology , User-Computer Interface , Animals , Humans , Strigiformes/physiology
4.
Trends Neurosci ; 20(12): 583-8, 1997 Dec.
Article in English | MEDLINE | ID: mdl-9416672

ABSTRACT

Motion provides one of the most important cues for survival, because it helps to break the camouflage of a predator or a prey and because it allows predictions about the future path of an object. Recent data on the processing of acoustic motion have yielded some astonishing findings, suggesting that the psychophysical, neurological and neurophysiological mechanisms underlying the detection and representation of acoustic motion are quite similar to those underlying the detection and representation in other modalities, especially in vision. A further comparison of these similarities and differences with respect to the different environmental constraints posed for the different modalities may help in understanding general problems associated with motion computations.


Subject(s)
Hearing/physiology , Motion Perception/physiology , Animals , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...