Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
Add more filters










Publication year range
1.
Hear Res ; 438: 108857, 2023 10.
Article in English | MEDLINE | ID: mdl-37639922

ABSTRACT

Perception is sensitive to statistical regularities in the environment, including temporal characteristics of sensory inputs. Interestingly, implicit learning of temporal patterns in one modality can also improve their processing in another modality. However, it is unclear how cross-modal learning transfer affects neural responses to sensory stimuli. Here, we recorded neural activity of human volunteers using electroencephalography (EEG), while participants were exposed to brief sequences of randomly timed auditory or visual pulses. Some trials consisted of a repetition of the temporal pattern within the sequence, and subjects were tasked with detecting these trials. Unknown to the participants, some trials reappeared throughout the experiment across both modalities (Transfer) or only within a modality (Control), enabling implicit learning in one modality and its transfer. Using a novel method of analysis of single-trial EEG responses, we showed that learning temporal structures within and across modalities is reflected in neural learning curves. These putative neural correlates of learning transfer were similar both when temporal information learned in audition was transferred to visual stimuli and vice versa. The modality-specific mechanisms for learning of temporal information and general mechanisms which mediate learning transfer across modalities had distinct physiological signatures: temporal learning within modalities relied on modality-specific brain regions while learning transfer affected beta-band activity in frontal regions.


Subject(s)
Auditory Perception , Learning , Humans , Electroencephalography , Frontal Lobe , Healthy Volunteers
2.
Hear Res ; 409: 108331, 2021 09 15.
Article in English | MEDLINE | ID: mdl-34416492

ABSTRACT

While a large body of literature has examined the encoding of binaural spatial cues in the auditory midbrain, studies that ask how quantitative measures of spatial tuning in midbrain neurons compare with an animal's psychoacoustic performance remain rare. Researchers have tried to explain deficits in spatial hearing in certain patient groups, such as binaural cochlear implant users, in terms of declines in apparent reductions in spatial tuning of midbrain neurons of animal models. However, the quality of spatial tuning can be quantified in many different ways, and in the absence of evidence that a given neural tuning measure correlates with psychoacoustic performance, the interpretation of such finding remains very tentative. Here, we characterize ITD tuning in the rat inferior colliculus (IC) to acoustic pulse train stimuli with varying envelopes and at varying rates, and explore whether quality of tuning correlates behavioral performance. We quantified both mutual information (MI) and neural d' as measures of ITD sensitivity. Neural d' values paralleled behavioral ones, declining with increasing click rates or when envelopes changed from rectangular to Hanning windows, and they correlated much better with behavioral performance than MI. Meanwhile, MI values were larger in an older, more experienced cohort of animals than in naive animals, but neural d' did not differ between cohorts. However, the results obtained with neural d' and MI were highly correlated when ITD values were coded simply as left or right ear leading, rather than specific ITD values. Thus, neural measures of lateralization ability (e.g. d' or left/right MI) appear to be highly predictive of psychoacoustic performance in a two-alternative forced choice task.


Subject(s)
Cochlear Implantation , Cochlear Implants , Inferior Colliculi , Acoustic Stimulation , Animals , Hearing , Rats , Sound Localization
3.
R Soc Open Sci ; 7(3): 191194, 2020 Mar.
Article in English | MEDLINE | ID: mdl-32269783

ABSTRACT

Previous research has shown that musical beat perception is a surprisingly complex phenomenon involving widespread neural coordination across higher-order sensory, motor and cognitive areas. However, the question of how low-level auditory processing must necessarily shape these dynamics, and therefore perception, is not well understood. Here, we present evidence that the auditory cortical representation of music, even in the absence of motor or top-down activations, already favours the beat that will be perceived. Extracellular firing rates in the rat auditory cortex were recorded in response to 20 musical excerpts diverse in tempo and genre, for which musical beat perception had been characterized by the tapping behaviour of 40 human listeners. We found that firing rates in the rat auditory cortex were on average higher on the beat than off the beat. This 'neural emphasis' distinguished the beat that was perceived from other possible interpretations of the beat, was predictive of the degree of tapping consensus across human listeners, and was accounted for by a spectrotemporal receptive field model. These findings strongly suggest that the 'bottom-up' processing of music performed by the auditory system predisposes the timing and clarity of the perceived musical beat.

4.
J Acoust Soc Am ; 145(5): EL341, 2019 05.
Article in English | MEDLINE | ID: mdl-31153346

ABSTRACT

Currently, there is controversy around whether rats can use interaural time differences (ITDs) to localize sound. Here, naturalistic pulse train stimuli were used to evaluate the rat's sensitivity to onset and ongoing ITDs using a two-alternative forced choice sound lateralization task. Pulse rates between 50 Hz and 4.8 kHz with rectangular or Hanning windows were delivered with ITDs between ±175 µs over a near-field acoustic setup. Similar to other mammals, rats performed with 75% accuracy at ∼50 µs ITD, demonstrating that rats are highly sensitive to envelope ITDs.


Subject(s)
Auditory Pathways/physiology , Reaction Time , Sound Localization/physiology , Sound , Acoustic Stimulation , Animals , Behavior, Animal/physiology , Female , Rats, Wistar
5.
J Acoust Soc Am ; 145(3): EL222, 2019 03.
Article in English | MEDLINE | ID: mdl-31067970

ABSTRACT

Spatially rendering sounds using head-related transfer functions (HRTFs) is an important part of creating immersive audio experiences for virtual reality applications. However, elevation perception remains challenging when generic, non-personalized HRTFs are used. This study investigated whether digital audio effects applied to a generic set of HRTFs could improve sound localization in the vertical plane. Several of the tested effects significantly improved elevation judgment, and trial-by-trial variability in spectral energy between 2 and 10 kHz correlated strongly with perceived elevation. Digital audio effects may therefore be a promising strategy to improve elevation perception where personalized HRTFs are not available.

6.
7.
Neuroscience ; 389: 4-18, 2018 10 01.
Article in English | MEDLINE | ID: mdl-29108832

ABSTRACT

Music is a curious example of a temporally patterned acoustic stimulus, and a compelling pan-cultural phenomenon. This review strives to bring some insights from decades of music psychology and sensorimotor synchronization (SMS) literature into the mainstream auditory domain, arguing that musical rhythm perception is shaped in important ways by temporal processing mechanisms in the brain. The feature that unites these disparate disciplines is an appreciation of the central importance of timing, sequencing, and anticipation. Perception of musical rhythms relies on an ability to form temporal predictions, a general feature of temporal processing that is equally relevant to auditory scene analysis, pattern detection, and speech perception. By bringing together findings from the music and auditory literature, we hope to inspire researchers to look beyond the conventions of their respective fields and consider the cross-disciplinary implications of studying auditory temporal sequence processing. We begin by highlighting music as an interesting sound stimulus that may provide clues to how temporal patterning in sound drives perception. Next, we review the SMS literature and discuss possible neural substrates for the perception of, and synchronization to, musical beat. We then move away from music to explore the perceptual effects of rhythmic timing in pattern detection, auditory scene analysis, and speech perception. Finally, we review the neurophysiology of general timing processes that may underlie aspects of the perception of rhythmic patterns. We conclude with a brief summary and outlook for future research.


Subject(s)
Auditory Perception/physiology , Music , Animals , Humans , Psychoacoustics , Time Factors
8.
Proc Biol Sci ; 284(1866)2017 Nov 15.
Article in English | MEDLINE | ID: mdl-29118141

ABSTRACT

The ability to spontaneously feel a beat in music is a phenomenon widely believed to be unique to humans. Though beat perception involves the coordinated engagement of sensory, motor and cognitive processes in humans, the contribution of low-level auditory processing to the activation of these networks in a beat-specific manner is poorly understood. Here, we present evidence from a rodent model that midbrain preprocessing of sounds may already be shaping where the beat is ultimately felt. For the tested set of musical rhythms, on-beat sounds on average evoked higher firing rates than off-beat sounds, and this difference was a defining feature of the set of beat interpretations most commonly perceived by human listeners over others. Basic firing rate adaptation provided a sufficient explanation for these results. Our findings suggest that midbrain adaptation, by encoding the temporal context of sounds, creates points of neural emphasis that may influence the perceptual emergence of a beat.


Subject(s)
Auditory Perception/physiology , Gerbillinae/physiology , Inferior Colliculi/physiology , Music , Psychomotor Performance , Acoustic Stimulation , Adult , Animals , Female , Humans , Male , Middle Aged , Young Adult
10.
Neuron ; 89(6): 1343-1354, 2016 Mar 16.
Article in English | MEDLINE | ID: mdl-26948895

ABSTRACT

Complex cognitive processes require sophisticated local processing but also interactions between distant brain regions. It is therefore critical to be able to study distant interactions between local computations and the neural representations they act on. Here we report two anatomically and computationally distinct learning signals in lateral orbitofrontal cortex (lOFC) and the dopaminergic ventral midbrain (VM) that predict trial-by-trial changes to a basic internal model in hippocampus. To measure local computations during learning and their interaction with neural representations, we coupled computational fMRI with trial-by-trial fMRI suppression. We find that suppression in a medial temporal lobe network changes trial-by-trial in proportion to stimulus-outcome associations. During interleaved choice trials, we identify learning signals that relate to outcome type in lOFC and to reward value in VM. These intervening choice feedback signals predicted the subsequent change to hippocampal suppression, suggesting a convergence of signals that update the flexible representation of stimulus-outcome associations.


Subject(s)
Hippocampus/physiology , Learning/physiology , Prefrontal Cortex/physiology , Adult , Brain Mapping , Choice Behavior , Computer Simulation , Feedback , Female , Functional Laterality , Healthy Volunteers , Hippocampus/blood supply , Humans , Linear Models , Male , Mesencephalon/blood supply , Mesencephalon/physiology , Models, Biological , Neural Pathways/blood supply , Neural Pathways/physiology , Oxygen/blood , Photic Stimulation , Predictive Value of Tests , Prefrontal Cortex/blood supply , Young Adult
11.
Front Neurosci ; 10: 9, 2016.
Article in English | MEDLINE | ID: mdl-26858589

ABSTRACT

This study investigates the influence of temporal regularity on human listeners' ability to detect a repeating noise pattern embedded in statistically identical non-repeating noise. Human listeners were presented with white noise stimuli that either contained a frozen segment of noise that repeated in a temporally regular or irregular manner, or did not contain any repetition at all. Subjects were instructed to respond as soon as they detected any repetition in the stimulus. Pattern detection performance was best when repeated targets occurred in a temporally regular manner, suggesting that temporal regularity plays a facilitative role in pattern detection. A modulation filterbank model could account for these results.

12.
Elife ; 42015 Jun 16.
Article in English | MEDLINE | ID: mdl-26077825

ABSTRACT

Behavioral strategies employed for chemotaxis have been described across phyla, but the sensorimotor basis of this phenomenon has seldom been studied in naturalistic contexts. Here, we examine how signals experienced during free olfactory behaviors are processed by first-order olfactory sensory neurons (OSNs) of the Drosophila larva. We find that OSNs can act as differentiators that transiently normalize stimulus intensity-a property potentially derived from a combination of integral feedback and feed-forward regulation of olfactory transduction. In olfactory virtual reality experiments, we report that high activity levels of the OSN suppress turning, whereas low activity levels facilitate turning. Using a generalized linear model, we explain how peripheral encoding of olfactory stimuli modulates the probability of switching from a run to a turn. Our work clarifies the link between computations carried out at the sensory periphery and action selection underlying navigation in odor gradients.


Subject(s)
Chemotaxis/physiology , Drosophila/physiology , Olfactory Receptor Neurons/physiology , Orientation/physiology , Sensory Receptor Cells/physiology , Smell/physiology , Action Potentials/physiology , Algorithms , Animals , Diffusion , Larva/physiology , Models, Theoretical , Motor Activity/physiology , Odorants
13.
J Acoust Soc Am ; 134(1): EL98-104, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23862914

ABSTRACT

This study reports a role of temporal regularity on the perception of auditory streams. Listeners were presented with two-tone sequences in an A-B-A-B rhythm that was either regular or had a controlled amount of temporal jitter added independently to each of the B tones. Subjects were asked to report whether they perceived one or two streams. The percentage of trials in which two streams were reported substantially and significantly increased with increasing amounts of temporal jitter. This suggests that temporal predictability may serve as a binding cue during auditory scene analysis.


Subject(s)
Attention , Cues , Illusions , Pitch Discrimination , Sound Spectrography , Time Perception , Humans , Psychoacoustics
SELECTION OF CITATIONS
SEARCH DETAIL
...