Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 17 de 17
Filter
Add more filters










Publication year range
1.
Nature ; 626(7999): 485-486, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38297041
2.
Elife ; 122023 Nov 16.
Article in English | MEDLINE | ID: mdl-37970945

ABSTRACT

Grouping sets of sounds into relevant categories is an important cognitive ability that enables the association of stimuli with appropriate goal-directed behavioral responses. In perceptual tasks, the primary auditory cortex (A1) assumes a prominent role by concurrently encoding both sound sensory features and task-related variables. Here, we sought to explore the role of A1 in the initiation of sound categorization, shedding light on its involvement in this cognitive process. We trained ferrets to discriminate click trains of different rates in a Go/No-Go delayed categorization task and recorded neural activity during both active behavior and passive exposure to the same sounds. Purely categorical response components were extracted and analyzed separately from sensory responses to reveal their contributions to the overall population response throughout the trials. We found that categorical activity emerged during sound presentation in the population average and was present in both active behavioral and passive states. However, upon task engagement, categorical responses to the No-Go category became suppressed in the population code, leading to an asymmetrical representation of the Go stimuli relative to the No-Go sounds and pre-stimulus baseline. The population code underwent an abrupt change at stimulus offset, with sustained responses after the Go sounds during the delay period. Notably, the categorical responses observed during the stimulus period exhibited a significant correlation with those extracted from the delay epoch, suggesting an early involvement of A1 in stimulus categorization.


Subject(s)
Auditory Cortex , Auditory Perception , Animals , Auditory Perception/physiology , Auditory Cortex/physiology , Ferrets , Sound , Behavior, Animal/physiology , Acoustic Stimulation
3.
Nat Commun ; 14(1): 6837, 2023 10 27.
Article in English | MEDLINE | ID: mdl-37884507

ABSTRACT

Brains can gracefully weed out irrelevant stimuli to guide behavior. This feat is believed to rely on a progressive selection of task-relevant stimuli across the cortical hierarchy, but the specific across-area interactions enabling stimulus selection are still unclear. Here, we propose that population gating, occurring within primary auditory cortex (A1) but controlled by top-down inputs from prelimbic region of medial prefrontal cortex (mPFC), can support across-area stimulus selection. Examining single-unit activity recorded while rats performed an auditory context-dependent task, we found that A1 encoded relevant and irrelevant stimuli along a common dimension of its neural space. Yet, the relevant stimulus encoding was enhanced along an extra dimension. In turn, mPFC encoded only the stimulus relevant to the ongoing context. To identify candidate mechanisms for stimulus selection within A1, we reverse-engineered low-rank RNNs trained on a similar task. Our analyses predicted that two context-modulated neural populations gated their preferred stimulus in opposite contexts, which we confirmed in further analyses of A1. Finally, we show in a two-region RNN how population gating within A1 could be controlled by top-down inputs from PFC, enabling flexible across-area communication despite fixed inter-areal connectivity.


Subject(s)
Auditory Cortex , Brain , Rats , Animals , Acoustic Stimulation/methods , Prefrontal Cortex
4.
iScience ; 26(2): 106000, 2023 Feb 17.
Article in English | MEDLINE | ID: mdl-36798438

ABSTRACT

Everyday life's perceptual decision-making is informed by experience. In particular, temporal expectation can ease the detection of relevant events in noisy sensory streams. Here, we investigated if humans can extract hidden temporal cues from the occurrences of probabilistic targets and utilize them to inform target detection in a complex acoustic stream. To understand what neural mechanisms implement temporal expectation influence on decision-making, we used pupillometry as a proxy for underlying neuromodulatory activity. We found that participants' detection strategy was influenced by the hidden temporal context and correlated with sound-evoked pupil dilation. A model of urgency fitted on false alarms predicted detection reaction time. Altogether, these findings suggest that temporal expectation informs decision-making and could be implemented through neuromodulatory-mediated urgency signals.

5.
Elife ; 102021 11 18.
Article in English | MEDLINE | ID: mdl-34792467

ABSTRACT

Little is known about how neural representations of natural sounds differ across species. For example, speech and music play a unique role in human hearing, yet it is unclear how auditory representations of speech and music differ between humans and other animals. Using functional ultrasound imaging, we measured responses in ferrets to a set of natural and spectrotemporally matched synthetic sounds previously tested in humans. Ferrets showed similar lower-level frequency and modulation tuning to that observed in humans. But while humans showed substantially larger responses to natural vs. synthetic speech and music in non-primary regions, ferret responses to natural and synthetic sounds were closely matched throughout primary and non-primary auditory cortex, even when tested with ferret vocalizations. This finding reveals that auditory representations in humans and ferrets diverge sharply at late stages of cortical processing, potentially driven by higher-order processing demands in speech and music.


Subject(s)
Auditory Cortex/physiology , Auditory Perception/physiology , Ferrets/physiology , Sound , Acoustic Stimulation , Animals , Humans
6.
J Acoust Soc Am ; 147(5): 3260, 2020 05.
Article in English | MEDLINE | ID: mdl-32486802

ABSTRACT

Natural soundscapes correspond to the acoustical patterns produced by biological and geophysical sound sources at different spatial and temporal scales for a given habitat. This pilot study aims to characterize the temporal-modulation information available to humans when perceiving variations in soundscapes within and across natural habitats. This is addressed by processing soundscapes from a previous study [Krause, Gage, and Joo. (2011). Landscape Ecol. 26, 1247] via models of human auditory processing extracting modulation at the output of cochlear filters. The soundscapes represent combinations of elevation, animal, and vegetation diversity in four habitats of the biosphere reserve in the Sequoia National Park (Sierra Nevada, USA). Bayesian statistical analysis and support vector machine classifiers indicate that: (i) amplitude-modulation (AM) and frequency-modulation (FM) spectra distinguish the soundscapes associated with each habitat; and (ii) for each habitat, diurnal and seasonal variations are associated with salient changes in AM and FM cues at rates between about 1 and 100 Hz in the low (<0.5 kHz) and high (>1-3 kHz) audio-frequency range. Support vector machine classifications further indicate that soundscape variations can be classified accurately based on these perceptually inspired representations.


Subject(s)
Cues , Sound , Animals , Bayes Theorem , Ecosystem , Humans , Pilot Projects
7.
Nat Commun ; 11(1): 3176, 2020 06 18.
Article in English | MEDLINE | ID: mdl-32555158

ABSTRACT

An amendment to this paper has been published and can be accessed via a link at the top of the paper.

8.
J Neurophysiol ; 122(4): 1606-1622, 2019 10 01.
Article in English | MEDLINE | ID: mdl-31411931

ABSTRACT

Rats use their whiskers to extract sensory information from their environment. While exploring, they analyze peripheral stimuli distributed over several whiskers. Previous studies have reported cross-whisker integration of information at several levels of the neuronal pathways from whisker follicles to the somatosensory cortex. In the present study, we investigated the possible coupling between whiskers at a preneuronal level, transmitted by the skin and muscles between follicles. First, we quantified the movement induced on one whisker by deflecting another whisker. Our results show significant mechanical coupling, predominantly when a given whisker's caudal neighbor in the same row is deflected. The magnitude of the effect was correlated with the diameter of the deflected whisker. In addition to changes in whisker angle, we observed curvature changes when the whisker shaft was constrained distally from the base. Second, we found that trigeminal ganglion neurons innervating a given whisker follicle fire action potentials in response to high-magnitude deflections of an adjacent whisker. This functional coupling also shows a bias toward the caudal neighbor located in the same row. Finally, we designed a two-whisker biomechanical model to investigate transmission of forces across follicles. Analysis of the whisker-follicle contact forces suggests that activation of mechanoreceptors in the ring sinus region could account for our electrophysiological results. The model can fully explain the observed caudal bias by the gradient in whisker diameter, with possible contribution of the intrinsic muscles connecting follicles. Overall, our study demonstrates the functional relevance of mechanical coupling on early information processing in the whisker system.NEW & NOTEWORTHY Rodents explore their environment actively by touching objects with their whiskers. A major challenge is to understand how sensory inputs from different whiskers are merged together to form a coherent tactile percept. We demonstrate that external sensory events on one whisker can influence the position of another whisker and, importantly, that they can trigger the activity of mechanoreceptors at its base. This cross-whisker interaction occurs pre-neuronally, through mechanical transmission of forces in the skin.


Subject(s)
Mechanoreceptors/physiology , Movement , Touch Perception , Vibrissae/physiology , Action Potentials , Animals , Male , Rats , Rats, Wistar , Trigeminal Ganglion/cytology , Trigeminal Ganglion/physiology , Vibrissae/innervation
9.
Nat Commun ; 10(1): 2151, 2019 05 14.
Article in English | MEDLINE | ID: mdl-31089133

ABSTRACT

Performance on cognitive tasks during learning is used to measure knowledge, yet it remains controversial since such testing is susceptible to contextual factors. To what extent does performance during learning depend on the testing context, rather than underlying knowledge? We trained mice, rats and ferrets on a range of tasks to examine how testing context impacts the acquisition of knowledge versus its expression. We interleaved reinforced trials with probe trials in which we omitted reinforcement. Across tasks, each animal species performed remarkably better in probe trials during learning and inter-animal variability was strikingly reduced. Reinforcement feedback is thus critical for learning-related behavioral improvements but, paradoxically masks the expression of underlying knowledge. We capture these results with a network model in which learning occurs during reinforced trials while context modulates only the read-out parameters. Probing learning by omitting reinforcement thus uncovers latent knowledge and identifies context- not "smartness"- as the major source of individual variability.


Subject(s)
Models, Neurological , Reinforcement, Psychology , Animals , Behavior, Animal/physiology , Biological Variation, Population/physiology , Female , Ferrets , Male , Mice , Models, Animal , Rats
10.
Elife ; 72018 06 28.
Article in English | MEDLINE | ID: mdl-29952750

ABSTRACT

A major challenge in neuroscience is to longitudinally monitor whole brain activity across multiple spatial scales in the same animal. Functional UltraSound (fUS) is an emerging technology that offers images of cerebral blood volume over large brain portions. Here we show for the first time its capability to resolve the functional organization of sensory systems at multiple scales in awake animals, both within small structures by precisely mapping and differentiating sensory responses, and between structures by elucidating the connectivity scheme of top-down projections. We demonstrate that fUS provides stable (over days), yet rapid, highly-resolved 3D tonotopic maps in the auditory pathway of awake ferrets, thus revealing its unprecedented functional resolution (100/300µm). This was performed in four different brain regions, including very small (1-2 mm3 size), deeply situated subcortical (8 mm deep) and previously undescribed structures in the ferret. Furthermore, we used fUS to map long-distance projections from frontal cortex, a key source of sensory response modulation, to auditory cortex.


Subject(s)
Auditory Cortex/diagnostic imaging , Auditory Pathways/diagnostic imaging , Brain Mapping/methods , Frontal Lobe/diagnostic imaging , Ultrasonography/methods , Acoustic Stimulation , Animals , Auditory Cortex/anatomy & histology , Auditory Cortex/physiology , Auditory Pathways/anatomy & histology , Auditory Pathways/physiology , Brain Mapping/instrumentation , Cerebrovascular Circulation/physiology , Electrodes, Implanted , Female , Ferrets , Frontal Lobe/anatomy & histology , Frontal Lobe/physiology , Stereotaxic Techniques , Ultrasonography/instrumentation , Wakefulness/physiology
11.
Nat Commun ; 9(1): 2529, 2018 06 28.
Article in English | MEDLINE | ID: mdl-29955046

ABSTRACT

Primary sensory cortices are classically considered to extract and represent stimulus features, while association and higher-order areas are thought to carry information about stimulus meaning. Here we show that this information can in fact be found in the neuronal population code of the primary auditory cortex (A1). A1 activity was recorded in awake ferrets while they either passively listened or actively discriminated stimuli in a range of Go/No-Go paradigms, with different sounds and reinforcements. Population-level dimensionality reduction techniques reveal that task engagement induces a shift in stimulus encoding from a sensory to a behaviorally driven representation that specifically enhances the target stimulus in all paradigms. This shift partly relies on task-engagement-induced changes in spontaneous activity. Altogether, we show that A1 population activity bears strong similarities to frontal cortex responses. These findings indicate that primary sensory cortices implement a crucial change in the structure of population activity to extract task-relevant information during behavior.


Subject(s)
Auditory Cortex/physiology , Avoidance Learning/physiology , Conditioning, Classical/physiology , Frontal Lobe/physiology , Pattern Recognition, Physiological/physiology , Acoustic Stimulation , Animals , Auditory Cortex/anatomy & histology , Auditory Cortex/cytology , Choice Behavior/physiology , Electrodes, Implanted , Female , Ferrets , Frontal Lobe/anatomy & histology , Frontal Lobe/cytology , Multifactor Dimensionality Reduction , Neuronal Plasticity/physiology , Neurons/cytology , Neurons/physiology , Reinforcement, Psychology , Stereotaxic Techniques , Wakefulness/physiology
12.
eNeuro ; 5(2)2018.
Article in English | MEDLINE | ID: mdl-29662943

ABSTRACT

Many natural sounds can be well described on a statistical level, for example, wind, rain, or applause. Even though the spectro-temporal profile of these acoustic textures is highly dynamic, changes in their statistics are indicative of relevant changes in the environment. Here, we investigated the neural representation of change detection in natural textures in humans, and specifically addressed whether active task engagement is required for the neural representation of this change in statistics. Subjects listened to natural textures whose spectro-temporal statistics were modified at variable times by a variable amount. Subjects were instructed to either report the detection of changes (active) or to passively listen to the stimuli. A subset of passive subjects had performed the active task before (passive-aware vs passive-naive). Psychophysically, longer exposure to pre-change statistics was correlated with faster reaction times and better discrimination performance. EEG recordings revealed that the build-up rate and size of parieto-occipital (PO) potentials reflected change size and change time. Reduced effects were observed in the passive conditions. While P2 responses were comparable across conditions, slope and height of PO potentials scaled with task involvement. Neural source localization identified a parietal source as the main contributor of change-specific potentials, in addition to more limited contributions from auditory and frontal sources. In summary, the detection of statistical changes in natural acoustic textures is predominantly reflected in parietal locations both on the skull and source level. The scaling in magnitude across different levels of task involvement suggests a context-dependent degree of evidence integration.


Subject(s)
Auditory Perception/physiology , Discrimination, Psychological/physiology , Electroencephalography/methods , Evoked Potentials, Auditory/physiology , Parietal Lobe/physiology , Adult , Auditory Cortex/physiology , Female , Frontal Lobe/physiology , Humans , Male , Young Adult
13.
Elife ; 62017 03 06.
Article in English | MEDLINE | ID: mdl-28262095

ABSTRACT

Natural sounds such as wind or rain, are characterized by the statistical occurrence of their constituents. Despite their complexity, listeners readily detect changes in these contexts. We here address the neural basis of statistical decision-making using a combination of psychophysics, EEG and modelling. In a texture-based, change-detection paradigm, human performance and reaction times improved with longer pre-change exposure, consistent with improved estimation of baseline statistics. Change-locked and decision-related EEG responses were found in a centro-parietal scalp location, whose slope depended on change size, consistent with sensory evidence accumulation. The potential's amplitude scaled with the duration of pre-change exposure, suggesting a time-dependent decision threshold. Auditory cortex-related potentials showed no response to the change. A dual timescale, statistical estimation model accounted for subjects' performance. Furthermore, a decision-augmented auditory cortex model accounted for performance and reaction times, suggesting that the primary cortical representation requires little post-processing to enable change-detection in complex acoustic environments.


Subject(s)
Auditory Cortex/physiology , Auditory Perception , Acoustic Stimulation , Electroencephalography , Humans , Models, Neurological , Psychophysics
14.
Adv Exp Med Biol ; 894: 229-239, 2016.
Article in English | MEDLINE | ID: mdl-27080663

ABSTRACT

Many natural sounds have spectrotemporal signatures only on a statistical level, e.g. wind, fire or rain. While their local structure is highly variable, the spectrotemporal statistics of these auditory textures can be used for recognition. This suggests the existence of a neural representation of these statistics. To explore their encoding, we investigated the detectability of changes in the spectral statistics in relation to the properties of the change. To achieve precise parameter control, we designed a minimal sound texture--a modified cloud of tones--which retains the central property of auditory textures: solely statistical predictability. Listeners had to rapidly detect a change in the frequency marginal probability of the tone cloud occurring at a random time.The size of change as well as the time available to sample the original statistics were found to correlate positively with performance and negatively with reaction time, suggesting the accumulation of noisy evidence. In summary we quantified dynamic aspects of change detection in statistically defined contexts, and found evidence of integration of statistical information.


Subject(s)
Auditory Perception/physiology , Acoustic Stimulation , Adult , Female , Humans , Male , Reaction Time
15.
Front Behav Neurosci ; 10: 251, 2016.
Article in English | MEDLINE | ID: mdl-28119582

ABSTRACT

Rodents use their whiskers to locate nearby objects with an extreme precision. To perform such tasks, they need to detect whisker/object contacts with a high temporal accuracy. This contact detection is conveyed by classes of mechanoreceptors whose neural activity is sensitive to either slow or fast time varying mechanical stresses acting at the base of the whiskers. We developed a biomimetic approach to separate and characterize slow quasi-static and fast vibrational stress signals acting on a whisker base in realistic exploratory phases, using experiments on both real and artificial whiskers. Both slow and fast mechanical inputs are successfully captured using a mechanical model of the whisker. We present and discuss consequences of the whisking process in purely mechanical terms and hypothesize that free whisking in air sets a mechanical threshold for contact detection. The time resolution and robustness of the contact detection strategies based on either slow or fast stress signals are determined. Contact detection based on the vibrational signal is faster and more robust to exploratory conditions than the slow quasi-static component, although both slow/fast components allow localizing the object.

16.
J Neurosci ; 34(33): 10832-43, 2014 Aug 13.
Article in English | MEDLINE | ID: mdl-25122886

ABSTRACT

Whisking rodents can discriminate finely textured objects using their vibrissae. The biomechanical and neural processes underlying such sensory tasks remain elusive. Here we combine the use of model micropatterned substrates and high-resolution videography of rats' whiskers during tactile exploration to study how texture information is mechanically encoded in the whisker motion. A biomechanical modeling of the whisker is developed, which yields quantitative predictions of the spectral and temporal characteristics of the observed whisker kinetics, for any given topography. These texture-induced whisker vibrations are then replayed via a multiwhisker stimulator while recording neuronal responses in the barrel field of the primary somatosensory cortex (S1bf). These results provide a comprehensive description of the transduction process at play during fine texture sensing in rats. They suggest that the sensory system operates through a vibratory amplitude modulation/demodulation scheme. Fine textural properties are encoded in the time-varying envelope of the whisker-resonant vibrations. This quantity is then recovered by neural demodulation, as it effectively drives the spiking-rate signal of a large fraction of S1 cortical neurons. This encoding/decoding scheme is shown to be robust against variations in exploratory conditions, such as the scanning speed or pad-to-substrate distance, thus allowing for reliable tactile discrimination in realistic conditions.


Subject(s)
Neurons/physiology , Somatosensory Cortex/physiology , Touch Perception/physiology , Touch/physiology , Vibrissae/physiology , Action Potentials/physiology , Afferent Pathways/physiology , Animals , Biomechanical Phenomena/physiology , Discrimination, Psychological/physiology , Male , Rats , Rats, Wistar , Vibration
17.
Front Behav Neurosci ; 6: 74, 2012.
Article in English | MEDLINE | ID: mdl-23133410

ABSTRACT

Rats use their whiskers to extract a wealth of information about their immediate environment, such as the shape, position or texture of an object. The information is conveyed to mechanoreceptors located within the whisker follicle in the form of a sequence of whisker deflections induced by the whisker/object contact interaction. How the whiskers filter and shape the mechanical information and effectively participate in the coding of tactile features remains an open question to date. In the present article, a biomechanical model was developed that provides predictions of the whisker dynamics during active tactile exploration, amenable to quantitative experimental comparison. This model is based on a decomposition of the whisker profile into a slow, quasi-static sequence and rapid resonant small-scale vibrations. It was applied to the typical situation of a rat actively whisking across a solid object. Having derived the quasi-static sequence of whisker deformation, the resonant properties of the whisker were analyzed, taking into account the boundary conditions imposed by the whisker/surface contact. We then focused on two elementary mechanical events that are expected to trigger significant neural responses, namely (1) the whisker/object first contact and (2) the whisker detachment from the object. Both events were found to trigger a deflection wave propagating upward to the mystacial pad at constant velocity of ≈3-5 m/s. This yielded a characteristic mechanical signature at the whisker base, in the form of a large peak of negative curvature occurring ≈4 ms after the event has been triggered. The dependence in amplitude and lag of this mechanical signal with the main contextual parameters (such as radial or angular distance) was investigated. The model was validated experimentally by comparing its predictions to high-speed video recordings of shock-induced whisker deflections performed on anesthetized rats. The consequences of these results on possible tactile encoding schemes are briefly discussed.

SELECTION OF CITATIONS
SEARCH DETAIL
...