Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
Add more filters










Database
Language
Publication year range
1.
bioRxiv ; 2024 Jun 14.
Article in English | MEDLINE | ID: mdl-38915704

ABSTRACT

Methodological advances in neuroscience have enabled the collection of massive datasets which demand innovative approaches for scientific communication. Existing platforms for data storage lack intuitive tools for data exploration, limiting our ability to interact effectively with these brain-wide datasets. We introduce two public websites: (Data and Atlas) developed for the International Brain Laboratory which provide access to millions of behavioral trials and hundreds of thousands of individual neurons. These interfaces allow users to discover both the raw and processed brain-wide data released by the IBL at the scale of the whole brain, individual sessions, trials, and neurons. By hosting these data interfaces as websites they are available cross-platform with no installation. By releasing each site's code as a modular open-source framework, other researchers can easily develop their own web interfaces and explore their own data. As neuroscience datasets continue to expand, customizable web interfaces offer a glimpse into a future of streamlined data exploration and act as blueprints for future tools.

2.
Neuron ; 111(15): 2432-2447.e13, 2023 08 02.
Article in English | MEDLINE | ID: mdl-37295419

ABSTRACT

The brain can combine auditory and visual information to localize objects. However, the cortical substrates underlying audiovisual integration remain uncertain. Here, we show that mouse frontal cortex combines auditory and visual evidence; that this combination is additive, mirroring behavior; and that it evolves with learning. We trained mice in an audiovisual localization task. Inactivating frontal cortex impaired responses to either sensory modality, while inactivating visual or parietal cortex affected only visual stimuli. Recordings from >14,000 neurons indicated that after task learning, activity in the anterior part of frontal area MOs (secondary motor cortex) additively encodes visual and auditory signals, consistent with the mice's behavioral strategy. An accumulator model applied to these sensory representations reproduced the observed choices and reaction times. These results suggest that frontal cortex adapts through learning to combine evidence across sensory cortices, providing a signal that is transformed into a binary decision by a downstream accumulator.


Subject(s)
Auditory Cortex , Visual Perception , Animals , Mice , Visual Perception/physiology , Acoustic Stimulation/methods , Auditory Perception/physiology , Photic Stimulation/methods , Frontal Lobe , Auditory Cortex/physiology
3.
Nat Methods ; 20(3): 403-407, 2023 03.
Article in English | MEDLINE | ID: mdl-36864199

ABSTRACT

We describe an architecture for organizing, integrating and sharing neurophysiology data within a single laboratory or across a group of collaborators. It comprises a database linking data files to metadata and electronic laboratory notes; a module collecting data from multiple laboratories into one location; a protocol for searching and sharing data and a module for automatic analyses that populates a website. These modules can be used together or individually, by single laboratories or worldwide collaborations.


Subject(s)
Laboratories , Neurophysiology , Databases, Factual
5.
Elife ; 102021 05 20.
Article in English | MEDLINE | ID: mdl-34011433

ABSTRACT

Progress in science requires standardized assays whose results can be readily shared, compared, and reproduced across laboratories. Reproducibility, however, has been a concern in neuroscience, particularly for measurements of mouse behavior. Here, we show that a standardized task to probe decision-making in mice produces reproducible results across multiple laboratories. We adopted a task for head-fixed mice that assays perceptual and value-based decision making, and we standardized training protocol and experimental hardware, software, and procedures. We trained 140 mice across seven laboratories in three countries, and we collected 5 million mouse choices into a publicly available database. Learning speed was variable across mice and laboratories, but once training was complete there were no significant differences in behavior across laboratories. Mice in different laboratories adopted similar reliance on visual stimuli, on past successes and failures, and on estimates of stimulus prior probability to guide their choices. These results reveal that a complex mouse behavior can be reproduced across multiple laboratories. They establish a standard for reproducible rodent behavior, and provide an unprecedented dataset and open-access tools to study decision-making in mice. More generally, they indicate a path toward achieving reproducibility in neuroscience through collaborative open-science approaches.


In science, it is of vital importance that multiple studies corroborate the same result. Researchers therefore need to know all the details of previous experiments in order to implement the procedures as exactly as possible. However, this is becoming a major problem in neuroscience, as animal studies of behavior have proven to be hard to reproduce, and most experiments are never replicated by other laboratories. Mice are increasingly being used to study the neural mechanisms of decision making, taking advantage of the genetic, imaging and physiological tools that are available for mouse brains. Yet, the lack of standardized behavioral assays is leading to inconsistent results between laboratories. This makes it challenging to carry out large-scale collaborations which have led to massive breakthroughs in other fields such as physics and genetics. To help make these studies more reproducible, the International Brain Laboratory (a collaborative research group) et al. developed a standardized approach for investigating decision making in mice that incorporates every step of the process; from the training protocol to the software used to analyze the data. In the experiment, mice were shown images with different contrast and had to indicate, using a steering wheel, whether it appeared on their right or left. The mice then received a drop of sugar water for every correction decision. When the image contrast was high, mice could rely on their vision. However, when the image contrast was very low or zero, they needed to consider the information of previous trials and choose the side that had recently appeared more frequently. This method was used to train 140 mice in seven laboratories from three different countries. The results showed that learning speed was different across mice and laboratories, but once training was complete the mice behaved consistently, relying on visual stimuli or experiences to guide their choices in a similar way. These results show that complex behaviors in mice can be reproduced across multiple laboratories, providing an unprecedented dataset and open-access tools for studying decision making. This work could serve as a foundation for other groups, paving the way to a more collaborative approach in the field of neuroscience that could help to tackle complex research challenges.


Subject(s)
Behavior, Animal , Biomedical Research/standards , Decision Making , Neurosciences/standards , Animals , Cues , Female , Learning , Male , Mice, Inbred C57BL , Models, Animal , Observer Variation , Photic Stimulation , Reproducibility of Results , Time Factors , Visual Perception
6.
Curr Biol ; 30(19): 3811-3817.e6, 2020 10 05.
Article in English | MEDLINE | ID: mdl-32763173

ABSTRACT

The visual responses of neurons in the primary visual cortex (V1) are influenced by the animal's position in the environment [1-5]. V1 responses encode positions that co-fluctuate with those encoded by place cells in hippocampal area CA1 [2, 5]. This correlation might reflect a common influence of non-visual spatial signals on both areas. Place cells in CA1, indeed, do not rely only on vision; their place preference depends on the physical distance traveled [6-11] and on the phase of the 6-9 Hz theta oscillation [12, 13]. Are V1 responses similarly influenced by these non-visual factors? We recorded V1 and CA1 neurons simultaneously while mice performed a spatial task in a virtual corridor by running on a wheel and licking at a reward location. By changing the gain that couples the wheel movement to the virtual environment, we found that ∼20% of V1 neurons were influenced by the physical distance traveled, as were ∼40% of CA1 place cells. Moreover, the firing rate of ∼24% of V1 neurons was modulated by the phase of theta oscillations recorded in CA1 and the response profiles of ∼7% of V1 neurons shifted spatially across the theta cycle, analogous to the phase precession observed in ∼37% of CA1 place cells. The influence of theta oscillations on V1 responses was more prominent in putative layer 6. These results reveal that, in a familiar environment, sensory processing in V1 is modulated by the key non-visual signals that influence spatial coding in the hippocampus.


Subject(s)
Spatial Behavior/physiology , Theta Rhythm/physiology , Visual Cortex/physiology , Action Potentials/physiology , Animals , CA1 Region, Hippocampal/physiology , Female , Hippocampus/physiology , Male , Mice , Mice, Inbred C57BL , Neurons/physiology , Pyramidal Cells/physiology , Reward , Visual Cortex/metabolism
7.
eNeuro ; 7(4)2020.
Article in English | MEDLINE | ID: mdl-32493756

ABSTRACT

Setting up an experiment in behavioral neuroscience is a complex process that is often managed with ad hoc solutions. To streamline this process, we developed Rigbox, a high-performance, open-source software toolbox that facilitates a modular approach to designing experiments (https://github.com/cortex-lab/Rigbox). Rigbox simplifies hardware input-output, time aligns datastreams from multiple sources, communicates with remote databases, and implements visual and auditory stimuli presentation. Its main submodule, Signals, allows intuitive programming of behavioral tasks. Here we illustrate its function with the following two interactive examples: a human psychophysics experiment, and the game of Pong. We give an overview of running experiments in Rigbox, provide benchmarks, and conclude with a discussion on the extensibility of the software and comparisons with similar toolboxes. Rigbox runs in MATLAB, with Java components to handle network communication, and a C library to boost performance.


Subject(s)
Neurosciences , Software , Humans , Neurons , Psychophysics
8.
Neuron ; 105(4): 700-711.e6, 2020 02 19.
Article in English | MEDLINE | ID: mdl-31859030

ABSTRACT

Deciding between stimuli requires combining their learned value with one's sensory confidence. We trained mice in a visual task that probes this combination. Mouse choices reflected not only present confidence and past rewards but also past confidence. Their behavior conformed to a model that combines signal detection with reinforcement learning. In the model, the predicted value of the chosen option is the product of sensory confidence and learned value. We found precise correlates of this variable in the pre-outcome activity of midbrain dopamine neurons and of medial prefrontal cortical neurons. However, only the latter played a causal role: inactivating medial prefrontal cortex before outcome strengthened learning from the outcome. Dopamine neurons played a causal role only after outcome, when they encoded reward prediction errors graded by confidence, influencing subsequent choices. These results reveal neural signals that combine reward value with sensory confidence and guide subsequent learning.


Subject(s)
Choice Behavior/physiology , Dopaminergic Neurons/metabolism , Learning/physiology , Prefrontal Cortex/metabolism , Reward , Animals , Dopaminergic Neurons/chemistry , Male , Mice , Mice, Inbred C57BL , Mice, Transgenic , Optogenetics/methods , Prefrontal Cortex/chemistry
9.
Cell Rep ; 20(10): 2513-2524, 2017 Sep 05.
Article in English | MEDLINE | ID: mdl-28877482

ABSTRACT

Research in neuroscience increasingly relies on the mouse, a mammalian species that affords unparalleled genetic tractability and brain atlases. Here, we introduce high-yield methods for probing mouse visual decisions. Mice are head-fixed, facilitating repeatable visual stimulation, eye tracking, and brain access. They turn a steering wheel to make two alternative choices, forced or unforced. Learning is rapid thanks to intuitive coupling of stimuli to wheel position. The mouse decisions deliver high-quality psychometric curves for detection and discrimination and conform to the predictions of a simple probabilistic observer model. The task is readily paired with two-photon imaging of cortical activity. Optogenetic inactivation reveals that the task requires mice to use their visual cortex. Mice are motivated to perform the task by fluid reward or optogenetic stimulation of dopamine neurons. This stimulation elicits a larger number of trials and faster learning. These methods provide a platform to accurately probe mouse vision and its neural basis.


Subject(s)
Choice Behavior/physiology , Dopaminergic Neurons/metabolism , Psychophysics/methods , Visual Cortex/metabolism , Visual Cortex/physiology , Animals , Female , Male , Mice , Photic Stimulation
SELECTION OF CITATIONS
SEARCH DETAIL
...