Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
2.
Nat Neurosci ; 24(7): 1010-1019, 2021 07.
Article in English | MEDLINE | ID: mdl-33986551

ABSTRACT

Synaptic plasticity is believed to be a key physiological mechanism for learning. It is well established that it depends on pre- and postsynaptic activity. However, models that rely solely on pre- and postsynaptic activity for synaptic changes have, so far, not been able to account for learning complex tasks that demand credit assignment in hierarchical networks. Here we show that if synaptic plasticity is regulated by high-frequency bursts of spikes, then pyramidal neurons higher in a hierarchical circuit can coordinate the plasticity of lower-level connections. Using simulations and mathematical analyses, we demonstrate that, when paired with short-term synaptic dynamics, regenerative activity in the apical dendrites and synaptic plasticity in feedback pathways, a burst-dependent learning rule can solve challenging tasks that require deep network architectures. Our results demonstrate that well-known properties of dendrites, synapses and synaptic plasticity are sufficient to enable sophisticated learning in hierarchical circuits.


Subject(s)
Deep Learning , Learning/physiology , Models, Neurological , Neuronal Plasticity/physiology , Pyramidal Cells/physiology , Animals , Humans
3.
Sci Rep ; 11(1): 8148, 2021 04 14.
Article in English | MEDLINE | ID: mdl-33854104

ABSTRACT

We present BonZeb-a suite of modular Bonsai packages which allow high-resolution zebrafish tracking with dynamic visual feedback. Bonsai is an increasingly popular software platform that is accelerating the standardization of experimental protocols within the neurosciences due to its speed, flexibility, and minimal programming overhead. BonZeb can be implemented into novel and existing Bonsai workflows for online behavioral tracking and offline tracking with batch processing. We demonstrate that BonZeb can run a variety of experimental configurations used for gaining insights into the neural mechanisms of zebrafish behavior. BonZeb supports head-fixed closed-loop and free-swimming virtual open-loop assays as well as multi-animal tracking, optogenetic stimulation, and calcium imaging during behavior. The combined performance, ease of use and versatility of BonZeb opens new experimental avenues for researchers seeking high-resolution behavioral tracking of larval zebrafish.


Subject(s)
Swimming/physiology , Video Recording/methods , Zebrafish/physiology , Animals , Behavior, Animal/physiology , Calcium/metabolism , Optogenetics/instrumentation , Software , Video Recording/instrumentation
4.
PLoS Comput Biol ; 14(8): e1006315, 2018 08.
Article in English | MEDLINE | ID: mdl-30067746

ABSTRACT

Symptoms of schizophrenia may arise from a failure of cortical circuits to filter-out irrelevant inputs. Schizophrenia has also been linked to disruptions in cortical inhibitory interneurons, consistent with the possibility that in the normally functioning brain, these cells are in some part responsible for determining which sensory inputs are relevant versus irrelevant. Here, we develop a neural network model that demonstrates how the cortex may learn to ignore irrelevant inputs through plasticity processes affecting inhibition. The model is based on the proposal that the amount of excitatory output from a cortical circuit encodes the expected magnitude of reward or punishment ("relevance"), which can be trained using a temporal difference learning mechanism acting on feedforward inputs to inhibitory interneurons. In the model, irrelevant and blocked stimuli drive lower levels of excitatory activity compared with novel and relevant stimuli, and this difference in activity levels is lost following disruptions to inhibitory units. When excitatory units are connected to a competitive-learning output layer with a threshold, the relevance code can be shown to "gate" both learning and behavioral responses to irrelevant stimuli. Accordingly, the combined network is capable of recapitulating published experimental data linking inhibition in frontal cortex with fear learning and expression. Finally, the model demonstrates how relevance learning can take place in parallel with other types of learning, through plasticity rules involving inhibitory and excitatory components, respectively. Altogether, this work offers a theory of how the cortex learns to selectively inhibit inputs, providing insight into how relevance-assignment problems may emerge in schizophrenia.


Subject(s)
Learning/physiology , Neuronal Plasticity/physiology , Schizophrenia/physiopathology , Action Potentials/physiology , Interneurons/physiology , Models, Biological , Models, Neurological , Models, Theoretical , Nerve Net/physiology , Neural Inhibition/physiology , Neural Networks, Computer
5.
Elife ; 62017 12 05.
Article in English | MEDLINE | ID: mdl-29205151

ABSTRACT

Deep learning has led to significant advances in artificial intelligence, in part, by adopting strategies motivated by neurophysiology. However, it is unclear whether deep learning could occur in the real brain. Here, we show that a deep learning algorithm that utilizes multi-compartment neurons might help us to understand how the neocortex optimizes cost functions. Like neocortical pyramidal neurons, neurons in our model receive sensory information and higher-order feedback in electrotonically segregated compartments. Thanks to this segregation, neurons in different layers of the network can coordinate synaptic weight updates. As a result, the network learns to categorize images better than a single layer network. Furthermore, we show that our algorithm takes advantage of multilayer architectures to identify useful higher-order representations-the hallmark of deep learning. This work demonstrates that deep learning can be achieved using segregated dendritic compartments, which may help to explain the morphology of neocortical pyramidal neurons.


Subject(s)
Artificial Intelligence , Machine Learning , Neural Networks, Computer , Models, Neurological
SELECTION OF CITATIONS
SEARCH DETAIL
...