Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 39
Filter
Add more filters










Publication year range
1.
PLoS Comput Biol ; 20(6): e1012218, 2024 Jun 25.
Article in English | MEDLINE | ID: mdl-38917228

ABSTRACT

Ripples are a typical form of neural activity in hippocampal neural networks associated with the replay of episodic memories during sleep as well as sleep-related plasticity and memory consolidation. The emergence of ripples has been observed both dependent as well as independent of input from other brain areas and often coincides with dendritic spikes. Yet, it is unclear how input-evoked and spontaneous ripples as well as dendritic excitability affect plasticity and consolidation. Here, we use mathematical modeling to compare these cases. We find that consolidation as well as the emergence of spontaneous ripples depends on a reliable propagation of activity in feed-forward structures which constitute memory representations. This propagation is facilitated by excitable dendrites, which entail that a few strong synapses are sufficient to trigger neuronal firing. In this situation, stimulation-evoked ripples lead to the potentiation of weak synapses within the feed-forward structure and, thus, to a consolidation of a more general sequence memory. However, spontaneous ripples that occur without stimulation, only consolidate a sparse backbone of the existing strong feed-forward structure. Based on this, we test a recently hypothesized scenario in which the excitability of dendrites is transiently enhanced after learning, and show that such a transient increase can strengthen, restructure and consolidate even weak hippocampal memories, which would be forgotten otherwise. Hence, a transient increase in dendritic excitability would indeed provide a mechanism for stabilizing memories.

2.
Sci Rep ; 14(1): 11054, 2024 05 14.
Article in English | MEDLINE | ID: mdl-38744976

ABSTRACT

Brain machine interfaces (BMIs) can substantially improve the quality of life of elderly or disabled people. However, performing complex action sequences with a BMI system is onerous because it requires issuing commands sequentially. Fundamentally different from this, we have designed a BMI system that reads out mental planning activity and issues commands in a proactive manner. To demonstrate this, we recorded brain activity from freely-moving monkeys performing an instructed task and decoded it with an energy-efficient, small and mobile field-programmable gate array hardware decoder triggering real-time action execution on smart devices. Core of this is an adaptive decoding algorithm that can compensate for the day-by-day neuronal signal fluctuations with minimal re-calibration effort. We show that open-loop planning-ahead control is possible using signals from primary and pre-motor areas leading to significant time-gain in the execution of action sequences. This novel approach provides, thus, a stepping stone towards improved and more humane control of different smart environments with mobile brain machine interfaces.


Subject(s)
Algorithms , Brain-Computer Interfaces , Animals , Brain/physiology , Macaca mulatta
3.
PLoS Comput Biol ; 20(3): e1011926, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38442095

ABSTRACT

In many situations it is behaviorally relevant for an animal to respond to co-occurrences of perceptual, possibly polymodal features, while these features alone may have no importance. Thus, it is crucial for animals to learn such feature combinations in spite of the fact that they may occur with variable intensity and occurrence frequency. Here, we present a novel unsupervised learning mechanism that is largely independent of these contingencies and allows neurons in a network to achieve specificity for different feature combinations. This is achieved by a novel correlation-based (Hebbian) learning rule, which allows for linear weight growth and which is combined with a mechanism for gradually reducing the learning rate as soon as the neuron's response becomes feature combination specific. In a set of control experiments, we show that other existing advanced learning rules cannot satisfactorily form ordered multi-feature representations. In addition, we show that networks, which use this type of learning always stabilize and converge to subsets of neurons with different feature-combination specificity. Neurons with this property may, thus, serve as an initial stage for the processing of ecologically relevant real world situations for an animal.


Subject(s)
Models, Neurological , Unsupervised Machine Learning , Animals , Neurons/physiology
4.
Front Neuroinform ; 16: 1015624, 2022.
Article in English | MEDLINE | ID: mdl-36439945

ABSTRACT

Developing intelligent neuromorphic solutions remains a challenging endeavor. It requires a solid conceptual understanding of the hardware's fundamental building blocks. Beyond this, accessible and user-friendly prototyping is crucial to speed up the design pipeline. We developed an open source Loihi emulator based on the neural network simulator Brian that can easily be incorporated into existing simulation workflows. We demonstrate errorless Loihi emulation in software for a single neuron and for a recurrently connected spiking neural network. On-chip learning is also reviewed and implemented, with reasonable discrepancy due to stochastic rounding. This work provides a coherent presentation of Loihi's computational unit and introduces a new, easy-to-use Loihi prototyping package with the aim to help streamline conceptualization and deployment of new algorithms.

5.
Sci Rep ; 12(1): 17772, 2022 10 22.
Article in English | MEDLINE | ID: mdl-36273097

ABSTRACT

Events that are important to an individual's life trigger neuromodulator release in brain areas responsible for cognitive and behavioral function. While it is well known that the presence of neuromodulators such as dopamine and norepinephrine is required for memory consolidation, the impact of neuromodulator concentration is, however, less understood. In a recurrent spiking neural network model featuring neuromodulator-dependent synaptic tagging and capture, we study how synaptic memory consolidation depends on the amount of neuromodulator present in the minutes to hours after learning. We find that the storage of rate-based and spike timing-based information is controlled by the level of neuromodulation. Specifically, we find better recall of temporal information for high levels of neuromodulation, while we find better recall of rate-coded spatial patterns for lower neuromodulation, mediated by the selection of different groups of synapses for consolidation. Hence, our results indicate that in minutes to hours after learning, the level of neuromodulation may alter the process of synaptic consolidation to ultimately control which type of information becomes consolidated in the recurrent neural network.


Subject(s)
Dopamine , Models, Neurological , Neural Networks, Computer , Synapses , Neurotransmitter Agents , Norepinephrine , Neuronal Plasticity
6.
PLoS One ; 17(5): e0266679, 2022.
Article in English | MEDLINE | ID: mdl-35617161

ABSTRACT

Spike timing-dependent plasticity, related to differential Hebb-rules, has become a leading paradigm in neuronal learning, because weights can grow or shrink depending on the timing of pre- and post-synaptic signals. Here we use this paradigm to reduce unwanted (acoustic) noise. Our system relies on heterosynaptic differential Hebbian learning and we show that it can efficiently eliminate noise by up to -140 dB in multi-microphone setups under various conditions. The system quickly learns, most often within a few seconds, and it is robust with respect to different geometrical microphone configurations, too. Hence, this theoretical study demonstrates that it is possible to successfully transfer differential Hebbian learning, derived from the neurosciences, into a technical domain.


Subject(s)
Learning , Neuronal Plasticity , Learning/physiology , Mathematics , Models, Neurological , Neuronal Plasticity/physiology , Neurons/physiology , Noise , Synapses/physiology
7.
Neuroscience ; 489: 290-300, 2022 05 01.
Article in English | MEDLINE | ID: mdl-34428499

ABSTRACT

BrainScaleS-2 is an accelerated and highly configurable neuromorphic system with physical models of neurons and synapses. Beyond networks of spiking point neurons, it allows for the implementation of user-defined neuron morphologies. Both passive propagation of electric signals between compartments as well as dendritic spikes and plateau potentials can be emulated. In this paper, three multi-compartment neuron morphologies are chosen to demonstrate passive propagation of postsynaptic potentials, spatio-temporal coincidence detection of synaptic inputs in a dendritic branch, and the replication of the BAC burst firing mechanism found in layer 5 pyramidal neurons of the neocortex.


Subject(s)
Neurons , Synapses , Action Potentials/physiology , Dendrites/physiology , Models, Neurological , Neurons/physiology , Pyramidal Cells
8.
Biology (Basel) ; 10(7)2021 Jun 24.
Article in English | MEDLINE | ID: mdl-34202473

ABSTRACT

Our brains process information using a layered hierarchical network architecture, with abundant connections within each layer and sparse long-range connections between layers. As these long-range connections are mostly unchanged after development, each layer has to locally self-organize in response to new inputs to enable information routing between the sparse in- and output connections. Here we demonstrate that this can be achieved by a well-established model of cortical self-organization based on a well-orchestrated interplay between several plasticity processes. After this self-organization, stimuli conveyed by sparse inputs can be rapidly read out from a layer using only very few long-range connections. To achieve this information routing, the neurons that are stimulated form feed-forward projections into the unstimulated parts of the same layer and get more neurons to represent the stimulus. Hereby, the plasticity processes ensure that each neuron only receives projections from and responds to only one stimulus such that the network is partitioned into parts with different preferred stimuli. Along this line, we show that the relation between the network activity and connectivity self-organizes into a biologically plausible regime. Finally, we argue how the emerging connectivity may minimize the metabolic cost for maintaining a network structure that rapidly transmits stimulus information despite sparse input and output connectivity.

9.
PLoS Comput Biol ; 17(3): e1008813, 2021 03.
Article in English | MEDLINE | ID: mdl-33750943

ABSTRACT

The maintenance of synaptic changes resulting from long-term potentiation (LTP) is essential for brain function such as memory and learning. Different LTP phases have been associated with diverse molecular processes and pathways, and the molecular underpinnings of LTP on the short, as well as long time scales, are well established. However, the principles on the intermediate time scale of 1-6 hours that mediate the early phase of LTP (E-LTP) remain elusive. We hypothesize that the interplay between specific features of postsynaptic receptor trafficking is responsible for sustaining synaptic changes during this LTP phase. We test this hypothesis by formalizing a biophysical model that integrates several experimentally-motivated mechanisms. The model captures a wide range of experimental findings and predicts that synaptic changes are preserved for hours when the receptor dynamics are shaped by the interplay of structural changes of the spine in conjunction with increased trafficking from recycling endosomes and the cooperative binding of receptors. Furthermore, our model provides several predictions to verify our findings experimentally.


Subject(s)
Long-Term Potentiation/physiology , Models, Neurological , Animals , Computational Biology , Dendrites/metabolism , Endosomes/metabolism , Glutamic Acid/metabolism , Receptors, Glutamate/metabolism
10.
Neurosci Biobehav Rev ; 126: 398-412, 2021 07.
Article in English | MEDLINE | ID: mdl-33775693

ABSTRACT

Hippocampal region CA2 has received increased attention due to its importance in social recognition memory. While its specific function remains to be identified, there are indications that CA2 plays a major role in a variety of situations, widely extending beyond social memory. In this targeted review, we highlight lines of research which have begun to converge on a more fundamental role for CA2 in hippocampus-dependent memory processing. We discuss recent proposals that speak to the computations CA2 may perform within the hippocampal circuit.


Subject(s)
CA2 Region, Hippocampal , Memory , Cognition , Hippocampus , Humans , Recognition, Psychology
11.
Commun Biol ; 4(1): 275, 2021 03 03.
Article in English | MEDLINE | ID: mdl-33658641

ABSTRACT

The synaptic-tagging-and-capture (STC) hypothesis formulates that at each synapse the concurrence of a tag with protein synthesis yields the maintenance of changes induced by synaptic plasticity. This hypothesis provides a biological principle underlying the synaptic consolidation of memories that is not verified for recurrent neural circuits. We developed a theoretical model integrating the mechanisms underlying the STC hypothesis with calcium-based synaptic plasticity in a recurrent spiking neural network. In the model, calcium-based synaptic plasticity yields the formation of strongly interconnected cell assemblies encoding memories, followed by consolidation through the STC mechanisms. Furthermore, we show for the first time that STC mechanisms modify the storage of memories such that after several hours memory recall is significantly improved. We identify two contributing processes: a merely time-dependent passive improvement, and an active improvement during recall. The described characteristics can provide a new principle for storing information in biological and artificial neural circuits.


Subject(s)
Brain/physiology , Calcium Signaling , Memory Consolidation , Mental Recall , Models, Neurological , Nerve Net/physiology , Neuronal Plasticity , Synaptic Transmission , Brain/cytology , Humans , Nerve Net/cytology , Time Factors
12.
Sci Rep ; 11(1): 4012, 2021 02 17.
Article in English | MEDLINE | ID: mdl-33597561

ABSTRACT

Dendritic spines change their size and shape spontaneously, but the function of this remains unclear. Here, we address this in a biophysical model of spine fluctuations, which reproduces experimentally measured spine fluctuations. For this, we characterize size- and shape fluctuations from confocal microscopy image sequences using autoregressive models and a new set of shape descriptors derived from circular statistics. Using the biophysical model, we extrapolate into longer temporal intervals and find the presence of 1/f noise. When investigating its origins, the model predicts that the actin dynamics underlying shape fluctuations self-organizes into a critical state, which creates a fine balance between static actin filaments and free monomers. In a comparison against a non-critical model, we show that this state facilitates spine enlargement, which happens after LTP induction. Thus, ongoing spine shape fluctuations might be necessary to react quickly to plasticity events.

13.
Neurosci Biobehav Rev ; 127: 946-957, 2021 08.
Article in English | MEDLINE | ID: mdl-33476672

ABSTRACT

The master clock, suprachiasmatic nucleus, is believed to control peripheral circadian oscillators throughout the brain and body. However, recent data suggest there is a circadian clock involved in learning and memory, potentially housed in the hippocampus, which is capable of acting independently of the master clock. Curiously, the hippocampal clock appears to be influenced by the master clock and by hippocampal dependent learning, while under certain conditions it may also revert to its endogenous circadian rhythm. Here we propose a mechanism by which the hippocampal clock could locally determine the nature of its entrainment. We introduce a novel theoretical framework, inspired by but extending beyond the hippocampal memory clock, which provides a new perspective on how circadian clocks throughout the brain coordinate their rhythms. Importantly, a local clock for memory would suggest that hippocampal-dependent learning at the same time every day should improve memory, opening up a range of possibilities for non-invasive therapies to alleviate the detrimental effects of circadian rhythm disruption on human health.


Subject(s)
Circadian Clocks , Brain , Circadian Rhythm , Humans , Learning , Suprachiasmatic Nucleus
15.
Front Neurorobot ; 14: 589532, 2020.
Article in English | MEDLINE | ID: mdl-33324191

ABSTRACT

Neuromorphic hardware has several promising advantages compared to von Neumann architectures and is highly interesting for robot control. However, despite the high speed and energy efficiency of neuromorphic computing, algorithms utilizing this hardware in control scenarios are still rare. One problem is the transition from fast spiking activity on the hardware, which acts on a timescale of a few milliseconds, to a control-relevant timescale on the order of hundreds of milliseconds. Another problem is the execution of complex trajectories, which requires spiking activity to contain sufficient variability, while at the same time, for reliable performance, network dynamics must be adequately robust against noise. In this study we exploit a recently developed biologically-inspired spiking neural network model, the so-called anisotropic network. We identified and transferred the core principles of the anisotropic network to neuromorphic hardware using Intel's neuromorphic research chip Loihi and validated the system on trajectories from a motor-control task performed by a robot arm. We developed a network architecture including the anisotropic network and a pooling layer which allows fast spike read-out from the chip and performs an inherent regularization. With this, we show that the anisotropic network on Loihi reliably encodes sequential patterns of neural activity, each representing a robotic action, and that the patterns allow the generation of multidimensional trajectories on control-relevant timescales. Taken together, our study presents a new algorithm that allows the generation of complex robotic movements as a building block for robotic control using state of the art neuromorphic hardware.

16.
Int J Mol Sci ; 21(19)2020 Oct 02.
Article in English | MEDLINE | ID: mdl-33023247

ABSTRACT

Synapses play a central role for the processing of information in the brain and have been analyzed in countless biochemical, electrophysiological, imaging, and computational studies. The functionality and plasticity of synapses are nevertheless still difficult to predict, and conflicting hypotheses have been proposed for many synaptic processes. In this review, we argue that the cause of these problems is a lack of understanding of the spatiotemporal dynamics of key synaptic components. Fortunately, a number of emerging imaging approaches, going beyond super-resolution, should be able to provide required protein positions in space at different points in time. Mathematical models can then integrate the resulting information to allow the prediction of the spatiotemporal dynamics. We argue that these models, to deal with the complexity of synaptic processes, need to be designed in a sufficiently abstract way. Taken together, we suggest that a well-designed combination of imaging and modelling approaches will result in a far more complete understanding of synaptic function than currently possible.


Subject(s)
Brain/physiology , Models, Neurological , Models, Theoretical , Synapses/physiology , Animals , Humans , Motivation/physiology , Neuronal Plasticity/physiology , Synaptic Transmission/physiology , Synaptic Vesicles/physiology
17.
Front Neural Circuits ; 14: 541728, 2020.
Article in English | MEDLINE | ID: mdl-33117130

ABSTRACT

It is commonly assumed that memories about experienced stimuli are represented by groups of highly interconnected neurons called cell assemblies. This requires allocating and storing information in the neural circuitry, which happens through synaptic weight adaptations at different types of synapses. In general, memory allocation is associated with synaptic changes at feed-forward synapses while memory storage is linked with adaptation of recurrent connections. It remains, however, largely unknown how memory allocation and storage can be achieved and the adaption of the different synapses involved be coordinated to allow for a faithful representation of multiple memories without disruptive interference between them. In this theoretical study, by using network simulations and phase space analyses, we show that the interplay between long-term synaptic plasticity and homeostatic synaptic scaling organizes simultaneously the adaptations of feed-forward and recurrent synapses such that a new stimulus forms a new memory and where different stimuli are assigned to distinct cell assemblies. The resulting dynamics can reproduce experimental in-vivo data, focusing on how diverse factors, such as neuronal excitability and network connectivity, influence memory formation. Thus, the here presented model suggests that a few fundamental synaptic mechanisms may suffice to implement memory allocation and storage in neural circuitry.


Subject(s)
Memory/physiology , Neural Pathways/physiology , Neuronal Plasticity/physiology , Neurons/physiology , Humans , Models, Neurological , Nerve Net , Neural Networks, Computer
18.
PLoS One ; 15(4): e0223743, 2020.
Article in English | MEDLINE | ID: mdl-32275703

ABSTRACT

In the course of everyday life, the brain must store and recall a huge variety of representations of stimuli which are presented in an ordered or sequential way. The processes by which the ordering of these various things is stored and recalled are moderately well understood. We use here a computational model of a cortex-like recurrent neural network adapted by a multitude of plasticity mechanisms. We first demonstrate the learning of a sequence. Then, we examine the influence of different types of distractors on the network dynamics during the recall of the encoded ordered information being ordered in a sequence. We are able to broadly arrive at two distinct effect-categories for distractors, arrive at a basic understanding of why this is so, and predict what distractors will fall into each category.


Subject(s)
Attention , Cerebral Cortex/physiology , Mental Recall , Models, Neurological , Arousal , Humans , Neuronal Plasticity
19.
Article in English | MEDLINE | ID: mdl-32218728

ABSTRACT

Dendritic spines are the morphological basis of excitatory synapses in the cortex and their size and shape correlates with functional synaptic properties. Recent experiments show that spines exhibit large shape fluctuations that are not related to activity-dependent plasticity but nonetheless might influence memory storage at their synapses. To investigate the determinants of such spontaneous fluctuations, we propose a mathematical model for the dynamics of the spine shape and analyze it in 2D-related to experimental microscopic imagery-and in 3D. We show that the spine shape is governed by a local imbalance between membrane tension and the expansive force from actin bundles that originates from discrete actin polymerization foci. Experiments have shown that only few such polymerization foci co-exist at any time in a spine, each having limited life time. The model shows that the momentarily existing set of such foci pushes the membrane along certain directions until foci are replaced and other directions may now be affected. We explore these relations in depth and use our model to predict shape and temporal characteristics of spines from the different biophysical parameters involved in actin polymerization. Approximating the model by a single recursive equation we finally demonstrate that the temporal evolution of the number of active foci is sufficient to predict the size of the model-spines. Thus, our model provides the first platform to study the relation between molecular and morphological properties of the spine with a high degree of biophysical detail.

20.
Netw Neurosci ; 4(1): 174-199, 2020.
Article in English | MEDLINE | ID: mdl-32166207

ABSTRACT

Along sensory pathways, representations of environmental stimuli become increasingly sparse and expanded. If additionally the feed-forward synaptic weights are structured according to the inherent organization of stimuli, the increase in sparseness and expansion leads to a reduction of sensory noise. However, it is unknown how the synapses in the brain form the required structure, especially given the omnipresent noise of environmental stimuli. Here, we employ a combination of synaptic plasticity and intrinsic plasticity-adapting the excitability of each neuron individually-and present stimuli with an inherent organization to a feed-forward network. We observe that intrinsic plasticity maintains the sparseness of the neural code and thereby allows synaptic plasticity to learn the organization of stimuli in low-noise environments. Nevertheless, even high levels of noise can be handled after a subsequent phase of readaptation of the neuronal excitabilities by intrinsic plasticity. Interestingly, during this phase the synaptic structure has to be maintained. These results demonstrate that learning and recalling in the presence of noise requires the coordinated interplay between plasticity mechanisms adapting different properties of the neuronal circuit.

SELECTION OF CITATIONS
SEARCH DETAIL
...