Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Proc Natl Acad Sci U S A ; 117(34): 20881-20889, 2020 08 25.
Article in English | MEDLINE | ID: mdl-32788365

ABSTRACT

Language processing involves the ability to store and integrate pieces of information in working memory over short periods of time. According to the dominant view, information is maintained through sustained, elevated neural activity. Other work has argued that short-term synaptic facilitation can serve as a substrate of memory. Here we propose an account where memory is supported by intrinsic plasticity that downregulates neuronal firing rates. Single neuron responses are dependent on experience, and we show through simulations that these adaptive changes in excitability provide memory on timescales ranging from milliseconds to seconds. On this account, spiking activity writes information into coupled dynamic variables that control adaptation and move at slower timescales than the membrane potential. From these variables, information is continuously read back into the active membrane state for processing. This neuronal memory mechanism does not rely on persistent activity, excitatory feedback, or synaptic plasticity for storage. Instead, information is maintained in adaptive conductances that reduce firing rates and can be accessed directly without cued retrieval. Memory span is systematically related to both the time constant of adaptation and baseline levels of neuronal excitability. Interference effects within memory arise when adaptation is long lasting. We demonstrate that this mechanism is sensitive to context and serial order which makes it suitable for temporal integration in sequence processing within the language domain. We also show that it enables the binding of linguistic features over time within dynamic memory registers. This work provides a step toward a computational neurobiology of language.


Subject(s)
Memory, Short-Term/physiology , Neuronal Plasticity/physiology , Neurons/physiology , Animals , Humans , Language , Models, Neurological , Neural Networks, Computer , Neurons/metabolism , Synapses/physiology
2.
PLoS Comput Biol ; 12(6): e1004895, 2016 06.
Article in English | MEDLINE | ID: mdl-27309381

ABSTRACT

Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them.


Subject(s)
Action Potentials/physiology , Learning/physiology , Models, Neurological , Animals , Computational Biology , Humans , Memory, Long-Term/physiology , Nerve Net/physiology , Neural Networks, Computer , Neurons/physiology , Nonlinear Dynamics , Synaptic Transmission/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...