Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Neural Netw ; 164: 275-309, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37163846

ABSTRACT

Neocortical pyramidal neurons have many dendrites, and such dendrites are capable of, in isolation of one-another, generating a neuronal spike. It is also now understood that there is a large amount of dendritic growth during the first years of a humans life, arguably a period of prodigious learning. These observations inspire the construction of a local, stochastic algorithm based on an earlier stochastic, homeostatic, Hebbian developmental theory. Here we investigate the neurocomputational advantages and limits on this novel algorithm that combines dendritogenesis with supervised adaptive synaptogenesis. Neurons created with this algorithm have enhanced memory capacity, can avoid catastrophic interference (forgetting), and have the ability to unmix mixture distributions. In particular, individual dendrites develop within each class, in an unsupervised manner, to become feature-clusters that correspond to the mixing elements of class-conditional mixture distribution. Error-free classification is demonstrated with input perturbations up to 40%. Although discriminative problems are used to understand the capabilities of the stochastic algorithm and the neuronal connectivity it produces, the algorithm is in the generative class, it thus seems ideal for decisions that require generalization, i.e., extrapolation beyond previous learning.


Subject(s)
Dendrites , Synapses , Humans , Dendrites/physiology , Synapses/physiology , Neurons/physiology , Pyramidal Cells/physiology , Learning , Models, Neurological
2.
Neural Netw ; 122: 68-93, 2020 Feb.
Article in English | MEDLINE | ID: mdl-31675628

ABSTRACT

The immense complexity of the brain requires that it be built and controlled by intrinsic, self-regulating mechanisms. One such mechanism, the formation of new connections via synaptogenesis, plays a central role in neuronal connectivity and, ultimately, performance. Adaptive synaptogenesis networks combine synaptogenesis, associative synaptic modification, and synaptic shedding to construct sparse networks. Here, inspired by neuroscientific observations, novel aspects of brain development are incorporated into adaptive synaptogenesis. The extensions include: (i) multiple layers, (ii) neuron survival and death based on information transmission, and (iii) bigrade growth factor signaling to control the onset of synaptogenesis in succeeding layers and to control neuron survival and death in preceding layers. Also guiding this research is the assumption that brains must achieve a compromise between good performance and low energy expenditures. Simulations of the network model demonstrate the parametric and functional control of both performance and energy expenditures, where performance is measured in terms of information loss and classification errors, and energy expenditures are assumed to be a monotonically increasing function of the number of neurons. Major insights from this study include (a) the key role a neural layer between two other layers has in controlling synaptogenesis and neuron elimination, (b) the performance and energy-savings benefits of delaying the onset of synaptogenesis in a succeeding layer, and (c) how the elimination of neurons in a preceding layer provides energy savings, code compression, and can be accomplished without significantly degrading information transfer or classification performance.


Subject(s)
Homeostasis , Models, Neurological , Neural Networks, Computer , Brain/physiology , Humans , Synaptic Transmission
3.
J Neurosci ; 22(11): 4746-55, 2002 Jun 01.
Article in English | MEDLINE | ID: mdl-12040082

ABSTRACT

Organisms evolve as compromises, and many of these compromises can be expressed in terms of energy efficiency. For example, a compromise between rate of information processing and the energy consumed might explain certain neurophysiological and neuroanatomical observations (e.g., average firing frequency and number of neurons). Using this perspective reveals that the randomness injected into neural processing by the statistical uncertainty of synaptic transmission optimizes one kind of information processing relative to energy use. A critical hypothesis and insight is that neuronal information processing is appropriately measured, first, by considering dendrosomatic summation as a Shannon-type channel (1948) and, second, by considering such uncertain synaptic transmission as part of the dendrosomatic computation rather than as part of axonal information transmission. Using such a model of neural computation and matching the information gathered by dendritic summation to the axonal information transmitted, H(p*), conditions are defined that guarantee synaptic failures can improve the energetic efficiency of neurons. Further development provides a general expression relating optimal failure rate, f, to average firing rate, p*, and is consistent with physiologically observed values. The expression providing this relationship, f approximately 4(-H(p*)), generalizes across activity levels and is independent of the number of inputs to a neuron.


Subject(s)
Computer Simulation , Energy Metabolism/physiology , Models, Neurological , Neurons/physiology , Synapses/physiology , Entropy , Information Theory , Mathematics , Neocortex/physiology , Synaptic Transmission/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...