Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
Add more filters










Publication year range
1.
Biosystems ; 220: 104756, 2022 Oct.
Article in English | MEDLINE | ID: mdl-35940498

ABSTRACT

We consider a model for the propagation of electrical impulses or activity in a neuronal network. The vertices of a square lattice represent neurons, and the edges of the lattice represent the synaptic connections. Each vertex v is assigned a type: inhibitory or excitatory. The dynamics of propagation of the initial activity captures features of the "integrate-and-fire" model. We study the spread of activation in a large network and describe possible spatio-temporal limiting patterns depending on the initial activation. The rich palette of the limits with qualitatively different properties, including expanding patterns, fixed patterns, and patterns moving across the network, allows us to argue that this is a versatile model for the study of associative memory.


Subject(s)
Cellular Automata , Models, Neurological , Neurons/physiology
2.
Front Comput Neurosci ; 13: 51, 2019.
Article in English | MEDLINE | ID: mdl-31417386

ABSTRACT

A new approach to understanding the interaction between cortical areas is provided by a mathematical analysis of biased competition, which describes many interactions between cortical areas, including those involved in top-down attention. The analysis helps to elucidate the principles of operation of such cortical systems, and in particular the parameter values within which biased competition operates. The analytic results are supported by simulations that illustrate the operation of the system with parameters selected from the analysis. The findings provide a detailed mathematical analysis of the operation of these neural systems with nodes connected by feedforward (bottom-up) and feedback (top-down) connections. The analysis provides the critical value of the top-down attentional bias that enables biased competition to operate for a range of input values to the network, and derives this as a function of all the parameters in the model. The critical value of the top-down bias depends linearly on the value of the other inputs, but the coefficients in the function reveal non-linear relations between the remaining parameters. The results provide reasons why the backprojections should not be very much weaker than the forward connections between two cortical areas. The major advantage of the analytical approach is that it discloses relations between all the parameters of the model.

3.
J Math Biol ; 79(5): 1639-1663, 2019 10.
Article in English | MEDLINE | ID: mdl-31338567

ABSTRACT

We provide an analysis of a randomly grown 2-d network which models the morphological growth of dendritic and axonal arbors. From the stochastic geometry of this model we derive a dynamic graph of potential synaptic connections. We estimate standard network parameters such as degree distribution, average shortest path length and clustering coefficient, considering all these parameters as functions of time. Our results show that even a simple model with just a few parameters is capable of representing a wide spectra of architecture, capturing properties of well-known models, such as random graphs or small world networks, depending on the time of the network development. The introduced model allows not only rather straightforward simulations but it is also amenable to a rigorous analysis. This provides a base for further study of formation of synaptic connections on such networks and their dynamics due to plasticity.


Subject(s)
Connectome , Models, Neurological , Nerve Net/growth & development , Animals , Computer Simulation , Connectome/statistics & numerical data , Humans , Mathematical Concepts , Nerve Net/physiology , Neuronal Plasticity , Stochastic Processes
4.
Biosystems ; 184: 103991, 2019 Oct.
Article in English | MEDLINE | ID: mdl-31351994

ABSTRACT

We study connectivity in a model of a growing neuronal network in dimensions 2 and 3. Although the axon-to-dendrite proximity is an insufficient condition for establishing a functional synapse, it is still a necessary one. Therefore we study connection probabilities at short distances between the randomly grown axon trees and somas as probabilities of potential connections between the corresponding neurons. Our results show that, contrary to a common belief, these probabilities do not necessarily decay polynomially or exponentially in distance, but there are regimes of parameter values when the probability of proximity is not sensitive to the distance. In particular, in 3 dimensions the Euclidean distance between the neuronal cell body centers of neurons seems to play a very subtle role, as the probabilities of connections are practically constant within a certain finite range of distance. The model has a sufficient number of parameters to assess networks of neurons of different types. Our results give a firm basis for further modelling of the neuronal connectivity taking into account some realistic bouton distributions for establishing synaptic connections.


Subject(s)
Action Potentials/physiology , Algorithms , Models, Neurological , Nerve Net/physiology , Neurons/physiology , Synapses/physiology , Animals , Axons/physiology , Dendrites/physiology , Probability
5.
Biosystems ; 136: 105-12, 2015 Oct.
Article in English | MEDLINE | ID: mdl-26375356

ABSTRACT

We introduce a growing random network on a plane as a model of a growing neuronal network. The properties of the structure of the induced graph are derived. We compare our results with available data. In particular, it is shown that depending on the parameters of the model the system undergoes in time different phases of the structure. We conclude with a possible explanation of some empirical data on the connections between neurons.


Subject(s)
Models, Neurological , Models, Statistical , Nerve Net/physiology , Neuronal Plasticity/physiology , Neurons/physiology , Synaptic Transmission/physiology , Animals , Computer Simulation , Feedback, Physiological/physiology , Humans
6.
Math Biosci Eng ; 11(1): 139-48, 2014 Feb.
Article in English | MEDLINE | ID: mdl-24245677

ABSTRACT

A model is considered for a neural network that is a stochastic process on a random graph. The neurons are represented by "integrate-and-fire" processes. The structure of the graph is determined by the probabilities of the connections, and it depends on the activity in the network. The dependence between the initial level of sparseness of the connections and the dynamics of activation in the network was investigated. A balanced regime was found between activity, i.e., the level of excitation in the network, and inhibition, that allows formation of synfire chains.


Subject(s)
Action Potentials/physiology , Brain/physiology , Neural Networks, Computer , Algorithms , Animals , Birds , Computer Simulation , Electrophysiology/methods , Humans , Models, Neurological , Neurons/physiology , Stochastic Processes , Synapses/physiology , Time Factors , Vocalization, Animal
7.
Brain Res ; 1434: 277-84, 2012 Jan 24.
Article in English | MEDLINE | ID: mdl-21875700

ABSTRACT

We consider a random synaptic pruning in an initially highly interconnected network. It is proved that a random network can maintain a self-sustained activity level for some parameters. For such a set of parameters a pruning is constructed so that in the resulting network each neuron/node has almost equal numbers of in- and out-connections. It is also shown that the set of parameters which admits a self-sustained activity level is rather small within the whole space of possible parameters. It is pointed out here that the threshold of connectivity for an auto-associative memory in a Hopfield model on a random graph coincides with the threshold for the bootstrap percolation on the same random graph. It is argued that this coincidence reflects the relations between the auto-associative memory mechanism and the properties of the underlying random network structure. This article is part of a Special Issue entitled "Neural Coding".


Subject(s)
Association Learning , Neural Networks, Computer , Neurons , Association Learning/physiology , Humans , Neural Pathways/physiology , Neurons/physiology , Random Allocation , Synapses/physiology
8.
Biosystems ; 89(1-3): 280-6, 2007.
Article in English | MEDLINE | ID: mdl-17292539

ABSTRACT

This paper presents an original mathematical framework based on graph theory which is a first attempt to investigate the dynamics of a model of neural networks with embedded spike timing dependent plasticity. The neurons correspond to integrate-and-fire units located at the vertices of a finite subset of 2D lattice. There are two types of vertices, corresponding to the inhibitory and the excitatory neurons. The edges are directed and labelled by the discrete values of the synaptic strength. We assume that there is an initial firing pattern corresponding to a subset of units that generate a spike. The number of activated externally vertices is a small fraction of the entire network. The model presented here describes how such pattern propagates throughout the network as a random walk on graph. Several results are compared with computational simulations and new data are presented for identifying critical parameters of the model.


Subject(s)
Action Potentials , Neuronal Plasticity , Models, Neurological
9.
Int J Syst Evol Microbiol ; 53(Pt 1): 113-119, 2003 Jan.
Article in English | MEDLINE | ID: mdl-12656161

ABSTRACT

Polyphasic genotypic analysis of 25 Acidithiobacillus ferrooxidans strains isolated from ores and ore concentrates collected in different regions of the world showed considerable strain heterogeneity. Restriction patterns of the chromosomal DNA of these strains obtained by PFGE were specific for each strain. According to the degree of DNA relatedness, 17 of the 23 strains studied were divided into four genomovars. Six independent, considerably divergent strains could not be assigned to any of the genomovars. A comparison of nearly complete nucleotide sequences of the 16S rDNA of five representatives of the genomovars (including the type strain of A. ferrooxidans, ATCC 23270T) with those of species of the genus Acidithiobacillus available from GenBank showed that most of the A. ferrooxidans strains, together with the type strain and some other strains of the species Acidithiobacillus thiooxidans, comprised a monophyletic cluster. Within this major cluster, A. ferrooxidans strains fell into four phylogenetic groups that were equidistant from the phylogenetic group of A. thiooxidans strains. In general, the distribution of strains among the phylogenetic groups correlated with their distribution among the genomovars, except that the representatives of two different genomovars fell into one phylogenetic group. Thus, at least two levels of phylogenetic heterogeneity for A. ferrooxidans have been found. The phylogenetic heterogeneity of A. ferrooxidans strains, which are phenotypically indistinguishable, suggests the occurrence of microevolutionary processes in different econiches. This should be taken into account in the biohydrometallurgical applications of A. ferrooxidans strains.


Subject(s)
Thiobacillus/classification , Thiobacillus/genetics , Base Composition , DNA, Bacterial/chemistry , DNA, Bacterial/genetics , DNA, Ribosomal/genetics , Ecosystem , Electrophoresis, Gel, Pulsed-Field , Genome, Bacterial , Minerals , Molecular Sequence Data , Phylogeny , RNA, Bacterial/genetics , RNA, Ribosomal, 16S/genetics , Thiobacillus/isolation & purification
10.
Biosystems ; 67(1-3): 281-6, 2002.
Article in English | MEDLINE | ID: mdl-12459308

ABSTRACT

The dynamical random graphs associated with a certain class of biological neural networks are introduced and studied. We describe the phase diagram revealing the parameters of a single neuron and of the synaptic strengths which allow formation of the stable strongly connected large groups of neurons. It is shown that the cycles are the most stable structures when the Hebb rule is implemented into the dynamics of the network of excitatory neurons. We discuss the role of cycles for the synchronization of the neuronal activity.


Subject(s)
Neural Networks, Computer , Neuronal Plasticity/physiology , Random Allocation , Stochastic Processes
11.
Phys Rev E Stat Nonlin Soft Matter Phys ; 65(6 Pt 2): 066102, 2002 Jun.
Article in English | MEDLINE | ID: mdl-12188778

ABSTRACT

We study the large-time dynamics of a Markov process whose states are finite but unbounded graphs. The number of vertices is described by a supercritical branching process, and the edges follow a certain mean-field dynamics determined by the rates of appending and deleting: the older an edge is, the lesser is the probability that it is still in the graph. The lifetime of any edge is distributed exponentially. We call its mean value (common for all edges) a parameter of memory, since it shows for how long the system keeps a particular connection between the vertices in the graph. We show that our model provides a bridge between two well-known models: when the parameter of memory goes to infinity this is a generalized model of random growth, and when this parameter is zero, i.e., no memory, our model behaves as a random graph. Thus by introducing a general class of dynamical graphs we have a unified overview on rather different models and the relations between them. We find all the critical values of the parameters at which our model exhibits phase transitions and describe the properties of the phase diagram. Finally, we compare and discuss the efficiency of the corresponding networks.

SELECTION OF CITATIONS
SEARCH DETAIL
...