Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
Neural Netw ; 109: 90-102, 2019 Jan.
Article in English | MEDLINE | ID: mdl-30408697

ABSTRACT

Information needs to be appropriately encoded to be reliably transmitted over physical media. Similarly, neurons have their own codes to convey information in the brain. Even though it is well-known that neurons exchange information using a pool of several protocols of spatio-temporal encodings, the suitability of each code and their performance as a function of network parameters and external stimuli is still one of the great mysteries in neuroscience. This paper sheds light on this by modeling small-size networks of chemically and electrically coupled Hindmarsh-Rose spiking neurons. We focus on a class of temporal and firing-rate codes that result from neurons' membrane-potentials and phases, and quantify numerically their performance estimating the Mutual Information Rate, aka the rate of information exchange. Our results suggest that the firing-rate and interspike-intervals codes are more robust to additive Gaussian white noise. In a network of four interconnected neurons and in the absence of such noise, pairs of neurons that have the largest rate of information exchange using the interspike-intervals and firing-rate codes are not adjacent in the network, whereas spike-timings and phase codes (temporal) promote large rate of information exchange for adjacent neurons. If that result would have been possible to extend to larger neural networks, it would suggest that small microcircuits would preferably exchange information using temporal codes (spike-timings and phase codes), whereas on the macroscopic scale, where there would be typically pairs of neurons not directly connected due to the brain's sparsity, firing-rate and interspike-intervals codes would be the most efficient codes.


Subject(s)
Action Potentials , Neural Networks, Computer , Neurons , Action Potentials/physiology , Animals , Brain/physiology , Membrane Potentials , Models, Neurological , Neurons/physiology
2.
Chaos ; 28(7): 075509, 2018 Jul.
Article in English | MEDLINE | ID: mdl-30070522

ABSTRACT

In a causal world the direction of the time arrow dictates how past causal events in a variable X produce future effects in Y. X is said to cause an effect in Y, if the predictability (uncertainty) about the future states of Y increases (decreases) as its own past and the past of X are taken into consideration. Causality is thus intrinsic dependent on the observation of the past events of both variables involved, to the prediction (or uncertainty reduction) of future event of the other variable. We will show that this temporal notion of causality leads to another natural spatiotemporal definition for it, and that can be exploited to detect the arrow of influence from X to Y, either by considering shorter time-series of X and longer time-series of Y (an approach that explores the time nature of causality) or lower precision measured time-series in X and higher precision measured time-series in Y (an approach that explores the spatial nature of causality). Causality has thus space and time signatures, causing a break of symmetry in the topology of the probabilistic space, or causing a break of symmetry in the length of the measured time-series, a consequence of the fact that information flows from X to Y.

3.
Sci Rep ; 8(1): 3785, 2018 02 28.
Article in English | MEDLINE | ID: mdl-29491432

ABSTRACT

When the state of the whole reaction network can be inferred by just measuring the dynamics of a limited set of nodes the system is said to be fully observable. However, as the number of all possible combinations of measured variables and time derivatives spanning the reconstructed state of the system exponentially increases with its dimension, the observability becomes a computationally prohibitive task. Our approach consists in computing the observability coefficients from a symbolic Jacobian matrix whose elements encode the linear, nonlinear polynomial or rational nature of the interaction among the variables. The novelty we introduce in this paper, required for treating large-dimensional systems, is to identify from the symbolic Jacobian matrix the minimal set of variables (together with their time derivatives) candidate to be measured for completing the state space reconstruction. Then symbolic observability coefficients are computed from the symbolic observability matrix. Our results are in agreement with the analytical computations, evidencing the correctness of our approach. Its application to efficiently exploring the dynamics of real world complex systems such as power grids, socioeconomic networks or biological networks is quite promising.


Subject(s)
DNA Replication , Models, Theoretical , Neural Networks, Computer , Nonlinear Dynamics , Systems Theory , Algorithms , Humans , Observation
4.
Article in English | MEDLINE | ID: mdl-26172777

ABSTRACT

Observability is a very useful concept for determining whether the dynamics of complicated systems can be correctly reconstructed from a single (univariate or multivariate) time series. When the governing equations of dynamical systems are high-dimensional and/or rational, analytical computations of observability coefficients produce large polynomial functions with a number of terms that become exponentially large with the dimension and the nature of the system. In order to overcome this difficulty, we introduce here a symbolic observability coefficient based on a symbolic computation of the determinant of the observability matrix. The computation of such coefficients is straightforward and can be easily analytically carried out, as demonstrated in this paper for a five-dimensional rational system.

5.
PLoS One ; 9(2): e89585, 2014.
Article in English | MEDLINE | ID: mdl-24586891

ABSTRACT

We present novel results that relate energy and information transfer with sensitivity to initial conditions in chaotic multi-dimensional Hamiltonian systems. We show the relation among Kolmogorov-Sinai entropy, Lyapunov exponents, and upper bounds for the Mutual Information Rate calculated in the Hamiltonian phase space and on bi-dimensional subspaces. Our main result is that the net amount of transfer from kinetic to potential energy per unit of time is a power-law of the upper bound for the Mutual Information Rate between kinetic and potential energies, and also a power-law of the Kolmogorov-Sinai entropy. Therefore, transfer of energy is related with both transfer and production of information. However, the power-law nature of this relation means that a small increment of energy transferred leads to a relatively much larger increase of the information exchanged. Then, we propose an "experimental" implementation of a 1-dimensional communication channel based on a Hamiltonian system, and calculate the actual rate with which information is exchanged between the first and last particle of the channel. Finally, a relation between our results and important quantities of thermodynamics is presented.


Subject(s)
Energy Transfer , Thermodynamics
SELECTION OF CITATIONS
SEARCH DETAIL
...