Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Phys Rev E ; 104(2-1): 024204, 2021 Aug.
Article in English | MEDLINE | ID: mdl-34525513

ABSTRACT

We investigate the role of bistability in the synchronization of a network of identical bursting neurons coupled through an generic electrical mean-field scheme. These neurons can exhibit distinct multistable states and, in particular, bistable behavior is observed when their sodium conductance is varied. With this, we consider three different initialization compositions: (i) the whole network is in the same periodic state; (ii) half of the network periodic, half chaotic; (iii) half periodic, and half in a different periodic state. We show that (i) and (ii) reach phase synchronization (PS) for all coupling strengths, while for (iii) small coupling regimes do not induce PS, and instead, there is a coexistence of different frequencies. For stronger coupling, case (iii) synchronizes, but after (i) and (ii). Since PS requires all neurons being in the same state (same frequencies), these different behaviors are governed by transitions between the states. We find that, during these transitions, (ii) and (iii) have transient chimera states and that (iii) has breathing chimeras. By studying the stability of each state, we explain the observed transitions. Therefore, bistability of neurons can play a major role in the synchronization of generic networks, with the simple initialization of the system being capable of drastically changing its asymptotic space.

2.
Chaos ; 31(8): 083121, 2021 Aug.
Article in English | MEDLINE | ID: mdl-34470242

ABSTRACT

In this work, we study the phase synchronization of a neural network and explore how the heterogeneity in the neurons' dynamics can lead their phases to intermittently phase-lock and unlock. The neurons are connected through chemical excitatory connections in a sparse random topology, feel no noise or external inputs, and have identical parameters except for different in-degrees. They follow a modification of the Hodgkin-Huxley model, which adds details like temperature dependence, and can burst either periodically or chaotically when uncoupled. Coupling makes them chaotic in all cases but each individual mode leads to different transitions to phase synchronization in the networks due to increasing synaptic strength. In almost all cases, neurons' inter-burst intervals differ among themselves, which indicates their dynamical heterogeneity and leads to their intermittent phase-locking. We argue then that this behavior occurs here because of their chaotic dynamics and their differing initial conditions. We also investigate how this intermittency affects the formation of clusters of neurons in the network and show that the clusters' compositions change at a rate following the degree of intermittency. Finally, we discuss how these results relate to studies in the neuroscience literature, especially regarding metastability.


Subject(s)
Neural Networks, Computer , Neurons , Models, Neurological
3.
Sci Rep ; 11(1): 15789, 2021 08 04.
Article in English | MEDLINE | ID: mdl-34349134

ABSTRACT

Extracting relevant properties of empirical signals generated by nonlinear, stochastic, and high-dimensional systems is a challenge of complex systems research. Open questions are how to differentiate chaotic signals from stochastic ones, and how to quantify nonlinear and/or high-order temporal correlations. Here we propose a new technique to reliably address both problems. Our approach follows two steps: first, we train an artificial neural network (ANN) with flicker (colored) noise to predict the value of the parameter, [Formula: see text], that determines the strength of the correlation of the noise. To predict [Formula: see text] the ANN input features are a set of probabilities that are extracted from the time series by using symbolic ordinal analysis. Then, we input to the trained ANN the probabilities extracted from the time series of interest, and analyze the ANN output. We find that the [Formula: see text] value returned by the ANN is informative of the temporal correlations present in the time series. To distinguish between stochastic and chaotic signals, we exploit the fact that the difference between the permutation entropy (PE) of a given time series and the PE of flicker noise with the same [Formula: see text] parameter is small when the time series is stochastic, but it is large when the time series is chaotic. We validate our technique by analysing synthetic and empirical time series whose nature is well established. We also demonstrate the robustness of our approach with respect to the length of the time series and to the level of noise. We expect that our algorithm, which is freely available, will be very useful to the community.

SELECTION OF CITATIONS
SEARCH DETAIL
...