Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 27
Filter
Add more filters










Publication year range
1.
bioRxiv ; 2024 Apr 27.
Article in English | MEDLINE | ID: mdl-38712237

ABSTRACT

The principle of efficient coding posits that sensory cortical networks are designed to encode maximal sensory information with minimal metabolic cost. Despite the major influence of efficient coding in neuroscience, it has remained unclear whether fundamental empirical properties of neural network activity can be explained solely based on this normative principle. Here, we rigorously derive the structural, coding, biophysical and dynamical properties of excitatory-inhibitory recurrent networks of spiking neurons that emerge directly from imposing that the network minimizes an instantaneous loss function and a time-averaged performance measure enacting efficient coding. The optimal network has biologically-plausible biophysical features, including realistic integrate-and-fire spiking dynamics, spike-triggered adaptation, and a non-stimulus-specific excitatory external input regulating metabolic cost. The efficient network has excitatory-inhibitory recurrent connectivity between neurons with similar stimulus tuning implementing feature-specific competition, similar to that recently found in visual cortex. Networks with unstructured connectivity cannot reach comparable levels of coding efficiency. The optimal biophysical parameters include 4 to 1 ratio of excitatory vs inhibitory neurons and 3 to 1 ratio of mean inhibitory-to-inhibitory vs. excitatory-to-inhibitory connectivity that closely match those of cortical sensory networks. The efficient network has biologically-plausible spiking dynamics, with a tight instantaneous E-I balance that makes them capable to achieve efficient coding of external stimuli varying over multiple time scales. Together, these results explain how efficient coding may be implemented in cortical networks and suggests that key properties of biological neural networks may be accounted for by efficient coding.

2.
Sci Rep ; 14(1): 5713, 2024 03 08.
Article in English | MEDLINE | ID: mdl-38459077

ABSTRACT

Modeling the functionality of the human brain is a major goal in neuroscience for which many powerful methodologies have been developed over the last decade. The impact of working memory and the associated brain regions on the brain dynamics is of particular interest due to their connection with many functions and malfunctions in the brain. In this context, the concept of brain flexibility has been developed for the characterization of brain functionality. We discuss emergence of brain flexibility that is commonly measured by the identification of changes in the cluster structure of co-active brain regions. We provide evidence that brain flexibility can be modeled by a system of coupled FitzHugh-Nagumo oscillators where the network structure is obtained from human brain Diffusion Tensor Imaging (DTI). Additionally, we propose a straightforward and computationally efficient alternative macroscopic measure, which is derived from the Pearson distance of functional brain matrices. This metric exhibits similarities to the established patterns of brain template flexibility that have been observed in prior investigations. Furthermore, we explore the significance of the brain's network structure and the strength of connections between network nodes or brain regions associated with working memory in the observation of patterns in networks flexibility. This work enriches our understanding of the interplay between the structure and function of dynamic brain networks and proposes a modeling strategy to study brain flexibility.


Subject(s)
Brain Mapping , Diffusion Tensor Imaging , Humans , Brain Mapping/methods , Diffusion Tensor Imaging/methods , Feasibility Studies , Brain/diagnostic imaging , Memory, Short-Term
3.
PLoS Comput Biol ; 20(2): e1011886, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38377147

ABSTRACT

Hippocampal ripple oscillations have been implicated in important cognitive functions such as memory consolidation and planning. Multiple computational models have been proposed to explain the emergence of ripple oscillations, relying either on excitation or inhibition as the main pacemaker. Nevertheless, the generating mechanism of ripples remains unclear. An interesting dynamical feature of experimentally measured ripples, which may advance model selection, is intra-ripple frequency accommodation (IFA): a decay of the instantaneous ripple frequency over the course of a ripple event. So far, only a feedback-based inhibition-first model, which relies on delayed inhibitory synaptic coupling, has been shown to reproduce IFA. Here we use an analytical mean-field approach and numerical simulations of a leaky integrate-and-fire spiking network to explain the mechanism of IFA. We develop a drift-based approximation for the oscillation dynamics of the population rate and the mean membrane potential of interneurons under strong excitatory drive and strong inhibitory coupling. For IFA, the speed at which the excitatory drive changes is critical. We demonstrate that IFA arises due to a speed-dependent hysteresis effect in the dynamics of the mean membrane potential, when the interneurons receive transient, sharp wave-associated excitation. We thus predict that the IFA asymmetry vanishes in the limit of slowly changing drive, but is otherwise a robust feature of the feedback-based inhibition-first ripple model.


Subject(s)
Hippocampus , Interneurons , Hippocampus/physiology , Interneurons/physiology , Membrane Potentials
4.
PLoS Comput Biol ; 18(12): e1010809, 2022 12.
Article in English | MEDLINE | ID: mdl-36548392

ABSTRACT

Bottom-up models of functionally relevant patterns of neural activity provide an explicit link between neuronal dynamics and computation. A prime example of functional activity patterns are propagating bursts of place-cell activities called hippocampal replay, which is critical for memory consolidation. The sudden and repeated occurrences of these burst states during ongoing neural activity suggest metastable neural circuit dynamics. As metastability has been attributed to noise and/or slow fatigue mechanisms, we propose a concise mesoscopic model which accounts for both. Crucially, our model is bottom-up: it is analytically derived from the dynamics of finite-size networks of Linear-Nonlinear Poisson neurons with short-term synaptic depression. As such, noise is explicitly linked to stochastic spiking and network size, and fatigue is explicitly linked to synaptic dynamics. To derive the mesoscopic model, we first consider a homogeneous spiking neural network and follow the temporal coarse-graining approach of Gillespie to obtain a "chemical Langevin equation", which can be naturally interpreted as a stochastic neural mass model. The Langevin equation is computationally inexpensive to simulate and enables a thorough study of metastable dynamics in classical setups (population spikes and Up-Down-states dynamics) by means of phase-plane analysis. An extension of the Langevin equation for small network sizes is also presented. The stochastic neural mass model constitutes the basic component of our mesoscopic model for replay. We show that the mesoscopic model faithfully captures the statistical structure of individual replayed trajectories in microscopic simulations and in previously reported experimental data. Moreover, compared to the deterministic Romani-Tsodyks model of place-cell dynamics, it exhibits a higher level of variability regarding order, direction and timing of replayed trajectories, which seems biologically more plausible and could be functionally desirable. This variability is the product of a new dynamical regime where metastability emerges from a complex interplay between finite-size fluctuations and local fatigue.


Subject(s)
Models, Neurological , Neural Networks, Computer , Action Potentials/physiology , Neurons/physiology , Hippocampus
5.
PLoS Comput Biol ; 17(12): e1009691, 2021 12.
Article in English | MEDLINE | ID: mdl-34968383

ABSTRACT

Assemblies of neurons, called concepts cells, encode acquired concepts in human Medial Temporal Lobe. Those concept cells that are shared between two assemblies have been hypothesized to encode associations between concepts. Here we test this hypothesis in a computational model of attractor neural networks. We find that for concepts encoded in sparse neural assemblies there is a minimal fraction cmin of neurons shared between assemblies below which associations cannot be reliably implemented; and a maximal fraction cmax of shared neurons above which single concepts can no longer be retrieved. In the presence of a periodically modulated background signal, such as hippocampal oscillations, recall takes the form of association chains reminiscent of those postulated by theories of free recall of words. Predictions of an iterative overlap-generating model match experimental data on the number of concepts to which a neuron responds.


Subject(s)
Memory/physiology , Models, Neurological , Neurons/cytology , Computational Biology , Hippocampus/cytology , Hippocampus/physiology , Humans , Nerve Net/cytology , Nerve Net/physiology , Temporal Lobe/cytology , Temporal Lobe/physiology
6.
Biol Cybern ; 115(5): 539-562, 2021 10.
Article in English | MEDLINE | ID: mdl-34668051

ABSTRACT

Noise in spiking neurons is commonly modeled by a noisy input current or by generating output spikes stochastically with a voltage-dependent hazard rate ("escape noise"). While input noise lends itself to modeling biophysical noise processes, the phenomenological escape noise is mathematically more tractable. Using the level-crossing theory for differentiable Gaussian processes, we derive an approximate mapping between colored input noise and escape noise in leaky integrate-and-fire neurons. This mapping requires the first-passage-time (FPT) density of an overdamped Brownian particle driven by colored noise with respect to an arbitrarily moving boundary. Starting from the Wiener-Rice series for the FPT density, we apply the second-order decoupling approximation of Stratonovich to the case of moving boundaries and derive a simplified hazard-rate representation that is local in time and numerically efficient. This simplification requires the calculation of the non-stationary auto-correlation function of the level-crossing process: For exponentially correlated input noise (Ornstein-Uhlenbeck process), we obtain an exact formula for the zero-lag auto-correlation as a function of noise parameters, mean membrane potential and its speed, as well as an exponential approximation of the full auto-correlation function. The theory well predicts the FPT and interspike interval densities as well as the population activities obtained from simulations with colored input noise and time-dependent stimulus or boundary. The agreement with simulations is strongly enhanced across the sub- and suprathreshold firing regime compared to a first-order decoupling approximation that neglects correlations between level crossings. The second-order approximation also improves upon a previously proposed theory in the subthreshold regime. Depending on a simplicity-accuracy trade-off, all considered approximations represent useful mappings from colored input noise to escape noise, enabling progress in the theory of neuronal population dynamics.


Subject(s)
Models, Neurological , Neurons , Action Potentials , Normal Distribution , Stochastic Processes
7.
Phys Rev E ; 102(2-1): 022407, 2020 Aug.
Article in English | MEDLINE | ID: mdl-32942450

ABSTRACT

The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.


Subject(s)
Models, Neurological , Neurons/cytology , Action Potentials , Nerve Net/cytology , Nerve Net/physiology
8.
J Math Neurosci ; 10(1): 5, 2020 Apr 06.
Article in English | MEDLINE | ID: mdl-32253526

ABSTRACT

Coarse-graining microscopic models of biological neural networks to obtain mesoscopic models of neural activities is an essential step towards multi-scale models of the brain. Here, we extend a recent theory for mesoscopic population dynamics with static synapses to the case of dynamic synapses exhibiting short-term plasticity (STP). The extended theory offers an approximate mean-field dynamics for the synaptic input currents arising from populations of spiking neurons and synapses undergoing Tsodyks-Markram STP. The approximate mean-field dynamics accounts for both finite number of synapses and correlation between the two synaptic variables of the model (utilization and available resources) and its numerical implementation is simple. Comparisons with Monte Carlo simulations of the microscopic model show that in both feedforward and recurrent networks, the mesoscopic mean-field model accurately reproduces the first- and second-order statistics of the total synaptic input into a postsynaptic neuron and accounts for stochastic switches between Up and Down states and for population spikes. The extended mesoscopic population theory of spiking neural networks with STP may be useful for a systematic reduction of detailed biophysical models of cortical microcircuits to numerically efficient and mathematically tractable mean-field models.

9.
Curr Opin Neurobiol ; 58: 155-166, 2019 10.
Article in English | MEDLINE | ID: mdl-31590003

ABSTRACT

The dominant modeling framework for understanding cortical computations are heuristic firing rate models. Despite their success, these models fall short to capture spike synchronization effects, to link to biophysical parameters and to describe finite-size fluctuations. In this opinion article, we propose that the refractory density method (RDM), also known as age-structured population dynamics or quasi-renewal theory, yields a powerful theoretical framework to build rate-based models for mesoscopic neural populations from realistic neuron dynamics at the microscopic level. We review recent advances achieved by the RDM to obtain efficient population density equations for networks of generalized integrate-and-fire (GIF) neurons - a class of neuron models that has been successfully fitted to various cell types. The theory not only predicts the nonstationary dynamics of large populations of neurons but also permits an extension to finite-size populations and a systematic reduction to low-dimensional rate dynamics. The new types of rate models will allow a re-examination of models of cortical computations under biological constraints.


Subject(s)
Neurons , Action Potentials , Biophysics , Computer Simulation , Models, Neurological
10.
PLoS Comput Biol ; 15(6): e1007122, 2019 06.
Article in English | MEDLINE | ID: mdl-31181063

ABSTRACT

While most models of randomly connected neural networks assume single-neuron models with simple dynamics, neurons in the brain exhibit complex intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of single neurons and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons. For the case of two-dimensional rate neurons with strong adaptation, we find that the network exhibits a state of "resonant chaos", characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated neurons, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single neurons, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic neural networks.


Subject(s)
Models, Neurological , Nerve Net/physiology , Neurons/physiology , Computational Biology , Computer Simulation , Signal-To-Noise Ratio
11.
PLoS Comput Biol ; 13(4): e1005507, 2017 04.
Article in English | MEDLINE | ID: mdl-28422957

ABSTRACT

Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50-2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations.


Subject(s)
Action Potentials/physiology , Models, Neurological , Neurons/physiology , Algorithms , Computational Biology , Neural Networks, Computer , Visual Cortex
12.
Front Neurosci ; 10: 206, 2016.
Article in English | MEDLINE | ID: mdl-27242410

ABSTRACT

Reconsolidation of memories has mostly been studied at the behavioral and molecular level. Here, we put forward a simple extension of existing computational models of synaptic consolidation to capture hippocampal slice experiments that have been interpreted as reconsolidation at the synaptic level. The model implements reconsolidation through stabilization of consolidated synapses by stabilizing entities combined with an activity-dependent reservoir of stabilizing entities that are immune to protein synthesis inhibition (PSI). We derive a reduced version of our model to explore the conditions under which synaptic reconsolidation does or does not occur, often referred to as the boundary conditions of reconsolidation. We find that our computational model of synaptic reconsolidation displays complex boundary conditions. Our results suggest that a limited resource of hypothetical stabilizing molecules or complexes, which may be implemented by protein phosphorylation or different receptor subtypes, can underlie the phenomenon of synaptic reconsolidation.

13.
Article in English | MEDLINE | ID: mdl-26565154

ABSTRACT

Networks of fast nonlinear elements may display slow fluctuations if interactions are strong. We find a transition in the long-term variability of a sparse recurrent network of perfect integrate-and-fire neurons at which the Fano factor switches from zero to infinity and the correlation time is minimized. This corresponds to a bifurcation in a linear map arising from the self-consistency of temporal input and output statistics. More realistic neural dynamics with a leak current and refractory period lead to smoothed transitions and modified critical couplings that can be theoretically predicted.

14.
J Comput Neurosci ; 39(1): 29-51, 2015 Aug.
Article in English | MEDLINE | ID: mdl-25936628

ABSTRACT

Nerve cells in the brain generate sequences of action potentials with a complex statistics. Theoretical attempts to understand this statistics were largely limited to the case of a temporally uncorrelated input (Poissonian shot noise) from the neurons in the surrounding network. However, the stimulation from thousands of other neurons has various sorts of temporal structure. Firstly, input spike trains are temporally correlated because their firing rates can carry complex signals and because of cell-intrinsic properties like neural refractoriness, bursting, or adaptation. Secondly, at the connections between neurons, the synapses, usage-dependent changes in the synaptic weight (short-term plasticity) further shape the correlation structure of the effective input to the cell. From the theoretical side, it is poorly understood how these correlated stimuli, so-called colored noise, affect the spike train statistics. In particular, no standard method exists to solve the associated first-passage-time problem for the interspike-interval statistics with an arbitrarily colored noise. Assuming that input fluctuations are weaker than the mean neuronal drive, we derive simple formulas for the essential interspike-interval statistics for a canonical model of a tonically firing neuron subjected to arbitrarily correlated input from the network. We verify our theory by numerical simulations for three paradigmatic situations that lead to input correlations: (i) rate-coded naturalistic stimuli in presynaptic spike trains; (ii) presynaptic refractoriness or bursting; (iii) synaptic short-term plasticity. In all cases, we find severe effects on interval statistics. Our results provide a framework for the interpretation of firing statistics measured in vivo in the brain.


Subject(s)
Action Potentials/physiology , Brain/cytology , Models, Neurological , Neurons/physiology , Synaptic Transmission/physiology , Animals , Humans , Stochastic Processes , Time Factors
15.
J Comput Neurosci ; 38(3): 589-600, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25894991

ABSTRACT

We study the spike statistics of an adaptive exponential integrate-and-fire neuron stimulated by white Gaussian current noise. We derive analytical approximations for the coefficient of variation and the serial correlation coefficient of the interspike interval assuming that the neuron operates in the mean-driven tonic firing regime and that the stochastic input is weak. Our result for the serial correlation coefficient has the form of a geometric sequence and is confirmed by the comparison to numerical simulations. The theory predicts various patterns of interval correlations (positive or negative at lag one, monotonically decreasing or oscillating) depending on the strength of the spike-triggered and subthreshold components of the adaptation current. In particular, for pure subthreshold adaptation we find strong positive ISI correlations that are usually ascribed to positive correlations in the input current. Our results i) provide an alternative explanation for interspike-interval correlations observed in vivo, ii) may be useful in fitting point neuron models to experimental data, and iii) may be instrumental in exploring the role of adaptation currents for signal detection and signal transmission in single neurons.


Subject(s)
Adaptation, Physiological/physiology , Models, Neurological , Neurons/physiology , Algorithms , Computer Simulation , Electrophysiological Phenomena/physiology , Normal Distribution , Stochastic Processes , Synaptic Transmission/physiology
16.
Article in English | MEDLINE | ID: mdl-26764723

ABSTRACT

The calculation of the steady-state probability density for multidimensional stochastic systems that do not obey detailed balance is a difficult problem. Here we present the analytical derivation of the stationary joint and various marginal probability densities for a stochastic neuron model with adaptation current. Our approach assumes weak noise but is valid for arbitrary adaptation strength and time scale. The theory predicts several effects of adaptation on the statistics of the membrane potential of a tonically firing neuron: (i) a membrane potential distribution with a convex shape, (ii) a strongly increased probability of hyperpolarized membrane potentials induced by strong and fast adaptation, and (iii) a maximized variability associated with the adaptation current at a finite adaptation time scale.

17.
Article in English | MEDLINE | ID: mdl-25615126

ABSTRACT

Finite-sized populations of spiking elements are fundamental to brain function but also are used in many areas of physics. Here we present a theory of the dynamics of finite-sized populations of spiking units, based on a quasirenewal description of neurons with adaptation. We derive an integral equation with colored noise that governs the stochastic dynamics of the population activity in response to time-dependent stimulation and calculate the spectral density in the asynchronous state. We show that systems of coupled populations with adaptation can generate a frequency band in which sensory information is preferentially encoded. The theory is applicable to fully as well as randomly connected networks and to leaky integrate-and-fire as well as to generalized spiking neurons with adaptation on multiple time scales.


Subject(s)
Adaptation, Physiological , Models, Neurological , Neurons/cytology
18.
Front Comput Neurosci ; 7: 164, 2013.
Article in English | MEDLINE | ID: mdl-24348372

ABSTRACT

Neural firing is often subject to negative feedback by adaptation currents. These currents can induce strong correlations among the time intervals between spikes. Here we study analytically the interval correlations of a broad class of noisy neural oscillators with spike-triggered adaptation of arbitrary strength and time scale. Our weak-noise theory provides a general relation between the correlations and the phase-response curve (PRC) of the oscillator, proves anti-correlations between neighboring intervals for adapting neurons with type I PRC and identifies a single order parameter that determines the qualitative pattern of correlations. Monotonically decaying or oscillating correlation structures can be related to qualitatively different voltage traces after spiking, which can be explained by the phase plane geometry. At high firing rates, the long-term variability of the spike train associated with the cumulative interval correlations becomes small, independent of model details. Our results are verified by comparison with stochastic simulations of the exponential, leaky, and generalized integrate-and-fire models with adaptation.

19.
PLoS Comput Biol ; 9(8): e1003170, 2013.
Article in English | MEDLINE | ID: mdl-23966844

ABSTRACT

Stochastic signals with pronounced oscillatory components are frequently encountered in neural systems. Input currents to a neuron in the form of stochastic oscillations could be of exogenous origin, e.g. sensory input or synaptic input from a network rhythm. They shape spike firing statistics in a characteristic way, which we explore theoretically in this report. We consider a perfect integrate-and-fire neuron that is stimulated by a constant base current (to drive regular spontaneous firing), along with Gaussian narrow-band noise (a simple example of stochastic oscillations), and a broadband noise. We derive expressions for the nth-order interval distribution, its variance, and the serial correlation coefficients of the interspike intervals (ISIs) and confirm these analytical results by computer simulations. The theory is then applied to experimental data from electroreceptors of paddlefish, which have two distinct types of internal noisy oscillators, one forcing the other. The theory provides an analytical description of their afferent spiking statistics during spontaneous firing, and replicates a pronounced dependence of ISI serial correlation coefficients on the relative frequency of the driving oscillations, and furthermore allows extraction of certain parameters of the intrinsic oscillators embedded in these electroreceptors.


Subject(s)
Fishes/physiology , Models, Neurological , Sensory Receptor Cells/physiology , Action Potentials/physiology , Animals , Computational Biology , Computer Simulation , Models, Statistical
20.
Article in English | MEDLINE | ID: mdl-23882211

ABSTRACT

Signals from different sensory modalities may converge on a single neuron. We study theoretically a setup in which one signal is transmitted via facilitating synapses (F signal) and another via depressing synapses (D signal). When both signals are present, the postsynaptic cell preferentially encodes information about slow components of the F signal and fast components of the D signal, whereas for a single signal, transmission is broadband. We also show that, in the fluctuation-driven regime, the rate of information transmission may be increased through stochastic resonance (SR). Remarkably, the role of the beneficial noise is played by another signal, which is itself represented in the spike train of the postsynaptic cell.

SELECTION OF CITATIONS
SEARCH DETAIL
...