Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
Add more filters










Publication year range
1.
PLoS Comput Biol ; 18(9): e1010086, 2022 09.
Article in English | MEDLINE | ID: mdl-36074778

ABSTRACT

Sustainable research on computational models of neuronal networks requires published models to be understandable, reproducible, and extendable. Missing details or ambiguities about mathematical concepts and assumptions, algorithmic implementations, or parameterizations hinder progress. Such flaws are unfortunately frequent and one reason is a lack of readily applicable standards and tools for model description. Our work aims to advance complete and concise descriptions of network connectivity but also to guide the implementation of connection routines in simulation software and neuromorphic hardware systems. We first review models made available by the computational neuroscience community in the repositories ModelDB and Open Source Brain, and investigate the corresponding connectivity structures and their descriptions in both manuscript and code. The review comprises the connectivity of networks with diverse levels of neuroanatomical detail and exposes how connectivity is abstracted in existing description languages and simulator interfaces. We find that a substantial proportion of the published descriptions of connectivity is ambiguous. Based on this review, we derive a set of connectivity concepts for deterministically and probabilistically connected networks and also address networks embedded in metric space. Beside these mathematical and textual guidelines, we propose a unified graphical notation for network diagrams to facilitate an intuitive understanding of network properties. Examples of representative network models demonstrate the practical use of the ideas. We hope that the proposed standardizations will contribute to unambiguous descriptions and reproducible implementations of neuronal network connectivity in computational neuroscience.


Subject(s)
Models, Neurological , Neurosciences , Computer Simulation , Neurons/physiology , Software
2.
Cell Rep ; 39(11): 110948, 2022 06 14.
Article in English | MEDLINE | ID: mdl-35705055

ABSTRACT

Dendrites are essential determinants of the input-output relationship of single neurons, but their role in network computations is not well understood. Here, we use a combination of dendritic patch-clamp recordings and in silico modeling to determine how dendrites of parvalbumin (PV)-expressing basket cells contribute to network oscillations in the gamma frequency band. Simultaneous soma-dendrite recordings from PV basket cells in the dentate gyrus reveal that the slope, or gain, of the dendritic input-output relationship is exceptionally low, thereby reducing the cell's sensitivity to changes in its input. By simulating gamma oscillations in detailed network models, we demonstrate that the low gain is key to increase spike synchrony in PV basket cell assemblies when cells are driven by spatially and temporally heterogeneous synaptic inputs. These results highlight the role of inhibitory neuron dendrites in synchronized network oscillations.


Subject(s)
Interneurons , Parvalbumins , Action Potentials/physiology , Dendrites/physiology , Interneurons/physiology , Neurons
3.
Proc Natl Acad Sci U S A ; 117(41): 25505-25516, 2020 10 13.
Article in English | MEDLINE | ID: mdl-33008882

ABSTRACT

An elemental computation in the brain is to identify the best in a set of options and report its value. It is required for inference, decision-making, optimization, action selection, consensus, and foraging. Neural computing is considered powerful because of its parallelism; however, it is unclear whether neurons can perform this max-finding operation in a way that improves upon the prohibitively slow optimal serial max-finding computation (which takes [Formula: see text] time for N noisy candidate options) by a factor of N, the benchmark for parallel computation. Biologically plausible architectures for this task are winner-take-all (WTA) networks, where individual neurons inhibit each other so only those with the largest input remain active. We show that conventional WTA networks fail the parallelism benchmark and, worse, in the presence of noise, altogether fail to produce a winner when N is large. We introduce the nWTA network, in which neurons are equipped with a second nonlinearity that prevents weakly active neurons from contributing inhibition. Without parameter fine-tuning or rescaling as N varies, the nWTA network achieves the parallelism benchmark. The network reproduces experimentally observed phenomena like Hick's law without needing an additional readout stage or adaptive N-dependent thresholds. Our work bridges scales by linking cellular nonlinearities to circuit-level decision-making, establishes that distributed computation saturating the parallelism benchmark is possible in networks of noisy, finite-memory neurons, and shows that Hick's law may be a symptom of near-optimal parallel decision-making with noisy input.


Subject(s)
Decision Making/physiology , Models, Neurological , Neural Networks, Computer , Neurons/physiology , Nerve Net/physiology , Nonlinear Dynamics
4.
J Comput Neurosci ; 45(2): 103-132, 2018 10.
Article in English | MEDLINE | ID: mdl-30146661

ABSTRACT

Capturing the response behavior of spiking neuron models with rate-based models facilitates the investigation of neuronal networks using powerful methods for rate-based network dynamics. To this end, we investigate the responses of two widely used neuron model types, the Izhikevich and augmented multi-adapative threshold (AMAT) models, to a range of spiking inputs ranging from step responses to natural spike data. We find (i) that linear-nonlinear firing rate models fitted to test data can be used to describe the firing-rate responses of AMAT and Izhikevich spiking neuron models in many cases; (ii) that firing-rate responses are generally too complex to be captured by first-order low-pass filters but require bandpass filters instead; (iii) that linear-nonlinear models capture the response of AMAT models better than of Izhikevich models; (iv) that the wide range of response types evoked by current-injection experiments collapses to few response types when neurons are driven by stationary or sinusoidally modulated Poisson input; and (v) that AMAT and Izhikevich models show different responses to spike input despite identical responses to current injections. Together, these findings suggest that rate-based models of network dynamics may capture a wider range of neuronal response properties by incorporating second-order bandpass filters fitted to responses of spiking model neurons. These models may contribute to bringing rate-based network modeling closer to the reality of biological neuronal networks.


Subject(s)
Action Potentials/physiology , Models, Neurological , Neurons/physiology , Animals , Computer Simulation , Electric Stimulation , Linear Models , Nerve Net , Nonlinear Dynamics
5.
Front Comput Neurosci ; 8: 136, 2014.
Article in English | MEDLINE | ID: mdl-25400575

ABSTRACT

Random networks of integrate-and-fire neurons with strong current-based synapses can, unlike previously believed, assume stable states of sustained asynchronous and irregular firing, even without external random background or pacemaker neurons. We analyze the mechanisms underlying the emergence, lifetime and irregularity of such self-sustained activity states. We first demonstrate how the competition between the mean and the variance of the synaptic input leads to a non-monotonic firing-rate transfer in the network. Thus, by increasing the synaptic coupling strength, the system can become bistable: In addition to the quiescent state, a second stable fixed-point at moderate firing rates can emerge by a saddle-node bifurcation. Inherently generated fluctuations of the population firing rate around this non-trivial fixed-point can trigger transitions into the quiescent state. Hence, the trade-off between the magnitude of the population-rate fluctuations and the size of the basin of attraction of the non-trivial rate fixed-point determines the onset and the lifetime of self-sustained activity states. During self-sustained activity, individual neuronal activity is moreover highly irregular, switching between long periods of low firing rate to short burst-like states. We show that this is an effect of the strong synaptic weights and the finite time constant of synaptic and neuronal integration, and can actually serve to stabilize the self-sustained state.

6.
J Comput Neurosci ; 35(3): 359-75, 2013 Dec.
Article in English | MEDLINE | ID: mdl-23783890

ABSTRACT

Firing-rate models provide a practical tool for studying signal processing in the early visual system, permitting more thorough mathematical analysis than spike-based models. We show here that essential response properties of relay cells in the lateral geniculate nucleus (LGN) can be captured by surprisingly simple firing-rate models consisting of a low-pass filter and a nonlinear activation function. The starting point for our analysis are two spiking neuron models based on experimental data: a spike-response model fitted to data from macaque (Carandini et al. J. Vis., 20(14), 1-2011, 2007), and a model with conductance-based synapses and afterhyperpolarizing currents fitted to data from cat (Casti et al. J. Comput. Neurosci., 24(2), 235-252, 2008). We obtained the nonlinear activation function by stimulating the model neurons with stationary stochastic spike trains, while we characterized the linear filter by fitting a low-pass filter to responses to sinusoidally modulated stochastic spike trains. To account for the non-Poisson nature of retinal spike trains, we performed all analyses with spike trains with higher-order gamma statistics in addition to Poissonian spike trains. Interestingly, the properties of the low-pass filter depend only on the average input rate, but not on the modulation depth of sinusoidally modulated input. Thus, the response properties of our model are fully specified by just three parameters (low-frequency gain, cutoff frequency, and delay) for a given mean input rate and input regularity. This simple firing-rate model reproduces the response of spiking neurons to a step in input rate very well for Poissonian as well as for non-Poissonian input. We also found that the cutoff frequencies, and thus the filter time constants, of the rate-based model are unrelated to the membrane time constants of the underlying spiking models, in agreement with similar observations for simpler models.


Subject(s)
Geniculate Bodies/physiology , Neurons/physiology , Algorithms , Animals , Computer Simulation , Electric Stimulation , Electrophysiological Phenomena/physiology , Excitatory Postsynaptic Potentials/physiology , Membrane Potentials/physiology , Models, Neurological , Nonlinear Dynamics , Synaptic Transmission/physiology
7.
Front Comput Neurosci ; 7: 187, 2013.
Article in English | MEDLINE | ID: mdl-24501591

ABSTRACT

Pattern formation, i.e., the generation of an inhomogeneous spatial activity distribution in a dynamical system with translation invariant structure, is a well-studied phenomenon in neuronal network dynamics, specifically in neural field models. These are population models to describe the spatio-temporal dynamics of large groups of neurons in terms of macroscopic variables such as population firing rates. Though neural field models are often deduced from and equipped with biophysically meaningful properties, a direct mapping to simulations of individual spiking neuron populations is rarely considered. Neurons have a distinct identity defined by their action on their postsynaptic targets. In its simplest form they act either excitatorily or inhibitorily. When the distribution of neuron identities is assumed to be periodic, pattern formation can be observed, given the coupling strength is supracritical, i.e., larger than a critical weight. We find that this critical weight is strongly dependent on the characteristics of the neuronal input, i.e., depends on whether neurons are mean- or fluctuation driven, and different limits in linearizing the full non-linear system apply in order to assess stability. In particular, if neurons are mean-driven, the linearization has a very simple form and becomes independent of both the fixed point firing rate and the variance of the input current, while in the very strongly fluctuation-driven regime the fixed point rate, as well as the input mean and variance are important parameters in the determination of the critical weight. We demonstrate that interestingly even in "intermediate" regimes, when the system is technically fluctuation-driven, the simple linearization neglecting the variance of the input can yield the better prediction of the critical coupling strength. We moreover analyze the effects of structural randomness by rewiring individual synapses or redistributing weights, as well as coarse-graining on the formation of inhomogeneous activity patterns.

8.
Chaos ; 22(3): 033143, 2012 Sep.
Article in English | MEDLINE | ID: mdl-23020482

ABSTRACT

Under which conditions can a network of pulse-coupled oscillators sustain stable collective activity states? Previously, it was shown that stability of the simplest pattern conceivable, i.e., global synchrony, in networks of symmetrically pulse-coupled oscillators can be decided in a rigorous mathematical fashion, if interactions either all advance or all retard oscillation phases ("mono-interaction network"). Yet, many real-world networks-for example neuronal circuits-are asymmetric and moreover crucially feature both types of interactions. Here, we study complex networks of excitatory (phase-advancing) and inhibitory (phase-retarding) leaky integrate-and-fire (LIF) oscillators. We show that for small coupling strength, previous results for mono-interaction networks also apply here: pulse time perturbations eventually decay if they are smaller than a transmission delay and if all eigenvalues of the linear stability operator have absolute value smaller or equal to one. In this case, the level of inhibition must typically be significantly stronger than that of excitation to ensure local stability of synchrony. For stronger coupling, however, network synchrony eventually becomes unstable to any finite perturbation, even if inhibition is strong and all eigenvalues of the stability operator are at most unity. This new type of instability occurs when any oscillator, inspite of receiving inhibitory input from the network on average, can by chance receive sufficient excitatory input to fire a pulse before all other pulses in the system are delivered, thus breaking the near-synchronous perturbation pattern.

9.
Network ; 23(4): 131-49, 2012.
Article in English | MEDLINE | ID: mdl-22994683

ABSTRACT

As computational neuroscience matures, many simulation environments are available that are useful for neuronal network modeling. However, methods for successfully documenting models for publication and for exchanging models and model components among these projects are still under development. Here we briefly review existing software and applications for network model creation, documentation and exchange. Then we discuss a few of the larger issues facing the field of computational neuroscience regarding network modeling and suggest solutions to some of these problems, concentrating in particular on standardized network model terminology, notation, and descriptions and explicit documentation of model scaling. We hope this will enable and encourage computational neuroscientists to share their models more systematically in the future.


Subject(s)
Computer Simulation , Documentation/methods , Information Dissemination/methods , Models, Neurological , Nerve Net/physiology , Software , Terminology as Topic , Animals , Humans , Programming Languages
10.
Article in English | MEDLINE | ID: mdl-21344004

ABSTRACT

Networks of well-known dynamical units but unknown interaction topology arise across various fields of biology, including genetics, ecology, and neuroscience. The collective dynamics of such networks is often sensitive to the presence (or absence) of individual interactions, but there is usually no direct way to probe for their existence. Here we present an explicit method for reconstructing interaction networks of leaky integrate-and-fire neurons from the spike patterns they exhibit in response to external driving. Given the dynamical parameters are known, the approach works well for networks in simple collective states but is also applicable to networks exhibiting complex spatio-temporal spike patterns. In particular, stationarity of spiking time series is not required.

11.
J Comput Neurosci ; 27(2): 177-200, 2009 Oct.
Article in English | MEDLINE | ID: mdl-19568923

ABSTRACT

Can the topology of a recurrent spiking network be inferred from observed activity dynamics? Which statistical parameters of network connectivity can be extracted from firing rates, correlations and related measurable quantities? To approach these questions, we analyze distance dependent correlations of the activity in small-world networks of neurons with current-based synapses derived from a simple ring topology. We find that in particular the distribution of correlation coefficients of subthreshold activity can tell apart random networks from networks with distance dependent connectivity. Such distributions can be estimated by sampling from random pairs. We also demonstrate the crucial role of the weight distribution, most notably the compliance with Dales principle, for the activity dynamics in recurrent networks of different types.


Subject(s)
Action Potentials/physiology , Central Nervous System/physiology , Nerve Net/physiology , Neural Networks, Computer , Synaptic Transmission/physiology , Algorithms , Animals , Computer Simulation , Humans , Neural Pathways/physiology , Neurons/physiology , Synapses/physiology
12.
Neural Comput ; 20(9): 2185-226, 2008 Sep.
Article in English | MEDLINE | ID: mdl-18439141

ABSTRACT

The function of cortical networks depends on the collective interplay between neurons and neuronal populations, which is reflected in the correlation of signals that can be recorded at different levels. To correctly interpret these observations it is important to understand the origin of neuronal correlations. Here we study how cells in large recurrent networks of excitatory and inhibitory neurons interact and how the associated correlations affect stationary states of idle network activity. We demonstrate that the structure of the connectivity matrix of such networks induces considerable correlations between synaptic currents as well as between subthreshold membrane potentials, provided Dale's principle is respected. If, in contrast, synaptic weights are randomly distributed, input correlations can vanish, even for densely connected networks. Although correlations are strongly attenuated when proceeding from membrane potentials to action potentials (spikes), the resulting weak correlations in the spike output can cause substantial fluctuations in the population activity, even in highly diluted networks. We show that simple mean-field models that take the structure of the coupling matrix into account can adequately describe the power spectra of the population activity. The consequences of Dale's principle on correlations and rate fluctuations are discussed in the light of recent experimental findings.


Subject(s)
Cerebral Cortex/anatomy & histology , Models, Neurological , Nerve Net/physiology , Neurons/physiology , Population Dynamics , Statistics as Topic , Animals , Humans , Neural Networks, Computer
SELECTION OF CITATIONS
SEARCH DETAIL
...