Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 35
Filter
1.
Biol Cybern ; 118(1-2): 39-81, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38583095

ABSTRACT

Stochastic models of synaptic plasticity must confront the corrosive influence of fluctuations in synaptic strength on patterns of synaptic connectivity. To solve this problem, we have proposed that synapses act as filters, integrating plasticity induction signals and expressing changes in synaptic strength only upon reaching filter threshold. Our earlier analytical study calculated the lifetimes of quasi-stable patterns of synaptic connectivity with synaptic filtering. We showed that the plasticity step size in a stochastic model of spike-timing-dependent plasticity (STDP) acts as a temperature-like parameter, exhibiting a critical value below which neuronal structure formation occurs. The filter threshold scales this temperature-like parameter downwards, cooling the dynamics and enhancing stability. A key step in this calculation was a resetting approximation, essentially reducing the dynamics to one-dimensional processes. Here, we revisit our earlier study to examine this resetting approximation, with the aim of understanding in detail why it works so well by comparing it, and a simpler approximation, to the system's full dynamics consisting of various embedded two-dimensional processes without resetting. Comparing the full system to the simpler approximation, to our original resetting approximation, and to a one-afferent system, we show that their equilibrium distributions of synaptic strengths and critical plasticity step sizes are all qualitatively similar, and increasingly quantitatively similar as the filter threshold increases. This increasing similarity is due to the decorrelation in changes in synaptic strength between different afferents caused by our STDP model, and the amplification of this decorrelation with larger synaptic filters.


Subject(s)
Models, Neurological , Neuronal Plasticity , Stochastic Processes , Synapses , Neuronal Plasticity/physiology , Synapses/physiology , Animals , Neurons/physiology , Humans , Action Potentials/physiology
2.
Biol Cybern ; 116(3): 327-362, 2022 06.
Article in English | MEDLINE | ID: mdl-35286444

ABSTRACT

Models of associative memory with discrete state synapses learn new memories by forgetting old ones. In the simplest models, memories are forgotten exponentially quickly. Sparse population coding ameliorates this problem, as do complex models of synaptic plasticity that posit internal synaptic states, giving rise to synaptic metaplasticity. We examine memory lifetimes in both simple and complex models of synaptic plasticity with sparse coding. We consider our own integrative, filter-based model of synaptic plasticity, and examine the cascade and serial synapse models for comparison. We explore memory lifetimes at both the single-neuron and the population level, allowing for spontaneous activity. Memory lifetimes are defined using either a signal-to-noise ratio (SNR) approach or a first passage time (FPT) method, although we use the latter only for simple models at the single-neuron level. All studied models exhibit a decrease in the optimal single-neuron SNR memory lifetime, optimised with respect to sparseness, as the probability of synaptic updates decreases or, equivalently, as synaptic complexity increases. This holds regardless of spontaneous activity levels. In contrast, at the population level, even a low but nonzero level of spontaneous activity is critical in facilitating an increase in optimal SNR memory lifetimes with increasing synaptic complexity, but only in filter and serial models. However, SNR memory lifetimes are valid only in an asymptotic regime in which a mean field approximation is valid. By considering FPT memory lifetimes, we find that this asymptotic regime is not satisfied for very sparse coding, violating the conditions for the optimisation of single-perceptron SNR memory lifetimes with respect to sparseness. Similar violations are also expected for complex models of synaptic plasticity.


Subject(s)
Memory , Models, Neurological , Humans , Learning , Memory/physiology , Neuronal Plasticity/physiology , Synapses/physiology
3.
Neural Comput ; 32(6): 1069-1143, 2020 06.
Article in English | MEDLINE | ID: mdl-32343647

ABSTRACT

Models of associative memory with discrete state synapses learn new memories by forgetting old ones. In contrast to non-integrative models of synaptic plasticity, models with integrative, filter-based synapses exhibit an initial rise in the fidelity of recall of stored memories. This rise to a peak is driven by a transient process and is then followed by a return to equilibrium. In a series of papers, we have employed a first passage time (FPT) approach to define and study memory lifetimes, incrementally developing our methods, from both simple and complex binary-strength synapses to simple multistate synapses. Here, we complete this work by analyzing FPT memory lifetimes in multistate, filter-based synapses. To achieve this, we integrate out the internal filter states so that we can work with transitions only in synaptic strength. We then generalize results on polysynaptic generating functions from binary strength to multistate synapses, allowing us to examine the dynamics of synaptic strength changes in an ensemble of synapses rather than just a single synapse. To derive analytical results for FPT memory lifetimes, we partition the synaptic dynamics into two distinct phases: the first, pre-peak phase studied with a drift-only approximation, and the second, post-peak phase studied with approximations to the full strength transition probabilities. These approximations capture the underlying dynamics very well, as demonstrated by the extremely good agreement between results obtained by simulating our model and results obtained from the Fokker-Planck or integral equation approaches to FPT processes.


Subject(s)
Association Learning/physiology , Memory/physiology , Models, Neurological , Synapses/physiology , Animals , Humans , Neuronal Plasticity/physiology
4.
Neural Comput ; 31(11): 2212-2251, 2019 11.
Article in English | MEDLINE | ID: mdl-31525308

ABSTRACT

Repeated stimuli that are spaced apart in time promote the transition from short- to long-term memory, while massing repetitions together does not. Previously, we showed that a model of integrative synaptic plasticity, in which plasticity induction signals are integrated by a low-pass filter before plasticity is expressed, gives rise to a natural timescale at which to repeat stimuli, hinting at a partial account of this spacing effect. The account was only partial because the important role of neuromodulation was not considered. We now show that by extending the model to allow dynamic integrative synaptic plasticity, the model permits synapses to robustly discriminate between spaced and massed repetition protocols, suppressing the response to massed stimuli while maintaining that to spaced stimuli. This is achieved by dynamically coupling the filter decay rate to neuromodulatory signaling in a very simple model of the signaling cascades downstream from cAMP production. In particular, the model's parameters may be interpreted as corresponding to the duration and amplitude of the waves of activity in the MAPK pathway. We identify choices of parameters and repetition times for stimuli in this model that optimize the ability of synapses to discriminate between spaced and massed repetition protocols. The model is very robust to reasonable changes around these optimal parameters and times, but for large changes in parameters, the model predicts that massed and spaced stimuli cannot be distinguished or that the responses to both patterns are suppressed. A model of dynamic integrative synaptic plasticity therefore explains the spacing effect under normal conditions and also predicts its breakdown under abnormal conditions.


Subject(s)
Brain/physiology , Memory, Long-Term/physiology , Models, Neurological , Neuronal Plasticity/physiology , Animals , Humans
5.
Neural Comput ; 31(1): 8-67, 2019 01.
Article in English | MEDLINE | ID: mdl-30576617

ABSTRACT

Models of associative memory with discrete-strength synapses are palimpsests, learning new memories by forgetting old ones. Memory lifetimes can be defined by the mean first passage time (MFPT) for a perceptron's activation to fall below firing threshold. By imposing the condition that the vector of possible strengths available to a synapse is a left eigenvector of the stochastic matrix governing transitions in strength, we previously derived results for MFPTs and first passage time (FPT) distributions in models with simple, multistate synapses. This condition permits jump moments to be computed via a 1-dimensional Fokker-Planck approach. Here, we study memory lifetimes in the absence of this condition. To do so, we must introduce additional variables, including the perceptron activation, that parameterize synaptic configurations, permitting Markovian dynamics in these variables to be formulated. FPT problems in these variables require solving multidimensional partial differential or integral equations. However, the FPT dynamics can be analytically well approximated by focusing on the slowest eigenmode in this higher-dimensional space. We may also obtain a much better approximation by restricting to the two dominant variables in this space, the restriction making numerical methods tractable. Analytical and numerical methods are in excellent agreement with simulation data, validating our methods. These methods prepare the ground for the study of FPT memory lifetimes with complex rather than simple, multistate synapses.

6.
Neural Comput ; 29(12): 3219-3259, 2017 12.
Article in English | MEDLINE | ID: mdl-28957028

ABSTRACT

Memory models based on synapses with discrete and bounded strengths store new memories by forgetting old ones. Memory lifetimes in such memory systems may be defined in a variety of ways. A mean first passage time (MFPT) definition overcomes much of the arbitrariness and many of the problems associated with the more usual signal-to-noise ratio (SNR) definition. We have previously computed MFPT lifetimes for simple, binary-strength synapses that lack internal, plasticity-related states. In simulation we have also seen that for multistate synapses, optimality conditions based on SNR lifetimes are absent with MFPT lifetimes, suggesting that such conditions may be artifactual. Here we extend our earlier work by computing the entire first passage time (FPT) distribution for simple, multistate synapses, from which all statistics, including the MFPT lifetime, may be extracted. For this, we develop a Fokker-Planck equation using the jump moments for perceptron activation. Two models are considered that satisfy a particular eigenvector condition that this approach requires. In these models, MFPT lifetimes do not exhibit optimality conditions, while in one but not the other, SNR lifetimes do exhibit optimality. Thus, not only are such optimality conditions artifacts of the SNR approach, but they are also strongly model dependent. By examining the variance in the FPT distribution, we may identify regions in which memory storage is subject to high variability, although MFPT lifetimes are nevertheless robustly positive. In such regions, SNR lifetimes are typically (defined to be) zero. FPT-defined memory lifetimes therefore provide an analytically superior approach and also have the virtue of being directly related to a neuron's firing properties.

7.
Neural Comput ; 29(6): 1468-1527, 2017 06.
Article in English | MEDLINE | ID: mdl-28333590

ABSTRACT

Memory models that store new memories by forgetting old ones have memory lifetimes that are rather short and grow only logarithmically in the number of synapses. Attempts to overcome these deficits include "complex" models of synaptic plasticity in which synapses possess internal states governing the expression of synaptic plasticity. Integrate-and-express, filter-based models of synaptic plasticity propose that synapses act as low-pass filters, integrating plasticity induction signals before expressing synaptic plasticity. Such mechanisms enhance memory lifetimes, leading to an initial rise in the memory signal that is in radical contrast to other related, but nonintegrative, memory models. Because of the complexity of models with internal synaptic states, however, their dynamics can be more difficult to extract compared to "simple" models that lack internal states. Here, we show that by focusing only on processes that lead to changes in synaptic strength, we can integrate out internal synaptic states and effectively reduce complex synapses to simple synapses. For binary-strength synapses, these simplified dynamics then allow us to work directly in the transitions in perceptron activation induced by memory storage rather than in the underlying transitions in synaptic configurations. This permits us to write down master and Fokker-Planck equations that may be simplified under certain, well-defined approximations. These methods allow us to see that memory based on synaptic filters can be viewed as an initial transient that leads to memory signal rise, followed by the emergence of Ornstein-Uhlenbeck-like dynamics that return the system to equilibrium. We may use this approach to compute mean first passage time-defined memory lifetimes for complex models of memory storage.


Subject(s)
Memory/physiology , Models, Neurological , Neuronal Plasticity/physiology , Perception/physiology , Synapses/physiology , Animals , Computer Simulation , Humans , Nonlinear Dynamics , Probability , Time Factors
8.
Neural Comput ; 28(11): 2393-2460, 2016 Nov.
Article in English | MEDLINE | ID: mdl-27626970

ABSTRACT

Integrate-and-express models of synaptic plasticity propose that synapses integrate plasticity induction signals before expressing synaptic plasticity. By discerning trends in their induction signals, synapses can control destabilizing fluctuations in synaptic strength. In a feedforward perceptron framework with binary-strength synapses for associative memory storage, we have previously shown that such a filter-based model outperforms other, nonintegrative, "cascade"-type models of memory storage in most regions of biologically relevant parameter space. Here, we consider some natural extensions of our earlier filter model, including one specifically tailored to binary-strength synapses and one that demands a fixed, consecutive number of same-type induction signals rather than merely an excess before expressing synaptic plasticity. With these extensions, we show that filter-based models outperform nonintegrative models in all regions of biologically relevant parameter space except for a small sliver in which all models encode memories only weakly. In this sliver, which model is superior depends on the metric used to gauge memory lifetimes (whether a signal-to-noise ratio or a mean first passage time). After comparing and contrasting these various filter models, we discuss the multiple mechanisms and timescales that underlie both synaptic plasticity and memory phenomena and suggest that multiple, different filtering mechanisms may operate at single synapses.

9.
Neural Comput ; 28(9): 1927-84, 2016 09.
Article in English | MEDLINE | ID: mdl-27391686

ABSTRACT

Integrate-and-express models of synaptic plasticity propose that synapses may act as low-pass filters, integrating synaptic plasticity induction signals in order to discern trends before expressing synaptic plasticity. We have previously shown that synaptic filtering strongly controls destabilizing fluctuations in developmental models. When applied to palimpsest memory systems that learn new memories by forgetting old ones, we have also shown that with binary-strength synapses, integrative synapses lead to an initial memory signal rise before its fall back to equilibrium. Such an initial rise is in dramatic contrast to nonintegrative synapses, in which the memory signal falls monotonically. We now extend our earlier analysis of palimpsest memories with synaptic filters to consider the more general case of discrete state, multilevel synapses. We derive exact results for the memory signal dynamics and then consider various simplifying approximations. We show that multilevel synapses enhance the initial rise in the memory signal and then delay its subsequent fall by inducing a plateau-like region in the memory signal. Such dynamics significantly increase memory lifetimes, defined by a signal-to-noise ratio (SNR). We derive expressions for optimal choices of synaptic parameters (filter size, number of strength states, number of synapses) that maximize SNR memory lifetimes. However, we find that with memory lifetimes defined via mean-first-passage times, such optimality conditions do not exist, suggesting that optimality may be an artifact of SNRs.


Subject(s)
Learning , Models, Neurological , Neuronal Plasticity , Humans , Memory , Synapses
10.
Neural Comput ; 26(9): 1873-923, 2014 Sep.
Article in English | MEDLINE | ID: mdl-24877738

ABSTRACT

We study memory lifetimes in a perceptron-based framework with binary synapses, using the mean first passage time for the perceptron's total input to fall below firing threshold to define memory lifetimes. Working with the simplest memory-related model of synaptic plasticity, we may obtain exact results for memory lifetimes or, working in the continuum limit, good analytical approximations that afford either much qualitative insight or extremely good quantitative agreement. In one particular limit, we find that memory dynamics reduce to the well-understood Ornstein-Uhlenbeck process. We show that asymptotically, the lifetimes of memories grow logarithmically in the number of synapses when the perceptron's firing threshold is zero, reproducing standard results from signal-to-noise ratio analyses. However, this is only an asymptotically valid result, and we show that extending its application outside the range of its validity leads to a massive overestimate of the minimum number of synapses required for successful memory encoding. In the case that the perceptron's firing threshold is positive, we find the remarkable result that memory lifetimes are strictly bounded from above. Asymptotically, the dependence of memory lifetimes on the number of synapses drops out entirely, and this asymptotic result provides a strict upper bound on memory lifetimes away from this asymptotic regime. The classic logarithmic growth of memory lifetimes in the simplest, palimpsest memories is therefore untypical and nongeneric: memory lifetimes are typically strictly bounded from above.


Subject(s)
Memory/physiology , Neural Networks, Computer , Neuronal Plasticity/physiology , Action Potentials , Algorithms , Binomial Distribution , Computer Simulation , Normal Distribution , Probability , Stochastic Processes , Synapses/physiology , Time
11.
Neural Comput ; 26(9): 1924-72, 2014 Sep.
Article in English | MEDLINE | ID: mdl-24922502

ABSTRACT

A recent model of intrinsic plasticity coupled to Hebbian synaptic plasticity proposes that adaptation of a neuron's threshold and gain in a sigmoidal response function to achieve a sparse, exponential output firing rate distribution facilitates the discovery of heavy-tailed or super- gaussian sources in the neuron's inputs. We show that the exponential output distribution is irrelevant to these dynamics and that, furthermore, while sparseness is sufficient, it is not necessary. The intrinsic plasticity mechanism drives the neuron's threshold large and positive, and we prove that in such a regime, the neuron will find supergaussian sources; equally, however, if the threshold is large and negative (an antisparse regime), it will also find supergaussian sources. Away from such extremes, the neuron can also discover subgaussian sources. By examining a neuron with a fixed sigmoidal nonlinearity and considering the synaptic strength fixed-point structure in the two-dimensional parameter space defined by the neuron's threshold and gain, we show that this space is carved up into sub- and supergaussian-input-finding regimes, possibly with regimes of simultaneous stability of sub- and supergaussian sources or regimes of instability of all sources; a single gaussian source may also be stabilized by the presence of a nongaussian source. A neuron's operating point (essentially its threshold and gain coupled with its input statistics) therefore critically determines its computational repertoire. Intrinsic plasticity mechanisms induce trajectories in this parameter space but do not fundamentally modify it. Unless the trajectories cross critical boundaries in this space, intrinsic plasticity is irrelevant and the neuron's nonlinearity may be frozen with identical receptive field refinement dynamics.


Subject(s)
Action Potentials/physiology , Adaptation, Physiological/physiology , Models, Neurological , Neuronal Plasticity/physiology , Neurons/physiology , Algorithms , Nonlinear Dynamics , Normal Distribution , Synapses/physiology
12.
Neural Comput ; 24(10): 2604-54, 2012 Oct.
Article in English | MEDLINE | ID: mdl-22734492

ABSTRACT

Plasticity-inducing stimuli must typically be presented many times before synaptic plasticity is expressed, perhaps because induction signals gradually accumulate before overt strength changes occur. We consider memory dynamics in a mathematical model with synapses that integrate plasticity induction signals before expressing plasticity. We find that the memory trace initially rises before reaching a maximum and then falling. The memory signal dissociates into separate oblivescence and reminiscence components, with reminiscence initially dominating recall. In radical contrast, related but nonintegrative models exhibit only a highly problematic oblivescence. Synaptic integration mechanisms possess natural timescales, depending on the statistics of the induction signals. Together with neuromodulation, these timescales may therefore also begin to provide a natural account of the well-known spacing effect in the transition to late-phase plasticity. Finally, we propose experiments that could distinguish between integrative and nonintegrative synapses. Such experiments should further elucidate the synaptic signal processing mechanisms postulated by our model.


Subject(s)
Memory/physiology , Models, Neurological , Neuronal Plasticity , Synapses/physiology , Animals , Humans , Probability , Time Factors
13.
Neural Comput ; 24(2): 455-522, 2012 Feb.
Article in English | MEDLINE | ID: mdl-22023195

ABSTRACT

Linear models of synaptic plasticity provide a useful starting-point for examining the dynamics of neuronal development and learning, but their inherent problems are well known. Models of synaptic plasticity that embrace the demands of biological realism are therefore typically nonlinear. Viewed from a more abstract perspective, nonlinear models of synaptic plasticity are a subset of nonlinear dynamical systems. As such, they may therefore exhibit bifurcations under the variation of control parameters, including noise and errors in synaptic updates. One source of noise or error is the cross-talk that occurs during otherwise Hebbian plasticity. Under cross-talk, stimulation of a set of synapses can induce or modify plasticity in adjacent, unstimulated synapses. Here, we analyze two nonlinear models of developmental synaptic plasticity and a model of independent component analysis in the presence of a simple model of cross-talk. We show that cross-talk does indeed induce bifurcations in these models, entirely destroying their ability to acquire either developmentally or learning-related patterns of fixed points. Importantly, the critical level of cross-talk required to induce bifurcations in these models is very sensitive to the statistics of the afferents' activities and the number of afferents synapsing on a postsynaptic cell. In particular, the critical level can be made arbitrarily small. Because bifurcations are inevitable in nonlinear models, our results likely apply to many nonlinear models of synaptic plasticity, although the precise details vary by model. Hence, many nonlinear models of synaptic plasticity are potentially fatally compromised by the toxic influence of cross-talk and other sources of noise and errors more generally. We conclude by arguing that biologically realistic models of synaptic plasticity must be robust against noise-induced bifurcations and that biological systems may have evolved strategies to circumvent their possible dangers.


Subject(s)
Learning/physiology , Neuronal Plasticity/physiology , Nonlinear Dynamics , Synapses/physiology , Models, Neurological , Neurons/physiology , Stochastic Processes
14.
Neural Comput ; 23(1): 124-59, 2011 Jan.
Article in English | MEDLINE | ID: mdl-20964546

ABSTRACT

Stochastic models of synaptic plasticity propose that single synapses perform a directed random walk of fixed step sizes in synaptic strength, thereby embracing the view that the mechanisms of synaptic plasticity constitute a stochastic dynamical system. However, fluctuations in synaptic strength present a formidable challenge to such an approach. We have previously proposed that single synapses must interpose an integration and filtering mechanism between the induction of synaptic plasticity and the expression of synaptic plasticity in order to control fluctuations. We analyze a class of three such mechanisms in the presence of possibly non-Markovian plasticity induction processes, deriving expressions for the mean expression time in these models. One of these filtering mechanisms constitutes a discrete low-pass filter that could be implemented on a small collection of molecules at single synapses, such as CaMKII, and we analyze this discrete filter in some detail. After considering Markov induction processes, we examine our own stochastic model of spike-timing-dependent plasticity, for which the probability density functions of the induction of plasticity steps have previously been derived. We determine the dependence of the mean time to express a plasticity step on pre- and postsynaptic firing rates in this model, and we also consider, numerically, the long-term stability against fluctuations of patterns of neuronal connectivity that typically emerge during neuronal development.


Subject(s)
Brain/physiology , Neural Networks, Computer , Neuronal Plasticity/physiology , Neurons/physiology , Synaptic Transmission/physiology , Action Potentials/physiology , Algorithms , Animals , Calcium-Calmodulin-Dependent Protein Kinase Type 2/physiology , Humans , Markov Chains , Models, Neurological , Neural Pathways/physiology , Stochastic Processes , Synaptic Membranes/physiology
15.
Neural Comput ; 23(3): 674-734, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21162665

ABSTRACT

In stochastic models of synaptic plasticity based on a random walk, the control of fluctuations is imperative. We have argued that synapses could act as low-pass filters, filtering plasticity induction steps before expressing a step change in synaptic strength. Earlier work showed, in simulation, that such a synaptic filter tames fluctuations very well, leading to patterns of synaptic connectivity that are stable for long periods of time. Here, we approach this problem analytically. We explicitly calculate the lifetime of meta-stable states of synaptic connectivity using a Fokker-Planck formalism in order to understand the dependence of this lifetime on both the plasticity step size and the filtering mechanism. We find that our analytical results agree very well with simulation results, despite having to make two approximations. Our analysis reveals, however, a deeper significance to the filtering mechanism and the plasticity step size. We show that a filter scales the step size into a smaller, effective step size. This scaling suggests that the step size may itself play the role of a temperature parameter, so that a filter cools the dynamics, thereby reducing the influence of fluctuations. Using the master equation, we explicitly demonstrate a bifurcation at a critical step size, confirming this interpretation. At this critical point, spontaneous symmetry breaking occurs in the class of stochastic models of synaptic plasticity that we consider.

16.
Neural Comput ; 22(1): 244-72, 2010 Jan.
Article in English | MEDLINE | ID: mdl-19764870

ABSTRACT

A stochastic model of spike-timing-dependent plasticity (STDP) postulates that single synapses presented with a single spike pair exhibit all-or-none quantal jumps in synaptic strength. The amplitudes of the jumps are independent of spiking timing, but their probabilities do depend on spiking timing. By making the amplitudes of both upward and downward transitions equal, synapses then occupy only a discrete set of states of synaptic strength. We explore the impact of a finite, discrete set of strength states on our model, finding three principal results. First, a finite set of strength states limits the capacity of a single synapse to express the standard, exponential STDP curve. We derive the expression for the expected change in synaptic strength in response to a standard, experimental spike pair protocol, finding a deviation from exponential behavior. We fit our prediction to recent data from single dendritic spine heads, finding results that are somewhat better than exponential fits. Second, we show that the fixed-point dynamics of our model regulate the upward and downward transition probabilities so that these are on average equal, leading to a uniform distribution of synaptic strength states. However, third, under long-term potentiation (LTP) and long-term depression (LTD) protocols, these probabilities are unequal, skewing the distribution away from uniformity. If the number of states of strength is at least of order 10, then we find that three effective states of synaptic strength appear, consistent with some experimental data on ternary-strength synapses. On this view, LTP and LTD protocols may therefore be saturating protocols.


Subject(s)
Action Potentials/physiology , Brain/physiology , Nerve Net/physiology , Neuronal Plasticity/physiology , Neurons/physiology , Synaptic Transmission/physiology , Computer Simulation , Dendritic Spines/physiology , Long-Term Potentiation/physiology , Long-Term Synaptic Depression/physiology , Mathematical Concepts , Neural Networks, Computer , Stochastic Processes , Synapses/physiology , Time Factors
17.
Neural Comput ; 22(5): 1180-230, 2010 May.
Article in English | MEDLINE | ID: mdl-20028229

ABSTRACT

A stochastic model of spike-timing-dependent plasticity (STDP) proposes that spike timing influences the probability but not the amplitude of synaptic strength change at single synapses. The classic, biphasic STDP profile emerges as a spatial average over many synapses presented with a single spike pair or as a temporal average over a single synapse presented with many spike pairs. We have previously shown that the model accounts for a variety of experimental data, including spike triplet results, and has a number of desirable theoretical properties, including being entirely self-stabilizing in all regions of parameter space. Our earlier analyses of the model have employed cumbersome spike-to-spike averaging arguments to derive results. Here, we show that the model can be reformulated as a non-Markovian random walk in synaptic strength, the step sizes being fixed as postulated. This change of perspective greatly simplifies earlier calculations by integrating out the proposed switch mechanism by which changes in strength are driven and instead concentrating on the changes in strength themselves. Moreover, this change of viewpoint is generative, facilitating further calculations that would be intractable, if not impossible, with earlier approaches. We prepare the machinery here for these later calculations but also briefly indicate how this machinery may be used by considering two particular applications.


Subject(s)
Action Potentials/physiology , Models, Neurological , Neuronal Plasticity/physiology , Stochastic Processes , Synapses/physiology , Algorithms , Animals , Neurons/physiology , Poisson Distribution , Probability , Time Factors
18.
Neural Comput ; 21(12): 3363-407, 2009 Dec.
Article in English | MEDLINE | ID: mdl-19635017

ABSTRACT

A stochastic model of spike-timing-dependent plasticity proposes that single synapses express fixed-amplitude jumps in strength, the amplitudes being independent of the spike time difference. However, the probability that a jump in strength occurs does depend on spike timing. Although the model has a number of desirable features, the stochasticity of response of a synapse introduces potentially large fluctuations into changes in synaptic strength. These can destabilize the segregated patterns of afferent connectivity characteristic of neuronal development. Previously we have taken these jumps to be small relative to overall synaptic strengths to control fluctuations, but doing so increases developmental timescales unacceptably. Here, we explore three alternative ways of taming fluctuations. First, a calculation of the variance for the change in synaptic strength shows that the mean change eventually dominates fluctuations, but on timescales that are too long. Second, it is possible that fluctuations in strength may cancel between synapses, but we show that correlations between synapses emasculate the law of large numbers. Finally, by separating plasticity induction and expression, we introduce a temporal window during which induction signals are low-pass-filtered before expression. In this way, fluctuations in strength are tamed, stabilizing segregated states of afferent connectivity.


Subject(s)
Action Potentials/physiology , Models, Neurological , Neurons/physiology , Nonlinear Dynamics , Stochastic Processes , Animals , Neural Networks, Computer , Synapses/physiology , Synaptic Transmission/physiology , Time Factors
19.
Network ; 20(1): 1-31, 2009.
Article in English | MEDLINE | ID: mdl-19229731

ABSTRACT

Adaptation is a ubiquitous property of sensory neurons. Multisensory neurons, receiving convergent input from different sensory modalities, also likely exhibit adaptation. The responses of multisensory superior colliculus neurons have been extensively studied, but the impact of adaptation on these responses has not been examined. Multisensory neurons in the superior colliculus exhibit cross-modal enhancement, an often non-linear and non-additive increase in response when a stimulus in one modality is paired with a stimulus in a different modality. We examine the possible impact of adaptation on cross-modal enhancement within the framework of a simple model of adaptation for a neuron employing a saturating, logistic response function. We consider how adaptation to an input's mean and standard deviation affects cross-modal enhancement, and also how the statistical correlations between two different modalities influence cross-modal enhancement. We determine the optimal bimodal stimuli to present a bimodal neuron that evoke the largest changes in cross-modal enhancement under adaptation to input statistics. The model requires separate gains for each modality, unless the statistics specific to each modality have been standardised by prior adaptation in earlier, unisensory neurons. The model also predicts that increasing the correlation coefficient between two modalities reduces a multisensory neuron's overall gain.


Subject(s)
Adaptation, Physiological/physiology , Models, Neurological , Sensation , Sensory Receptor Cells/physiology , Animals , Neural Networks, Computer , Superior Colliculi/cytology
20.
Network ; 19(3): 213-35, 2008.
Article in English | MEDLINE | ID: mdl-18946837

ABSTRACT

Sensory neurons adapt to changes in the natural statistics of their environments through processes such as gain control and firing threshold adjustment. It has been argued that neurons early in sensory pathways adapt according to information-theoretic criteria, perhaps maximising their coding efficiency or information rate. Here, we draw a distinction between how a neuron's preferred operating point is determined and how its preferred operating point is maintained through adaptation. We propose that a neuron's preferred operating point can be characterised by the probability density function (PDF) of its output spike rate, and that adaptation maintains an invariant output PDF, regardless of how this output PDF is initially set. Considering a sigmoidal transfer function for simplicity, we derive simple adaptation rules for a neuron with one sensory input that permit adaptation to the lower-order statistics of the input, independent of how the preferred operating point of the neuron is set. Thus, if the preferred operating point is, in fact, set according to information-theoretic criteria, then these rules nonetheless maintain a neuron at that point. Our approach generalises from the unimodal case to the multimodal case, for a neuron with inputs from distinct sensory channels, and we briefly consider this case too.


Subject(s)
Action Potentials/physiology , Models, Neurological , Sensory Receptor Cells/physiology , Sensory Thresholds/physiology , Animals , Computer Simulation , Humans , Models, Statistical
SELECTION OF CITATIONS
SEARCH DETAIL
...