Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 17 de 17
Filter
Add more filters










Publication year range
1.
Elife ; 92020 07 30.
Article in English | MEDLINE | ID: mdl-32729828

ABSTRACT

Signalling pathways leading to post-synaptic plasticity have been examined in many types of experimental studies, but a unified picture on how multiple biochemical pathways collectively shape neocortical plasticity is missing. We built a biochemically detailed model of post-synaptic plasticity describing CaMKII, PKA, and PKC pathways and their contribution to synaptic potentiation or depression. We developed a statistical AMPA-receptor-tetramer model, which permits the estimation of the AMPA-receptor-mediated maximal synaptic conductance based on numbers of GluR1s and GluR2s predicted by the biochemical signalling model. We show that our model reproduces neuromodulator-gated spike-timing-dependent plasticity as observed in the visual cortex and can be fit to data from many cortical areas, uncovering the biochemical contributions of the pathways pinpointed by the underlying experimental studies. Our model explains the dependence of different forms of plasticity on the availability of different proteins and can be used for the study of mental disorder-associated impairments of cortical plasticity.


Subject(s)
Neuronal Plasticity , Signal Transduction , Visual Cortex/physiology , Animals , Computational Biology , Mice , Models, Neurological , Rats , Receptors, AMPA/metabolism
2.
J Neurophysiol ; 120(5): 2532-2541, 2018 11 01.
Article in English | MEDLINE | ID: mdl-29975165

ABSTRACT

Transcranial magnetic stimulation (TMS) is a technique that enables noninvasive manipulation of neural activity and holds promise in both clinical and basic research settings. The effect of TMS on the motor cortex is often measured by electromyography (EMG) recordings from a small hand muscle. However, the details of how TMS generates responses measured with EMG are not completely understood. We aim to develop a biophysically detailed computational model to study the potential mechanisms underlying the generation of EMG signals following TMS. Our model comprises a feed-forward network of cortical layer 2/3 cells, which drive morphologically detailed layer 5 corticomotoneuronal cells, which in turn project to a pool of motoneurons. EMG signals are modeled as the sum of motor unit action potentials. EMG recordings from the first dorsal interosseous muscle were performed in four subjects and compared with simulated EMG signals. Our model successfully reproduces several characteristics of the experimental data. The simulated EMG signals match experimental EMG recordings in shape and size, and change with stimulus intensity and contraction level as in experimental recordings. They exhibit cortical silent periods that are close to the biological values and reveal an interesting dependence on inhibitory synaptic transmission properties. Our model predicts several characteristics of the firing patterns of neurons along the entire pathway from cortical layer 2/3 cells down to spinal motoneurons and should be considered as a viable tool for explaining and analyzing EMG signals following TMS. NEW & NOTEWORTHY A biophysically detailed model of EMG signal generation following transcranial magnetic stimulation (TMS) is proposed. Simulated EMG signals match experimental EMG recordings in shape and amplitude. Motor-evoked potential and cortical silent period properties match experimental data. The model is a viable tool to analyze, explain, and predict EMG signals following TMS.


Subject(s)
Evoked Potentials, Motor , Models, Neurological , Muscle, Skeletal/physiology , Adult , Computer Simulation , Electromyography , Female , Humans , Male , Motor Cortex/cytology , Motor Cortex/physiology , Motor Neurons/physiology , Muscle Contraction , Muscle, Skeletal/innervation , Transcranial Magnetic Stimulation
3.
Int J Neural Syst ; 28(7): 1850004, 2018 Sep.
Article in English | MEDLINE | ID: mdl-29631506

ABSTRACT

Existing computational models of the retina often compromise between the biophysical accuracy and a hardware-adaptable methodology of implementation. When compared to the current modes of vision restoration, algorithmic models often contain a greater correlation between stimuli and the affected neural network, but lack physical hardware practicality. Thus, if the present processing methods are adapted to complement very-large-scale circuit design techniques, it is anticipated that it will engender a more feasible approach to the physical construction of the artificial retina. The computational model presented in this research serves to provide a fast and accurate predictive model of the retina, a deeper understanding of neural responses to visual stimulation, and an architecture that can realistically be transformed into a hardware device. Traditionally, implicit (or semi-implicit) ordinary differential equations (OES) have been used for optimal speed and accuracy. We present a novel approach that requires the effective integration of different dynamical time scales within a unified framework of neural responses, where the rod, cone, amacrine, bipolar, and ganglion cells correspond to the implemented pathways. Furthermore, we show that adopting numerical integration can both accelerate retinal pathway simulations by more than 50% when compared with traditional ODE solvers in some cases, and prove to be a more realizable solution for the hardware implementation of predictive retinal models.


Subject(s)
Models, Neurological , Retina/physiology , Action Potentials , Algorithms , Animals , Computer Simulation , Nonlinear Dynamics , Time Factors , Vertebrates , Vision, Ocular/physiology , Visual Pathways/physiology
4.
Front Comput Neurosci ; 11: 42, 2017.
Article in English | MEDLINE | ID: mdl-28649195

ABSTRACT

The ability for cortical neurons to adapt their input/output characteristics and information processing capabilities ultimately relies on the interplay between synaptic plasticity, synapse location, and the nonlinear properties of the dendrite. Collectively, they shape both the strengths and spatial arrangements of convergent afferent inputs to neuronal dendrites. Recent experimental and theoretical studies support a clustered plasticity model, a view that synaptic plasticity promotes the formation of clusters or hotspots of synapses sharing similar properties. We have previously shown that spike timing-dependent plasticity (STDP) can lead to synaptic efficacies being arranged into spatially segregated clusters. This effectively partitions the dendritic tree into a tessellated imprint which we have called a dendritic mosaic. Here, using a biophysically detailed neuron model of a reconstructed layer 2/3 pyramidal cell and STDP learning, we investigated the impact of altered STDP balance on forming such a spatial organization. We show that cluster formation and extend depend on several factors, including the balance between potentiation and depression, the afferents' mean firing rate and crucially on the dendritic morphology. We find that STDP balance has an important role to play for this emergent mode of spatial organization since any imbalances lead to severe degradation- and in some case even destruction- of the mosaic. Our model suggests that, over a broad range of of STDP parameters, synaptic plasticity shapes the spatial arrangement of synapses, favoring the formation of clustered efficacy engrams.

5.
J Comput Neurosci ; 41(2): 193-206, 2016 Oct.
Article in English | MEDLINE | ID: mdl-27480847

ABSTRACT

Neural spike trains are commonly characterized as a Poisson point process. However, the Poisson assumption is a poor model for spiking in auditory nerve fibres because it is known that interspike intervals display positive correlation over long time scales and negative correlation over shorter time scales. We have therefore developed a biophysical model based on the well-known Meddis model of the peripheral auditory system, to produce simulated auditory nerve fibre spiking statistics that more closely match the firing correlations observed in empirical data. We achieve this by introducing biophysically realistic ion channel noise to an inner hair cell membrane potential model that includes fractal fast potassium channels and deterministic slow potassium channels. We succeed in producing simulated spike train statistics that match empirically observed firing correlations. Our model thus replicates macro-scale stochastic spiking statistics in the auditory nerve fibres due to modeling stochasticity at the micro-scale of potassium channels.


Subject(s)
Action Potentials , Cochlear Nerve , Ion Channels/physiology , Models, Neurological , Neurons , Potassium Channels
6.
Network ; 26(2): 35-71, 2015.
Article in English | MEDLINE | ID: mdl-25760433

ABSTRACT

Stochastic resonance (SR) is said to be observed when the presence of noise in a nonlinear system enables an output signal from the system to better represent some feature of an input signal than it does in the absence of noise. The effect has been observed in models of individual neurons, and in experiments performed on real neural systems. Despite the ubiquity of biophysical sources of stochastic noise in the nervous system, however, it has not yet been established whether neuronal computation mechanisms involved in performance of specific functions such as perception or learning might exploit such noise as an integral component, such that removal of the noise would diminish performance of these functions. In this paper we revisit the methods used to demonstrate stochastic resonance in models of single neurons. This includes a previously unreported observation in a multicompartmental model of a CA1-pyramidal cell. We also discuss, as a contrast to these classical studies, a form of 'stochastic facilitation', known as inverse stochastic resonance. We draw on the reviewed examples to argue why new approaches to studying 'stochastic facilitation' in neural systems need to be developed.


Subject(s)
Computer Simulation , Models, Neurological , Neurons/physiology , Stochastic Processes , Animals , Humans
7.
PLoS One ; 9(12): e114503, 2014.
Article in English | MEDLINE | ID: mdl-25486535

ABSTRACT

Complex networks are frequently characterized by metrics for which particular subgraphs are counted. One statistic from this category, which we refer to as motif-role fingerprints, differs from global subgraph counts in that the number of subgraphs in which each node participates is counted. As with global subgraph counts, it can be important to distinguish between motif-role fingerprints that are 'structural' (induced subgraphs) and 'functional' (partial subgraphs). Here we show mathematically that a vector of all functional motif-role fingerprints can readily be obtained from an arbitrary directed adjacency matrix, and then converted to structural motif-role fingerprints by multiplying that vector by a specific invertible conversion matrix. This result demonstrates that a unique structural motif-role fingerprint exists for any given functional motif-role fingerprint. We demonstrate a similar result for the cases of functional and structural motif-fingerprints without node roles, and global subgraph counts that form the basis of standard motif analysis. We also explicitly highlight that motif-role fingerprints are elemental to several popular metrics for quantifying the subgraph structure of directed complex networks, including motif distributions, directed clustering coefficient, and transitivity. The relationships between each of these metrics and motif-role fingerprints also suggest new subtypes of directed clustering coefficients and transitivities. Our results have potential utility in analyzing directed synaptic networks constructed from neuronal connectome data, such as in terms of centrality. Other potential applications include anomaly detection in networks, identification of similar networks and identification of similar nodes within networks. Matlab code for calculating all stated metrics following calculation of functional motif-role fingerprints is provided as S1 Matlab File.


Subject(s)
Algorithms , Caenorhabditis elegans/genetics , Gene Regulatory Networks , Models, Biological , Neural Pathways , Neurons/metabolism , Protein Interaction Mapping , Animals , Cluster Analysis , Computational Biology , Computer Simulation
8.
PLoS One ; 9(8): e102601, 2014.
Article in English | MEDLINE | ID: mdl-25148478

ABSTRACT

Finding the rules underlying how axons of cortical neurons form neural circuits and modify their corresponding synaptic strength is the still subject of intense research. Experiments have shown that internal calcium concentration, and both the precise timing and temporal order of pre and postsynaptic action potentials, are important constituents governing whether the strength of a synapse located on the dendrite is increased or decreased. In particular, previous investigations focusing on spike timing-dependent plasticity (STDP) have typically observed an asymmetric temporal window governing changes in synaptic efficacy. Such a temporal window emphasizes that if a presynaptic spike, arriving at the synaptic terminal, precedes the generation of a postsynaptic action potential, then the synapse is potentiated; however if the temporal order is reversed, then depression occurs. Furthermore, recent experimental studies have now demonstrated that the temporal window also depends on the dendritic location of the synapse. Specifically, it was shown that in distal regions of the apical dendrite, the magnitude of potentiation was smaller and the window for depression was broader, when compared to observations from the proximal region of the dendrite. To date, the underlying mechanism(s) for such a distance-dependent effect is (are) currently unknown. Here, using the ionic cable theory framework in conjunction with the standard calcium based plasticity model, we show for the first time that such distance-dependent inhomogeneities in the temporal learning window for STDP can be largely explained by both the spatial and active properties of the dendrite.


Subject(s)
Models, Neurological , Neuronal Plasticity/physiology , Synapses/physiology , Algorithms , Animals , Calcium/metabolism , Cerebral Cortex/cytology , Cerebral Cortex/physiology , Dendrites/physiology , Humans , Neurons/physiology , Synaptic Potentials
9.
PLoS One ; 9(2): e88326, 2014.
Article in English | MEDLINE | ID: mdl-24551089

ABSTRACT

Cortical circuits in the brain have long been recognised for their information processing capabilities and have been studied both experimentally and theoretically via spiking neural networks. Neuromorphic engineers are primarily concerned with translating the computational capabilities of biological cortical circuits, using the Spiking Neural Network (SNN) paradigm, into in silico applications that can mimic the behaviour and capabilities of real biological circuits/systems. These capabilities include low power consumption, compactness, and relevant dynamics. In this paper, we propose a new accelerated-time circuit that has several advantages over its previous neuromorphic counterparts in terms of compactness, power consumption, and capability to mimic the outcomes of biological experiments. The presented circuit simulation results demonstrate that, in comparing the new circuit to previous published synaptic plasticity circuits, reduced silicon area and lower energy consumption for processing each spike is achieved. In addition, it can be tuned in order to closely mimic the outcomes of various spike timing- and rate-based synaptic plasticity experiments. The proposed circuit is also investigated and compared to other designs in terms of tolerance to mismatch and process variation. Monte Carlo simulation results show that the proposed design is much more stable than its previous counterparts in terms of vulnerability to transistor mismatch, which is a significant challenge in analog neuromorphic design. All these features make the proposed design an ideal circuit for use in large scale SNNs, which aim at implementing neuromorphic systems with an inherent capability that can adapt to a continuously changing environment, thus leading to systems with significant learning and computational abilities.


Subject(s)
Hippocampus/physiology , Models, Neurological , Nerve Net/physiology , Neural Networks, Computer , Visual Cortex/physiology , Action Potentials/physiology , Computer Simulation , Hippocampus/cytology , Humans , Nerve Net/cytology , Neuronal Plasticity , Neurons/cytology , Neurons/physiology , Synapses/physiology , Synaptic Transmission , Visual Cortex/cytology
10.
Front Comput Neurosci ; 8: 163, 2014.
Article in English | MEDLINE | ID: mdl-25566047

ABSTRACT

We propose several modifications to an existing computational model of stochastic vesicle release in inner hair cell ribbon synapses, with the aim of producing simulated auditory nerve fiber spiking data that more closely matches empirical data. Specifically, we studied the inter-spike-interval (ISI) distribution, and long and short term ISI correlations in spontaneous spiking in post-synaptic auditory nerve fibers. We introduced short term plasticity to the pre-synaptic release probability, in a manner analogous to standard stochastic models of cortical short term synaptic depression. This modification resulted in a similar distribution of vesicle release intervals to that estimated from empirical data. We also introduced a biophysical stochastic model of calcium channel opening and closing, but showed that this model is insufficient for generating a match with empirically observed spike correlations. However, by combining a phenomenological model of channel noise and our short term depression model, we generated short and long term correlations in auditory nerve spontaneous activity that qualitatively match empirical data.

11.
Neural Netw ; 45: 70-82, 2013 Sep.
Article in English | MEDLINE | ID: mdl-23566339

ABSTRACT

Triplet-based Spike Timing Dependent Plasticity (TSTDP) is a powerful synaptic plasticity rule that acts beyond conventional pair-based STDP (PSTDP). Here, the TSTDP is capable of reproducing the outcomes from a variety of biological experiments, while the PSTDP rule fails to reproduce them. Additionally, it has been shown that the behaviour inherent to the spike rate-based Bienenstock-Cooper-Munro (BCM) synaptic plasticity rule can also emerge from the TSTDP rule. This paper proposes an analogue implementation of the TSTDP rule. The proposed VLSI circuit has been designed using the AMS 0.35 µm CMOS process and has been simulated using design kits for Synopsys and Cadence tools. Simulation results demonstrate how well the proposed circuit can alter synaptic weights according to the timing difference amongst a set of different patterns of spikes. Furthermore, the circuit is shown to give rise to a BCM-like learning rule, which is a rate-based rule. To mimic an implementation environment, a 1000 run Monte Carlo (MC) analysis was conducted on the proposed circuit. The presented MC simulation analysis and the simulation result from fine-tuned circuits show that it is possible to mitigate the effect of process variations in the proof of concept circuit; however, a practical variation aware design technique is required to promise a high circuit performance in a large scale neural network. We believe that the proposed design can play a significant role in future VLSI implementations of both spike timing and rate based neuromorphic learning systems.


Subject(s)
Action Potentials/physiology , Artificial Intelligence , Neuronal Plasticity , Neurons/physiology , Numerical Analysis, Computer-Assisted/instrumentation , Synapses/physiology , Animals , Humans , Semiconductors , Time Factors
12.
PLoS One ; 7(12): e51756, 2012.
Article in English | MEDLINE | ID: mdl-23300566

ABSTRACT

Minkowski famously introduced the concept of a space-time continuum in 1908, merging the three dimensions of space with an imaginary time dimension [Formula: see text], with the unit imaginary producing the correct spacetime distance [Formula: see text], and the results of Einstein's then recently developed theory of special relativity, thus providing an explanation for Einstein's theory in terms of the structure of space and time. As an alternative to a planar Minkowski space-time of two space dimensions and one time dimension, we replace the unit imaginary [Formula: see text], with the Clifford bivector [Formula: see text] for the plane that also squares to minus one, but which can be included without the addition of an extra dimension, as it is an integral part of the real Cartesian plane with the orthonormal basis [Formula: see text] and [Formula: see text]. We find that with this model of planar spacetime, using a two-dimensional Clifford multivector, the spacetime metric and the Lorentz transformations follow immediately as properties of the algebra. This also leads to momentum and energy being represented as components of a multivector and we give a new efficient derivation of Compton's scattering formula, and a simple formulation of Dirac's and Maxwell's equations. Based on the mathematical structure of the multivector, we produce a semi-classical model of massive particles, which can then be viewed as the origin of the Minkowski spacetime structure and thus a deeper explanation for relativistic effects. We also find a new perspective on the nature of time, which is now given a precise mathematical definition as the bivector of the plane.


Subject(s)
Extraterrestrial Environment , Models, Theoretical , Physical Phenomena , Particle Size , Time Factors
13.
Article in English | MEDLINE | ID: mdl-20725522

ABSTRACT

Synapse location, dendritic active properties and synaptic plasticity are all known to play some role in shaping the different input streams impinging onto a neuron. It remains unclear however, how the magnitude and spatial distribution of synaptic efficacies emerge from this interplay. Here, we investigate this interplay using a biophysically detailed neuron model of a reconstructed layer 2/3 pyramidal cell and spike timing-dependent plasticity (STDP). Specifically, we focus on the issue of how the efficacy of synapses contributed by different input streams are spatially represented in dendrites after STDP learning. We construct a simple feed forward network where a detailed model neuron receives synaptic inputs independently from multiple yet equally sized groups of afferent fibers with correlated activity, mimicking the spike activity from different neuronal populations encoding, for example, different sensory modalities. Interestingly, ensuing STDP learning, we observe that for all afferent groups, STDP leads to synaptic efficacies arranged into spatially segregated clusters effectively partitioning the dendritic tree. These segregated clusters possess a characteristic global organization in space, where they form a tessellation in which each group dominates mutually exclusive regions of the dendrite. Put simply, the dendritic imprint from different input streams left after STDP learning effectively forms what we term a "dendritic efficacy mosaic." Furthermore, we show how variations of the inputs and STDP rule affect such an organization. Our model suggests that STDP may be an important mechanism for creating a clustered plasticity engram, which shapes how different input streams are spatially represented in dendrite.

14.
J Integr Neurosci ; 6(2): 241-77, 2007 Jun.
Article in English | MEDLINE | ID: mdl-17622981

ABSTRACT

In order to gain a better theoretical understanding of the interaction between voltage and calcium influx, we present the simulation results for saltatory transmission in a sparsely excitable model of a continuous cylindrical segment of nerve fiber, where calcium diffuses internally and various ion channels are distributed as hotspots along the cable. A standard set of ion channel descriptions is used to illustrate how different numbers and distributions of ion channel hotspots affect the propagation and transmission of a single action potential and/or a spike train and how such hotspots affect calcium influx and diffusion within continuous cylindrical segment of nerve fiber.


Subject(s)
Action Potentials/physiology , Calcium/metabolism , Ion Channels/metabolism , Nerve Fibers/physiology , Synaptic Transmission/physiology , Animals , Computer Simulation , Diffusion , Electrophysiology , Humans , Models, Neurological , Nerve Fibers/metabolism , Nonlinear Dynamics
15.
J Integr Neurosci ; 5(2): 249-72, 2006 Jun.
Article in English | MEDLINE | ID: mdl-16783871

ABSTRACT

The interaction between membrane potential and internal calcium concentration plays many important roles in regulating synaptic integration and neuronal firing. In order to gain a better theoretical understanding between the voltage-calcium interaction, a nonlinear cable equation with calcium dynamics is solved analytically. This general reaction-diffusion system represents a model of a cylindrical dendritic segment in which calcium diffuses internally in the presence of buffers, pumps and exchangers, and where ion channels are sparsely distributed over the membrane,in the form of hotspots, acting as point current sources along the dendritic membrane. In order to proceed, the reaction-diffusion system is recast into a system of coupled nonlinear integral equations, with which a perturbative expansion in dimensionless voltage and calcium concentration are used to find analytical solutions to this general system. The resulting solutions can be used to investigate, the interaction between the membrane potential and underlying calcium dynamics in a natural (non-discretized) setting.


Subject(s)
Calcium/metabolism , Membrane Potentials/physiology , Models, Neurological , Neurons/physiology , Nonlinear Dynamics , Animals , Dendrites/physiology , Humans , Neural Conduction/physiology , Synaptic Transmission/physiology
16.
Neurosci Lett ; 403(1-2): 24-9, 2006 Jul 31.
Article in English | MEDLINE | ID: mdl-16762502

ABSTRACT

The role of spike-timing-dependent plasticity (STDP) in shaping the strength of a synapse located on the dendritic tree has gained recent interest. Previous theoretical studies using STDP have mostly used simplified integrate-and-fire models to investigate the evolution of synaptic efficacy with time. Such studies usually show that the final weight distribution is unimodal or bimodal resulting from a multiplicative or additive STDP rule, respectively. However, very little is known about how STDP shapes the spatial organization of synaptic efficacies. Here, for the first time, we demonstrate that spatial clustering of synaptic efficacies can occur on the dendrite via STDP, where changes in synaptic efficacy are driven by timing differences between synaptic inputs and the generation of local dendritic spikes. Specifically, when the model neuron is stimulated by two independent groups of correlated afferent inputs, the synaptic efficacies from each group, are not only spatially clustered on the dendrite but also spatially complementary to each other.


Subject(s)
Action Potentials , Models, Neurological , Neuronal Plasticity , Neurons/physiology , Synapses/physiology , Animals , Dendrites/physiology , Rats , Somatosensory Cortex/physiology , Synaptic Transmission
17.
Math Biosci ; 188: 117-32, 2004.
Article in English | MEDLINE | ID: mdl-14766097

ABSTRACT

We have developed a non-linear stochastic PDE (partial differential equation) model of a rat layer 2/3 somatosensory pyramidal neuron which approximates several of the dynamical properties of these cells. The model distinguishes telodendrites, a myelinated axon, initial segment, hillock, soma and a simplified dendritic tree. Distributions and properties of excitatory and inhibitory synapses were included, in accordance with recent anatomical and physiological findings. Using simulation methods, we aim to show that the spatial separation between regions of spatially distributed randomly activated excitatory and inhibitory synaptic inputs may be an important parameter which can influence neuronal firing properties. Due to the complexity of the problem, with respect to configurations of spatially and temporally activated excitatory and inhibitory synaptic inputs, we consider two simple configurations in which the spatial region of activated excitatory and inhibitory synaptic inputs overlap and when they are far from each. In the first, denoted configuration A, activated excitatory and inhibitory synapses were located close to the soma. In the second, denoted configuration B, active inhibitory synapses were close the soma, while active excitatory synapses were located on distal regions of the dendrite. For the first configuration, we find that increases in the mean rate of inhibition results in an increase in the width of the firing rate tuning curves, and that for particular mean input frequencies of excitation, increasing the mean input rate of inhibition does not always imply that the neuron fires at a slower rate. Furthermore, we observed for mean input frequencies of excitation between 15 and 60 (Hz), that increasing the mean rate of inhibition resulted in the linearization of the firing rate over this interval. For configuration B, no increase in width nor a linearization effect via inhibition was observed. These differences indicate that the distance between regions of active excitatory and inhibitory synapses may be an important factor to consider in determining how the interaction between excitation and inhibition contributes to neuronal firing.


Subject(s)
Models, Neurological , Pyramidal Cells/physiology , Somatosensory Cortex/physiology , Animals , Rats , Somatosensory Cortex/cytology , Stochastic Processes , Synaptic Transmission
SELECTION OF CITATIONS
SEARCH DETAIL
...