Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
Add more filters











Database
Language
Publication year range
1.
Neuron ; 90(2): 400-9, 2016 04 20.
Article in English | MEDLINE | ID: mdl-27041502

ABSTRACT

Humans and monkeys have access to an accurate representation of visual space despite a constantly moving eye. One mechanism by which the brain accomplishes this is by remapping visual receptive fields around the time of a saccade. In this process a neuron can be excited by a probe stimulus in the current receptive field, and also simultaneously by a probe stimulus in the location that will be brought into the neuron's receptive field by the saccade (the future receptive field), even before saccade begins. Here we show that perisaccadic neuronal excitability is not limited to the current and future receptive fields but encompasses the entire region of visual space across which the current receptive field will be swept by the saccade. A computational model shows that this receptive field expansion is consistent with the propagation of a wave of activity across the cerebral cortex as saccade planning and remapping proceed.


Subject(s)
Parietal Lobe/physiology , Saccades/physiology , Visual Fields/physiology , Animals , Macaca mulatta , Models, Neurological , Neurons/physiology , Photic Stimulation
2.
F1000Res ; 52016.
Article in English | MEDLINE | ID: mdl-26937278

ABSTRACT

Owing to its many computationally desirable properties, the model of continuous attractor neural networks (CANNs) has been successfully applied to describe the encoding of simple continuous features in neural systems, such as orientation, moving direction, head direction, and spatial location of objects. Recent experimental and computational studies revealed that complex features of external inputs may also be encoded by low-dimensional CANNs embedded in the high-dimensional space of neural population activity. The new experimental data also confirmed the existence of the M-shaped correlation between neuronal responses, which is a correlation structure associated with the unique dynamics of CANNs. This body of evidence, which is reviewed in this report, suggests that CANNs may serve as a canonical model for neural information representation.

3.
Article in English | MEDLINE | ID: mdl-26465541

ABSTRACT

In continuous attractor neural networks (CANNs), spatially continuous information such as orientation, head direction, and spatial location is represented by Gaussian-like tuning curves that can be displaced continuously in the space of the preferred stimuli of the neurons. We investigate how short-term synaptic depression (STD) can reshape the intrinsic dynamics of the CANN model and its responses to a single static input. In particular, CANNs with STD can support various complex firing patterns and chaotic behaviors. These chaotic behaviors have the potential to encode various stimuli in the neuronal system.


Subject(s)
Models, Neurological , Neural Networks, Computer , Neuronal Plasticity/physiology , Neurons/physiology , Action Potentials/physiology , Nonlinear Dynamics
4.
Article in English | MEDLINE | ID: mdl-26382448

ABSTRACT

Anticipation is a strategy used by neural fields to compensate for transmission and processing delays during the tracking of dynamical information and can be achieved by slow, localized, inhibitory feedback mechanisms such as short-term synaptic depression, spike-frequency adaptation, or inhibitory feedback from other layers. Based on the translational symmetry of the mobile network states, we derive generic fluctuation-response relations, providing unified predictions that link their tracking behaviors in the presence of external stimuli to the intrinsic dynamics of the neural fields in their absence.


Subject(s)
Models, Neurological , Neurons/physiology , Feedback , Neural Inhibition/physiology , Neuronal Plasticity/physiology , Synaptic Transmission/physiology
5.
Neural Comput ; 27(3): 507-47, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25602773

ABSTRACT

Attractor models are simplified models used to describe the dynamics of firing rate profiles of a pool of neurons. The firing rate profile, or the neuronal activity, is thought to carry information. Continuous attractor neural networks (CANNs) describe the neural processing of continuous information such as object position, object orientation, and direction of object motion. Recently it was found that in one-dimensional CANNs, short-term synaptic depression can destabilize bump-shaped neuronal attractor activity profiles. In this article, we study two-dimensional CANNs with short-term synaptic depression and spike frequency adaptation. We found that the dynamics of CANNs with short-term synaptic depression and CANNs with spike frequency adaptation are qualitatively similar. We also found that in both kinds of CANNs, the perturbative approach can be used to predict phase diagrams, dynamical variables, and speed of spontaneous motion.


Subject(s)
Computer Simulation , Models, Neurological , Neural Networks, Computer , Neurons/physiology , Nonlinear Dynamics , Action Potentials/physiology , Humans , Nerve Net/physiology
6.
Article in English | MEDLINE | ID: mdl-23781197

ABSTRACT

Conventionally, information is represented by spike rates in the neural system. Here, we consider the ability of temporally modulated activities in neuronal networks to carry information extra to spike rates. These temporal modulations, commonly known as population spikes, are due to the presence of synaptic depression in a neuronal network model. We discuss its relevance to an experiment on transparent motions in macaque monkeys by Treue et al. in 2000. They found that if the moving directions of objects are too close, the firing rate profile will be very similar to that with one direction. As the difference in the moving directions of objects is large enough, the neuronal system would respond in such a way that the network enhances the resolution in the moving directions of the objects. In this paper, we propose that this behavior can be reproduced by neural networks with dynamical synapses when there are multiple external inputs. We will demonstrate how resolution enhancement can be achieved, and discuss the conditions under which temporally modulated activities are able to enhance information processing performances in general.

7.
Neural Comput ; 24(5): 1147-85, 2012 May.
Article in English | MEDLINE | ID: mdl-22295986

ABSTRACT

Experimental data have revealed that neuronal connection efficacy exhibits two forms of short-term plasticity: short-term depression (STD) and short-term facilitation (STF). They have time constants residing between fast neural signaling and rapid learning and may serve as substrates for neural systems manipulating temporal information on relevant timescales. This study investigates the impact of STD and STF on the dynamics of continuous attractor neural networks and their potential roles in neural information processing. We find that STD endows the network with slow-decaying plateau behaviors: the network that is initially being stimulated to an active state decays to a silent state very slowly on the timescale of STD rather than on that of neuralsignaling. This provides a mechanism for neural systems to hold sensory memory easily and shut off persistent activities gracefully. With STF, we find that the network can hold a memory trace of external inputs in the facilitated neuronal interactions, which provides a way to stabilize the network response to noisy inputs, leading to improved accuracy in population decoding. Furthermore, we find that STD increases the mobility of the network states. The increased mobility enhances the tracking performance of the network in response to time-varying stimuli, leading to anticipative neural responses. In general, we find that STD and STP tend to have opposite effects on network dynamics and complementary computational advantages, suggesting that the brain may employ a strategy of weighting them differentially depending on the computational purpose.


Subject(s)
Memory/physiology , Neural Networks, Computer , Neuronal Plasticity/physiology , Neurons/physiology , Synapses/physiology , Brain/physiology , Computer Simulation , Nerve Net/physiology , Signal Processing, Computer-Assisted
8.
Neural Comput ; 22(3): 752-92, 2010 Mar.
Article in English | MEDLINE | ID: mdl-19922292

ABSTRACT

Understanding how the dynamics of a neural network is shaped by the network structure and, consequently, how the network structure facilitates the functions implemented by the neural system is at the core of using mathematical models to elucidate brain functions. This study investigates the tracking dynamics of continuous attractor neural networks (CANNs). Due to the translational invariance of neuronal recurrent interactions, CANNs can hold a continuous family of stationary states. They form a continuous manifold in which the neural system is neutrally stable. We systematically explore how this property facilitates the tracking performance of a CANN, which is believed to have clear correspondence with brain functions. By using the wave functions of the quantum harmonic oscillator as the basis, we demonstrate how the dynamics of a CANN is decomposed into different motion modes, corresponding to distortions in the amplitude, position, width, or skewness of the network state. We then develop a perturbation approach that utilizes the dominating movement of the network's stationary states in the state space. This method allows us to approximate the network dynamics up to an arbitrary accuracy depending on the order of perturbation used. We quantify the distortions of a gaussian bump during tracking and study their effects on tracking performance. Results are obtained on the maximum speed for a moving stimulus to be trackable and the reaction time for the network to catch up with an abrupt change in the stimulus.


Subject(s)
Neural Networks, Computer , Algorithms , Computer Simulation , Motion Perception , Normal Distribution , Reaction Time
SELECTION OF CITATIONS
SEARCH DETAIL