Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
Add more filters










Publication year range
1.
Phys Rev X ; 12(1)2022.
Article in English | MEDLINE | ID: mdl-35923858

ABSTRACT

Cortical neurons are characterized by irregular firing and a broad distribution of rates. The balanced state model explains these observations with a cancellation of mean excitatory and inhibitory currents, which makes fluctuations drive firing. In networks of neurons with current-based synapses, the balanced state emerges dynamically if coupling is strong, i.e., if the mean number of synapses per neuron K is large and synaptic efficacy is of the order of 1 / K . When synapses are conductance-based, current fluctuations are suppressed when coupling is strong, questioning the applicability of the balanced state idea to biological neural networks. We analyze networks of strongly coupled conductance-based neurons and show that asynchronous irregular activity and broad distributions of rates emerge if synaptic efficacy is of the order of 1/ log(K). In such networks, unlike in the standard balanced state model, current fluctuations are small and firing is maintained by a drift-diffusion balance. This balance emerges dynamically, without fine-tuning, if inputs are smaller than a critical value, which depends on synaptic time constants and coupling strength, and is significantly more robust to connection heterogeneities than the classical balanced state model. Our analysis makes experimentally testable predictions of how the network response properties should evolve as input increases.

2.
Neuroscience ; 323: 43-61, 2016 May 26.
Article in English | MEDLINE | ID: mdl-25862587

ABSTRACT

Synaptic plasticity is the capacity of a preexisting connection between two neurons to change in strength as a function of neural activity. Because synaptic plasticity is the major candidate mechanism for learning and memory, the elucidation of its constituting mechanisms is of crucial importance in many aspects of normal and pathological brain function. In particular, a prominent aspect that remains debated is how the plasticity mechanisms, that encompass a broad spectrum of temporal and spatial scales, come to play together in a concerted fashion. Here we review and discuss evidence that pinpoints to a possible non-neuronal, glial candidate for such orchestration: the regulation of synaptic plasticity by astrocytes.


Subject(s)
Astrocytes/physiology , Neuronal Plasticity/physiology , Animals , Humans , Neurons/physiology , Synapses/physiology , Synaptic Transmission/physiology
3.
J Exp Bot ; 63(5): 2217-30, 2012 Mar.
Article in English | MEDLINE | ID: mdl-22223812

ABSTRACT

A novel category of major intrinsic proteins which share weak similarities with previously identified aquaporin subfamilies was recently identified in land plants, and named X (for unrecognized) intrinsic proteins (XIPs). Because XIPs are still ranked as uncharacterized proteins, their further molecular characterization is required. Herein, a systematic fine-scale analysis of XIP sequences found in flowering plant databases revealed that XIPs are found in at least five groups. The phylogenetic relationship of these five groups with the phylogenetic organization of angiosperms revealed an original pattern of evolution for the XIP subfamily through distinct angiosperm taxon-specific clades. Of all flowering plant having XIPs, the genus Populus encompasses the broadest panel and the highest polymorphism of XIP isoforms, with nine PtXIP sequences distributed within three XIP groups. Comprehensive PtXIP gene expression patterns showed that only two isoforms (PtXIP2;1 and PtXIP3;2) were transcribed in vegetative tissues. However, their patterns are contrasted, PtXIP2;1 was ubiquitously accumulated whereas PtXIP3;2 was predominantly detected in wood and to a lesser extent in roots. Furthermore, only PtXIP2;1 exhibited a differential expression in leaves and stems of drought-, salicylic acid-, or wounding-challenged plants. Unexpectedly, the PtXIPs displayed different abilities to alter water transport upon expression in Xenopus laevis oocytes. PtXIP2;1 and PtXIP3;3 transported water while other PtXIPs did not.


Subject(s)
Aquaporins/genetics , Evolution, Molecular , Magnoliopsida/genetics , Phylogeny , Polymorphism, Genetic/genetics , Populus/genetics , Amino Acid Sequence , Animals , Aquaporins/classification , Aquaporins/metabolism , Biological Transport , Droughts , Environment , Gene Expression Regulation, Plant/physiology , Magnoliopsida/metabolism , Magnoliopsida/physiology , Molecular Sequence Data , Multigene Family , Organ Specificity , Plant Leaves/genetics , Plant Leaves/metabolism , Plant Leaves/physiology , Plant Proteins/classification , Plant Proteins/genetics , Plant Proteins/metabolism , Plant Roots/genetics , Plant Roots/metabolism , Plant Roots/physiology , Plant Stems/genetics , Plant Stems/metabolism , Plant Stems/physiology , Populus/metabolism , Populus/physiology , Protein Isoforms , Sequence Alignment , Water/metabolism , Wood/genetics , Wood/metabolism , Wood/physiology , Xenopus laevis/genetics , Xenopus laevis/metabolism
4.
J Comput Neurosci ; 11(1): 63-85, 2001.
Article in English | MEDLINE | ID: mdl-11524578

ABSTRACT

Experimental evidence suggests that the maintenance of an item in working memory is achieved through persistent activity in selective neural assemblies of the cortex. To understand the mechanisms underlying this phenomenon, it is essential to investigate how persistent activity is affected by external inputs or neuromodulation. We have addressed these questions using a recurrent network model of object working memory. Recurrence is dominated by inhibition, although persistent activity is generated through recurrent excitation in small subsets of excitatory neurons. Our main findings are as follows. (1) Because of the strong feedback inhibition, persistent activity shows an inverted U shape as a function of increased external drive to the network. (2) A transient external excitation can switch off a network from a selective persistent state to its spontaneous state. (3) The maintenance of the sample stimulus in working memory is not affected by intervening stimuli (distractors) during the delay period, provided the stimulation intensity is not large. On the other hand, if stimulation intensity is large enough, distractors disrupt sample-related persistent activity, and the network is able to maintain a memory only of the last shown stimulus. (4) A concerted modulation of GABA(A) and NMDA conductances leads to a decrease of spontaneous activity but an increase of persistent activity; the enhanced signal-to-noise ratio is shown to increase the resistance of the network to distractors. (5) Two mechanisms are identified that produce an inverted U shaped dependence of persistent activity on modulation. The present study therefore points to several mechanisms that enhance the signal-to-noise ratio in working memory states. These mechanisms could be implemented in the prefrontal cortex by dopaminergic projections from the midbrain.


Subject(s)
Afferent Pathways/physiology , Cerebral Cortex/physiology , Memory, Short-Term/physiology , Models, Neurological , Nerve Net/physiology , Neural Inhibition/physiology , Neurons/physiology , Action Potentials/drug effects , Action Potentials/physiology , Afferent Pathways/cytology , Afferent Pathways/drug effects , Cerebral Cortex/cytology , Cerebral Cortex/drug effects , Dopamine/metabolism , Excitatory Postsynaptic Potentials/physiology , Feedback , Memory, Short-Term/drug effects , Nerve Net/cytology , Nerve Net/drug effects , Neural Inhibition/drug effects , Neurons/cytology , Neurons/drug effects , Pattern Recognition, Visual/physiology , Photic Stimulation/methods , Psychomotor Performance/physiology , Receptors, AMPA/drug effects , Receptors, AMPA/metabolism , Receptors, Dopamine/drug effects , Receptors, Dopamine/metabolism , Receptors, GABA-A/drug effects , Receptors, GABA-A/metabolism , Receptors, N-Methyl-D-Aspartate/drug effects , Receptors, N-Methyl-D-Aspartate/metabolism , Synapses/drug effects , Synapses/physiology , Synaptic Transmission/drug effects , Synaptic Transmission/physiology
5.
Phys Rev Lett ; 86(10): 2186-9, 2001 Mar 05.
Article in English | MEDLINE | ID: mdl-11289886

ABSTRACT

Noise can have a significant impact on the response dynamics of a nonlinear system. For neurons, the primary source of noise comes from background synaptic input activity. If this is approximated as white noise, the amplitude of the modulation of the firing rate in response to an input current oscillating at frequency omega decreases as 1/square root[omega] and lags the input by 45 degrees in phase. However, if filtering due to realistic synaptic dynamics is included, the firing rate is modulated by a finite amount even in the limit omega-->infinity and the phase lag is eliminated. Thus, through its effect on noise inputs, realistic synaptic dynamics can ensure unlagged neuronal responses to high-frequency inputs.


Subject(s)
Models, Neurological , Neurons/physiology , Synapses/physiology , Action Potentials/physiology , Mathematical Computing , Synaptic Transmission/physiology
6.
Network ; 11(4): 261-80, 2000 Nov.
Article in English | MEDLINE | ID: mdl-11128167

ABSTRACT

Neurophysiological experiments indicate that working memory of an object is maintained by the persistent activity of cells in the prefrontal cortex and infero-temporal cortex of the monkey. This paper considers a cortical network model in which this persistent activity appears due to recurrent synaptic interactions. The conditions under which the magnitude of spontaneous and persistent activity are close to one another (as is found empirically) are investigated using a simplified mean-field description in which firing rates in these states are given by the intersections of a straight line with the f-I curve of a single pyramidal cell. The present analysis relates a network phenomenon - persistent activity in a 'working memory' state - to single-cell data which are accessible to experiment. It predicts that, in networks of the cerebral cortex in which persistent activity phenomena are observed, average synaptic inputs in both spontaneous and persistent activity should bring the cells close to firing threshold. Cells should be slightly sub-threshold in spontaneous activity, and slightly supra-threshold in persistent activity. The results are shown to be robust to the inclusion of inhomogeneities that produce wide distributions of firing rates, in both spontaneous and working memory states.


Subject(s)
Memory, Short-Term/physiology , Models, Neurological , Nerve Net/physiology , Neural Pathways/physiology , Neurons/physiology , Prefrontal Cortex/physiology , Temporal Lobe/physiology , Action Potentials/physiology , Animals , Humans , Interneurons/cytology , Interneurons/physiology , Nerve Net/cytology , Neural Pathways/cytology , Neurons/cytology , Prefrontal Cortex/cytology , Pyramidal Cells/cytology , Pyramidal Cells/physiology , Reaction Time/physiology , Synapses/physiology , Temporal Lobe/cytology , Visual Perception/physiology
7.
Cereb Cortex ; 10(9): 910-23, 2000 Sep.
Article in English | MEDLINE | ID: mdl-10982751

ABSTRACT

Single-neuron recordings from behaving primates have established a link between working memory processes and information-specific neuronal persistent activity in the prefrontal cortex. Using a network model endowed with a columnar architecture and based on the physiological properties of cortical neurons and synapses, we have examined the synaptic mechanisms of selective persistent activity underlying spatial working memory in the prefrontal cortex. Our model reproduces the phenomenology of the oculomotor delayed-response experiment of Funahashi et al. (S. Funahashi, C.J. Bruce and P.S. Goldman-Rakic, Mnemonic coding of visual space in the monkey's dorsolateral prefrontal cortex. J Neurophysiol 61:331-349, 1989). To observe stable spontaneous and persistent activity, we find that recurrent synaptic excitation should be primarily mediated by NMDA receptors, and that overall recurrent synaptic interactions should be dominated by inhibition. Isodirectional tuning of adjacent pyramidal cells and interneurons can be accounted for by a structured pyramid-to-interneuron connectivity. Robust memory storage against random drift of the tuned persistent activity and against distractors (intervening stimuli during the delay period) may be enhanced by neuromodulation of recurrent synapses. Experimentally testable predictions concerning the neural basis of working memory are discussed.


Subject(s)
Memory, Short-Term/physiology , Models, Neurological , Prefrontal Cortex/physiology , Synapses/physiology , Animals , Attention/physiology , Behavior, Animal/physiology , Haplorhini , Interneurons/physiology , Neural Inhibition/physiology , Prefrontal Cortex/chemistry , Prefrontal Cortex/cytology , Pyramidal Cells/physiology , Receptors, AMPA/analysis , Receptors, AMPA/physiology , Receptors, N-Methyl-D-Aspartate/analysis , Receptors, N-Methyl-D-Aspartate/physiology
8.
J Comput Neurosci ; 8(3): 183-208, 2000.
Article in English | MEDLINE | ID: mdl-10809012

ABSTRACT

The dynamics of networks of sparsely connected excitatory and inhibitory integrate-and-fire neurons are studied analytically. The analysis reveals a rich repertoire of states, including synchronous states in which neurons fire regularly; asynchronous states with stationary global activity and very irregular individual cell activity; and states in which the global activity oscillates but individual cells fire irregularly, typically at rates lower than the global oscillation frequency. The network can switch between these states, provided the external frequency, or the balance between excitation and inhibition, is varied. Two types of network oscillations are observed. In the fast oscillatory state, the network frequency is almost fully controlled by the synaptic time scale. In the slow oscillatory state, the network frequency depends mostly on the membrane time constant. Finite size effects in the asynchronous state are also discussed.


Subject(s)
Brain/physiology , Excitatory Postsynaptic Potentials/physiology , Nerve Net/physiology , Neural Pathways/physiology , Neurons/physiology , Action Potentials/physiology , Biological Clocks/physiology , Brain/cytology , Cortical Synchronization/methods , Electric Stimulation , Hippocampus/cytology , Hippocampus/physiology , Interneurons/cytology , Interneurons/physiology , Linear Models , Models, Neurological , Neural Inhibition/physiology , Neural Pathways/cytology , Neurons/cytology , Pyramidal Cells/cytology , Pyramidal Cells/physiology , Synapses/physiology , Synapses/ultrastructure , Time Factors
9.
J Physiol Paris ; 94(5-6): 445-63, 2000.
Article in English | MEDLINE | ID: mdl-11165912

ABSTRACT

Recent advances in the understanding of the dynamics of populations of spiking neurones are reviewed. These studies shed light on how a population of neurones can follow arbitrary variations in input stimuli, how the dynamics of the population depends on the type of noise, and how recurrent connections influence the dynamics. The importance of inhibitory feedback for the generation of irregularity in single cell behaviour is emphasized. Examples of computation that recurrent networks with excitatory and inhibitory cells can perform are then discussed. Maintenance of a network state as an attractor of the system is discussed as a model for working memory function, in both object and spatial modalities. These models can be used to interpret and make predictions about electrophysiological data in the awake monkey.


Subject(s)
Cerebral Cortex/physiology , Memory/physiology , Models, Neurological , Nerve Net/physiology , Neurons/physiology , Pyramidal Cells/physiology , Acoustic Stimulation , Animals , Evoked Potentials , Excitatory Postsynaptic Potentials , Feedback , Humans , Interneurons/physiology , Membrane Potentials , Space Perception , Synapses/physiology
10.
Neural Comput ; 11(7): 1621-71, 1999 Oct 01.
Article in English | MEDLINE | ID: mdl-10490941

ABSTRACT

We study analytically the dynamics of a network of sparsely connected inhibitory integrate-and-fire neurons in a regime where individual neurons emit spikes irregularly and at a low rate. In the limit when the number of neurons --> infinity, the network exhibits a sharp transition between a stationary and an oscillatory global activity regime where neurons are weakly synchronized. The activity becomes oscillatory when the inhibitory feedback is strong enough. The period of the global oscillation is found to be mainly controlled by synaptic times but depends also on the characteristics of the external input. In large but finite networks, the analysis shows that global oscillations of finite coherence time generically exist both above and below the critical inhibition threshold. Their characteristics are determined as functions of systems parameters in these two different regions. The results are found to be in good agreement with numerical simulations.


Subject(s)
Neural Networks, Computer , Neurons/physiology , Algorithms , Computer Simulation , Electrophysiology , Linear Models , Models, Neurological , Nonlinear Dynamics , Synapses/physiology
11.
Network ; 9(1): 123-52, 1998 Feb.
Article in English | MEDLINE | ID: mdl-9861982

ABSTRACT

We study unsupervised Hebbian learning in a recurrent network in which synapses have a finite number of stable states. Stimuli received by the network are drawn at random at each presentation from a set of classes. Each class is defined as a cluster in stimulus space, centred on the class prototype. The presentation protocol is chosen to mimic the protocols of visual memory experiments in which a set of stimuli is presented repeatedly in a random way. The statistics of the input stream may be stationary, or changing. Each stimulus induces, in a stochastic way, transitions between stable synaptic states. Learning dynamics is studied analytically in the slow learning limit, in which a given stimulus has to be presented many times before it is memorized, i.e. before synaptic modifications enable a pattern of activity correlated with the stimulus to become an attractor of the recurrent network. We show that in this limit the synaptic matrix becomes more correlated with the class prototypes than with any of the instances of the class. We also show that the number of classes that can be learned increases sharply when the coding level decreases, and determine the speeds of learning and forgetting of classes in the case of changes in the statistics of the input stream.


Subject(s)
Generalization, Stimulus/physiology , Neural Networks, Computer , Stochastic Processes , Pattern Recognition, Visual/physiology , Time Factors
12.
Network ; 9(2): 207-17, 1998 May.
Article in English | MEDLINE | ID: mdl-9861986

ABSTRACT

We prove that maximization of mutual information between the output and the input of a feedforward neural network leads to full redundancy reduction under the following sufficient conditions: (i) the input signal is a (possibly nonlinear) invertible mixture of independent components; (ii) there is no input noise; (iii) the activity of each output neuron is a (possibly) stochastic variable with a probability distribution depending on the stimulus through a deterministic function of the inputs (where both the probability distributions and the functions can be different from neuron to neuron); (iv) optimization of the mutual information is performed over all these deterministic functions. This result extends that obtained by Nadal and Parga (1994) who considered the case of deterministic outputs.


Subject(s)
Information Theory , Neural Networks, Computer , Nonlinear Dynamics , Stochastic Processes , Feedback/physiology , Neurons/physiology
13.
J Theor Biol ; 195(1): 87-95, 1998 Nov 07.
Article in English | MEDLINE | ID: mdl-9802952

ABSTRACT

We consider a model of an integrate-and-fire neuron with synaptic current dynamics, in which the synaptic time constant tau' is much smaller than the membrane time constant tau. We calculate analytically the firing frequency of such a neuron for inputs described by a random Gaussian process. We find that the first order correction to the frequency due to tau' is proportional to the square root of the ratio between these time constants radicaltau'/tau. This implies that the correction is important even when the synaptic time constant is small compared with that of the potential. The frequency of a neuron with tau'>0 can be reduced to that of the basic IF neuron (corresponding to tau'=1) using an "effective" threshold which has a linear dependence on radical tau'/tau. Numerical simulations show a very good agreement with the analytical result, and permit an extrapolation of the "effective" threshold to higher orders in radical tau'/tau. The obtained frequency agrees with simulation data for a wide range of parameters.


Subject(s)
Computer Simulation , Models, Neurological , Neurons/physiology , Synaptic Transmission , Animals , Membrane Potentials/physiology
14.
C R Acad Sci III ; 321(2-3): 249-52, 1998.
Article in English | MEDLINE | ID: mdl-9759349

ABSTRACT

In this paper we summarize some of the main contributions of models of recurrent neural networks with associative memory properties. We compare the behavior of these attractor neural networks with empirical data from both physiology and psychology. This type of network could be used in models with more complex functions.


Subject(s)
Learning/physiology , Memory/physiology , Models, Neurological , Models, Psychological , Nerve Net/physiology , Humans , Neuronal Plasticity/physiology
16.
Neural Comput ; 10(7): 1731-57, 1998 Oct 01.
Article in English | MEDLINE | ID: mdl-9744895

ABSTRACT

In the context of parameter estimation and model selection, it is only quite recently that a direct link between the Fisher information and information-theoretic quantities has been exhibited. We give an interpretation of this link within the standard framework of information theory. We show that in the context of population coding, the mutual information between the activity of a large array of neurons and a stimulus to which the neurons are tuned is naturally related to the Fisher information. In the light of this result, we consider the optimization of the tuning curves parameters in the case of neurons responding to a stimulus represented by an angular variable.


Subject(s)
Cell Communication/physiology , Models, Neurological , Neurons/physiology , Action Potentials/physiology , Information Theory
17.
Hippocampus ; 8(6): 651-65, 1998.
Article in English | MEDLINE | ID: mdl-9882023

ABSTRACT

We propose a computational model of the CA3 region of the rat hippocampus that is able to reproduce the available experimental data concerning the dependence of directional selectivity of the place cell discharge on the environment and on the spatial task. The main feature of our model is a continuous, unsupervised Hebbian learning dynamics of recurrent connections, which is driven by the neuronal activities imposed upon the network by the environment-dependent external input. In our simulations, the environment and the movements of the rat are chosen to mimic those commonly observed in neurophysiological experiments. The environment is represented as local views that depend on both the position and the heading direction of the rat. We hypothesize that place cells are intrinsically directional, that is, they respond to local views. We show that the synaptic dynamics in the recurrent neural network rapidly modify the discharge correlates of the place cells: Cells tend to become omnidirectional place cells in open fields, while their directionality tends to get stronger in radial-arm mazes. We also find that the synaptic learning mechanisms account for other properties of place cell activity, such as an increase in the place cell peak firing rates as well as clustering of place fields during exploration. Our model makes several experimental predictions that can be tested using current techniques.


Subject(s)
Hippocampus/physiology , Maze Learning/physiology , Nerve Net/physiology , Neuronal Plasticity/physiology , Neurons/physiology , Animals , Computer Simulation , Exploratory Behavior , Mathematics , Models, Neurological , Neural Networks, Computer , Rats , Synapses/physiology
18.
Brain Res Cogn Brain Res ; 5(4): 273-82, 1997 Jun.
Article in English | MEDLINE | ID: mdl-9197514

ABSTRACT

The time to locate a difference between two artificial images presented side by side on a CRT screen was studied as a function of their complexity. The images were square lattices of black or white squares or quadrangles, in some cases delineated by a blue grid. Each pair differed at a single position, chosen at random. For images of size N x N, the median reaction time varied as cN2, from N = 3-15, with c being around 50 ms in the absence of grid (i.e., when the quadrangles were associated into continuous shapes). For N < or = 9, when the lattice was made irregular, performance did not deteriorate, up to a rather high level of irregularity. Furthermore, the presence of uncorrelated distortions in the left and right images did not affect performance for N < or = 6. In the presence of a grid, the reaction times were on average higher by 20%. The results taken together indicate that the detection of differences does not proceed on a point-by-point basis and must be mediated by some abstract shape analysis, in agreement with current views on short-term visual memory (e.g., Phillips, W.A., On the distinction between sensory storage and short-term visual memory, Percept. Psychophys., 16 (1974) 283-290 [13]). In complementary experiments, the subjects had to judge whether two images presented side by side were the same or different, with N varying from 1 to 5. For N < 3, the same and the different responses were similar in all their statistical aspects. For N > or = 4, the "same" responses took a significantly larger time than the "different" responses and were accompanied by a significant increase in errors. The qualitative change from N = 3 to N = 4 is interpreted as a shift from a "single inspection" analysis to an obligatory scanning procedure. On the whole, we suggest that visual information in our simultaneous comparison task is extracted by chunks of about 12 +/- 3 bits, and that the visual processing and matching tasks take about 50 ms per couple of quadrangles. In Section 4, we compare these values to the values obtained through other experimental paradigms.


Subject(s)
Neuropsychological Tests , Pattern Recognition, Visual , Humans , Photic Stimulation/methods , Reaction Time , Time Factors
19.
Cereb Cortex ; 7(3): 237-52, 1997.
Article in English | MEDLINE | ID: mdl-9143444

ABSTRACT

We investigate self-sustaining stable states (attractors) in networks of integrate-and-fire neurons. First, we study the stability of spontaneous activity in an unstructured network. It is shown that the stochastic background activity, of 1-5 spikes/s, is unstable if all neurons are excitatory. On the other hand, spontaneous activity becomes self-stabilizing in presence of local inhibition, given reasonable values of the parameters of the network. Second, in a network sustaining physiological spontaneous rates, we study the effect of learning in a local module, expressed in synaptic modifications in specific populations of synapses. We find that if the average synaptic potentiation (LTP) is too low, no stimulus specific activity manifests itself in the delay period. Instead, following the presentation and removal of any stimulus there is, in the local module, a delay activity in which all neurons selective (responding visually) to any of the stimuli presented for learning have rates which gradually increase with the amplitude of synaptic potentiation. When the average LTP increases beyond a critical value, specific local attractors (stable states) appear abruptly against the background of the global uniform spontaneous attractor. In this case the local module has two available types of collective delay activity: if the stimulus is unfamiliar, the activity is spontaneous; if it is similar to a learned stimulus, delay activity is selective. These new attractors reflect the synaptic structure developed during learning. In each of them a small population of neurons have elevated rates, which depend on the strength of LTP. The remaining neurons of the module have their activity at spontaneous rates. The predictions made in this paper could be checked by single unit recordings in delayed response experiments.


Subject(s)
Cerebral Cortex/physiology , Neural Networks, Computer , Cerebral Cortex/cytology , Electrophysiology , Feedback/physiology , Long-Term Potentiation/physiology , Models, Neurological , Motor Activity/physiology , Neurons/physiology , Neurons, Afferent/physiology , Poisson Distribution , Synaptic Membranes/physiology
20.
Neural Comput ; 8(8): 1677-710, 1996 Nov 15.
Article in English | MEDLINE | ID: mdl-8888613

ABSTRACT

Single electrode recording in the inferotemporal cortex of monkeys during delayed visual memory tasks provide evidence for attractor dynamics in the observed region. The persistent elevated delay activities could be internal representations of features of the learned visual stimuli shown to the monkey during training. When uncorrelated stimuli are presented during training in a fixed sequence, these experiments display significant correlations between the internal representations. Recently a simple model of attractor neural network has reproduced quantitatively the measured correlations. An underlying assumption of the model is that the synaptic matrix formed during the training phase contains in its efficacies information about the contiguity of persistent stimuli in the training sequence. We present here a simple unsupervised learning dynamics that produces such a synaptic matrix if sequences of stimuli are repeatedly presented to the network at fixed order. The resulting matrix is then shown to convert temporal correlations during training into spatial correlations between attractors. The scenario is that, in the presence of selective delay activity, at the presentation of each stimulus, the activity distribution in the neural assembly contain information of both the current stimulus and the previous one (carried by the attractor). Thus the recurrent synaptic matrix can code not only for each of the stimuli presented to the network but also for their context. We combine the idea that for learning to be effective, synaptic modification should be stochastic, with the fact that attractors provide learnable information about two consecutive stimuli. We calculate explicitly the probability distribution of synaptic efficacies as a function of training protocol, that is, the order in which stimuli are presented to the network. We then solve for the dynamics of a network composed of integrate-and-fire excitatory and inhibitory neurons with a matrix of synaptic collaterals resulting from the learning dynamics. The network has a stable spontaneous activity, and stable delay activity develops after a critical learning stage. The availability of a learning dynamics makes possible a number of experimental predictions for the dependence of the delay activity distributions and the correlations between them, on the learning stage and the learning protocol. In particular it makes specific predictions for pair-associates delay experiments.


Subject(s)
Learning/physiology , Memory/physiology , Nerve Net/physiology , Reaction Time/physiology , Temporal Lobe/physiology , Animals , Haplorhini , Models, Neurological , Models, Psychological , Models, Statistical , Synapses/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...