Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
Add more filters










Publication year range
1.
bioRxiv ; 2023 Jan 18.
Article in English | MEDLINE | ID: mdl-36711713

ABSTRACT

Categorization is a fundamental cognitive process by which the brain assigns stimuli to behaviorally meaningful groups. Investigations of visual categorization in primates have identified a hierarchy of cortical areas that are involved in the transformation of sensory information into abstract category representations. However, categorization behaviors are ubiquitous across diverse animal species, even those without a neocortex, motivating the possibility that subcortical regions may contribute to abstract cognition in primates. One candidate structure is the superior colliculus (SC), an evolutionarily conserved midbrain region that, although traditionally thought to mediate only reflexive spatial orienting, is involved in cognitive tasks that require spatial orienting. Here, we reveal a novel role of the primate SC in abstract, higher-order visual cognition. We compared neural activity in the SC and the posterior parietal cortex (PPC), a region previously shown to causally contribute to category decisions, while monkeys performed a visual categorization task in which they report their decisions with a hand movement. The SC exhibits stronger and shorter-latency category encoding than the PPC, and inactivation of the SC markedly impairs monkeys' category decisions. These results extend SC's established role in spatial orienting to abstract, non-spatial cognition.

2.
Neuron ; 97(6): 1219-1234, 2018 03 21.
Article in English | MEDLINE | ID: mdl-29566792

ABSTRACT

Throughout the history of modern neuroscience, the parietal cortex has been associated with a wide array of sensory, motor, and cognitive functions. The use of non-human primates as a model organism has been instrumental in our current understanding of how areas in the posterior parietal cortex (PPC) modulate our perception and influence our behavior. In this Perspective, we highlight a series of influential studies over the last five decades examining the role of the PPC in visual perception and motor planning. We also integrate long-standing views of PPC functions with more recent evidence to propose a more general model framework to explain integrative sensory, motor, and cognitive functions of the PPC.


Subject(s)
Cognition/physiology , Nerve Net/physiology , Parietal Lobe/physiology , Psychomotor Performance/physiology , Visual Perception/physiology , Animals , Humans
3.
Elife ; 62017 04 24.
Article in English | MEDLINE | ID: mdl-28418332

ABSTRACT

Decisions about the behavioral significance of sensory stimuli often require comparing sensory inference of what we are looking at to internal models of what we are looking for. Here, we test how neuronal selectivity for visual features is transformed into decision-related signals in posterior parietal cortex (area LIP). Monkeys performed a visual matching task that required them to detect target stimuli composed of conjunctions of color and motion-direction. Neuronal recordings from area LIP revealed two main findings. First, the sequential processing of visual features and the selection of target-stimuli suggest that LIP is involved in transforming sensory information into decision-related signals. Second, the patterns of color and motion selectivity and their impact on decision-related encoding suggest that LIP plays a role in detecting target stimuli by comparing bottom-up sensory inputs (what the monkeys were looking at) and top-down cognitive encoding inputs (what the monkeys were looking for).


Subject(s)
Decision Making , Motion Perception , Parietal Lobe/physiology , Visual Perception , Animals , Electroencephalography , Macaca mulatta
4.
Neuron ; 91(4): 931-943, 2016 Aug 17.
Article in English | MEDLINE | ID: mdl-27499082

ABSTRACT

Lateral intraparietal (LIP) neurons encode a vast array of sensory and cognitive variables. Recently, we proposed that the flexibility of feature representations in LIP reflect the bottom-up integration of sensory signals, modulated by feature-based attention (FBA), from upstream feature-selective cortical neurons. Moreover, LIP activity is also strongly modulated by the position of space-based attention (SBA). However, the mechanisms by which SBA and FBA interact to facilitate the representation of task-relevant spatial and non-spatial features in LIP remain unclear. We recorded from LIP neurons during performance of a task that required monkeys to detect specific conjunctions of color, motion direction, and stimulus position. Here we show that FBA and SBA potentiate each other's effect in a manner consistent with attention gating the flow of visual information along the cortical visual pathway. Our results suggest that linear bottom-up integrative mechanisms allow LIP neurons to emphasize task-relevant spatial and non-spatial features.


Subject(s)
Attention/physiology , Color Perception/physiology , Motion Perception/physiology , Neurons/physiology , Parietal Lobe/cytology , Parietal Lobe/physiology , Animals , Macaca mulatta , Photic Stimulation , Space Perception/physiology
5.
J Neurosci ; 35(7): 3174-89, 2015 Feb 18.
Article in English | MEDLINE | ID: mdl-25698752

ABSTRACT

Despite an ever growing knowledge on how parietal and prefrontal neurons encode low-level spatial and color information or higher-level information, such as spatial attention, an understanding of how these cortical regions process neuronal information at the population level is still missing. A simple assumption would be that the function and temporal response profiles of these neuronal populations match that of its constituting individual cells. However, several recent studies suggest that this is not necessarily the case and that the single-cell approach overlooks dynamic changes in how information is distributed over the neuronal population. Here, we use a time-resolved population pattern analysis to explore how spatial position, spatial attention and color information are differentially encoded and maintained in the macaque monkey prefrontal (frontal eye fields) and parietal cortex (lateral intraparietal area). Overall, our work brings about three novel observations. First, we show that parietal and prefrontal populations operate in two distinct population regimens for the encoding of sensory and cognitive information: a stationary mode and a dynamic mode. Second, we show that the temporal dynamics of a heterogeneous neuronal population brings about complementary information to that of its functional subpopulations. Thus, both need to be investigated in parallel. Last, we show that identifying the neuronal configuration in which a neuronal population encodes given information can serve to reveal this same information in a different context. All together, this work challenges common views on neural coding in the parietofrontal network.


Subject(s)
Attention/physiology , Color , Frontal Lobe/cytology , Neurons/physiology , Nonlinear Dynamics , Parietal Lobe/cytology , Space Perception/physiology , Action Potentials/physiology , Animals , Cues , Female , Macaca mulatta , Male , Models, Neurological , Photic Stimulation , ROC Curve , Reaction Time , Statistics, Nonparametric
6.
Neuron ; 83(6): 1468-80, 2014 Sep 17.
Article in English | MEDLINE | ID: mdl-25199703

ABSTRACT

The primate visual system consists of multiple hierarchically organized cortical areas, each specialized for processing distinct aspects of the visual scene. For example, color and form are encoded in ventral pathway areas such as V4 and inferior temporal cortex, while motion is preferentially processed in dorsal pathway areas such as the middle temporal area. Such representations often need to be integrated perceptually to solve tasks that depend on multiple features. We tested the hypothesis that the lateral intraparietal area (LIP) integrates disparate task-relevant visual features by recording from LIP neurons in monkeys trained to identify target stimuli composed of conjunctions of color and motion features. We show that LIP neurons exhibit integrative representations of both color and motion features when they are task relevant and task-dependent shifts of both direction and color tuning. This suggests that LIP plays a role in flexibly integrating task-relevant sensory signals.


Subject(s)
Neurons/physiology , Parietal Lobe/physiology , Visual Perception/physiology , Animals , Electrophysiology , Macaca mulatta , Male , Photic Stimulation
7.
PLoS One ; 9(1): e86314, 2014.
Article in English | MEDLINE | ID: mdl-24466019

ABSTRACT

Decoding neuronal information is important in neuroscience, both as a basic means to understand how neuronal activity is related to cerebral function and as a processing stage in driving neuroprosthetic effectors. Here, we compare the readout performance of six commonly used classifiers at decoding two different variables encoded by the spiking activity of the non-human primate frontal eye fields (FEF): the spatial position of a visual cue, and the instructed orientation of the animal's attention. While the first variable is exogenously driven by the environment, the second variable corresponds to the interpretation of the instruction conveyed by the cue; it is endogenously driven and corresponds to the output of internal cognitive operations performed on the visual attributes of the cue. These two variables were decoded using either a regularized optimal linear estimator in its explicit formulation, an optimal linear artificial neural network estimator, a non-linear artificial neural network estimator, a non-linear naïve Bayesian estimator, a non-linear Reservoir recurrent network classifier or a non-linear Support Vector Machine classifier. Our results suggest that endogenous information such as the orientation of attention can be decoded from the FEF with the same accuracy as exogenous visual information. All classifiers did not behave equally in the face of population size and heterogeneity, the available training and testing trials, the subject's behavior and the temporal structure of the variable of interest. In most situations, the regularized optimal linear estimator and the non-linear Support Vector Machine classifiers outperformed the other tested decoders.


Subject(s)
Cognition , Prefrontal Cortex/physiology , Action Potentials , Animals , Bayes Theorem , Computer Simulation , Female , Macaca mulatta , Male , Models, Neurological , Neural Networks, Computer , Neurons/physiology , Photic Stimulation , Prefrontal Cortex/cytology , Support Vector Machine
8.
J Neurosci ; 33(19): 8359-69, 2013 May 08.
Article in English | MEDLINE | ID: mdl-23658175

ABSTRACT

Although we are confronted with an ever-changing environment, we do not have the capacity to analyze all incoming sensory information. Perception is selective and is guided both by salient events occurring in our visual field and by cognitive premises about what needs our attention. Although the lateral intraparietal area (LIP) and frontal eye field (FEF) are known to represent the position of visual attention, their respective contributions to its control are still unclear. Here, we report LIP and FEF neuronal activities recorded while monkeys performed a voluntary attention-orientation target-detection task. We show that both encode behaviorally significant events, but that the FEF plays a specific role in mapping abstract cue instructions onto a spatial priority map to voluntarily guide attention. On the basis of a latency analysis, we show that the coding of stimulus identity and position precedes the emergence of an explicit attentional signal within the FEF. We also describe dynamic temporal hierarchies between LIP and FEF: stimuli carrying the highest intrinsic saliency are signaled by LIP before FEF, whereas stimuli carrying the highest extrinsic saliency are signaled in FEF before LIP. This suggests that whereas the parietofrontal attentional network most probably processes visual information in a recurrent way, exogenous processing predominates in the parietal cortex and the endogenous control of attention takes place in the FEF.


Subject(s)
Attention/physiology , Brain Mapping , Decision Making/physiology , Frontal Lobe/physiology , Neural Pathways/physiology , Parietal Lobe/physiology , Action Potentials/physiology , Animals , Cues , Female , Frontal Lobe/cytology , Functional Laterality , Macaca mulatta , Magnetic Resonance Imaging , Male , Neurons/physiology , Parietal Lobe/cytology , Photic Stimulation , ROC Curve , Reaction Time/physiology
9.
J Physiol Paris ; 105(1-3): 115-22, 2011.
Article in English | MEDLINE | ID: mdl-21986475

ABSTRACT

While sensory and motor systems have attracted most of the research effort in the field neuroprosthetics, little attention has been devoted to higher order cortical processes. Here, we propose a first step in the direction of applying neural decoding to the study and manipulation of visuospatial attention, an endogenous process at the interface between sensory and motor functions. To this aim, we investigate whether the offline activity of a population of non-human primate frontal eye field neurons (FEF) in response to an endogenous cue can be readout on a trial by trial basis to provide a precise description of the cue's attributes, namely, its location and identity, but also the allocation of attention following its interpretation. Using a linear decoder, we reach up to 86% correct predictions for the different decoded variables, including the spatial allocation of endogenous attention. We show that the decoding performance drops on incorrect trials, indicating that cue encoding participates to the animal's behavioral performance. Last, we show that the temporal resolution of the decoding influences readout performance. These results are a strong indication of the feasibility of the readout of endogenous variables by standard decoding algorithms, on a suboptimal dataset. However, its validity remains to be proved in a real-time situation.


Subject(s)
Attention/physiology , Brain/physiology , Neurons/physiology , Animals , Cues , Macaca , Photic Stimulation , Reaction Time/physiology , User-Computer Interface , Vision, Ocular/physiology , Visual Fields/physiology
10.
PLoS One ; 4(8): e6716, 2009 Aug 21.
Article in English | MEDLINE | ID: mdl-19696923

ABSTRACT

Several studies have addressed the question of the time it takes for attention to shift from one position in space to another. Here we present a behavioural paradigm which offers a direct access to an estimate of voluntary shift time by comparing, in the same task, a situation in which subjects are required to re-engage their attention at the same spatial location with a situation in which they need to shift their attention to another location, all other sensory, cognitive and motor parameters being equal. We show that spatial attention takes on average 55 ms to voluntarily shift from one hemifield to the other and 38 ms to shift within the same hemifield. In addition, we show that across and within hemifields attentional processes are different. In particular, attentional spotlight division appears to be more difficult to operate within than across hemifields.


Subject(s)
Attention , Visual Fields , Cognition , Humans , Motor Activity , Reaction Time
11.
J Neurosci ; 26(16): 4228-35, 2006 Apr 19.
Article in English | MEDLINE | ID: mdl-16624943

ABSTRACT

The frontal eye field (FEF) has long been regarded as a cortical area critically involved in the execution of voluntary saccadic eye movements. However, recent studies have suggested that the FEF may also play a role in orienting attention. To address this issue, we reversibly inactivated the FEF using multiple microinjections of muscimol, a GABAA agonist, in two macaque monkeys performing visually guided saccades to a single target. The effects of FEF inactivation were also studied in a covert visual search task that required monkeys to search for a target presented among several distractors without making any eye movements. As expected, inactivating the FEF caused spatially selective deficits in executing visually guided saccades, but it also altered the ability to detect a visual target presented among distractors when no eye movements were permitted. These results allow us to conclude definitively to an involvement of the FEF in both oculomotor and attentional functions. Comparison of the present results with a similar experiment conducted in the lateral intraparietal cortex area revealed qualitatively different deficits, suggesting that the two areas may make distinct contributions to selective attention processes.


Subject(s)
Attention/physiology , Saccades/physiology , Visual Fields/physiology , Visual Perception/physiology , Animals , Macaca fascicularis , Macaca mulatta , Photic Stimulation/methods
SELECTION OF CITATIONS
SEARCH DETAIL
...