Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 26
Filter
1.
Multisens Res ; 35(4): 291-308, 2022 03 09.
Article in English | MEDLINE | ID: mdl-35263712

ABSTRACT

The integration of information from different sensory modalities is crucial for successful navigation through an environment. Among others, self-motion induces distinct optic flow patterns on the retina, vestibular signals and tactile flow, which contribute to determine traveled distance (path integration) or movement direction (heading). While the processing of combined visual-vestibular information is subject to a growing body of literature, the processing of visuo-tactile signals in the context of self-motion has received comparatively little attention. Here, we investigated whether visual heading perception is influenced by behaviorally irrelevant tactile flow. In the visual modality, we simulated an observer's self-motion across a horizontal ground plane (optic flow). Tactile self-motion stimuli were delivered by air flow from head-mounted nozzles (tactile flow). In blocks of trials, we presented only visual or tactile stimuli and subjects had to report their perceived heading. In another block of trials, tactile and visual stimuli were presented simultaneously, with the tactile flow within ±40° of the visual heading (bimodal condition). Here, importantly, participants had to report their perceived visual heading. Perceived self-motion direction in all conditions revealed a centripetal bias, i.e., heading directions were perceived as compressed toward straight ahead. In the bimodal condition, we found a small but systematic influence of task-irrelevant tactile flow on visually perceived headings as function of their directional offset. We conclude that tactile flow is more tightly linked to self-motion perception than previously thought.


Subject(s)
Motion Perception , Optic Flow , Vestibule, Labyrinth , Humans , Visual Perception , Vision, Ocular
2.
Brain Struct Funct ; 226(8): 2707-2723, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34468861

ABSTRACT

The oculomotor system can initiate remarkably accurate saccades towards moving targets (interceptive saccades) the processing of which is still under debate. The generation of these saccades requires the oculomotor centers to have information about the motion parameters of the target that then must be extrapolated to bridge the inherent processing delays. We investigated to what degree the information about motion of a saccade target is available in the lateral intra-parietal area (area LIP) of macaque monkeys for generation of accurate interceptive saccades. When a multi-layer neural network was trained based on neural discharges from area LIP around the time of saccades towards stationary targets, it was also able to predict the end points of saccades directed towards moving targets. This prediction, however, lagged behind the actual post-saccadic position of the moving target by ~ 80 ms when the whole neuronal sample of 105 neurons was used. We further found that single neurons differentially code for the motion of the target. Selecting neurons with the strongest representation of target motion reduced this lag to ~ 30 ms which represents the position of the moving target approximately at the onset of the interceptive saccade. We conclude that-similarly to recent findings from the Superior Colliculus (Goffart et al. J Neurophysiol 118(5):2890-2901)-there is a continuum of contributions of individual LIP neurons to the accuracy of interceptive saccades. A contribution of other gaze control centers (like the cerebellum or the frontal eye field) that further increase the saccadic accuracy is, however, likely.


Subject(s)
Macaca , Saccades , Animals , Haplorhini , Parietal Lobe , Photic Stimulation , Superior Colliculi
3.
Prog Neurobiol ; 205: 102117, 2021 10.
Article in English | MEDLINE | ID: mdl-34224808

ABSTRACT

The visually-based control of self-motion is a challenging task, requiring - if needed - immediate adjustments to keep on track. Accordingly, it would appear advantageous if the processing of self-motion direction (heading) was predictive, thereby accelerating the encoding of unexpected changes, and un-impaired by attentional load. We tested this hypothesis by recording EEG in humans and macaque monkeys with similar experimental protocols. Subjects viewed a random dot pattern simulating self-motion across a ground plane in an oddball EEG paradigm. Standard and deviant trials differed only in their simulated heading direction (forward-left vs. forward-right). Event-related potentials (ERPs) were compared in order to test for the occurrence of a visual mismatch negativity (vMMN), a component that reflects preattentive and likely also predictive processing of sensory stimuli. Analysis of the ERPs revealed signatures of a prediction mismatch for deviant stimuli in both humans and monkeys. In humans, a MMN was observed starting 110 ms after self-motion onset. In monkeys, peak response amplitudes following deviant stimuli were enhanced compared to the standard already 100 ms after self-motion onset. We consider our results strong evidence for a preattentive processing of visual self-motion information in humans and monkeys, allowing for ultrafast adjustments of their heading direction.


Subject(s)
Electroencephalography , Animals , Attention , Evoked Potentials , Haplorhini , Humans
4.
J Neurophysiol ; 125(6): 2432-2443, 2021 06 01.
Article in English | MEDLINE | ID: mdl-34010579

ABSTRACT

Successful interaction with the environment requires the dissociation of self-induced from externally induced sensory stimulation. Temporal proximity of action and effect is hereby often used as an indicator of whether an observed event should be interpreted as a result of own actions or not. We tested how the delay between an action (press of a touch bar) and an effect (onset of simulated self-motion) influences the processing of visually simulated self-motion in the ventral intraparietal area (VIP) of macaque monkeys. We found that a delay between the action and the start of the self-motion stimulus led to a rise of activity above the baseline activity before motion onset in a subpopulation of 21% of the investigated neurons. In the responses to the stimulus, we found a significantly lower sustained activity when the press of a touch bar and the motion onset were contiguous compared to the condition when the motion onset was delayed. We speculate that this weak inhibitory effect might be part of a mechanism that sharpens the tuning of VIP neurons during self-induced motion and thus has the potential to increase the precision of heading information that is required to adjust the orientation of self-motion in everyday navigational tasks.NEW & NOTEWORTHY Neurons in macaque ventral intraparietal area (VIP) are responding to sensory stimulation related to self-motion, e.g. visual optic flow. Here, we found that self-motion induced activation depends on the sense of agency, i.e., it differed when optic flow was perceived as self- or externally induced. This demonstrates that area VIP is well suited for study of the interplay between active behavior and sensory processing during self-motion.


Subject(s)
Kinesthesis/physiology , Motion Perception/physiology , Motor Activity/physiology , Optic Flow/physiology , Parietal Lobe/physiology , Animals , Electrocorticography , Macaca mulatta , Male , Neurons/physiology
5.
Physiol Rep ; 6(22): e13921, 2018 Nov.
Article in English | MEDLINE | ID: mdl-30450739

ABSTRACT

Self-motion induces spontaneous eye movements which serve the purpose of stabilizing the visual image on the retina. Previous studies have mainly focused on their reflexive nature and how the perceptual system disentangles visual flow components caused by eye movements and self-motion. Here, we investigated the role of eye movements in distance reproduction (path integration). We used bimodal (visual-auditory)-simulated self-motion: visual optic flow was paired with an auditory stimulus whose frequency was scaled with simulated speed. The task of the subjects in each trial was, first, to observe the simulated self-motion over a certain distance (Encoding phase) and, second, to actively reproduce the observed distance using only visual, only auditory, or bimodal feedback (Reproduction phase). We found that eye positions and eye speeds were strongly correlated between the Encoding and the Reproduction phases. This was the case even when reproduction relied solely on auditory information and thus no visual stimulus was presented. We believe that these correlations are indicative of a contribution of eye movements to path integration.


Subject(s)
Auditory Perception , Eye Movements , Psychomotor Performance , Feedback, Physiological , Female , Humans , Male , Optic Flow , Young Adult
6.
J Eye Mov Res ; 11(4)2018 Nov 09.
Article in English | MEDLINE | ID: mdl-33828708

ABSTRACT

Direct comparison of results of humans and monkeys is often complicated by differences in experimental conditions. We replicated in head unrestrained macaques experiments of a recent study comparing human directional precision during smooth pursuit eye movements (SPEM) and saccades to moving targets (Braun & Gegenfurtner, 2016). Directional precision of human SPEM follows an exponential decay function reaching optimal values of 1.5°-3° within 300 ms after target motion onset, whereas precision of initial saccades to moving targets is slightly better. As in humans, we found general agreement in the devel-opment of directional precision of SPEM over time and in the differences between direc-tional precision of initial saccades and SPEM initiation. However, monkeys showed over-all lower precision in SPEM compared to humans. This was most likely due to differences in experimental conditions, such as in the stabilization of the head, which was by a chin and a head rest in human subjects and unrestrained in monkeys.

7.
Nat Commun ; 8(1): 920, 2017 10 13.
Article in English | MEDLINE | ID: mdl-29030557

ABSTRACT

Perceptual illusions help to understand how sensory signals are decoded in the brain. Here we report that the opposite approach is also applicable, i.e., results from decoding neural activity from monkey extrastriate visual cortex correctly predict a hitherto unknown perceptual illusion in humans. We record neural activity from monkey medial superior temporal (MST) and ventral intraparietal (VIP) area during presentation of self-motion stimuli and concurrent reflexive eye movements. A heading-decoder performs veridically during slow eye movements. During fast eye movements (saccades), however, the decoder erroneously reports compression of heading toward straight ahead. Functional equivalents of macaque areas MST and VIP have been identified in humans, implying a perceptual correlate (illusion) of this perisaccadic decoding error. Indeed, a behavioral experiment in humans shows that perceived heading is perisaccadically compressed toward the direction of gaze. Response properties of primate areas MST and VIP are consistent with being the substrate of the newly described visual illusion.Macaque higher visual areas MST and VIP encode heading direction based on self-motion stimuli. Here the authors show that, while making saccades, the heading direction decoded from the neural responses is compressed toward straight-ahead, and independently demonstrate a perceptual illusion in humans based on this perisaccadic decoding error.


Subject(s)
Eye Movements/physiology , Motion Perception/physiology , Saccades/physiology , Visual Perception/physiology , Adult , Animals , Female , Head Movements/physiology , Humans , Macaca mulatta , Male , Parietal Lobe/physiology , Photic Stimulation , Temporal Lobe/physiology
8.
J Neurophysiol ; 118(3): 1650-1663, 2017 09 01.
Article in English | MEDLINE | ID: mdl-28659463

ABSTRACT

In the natural world, self-motion always stimulates several different sensory modalities. Here we investigated the interplay between a visual optic flow stimulus simulating self-motion and a tactile stimulus (air flow resulting from self-motion) while human observers were engaged in a distance reproduction task. We found that adding congruent tactile information (i.e., speed of the air flow and speed of visual motion are directly proportional) to the visual information significantly improves the precision of the actively reproduced distances. This improvement, however, was smaller than predicted for an optimal integration of visual and tactile information. In contrast, incongruent tactile information (i.e., speed of the air flow and speed of visual motion are inversely proportional) did not improve subjects' precision indicating that incongruent tactile information and visual information were not integrated. One possible interpretation of the results is a link to properties of neurons in the ventral intraparietal area that have been shown to have spatially and action-congruent receptive fields for visual and tactile stimuli.NEW & NOTEWORTHY This study shows that tactile and visual information can be integrated to improve the estimates of the parameters of self-motion. This, however, happens only if the two sources of information are congruent-as they are in a natural environment. In contrast, an incongruent tactile stimulus is still used as a source of information about self-motion but it is not integrated with visual information.


Subject(s)
Motion Perception , Touch Perception , Adult , Female , Humans , Male , Movement , Parietal Lobe/physiology
9.
Front Integr Neurosci ; 10: 30, 2016.
Article in English | MEDLINE | ID: mdl-27630547

ABSTRACT

Primates perform saccadic eye movements in order to bring the image of an interesting target onto the fovea. Compared to stationary targets, saccades toward moving targets are computationally more demanding since the oculomotor system must use speed and direction information about the target as well as knowledge about its own processing latency to program an adequate, predictive saccade vector. In monkeys, different brain regions have been implicated in the control of voluntary saccades, among them the lateral intraparietal area (LIP). Here we asked, if activity in area LIP reflects the distance between fovea and saccade target, or the amplitude of an upcoming saccade, or both. We recorded single unit activity in area LIP of two macaque monkeys. First, we determined for each neuron its preferred saccade direction. Then, monkeys performed visually guided saccades along the preferred direction toward either stationary or moving targets in pseudo-randomized order. LIP population activity allowed to decode both, the distance between fovea and saccade target as well as the size of an upcoming saccade. Previous work has shown comparable results for saccade direction (Graf and Andersen, 2014a,b). Hence, LIP population activity allows to predict any two-dimensional saccade vector. Functional equivalents of macaque area LIP have been identified in humans. Accordingly, our results provide further support for the concept of activity from area LIP as neural basis for the control of an oculomotor brain-machine interface.

10.
J Neurosci ; 34(14): 4760-5, 2014 Apr 02.
Article in English | MEDLINE | ID: mdl-24695696

ABSTRACT

Corollary discharge signals are found in the nervous systems of many animals, where they serve a large variety of functions related to the integration of sensory and motor signals. In humans, an important corollary discharge signal is generated by oculomotor structures and communicated to sensory systems in concert with the execution of each saccade. This signal is thought to serve a number of purposes related to the maintenance of accurate visual perception. The properties of the oculomotor corollary discharge can be probed by asking subjects to localize stimuli that are flashed briefly around the time of a saccade. The results of such experiments typically reveal large errors in localization. Here, we have exploited these well-known psychophysical effects to assess the potential dysfunction of corollary discharge signals in people with schizophrenia. In a standard perisaccadic localization task, we found that, compared with controls, patients with schizophrenia exhibited larger errors in localizing visual stimuli. The pattern of errors could be modeled as an overdamped corollary discharge signal that encodes instantaneous eye position. The dynamics of this signal predicted symptom severity among patients, suggesting a possible mechanistic basis for widely observed behavioral manifestations of schizophrenia.


Subject(s)
Perceptual Disorders/etiology , Saccades/physiology , Schizophrenia/complications , Space Perception/physiology , Adult , Female , Humans , Male , Models, Biological , Photic Stimulation , Psychiatric Status Rating Scales , Psychophysics , Reaction Time
11.
J Neurophysiol ; 108(10): 2653-67, 2012 Nov.
Article in English | MEDLINE | ID: mdl-22933722

ABSTRACT

Saccades are useful for directing the high-acuity fovea to visual targets that are of behavioral relevance. The selection of visual targets for eye movements involves the superior colliculus (SC), where many neurons respond to visual stimuli. Many of these neurons are also activated before and during saccades of specific directions and amplitudes. Although the role of the SC in controlling eye movements has been thoroughly examined, far less is known about the nature of the visual responses in this area. We have, therefore, recorded from neurons in the intermediate layers of the macaque SC, while using a sparse-noise mapping procedure to obtain a detailed characterization of the spatiotemporal structure of visual receptive fields. We find that SC responses to flashed visual stimuli start roughly 50 ms after the onset of the stimulus and last for on average ~70 ms. About 50% of these neurons are strongly suppressed by visual stimuli flashed at certain locations flanking the excitatory center, and the spatiotemporal pattern of suppression exerts a predictable influence on the timing of saccades. This suppression may, therefore, contribute to the filtering of distractor stimuli during target selection. We also find that saccades affect the processing of visual stimuli by SC neurons in a manner that is quite similar to the saccadic suppression and postsaccadic enhancement that has been observed in the cortex and in perception. However, in contrast to what has been observed in the cortex, decreased visual sensitivity was generally associated with increased firing rates, while increased sensitivity was associated with decreased firing rates. Overall, these results suggest that the processing of visual stimuli by SC receptive fields can influence oculomotor behavior and that oculomotor signals originating in the SC can shape perisaccadic visual perception.


Subject(s)
Neurons/physiology , Superior Colliculi/physiology , Visual Fields/physiology , Animals , Fixation, Ocular , Macaca , Male , Neurons/classification , Photic Stimulation , Psychomotor Performance , Saccades/physiology , Superior Colliculi/cytology
12.
PLoS One ; 7(12): e52195, 2012.
Article in English | MEDLINE | ID: mdl-23284931

ABSTRACT

Visual neurons have spatial receptive fields that encode the positions of objects relative to the fovea. Because foveate animals execute frequent saccadic eye movements, this position information is constantly changing, even though the visual world is generally stationary. Interestingly, visual receptive fields in many brain regions have been found to exhibit changes in strength, size, or position around the time of each saccade, and these changes have often been suggested to be involved in the maintenance of perceptual stability. Crucial to the circuitry underlying perisaccadic changes in visual receptive fields is the superior colliculus (SC), a brainstem structure responsible for integrating visual and oculomotor signals. In this work we have studied the time-course of receptive field changes in the SC. We find that the distribution of the latencies of SC responses to stimuli placed outside the fixation receptive field is bimodal: The first mode is comprised of early responses that are temporally locked to the onset of the visual probe stimulus and stronger for probes placed closer to the classical receptive field. We suggest that such responses are therefore consistent with a perisaccadic rescaling, or enhancement, of weak visual responses within a fixed spatial receptive field. The second mode is more similar to the remapping that has been reported in the cortex, as responses are time-locked to saccade onset and stronger for stimuli placed in the postsaccadic receptive field location. We suggest that these two temporal phases of spatial updating may represent different sources of input to the SC.


Subject(s)
Macaca/physiology , Superior Colliculi/physiology , Animals , Photic Stimulation , Visual Fields/physiology , Visual Perception/physiology
13.
J Vis ; 11(12)2011 Oct 06.
Article in English | MEDLINE | ID: mdl-21980187

ABSTRACT

In primates, inspection of a visual scene is typically interrupted by frequent gaze shifts, occurring at an average rate of three to five times per second. Perceptually, these gaze shifts are accompanied by a compression of visual space toward the saccade target, which may be attributed to an oculomotor signal that transiently influences visual processing. While previous studies of compression have focused exclusively on saccadic eye movements made with the head artificially immobilized, many brain structures involved in saccade generation also encode combined eye-head gaze shifts. Thus, in order to understand the interaction between gaze motor and visual signals, we studied perception during eye-head gaze shifts and found a powerful compression of visual space that was spatially directed toward the intended gaze (and not the eye movement) target location. This perceptual compression was nearly constant in duration across gaze shift amplitudes, suggesting that the signal that triggers compression is largely independent of the size and kinematics of the gaze shift. The spatial pattern of results could be captured by a model that involves interactions, on a logarithmic map of visual space, between two loci of neural activity that encode the gaze shift vector and visual stimulus position relative to the fovea.


Subject(s)
Models, Neurological , Saccades/physiology , Space Perception/physiology , Superior Colliculi/physiology , Vision, Binocular/physiology , Animals , Biomechanical Phenomena/physiology , Fixation, Ocular/physiology , Fovea Centralis/physiology , Head , Humans , Immobilization/methods , Male , Photic Stimulation/methods , Primates , Psychomotor Performance/physiology
14.
J Neurophysiol ; 106(4): 1862-74, 2011 Oct.
Article in English | MEDLINE | ID: mdl-21753030

ABSTRACT

Our perception of the positions of objects in our surroundings is surprisingly unaffected by movements of the eyes, head, and body. This suggests that the brain has a mechanism for maintaining perceptual stability, based either on the spatial relationships among visible objects or internal copies of its own motor commands. Strong evidence for the latter mechanism comes from the remapping of visual receptive fields that occurs around the time of a saccade. Remapping occurs when a single neuron responds to visual stimuli placed presaccadically in the spatial location that will be occupied by its receptive field after the completion of a saccade. Although evidence for remapping has been found in many brain areas, relatively little is known about how it interacts with sensory context. This interaction is important for understanding perceptual stability more generally, as the brain may rely on extraretinal signals or visual signals to different degrees in different contexts. Here, we have studied the interaction between visual stimulation and remapping by recording from single neurons in the superior colliculus of the macaque monkey, using several different visual stimulus conditions. We find that remapping responses are highly sensitive to low-level visual signals, with the overall luminance of the visual background exerting a particularly powerful influence. Specifically, although remapping was fairly common in complete darkness, such responses were usually decreased or abolished in the presence of modest background illumination. Thus the brain might make use of a strategy that emphasizes visual landmarks over extraretinal signals whenever the former are available.


Subject(s)
Saccades/physiology , Space Perception/physiology , Superior Colliculi/physiology , Animals , Darkness , Fixation, Ocular , Light , Macaca mulatta , Male , Photic Stimulation , Psychomotor Performance/physiology , Visual Fields
15.
Vision Res ; 51(6): 577-84, 2011 Mar 25.
Article in English | MEDLINE | ID: mdl-21300079

ABSTRACT

Amblyopia is characterised by visual deficits in both spatial vision and motion perception. While the spatial deficits are thought to result from deficient processing at both low and higher level stages of visual processing, the deficits in motion perception appear to result primarily from deficits involving higher level processing. Specifically, it has been argued that the motion deficit in amblyopia occurs when local motion information is pooled spatially and that this process is abnormally susceptible to the presence of noise elements in the stimulus. Here we investigated motion direction discrimination for abruptly presented two-frame Gabor stimuli in a group of five strabismic amblyopes and five control observers. Motion direction discrimination for this stimulus is inherently noisy and relies on the signal/noise processing of motion detectors. We varied viewing condition (monocular vs. binocular), stimulus size (5.3-18.5°) and stimulus contrast (high vs. low) in order to assess the effects of binocular summation, spatial summation and contrast on task performance. No differences were found for the high contrast stimuli; however the low contrast stimuli revealed differences between the control and amblyopic groups and between fellow fixing and amblyopic eyes. Control participants exhibited pronounced binocular summation for this task (on average a factor of 3.7), whereas amblyopes showed no such effect. In addition, the spatial summation that occurred for control eyes and the fellow eye of amblyopes was significantly attenuated for the amblyopic eyes relative to fellow eyes. Our results support the hypothesis that pooling of local motion information from amblyopic eyes is abnormal and highly sensitive to noise.


Subject(s)
Amblyopia/physiopathology , Discrimination, Psychological/physiology , Motion Perception/physiology , Strabismus/physiopathology , Vision, Binocular/physiology , Vision, Monocular/physiology , Adult , Analysis of Variance , Contrast Sensitivity/physiology , Female , Humans , Male , Middle Aged , Size Perception/physiology , Young Adult
16.
Neurosci Lett ; 469(3): 411-5, 2010 Jan 29.
Article in English | MEDLINE | ID: mdl-20035830

ABSTRACT

Analyses of neural mechanisms of duration processing are essential for the understanding of psychological phenomena which evolve in time. Different mechanisms are presumably responsible for the processing of shorter (below 500 ms) and longer (above 500 ms) events but have not yet been a subject of an investigation with functional magnetic resonance imaging (fMRI). In the present study, we show a greater involvement of several brain regions - including right-hemispheric midline structures and left-hemispheric lateral regions - in the processing of visual stimuli of shorter as compared to longer duration. We propose a greater involvement of lower-level cognitive mechanisms in the processing of shorter events as opposed to higher-level mechanisms of cognitive control involved in longer events.


Subject(s)
Brain/physiology , Discrimination, Psychological/physiology , Time Perception/physiology , Visual Perception/physiology , Adult , Brain Mapping , Cognition/physiology , Female , Humans , Magnetic Resonance Imaging , Male , Neuropsychological Tests , Photic Stimulation , Time Factors , Young Adult
17.
J Vis ; 9(4): 15.1-15, 2009 Apr 20.
Article in English | MEDLINE | ID: mdl-19757924

ABSTRACT

Recent psychophysical work has shown that performance on a direction discrimination task decreases with increasing stimulus size, provided the stimulus is high in contrast. This psychophysical surround suppression has been linked to the inhibitory spatial surrounds that have been observed throughout the primate visual system. In this work we have examined a temporal factor that may also contribute to psychophysical surround suppression. Consistent with previous work, we found that psychophysical surround suppression is strongest when a high-contrast motion stimulus is presented very briefly so that the appearance of the stimulus coincided with its motion. However, when a brief delay was inserted between the stimulus onset and the onset of motion, the counterintuitive effects of stimulus size disappeared. The effect of the motion onset asynchrony (MOA) was strongest when the stationary stimulus immediately preceded the stimulus motion and when stimulus orientation during the MOA was very similar to that during the motion presentation. We conclude that psychophysical surround suppression is partially linked to the temporal structure of the stimulus, more precisely to a masking effect caused by sudden stimulus onsets (and to a smaller degree stimulus offsets).


Subject(s)
Motion Perception/physiology , Neural Inhibition/physiology , Psychophysics , Reaction Time/physiology , Retina/physiology , Adult , Contrast Sensitivity/physiology , Humans , Male , Orientation/physiology , Perceptual Masking/physiology , Photic Stimulation/methods , Sensory Thresholds/physiology , Space Perception/physiology
18.
J Neurosci ; 29(32): 10160-70, 2009 Aug 12.
Article in English | MEDLINE | ID: mdl-19675250

ABSTRACT

Our ability to explore our surroundings requires a combination of high-resolution vision and frequent rotations of the visual axis toward objects of interest. Such gaze shifts are themselves a source of powerful retinal stimulation, and so the visual system appears to have evolved mechanisms to maintain perceptual stability during movements of the eyes in space. The mechanisms underlying this perceptual stability can be probed in the laboratory by briefly presenting a stimulus around the time of a saccadic eye movement and asking subjects to report its position. Under such conditions, there is a systematic misperception of the probes toward the saccade end point. This perisaccadic compression of visual space has been the subject of much research, but few studies have attempted to relate it to specific brain mechanisms. Here, we show that the magnitude of perceptual compression for a wide variety of probe stimuli and saccade amplitudes is quantitatively predicted by a simple heuristic model based on the geometry of retinotopic representations in the primate brain. Specifically, we propose that perisaccadic compression is determined by the distance between the probe and saccade end point on a map that has a logarithmic representation of visual space, similar to those found in numerous cortical and subcortical visual structures. Under this assumption, the psychophysical data on perisaccadic compression can be appreciated intuitively by imagining that, around the time of a saccade, the brain confounds nearby oculomotor and sensory signals while attempting to localize the position of objects in visual space.


Subject(s)
Models, Neurological , Saccades , Visual Perception , Adult , Algorithms , Brain/physiology , Eye Movement Measurements , Humans , Male , Photic Stimulation , Psychophysics , Space Perception , Time Factors
19.
Article in English | MEDLINE | ID: mdl-19105053

ABSTRACT

The present paper investigates the effects of age, sex, and cognitive factors on temporal-order perception. Nine temporal-order tasks were employed using two and four stimuli presented in the auditory and visual modalities. Significantly increased temporal-order thresholds (TOT) in the elderly were found for almost all tasks, while sex differences were only observed for two tasks. Multiple regression analyses show that the performance on most temporal-order tasks can be predicted by cognitive factors, such as speed of fluid reasoning, short-term memory, and attention. However, age was a significant predictor of TOT in three tasks using visual stimuli. We conclude (1) that age-related differences can often be attributed to cognitive factors involved in temporal-order perception, and (2) that the concept of temporal-order perception is more complex than implied by the current models.


Subject(s)
Aging/psychology , Cognition , Sex Characteristics , Time Perception , Acoustic Stimulation , Adult , Aged , Aged, 80 and over , Analysis of Variance , Female , Humans , Male , Middle Aged , Photic Stimulation , Regression Analysis , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...