Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 52
Filter
1.
Trends Cogn Sci ; 2024 May 18.
Article in English | MEDLINE | ID: mdl-38763804

ABSTRACT

Our ability to perceive multiple objects is mysterious. Sensory neurons are broadly tuned, producing potential overlap in the populations of neurons activated by each object in a scene. This overlap raises questions about how distinct information is retained about each item. We present a novel signal switching theory of neural representation, which posits that neural signals may interleave representations of individual items across time. Evidence for this theory comes from new statistical tools that overcome the limitations inherent to standard time-and-trial-pooled assessments of neural signals. Our theory has implications for diverse domains of neuroscience, including attention, figure binding/scene segregation, oscillations, and divisive normalization. The general concept of switching between functions could also lend explanatory power to theories of grounded cognition.

2.
Elife ; 132024 Mar 15.
Article in English | MEDLINE | ID: mdl-38489224

ABSTRACT

How neural representations preserve information about multiple stimuli is mysterious. Because tuning of individual neurons is coarse (e.g., visual receptive field diameters can exceed perceptual resolution), the populations of neurons potentially responsive to each individual stimulus can overlap, raising the question of how information about each item might be segregated and preserved in the population. We recently reported evidence for a potential solution to this problem: when two stimuli were present, some neurons in the macaque visual cortical areas V1 and V4 exhibited fluctuating firing patterns, as if they responded to only one individual stimulus at a time (Jun et al., 2022). However, whether such an information encoding strategy is ubiquitous in the visual pathway and thus could constitute a general phenomenon remains unknown. Here, we provide new evidence that such fluctuating activity is also evoked by multiple stimuli in visual areas responsible for processing visual motion (middle temporal visual area, MT), and faces (middle fundus and anterolateral face patches in inferotemporal cortex - areas MF and AL), thus extending the scope of circumstances in which fluctuating activity is observed. Furthermore, consistent with our previous results in the early visual area V1, MT exhibits fluctuations between the representations of two stimuli when these form distinguishable objects but not when they fuse into one perceived object, suggesting that fluctuating activity patterns may underlie visual object formation. Taken together, these findings point toward an updated model of how the brain preserves sensory information about multiple stimuli for subsequent processing and behavioral action.


Subject(s)
Visual Cortex , Visual Pathways , Visual Pathways/physiology , Visual Cortex/physiology , Visual Fields , Neurons/physiology , Photic Stimulation
3.
Proc Natl Acad Sci U S A ; 120(48): e2303562120, 2023 Nov 28.
Article in English | MEDLINE | ID: mdl-37988462

ABSTRACT

Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect neural pathways from the ear through auditory cortex and beyond, but how these signals contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evaluated the information contained in eye movement-related eardrum oscillations (EMREOs), pressure changes recorded in the ear canal that occur in conjunction with simultaneous eye movements. We show that EMREOs contain parametric information about horizontal and vertical eye displacement as well as initial/final eye position with respect to the head. The parametric information in the horizontal and vertical directions can be modeled as combining linearly, allowing accurate prediction of the EMREOs associated with oblique (diagonal) eye movements. Target location can also be inferred from the EMREO signals recorded during eye movements to those targets. We hypothesize that the (currently unknown) mechanism underlying EMREOs could impose a two-dimensional eye-movement-related transfer function on any incoming sound, permitting subsequent processing stages to compute the positions of sounds in relation to the visual scene.


Subject(s)
Eye Movements , Saccades , Movement , Ocular Physiological Phenomena , Sound
4.
Hear Res ; 440: 108899, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37979436

ABSTRACT

We recently discovered a unique type of otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate. Clues to both the drivers of EMREOs and their purpose can be gleaned by examining responses in normal hearing human subjects. Do EMREOs occur in all individuals with normal hearing? If so, what components of the response occur most consistently? Understanding which attributes of EMREOs are similar across participants and which show more variability will provide the groundwork for future comparisons with individuals with hearing abnormalities affecting the ear's various motor components. Here we report that in subjects with normal hearing thresholds and normal middle ear function, all ears show (a) measurable EMREOs (mean: 58.7 dB SPL; range 45-67 dB SPL for large contralateral saccades), (b) a phase reversal for contra- versus ipsilaterally-directed saccades, (c) a large peak in the signal occurring soon after saccade onset, (d) an additional large peak time-locked to saccade offset and (e) evidence that saccade duration is encoded in the signal. We interpret the attributes of EMREOs that are most consistent across subjects as the ones that are most likely to play an essential role in their function. The individual differences likely reflect normal variation in individuals' auditory system anatomy and physiology, much like traditional measures of auditory function such as auditory-evoked OAEs, tympanometry and auditory-evoked potentials. Future work will compare subjects with different types of auditory dysfunction to population data from normal hearing subjects. Overall, these findings provide important context for the widespread observations of visual- and eye-movement related signals found in cortical and subcortical auditory areas of the brain.


Subject(s)
Hearing , Tympanic Membrane , Humans , Hearing/physiology , Otoacoustic Emissions, Spontaneous/physiology , Acoustic Impedance Tests , Sound
5.
Philos Trans R Soc Lond B Biol Sci ; 378(1886): 20220340, 2023 09 25.
Article in English | MEDLINE | ID: mdl-37545299

ABSTRACT

Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory spatial signals. The recent discovery of eye movement-related eardrum oscillations (EMREOs) suggests that this process could begin as early as the auditory periphery. How this reconciliation might happen remains poorly understood. Because humans and monkeys both have mobile eyes and therefore both must perform this shift of reference frames, comparison of the EMREO across species can provide insights to shared and therefore important parameters of the signal. Here we show that rhesus monkeys, like humans, have a consistent, significant EMREO signal that carries parametric information about eye displacement as well as onset times of eye movements. The dependence of the EMREO on the horizontal displacement of the eye is its most consistent feature, and is shared across behavioural tasks, subjects and species. Differences chiefly involve the waveform frequency (higher in monkeys than in humans) and patterns of individual variation (more prominent in monkeys than in humans), and the waveform of the EMREO when factors due to horizontal and vertical eye displacements were controlled for. This article is part of the theme issue 'Decision and control processes in multisensory perception'.


Subject(s)
Eye Movements , Tympanic Membrane , Humans , Cues , Movement
6.
bioRxiv ; 2023 Jul 19.
Article in English | MEDLINE | ID: mdl-37502939

ABSTRACT

How neural representations preserve information about multiple stimuli is mysterious. Because tuning of individual neurons is coarse (for example, visual receptive field diameters can exceed perceptual resolution), the populations of neurons potentially responsive to each individual stimulus can overlap, raising the question of how information about each item might be segregated and preserved in the population. We recently reported evidence for a potential solution to this problem: when two stimuli were present, some neurons in the macaque visual cortical areas V1 and V4 exhibited fluctuating firing patterns, as if they responded to only one individual stimulus at a time. However, whether such an information encoding strategy is ubiquitous in the visual pathway and thus could constitute a general phenomenon remains unknown. Here we provide new evidence that such fluctuating activity is also evoked by multiple stimuli in visual areas responsible for processing visual motion (middle temporal visual area, MT), and faces (middle fundus and anterolateral face patches in inferotemporal cortex - areas MF and AL), thus extending the scope of circumstances in which fluctuating activity is observed. Furthermore, consistent with our previous results in the early visual area V1, MT exhibits fluctuations between the representations of two stimuli when these form distinguishable objects but not when they fuse into one perceived object, suggesting that fluctuating activity patterns may underlie visual object formation. Taken together, these findings point toward an updated model of how the brain preserves sensory information about multiple stimuli for subsequent processing and behavioral action. Impact Statement: We find neural fluctuations in multiple areas along the visual cortical hierarchy that could allow the brain to represent distinct co-occurring visual stimuli.

7.
bioRxiv ; 2023 Aug 06.
Article in English | MEDLINE | ID: mdl-36945521

ABSTRACT

We recently discovered a unique type of low-frequency otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate. Clues to both the drivers of EMREOs and their purpose can be gleaned by examining responses in normal hearing human subjects. Do EMREOs occur in all individuals with normal hearing? If so, what components of the response occur most consistently? Understanding which attributes of EMREOs are similar across participants and which show more variability will provide the groundwork for future comparisons with individuals with hearing abnormalities affecting the ear's various motor components. Here we report that in subjects with normal hearing thresholds and normal middle ear function, all ears show (a) measurable EMREOs (mean: 58.7 dB SPL; range 45-67 dB SPL for large contralateral saccades), (b) a phase reversal for contra- versus ipsilaterally-directed saccades, (c) a large peak in the signal occurring soon after saccade onset, (d) an additional large peak time-locked to saccade offset and (e) evidence that saccade duration is encoded in the signal. We interpret the attributes of EMREOs that are most consistent across subjects as the ones that are most likely to play an essential role in their function. The individual differences likely reflect normal variation in individuals' auditory system anatomy and physiology, much like traditional measures of auditory function such as auditory-evoked OAEs, tympanometry and auditory-evoked potentials. Future work will compare subjects with different types of auditory dysfunction to population data from normal hearing subjects. Overall, these findings provide important context for the widespread observations of visual- and eye-movement related signals found in cortical and subcortical auditory areas of the brain.

8.
bioRxiv ; 2023 May 22.
Article in English | MEDLINE | ID: mdl-36945629

ABSTRACT

Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory spatial signals. The recent discovery of eye movement-related eardrum oscillations (EMREOs) suggests that this process could begin as early as the auditory periphery. How this reconciliation might happen remains poorly understood. Because humans and monkeys both have mobile eyes and therefore both must perform this shift of reference frames, comparison of the EMREO across species can provide insights to shared and therefore important parameters of the signal. Here we show that rhesus monkeys, like humans, have a consistent, significant EMREO signal that carries parametric information about eye displacement as well as onset times of eye movements. The dependence of the EMREO on the horizontal displacement of the eye is its most consistent feature, and is shared across behavioral tasks, subjects, and species. Differences chiefly involve the waveform frequency (higher in monkeys than in humans) and patterns of individual variation (more prominent in monkeys than humans), and the waveform of the EMREO when factors due to horizontal and vertical eye displacements were controlled for.

9.
Elife ; 112022 11 29.
Article in English | MEDLINE | ID: mdl-36444983

ABSTRACT

Sensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information about each of the stimuli that may be present at a given moment? We recently showed that when more than one stimulus is present, single neurons can fluctuate between coding one vs. the other(s) across some time period, suggesting a form of neural multiplexing of different stimuli (Caruso et al., 2018). Here, we investigate (a) whether such coding fluctuations occur in early visual cortical areas; (b) how coding fluctuations are coordinated across the neural population; and (c) how coordinated coding fluctuations depend on the parsing of stimuli into separate vs. fused objects. We found coding fluctuations do occur in macaque V1 but only when the two stimuli form separate objects. Such separate objects evoked a novel pattern of V1 spike count ('noise') correlations involving distinct distributions of positive and negative values. This bimodal correlation pattern was most pronounced among pairs of neurons showing the strongest evidence for coding fluctuations or multiplexing. Whether a given pair of neurons exhibited positive or negative correlations depended on whether the two neurons both responded better to the same object or had different object preferences. Distinct distributions of spike count correlations based on stimulus preferences were also seen in V4 for separate objects but not when two stimuli fused to form one object. These findings suggest multiple objects evoke different response dynamics than those evoked by single stimuli, lending support to the multiplexing hypothesis and suggesting a means by which information about multiple objects can be preserved despite the apparent coarseness of sensory coding.


Subject(s)
Visual Cortex , Animals , Neurons , Macaca , Brain
10.
Eur J Neurosci ; 55(2): 528-548, 2022 01.
Article in English | MEDLINE | ID: mdl-34844286

ABSTRACT

How we distinguish multiple simultaneous stimuli is uncertain, particularly given that such stimuli sometimes recruit largely overlapping populations of neurons. One commonly proposed hypothesis is that the sharpness of tuning curves might change to limit the number of stimuli driving any given neuron when multiple stimuli are present. To test this hypothesis, we recorded the activity of neurons in the inferior colliculus while monkeys made saccades to either one or two simultaneous sounds differing in frequency and spatial location. Although monkeys easily distinguished simultaneous sounds (~90% correct performance), the frequency selectivity of inferior colliculus neurons on dual-sound trials did not improve in any obvious way. Frequency selectivity was degraded on dual-sound trials compared to single-sound trials: neural response functions broadened and frequency accounted for less of the variance in firing rate. These changes in neural firing led a maximum-likelihood decoder to perform worse on dual-sound trials than on single-sound trials. These results fail to support the hypothesis that changes in frequency response functions serve to reduce the overlap in the representation of simultaneous sounds. Instead, these results suggest that alternative possibilities, such as recent evidence of alternations in firing rate between the rates corresponding to each of the two stimuli, offer a more promising approach.


Subject(s)
Inferior Colliculi , Sound Localization , Acoustic Stimulation , Animals , Inferior Colliculi/physiology , Macaca mulatta , Sound , Sound Localization/physiology
11.
Ann Appl Stat ; 15(1): 41-63, 2021 Mar.
Article in English | MEDLINE | ID: mdl-34413921

ABSTRACT

Conventional analysis of neuroscience data involves computing average neural activity over a group of trials and/or a period of time. This approach may be particularly problematic when assessing the response patterns of neurons to more than one simultaneously presented stimulus. in such cases the brain must represent each individual component of the stimuli bundle, but trial-and-time-pooled averaging methods are fundamentally unequipped to address the means by which multiitem representation occurs. We introduce and investigate a novel statistical analysis framework that relates the firing pattern of a single cell, exposed to a stimuli bundle, to the ensemble of its firing patterns under each constituent stimulus. Existing statistical tools focus on what may be called "first order stochasticity" in trial-to-trial variation in the form of unstructured noise around a fixed firing rate curve associated with a given stimulus. our analysis is based upon the theoretical premise that exposure to a stimuli bundle induces additional stochasticity in the cell's response pattern in the form of a stochastically varying recombination of its single stimulus firing rate curves. We discuss challenges to statistical estimation of such "second order stochasticity" and address them with a novel dynamic admixture point process (DAPP) model. DAPP is a hierarchical point process model that decomposes second order stochasticity into a Gaussian stochastic process and a random vector of interpretable features and facilitates borrowing of information on the latter across repeated trials through latent clustering. We illustrate the utility and accuracy of the DAPP analysis with synthetic data simulation studies. We present real-world evidence of second order stochastic variation with an analysis of monkey inferior colliculus recordings under auditory stimuli.

12.
Annu Rev Vis Sci ; 7: 201-223, 2021 09 15.
Article in English | MEDLINE | ID: mdl-34242053

ABSTRACT

Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, we focus specifically on visual-auditory interactions in areas of the mammalian brain that are commonly considered to be auditory in function. The auditory cortex and inferior colliculus are two key points of entry where visual signals reach the auditory pathway, and both contain visual- and/or eye movement-related signals in humans and other animals. The visual signals observed in these auditory structures reflect a mixture of visual modulation of auditory-evoked activity and visually driven responses that are selective for stimulus location or features. These key response attributes also appear in the classic visual pathway but may play a different role in the auditory pathway: to modify auditory rather than visual perception. Finally, while this review focuses on two particular areas of the auditory pathway where this question has been studied, robust descending as well as ascending connections within this pathway suggest that undiscovered visual signals may be present at other stages as well.


Subject(s)
Auditory Cortex , Animals , Auditory Cortex/physiology , Mammals , Sensation , Sense Organs , Visual Pathways , Visual Perception/physiology
13.
J Neurophysiol ; 126(1): 82-94, 2021 07 01.
Article in English | MEDLINE | ID: mdl-33852803

ABSTRACT

Stimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the reference frames of two sensory modalities, vision and audition, across three interconnected brain areas involved in generating saccades, namely the frontal eye fields (FEF), lateral and medial parietal cortex (M/LIP), and superior colliculus (SC). We recorded from single neurons in head-restrained monkeys performing auditory- and visually guided saccades from variable initial fixation locations and evaluated whether their receptive fields were better described as eye-centered, head-centered, or hybrid (i.e. not anchored uniquely to head- or eye-orientation). We found a progression of reference frames across areas and across time, with considerable hybrid-ness and persistent differences between modalities during most epochs/brain regions. For both modalities, the SC was more eye-centered than the FEF, which in turn was more eye-centered than the predominantly hybrid M/LIP. In all three areas and temporal epochs from stimulus onset to movement, visual signals were more eye-centered than auditory signals. In the SC and FEF, auditory signals became more eye-centered at the time of the saccade than they were initially after stimulus onset, but only in the SC at the time of the saccade did the auditory signals become "predominantly" eye-centered. The results indicate that visual and auditory signals both undergo transformations, ultimately reaching the same final reference frame but via different dynamics across brain regions and time.NEW & NOTEWORTHY Models for visual-auditory integration posit that visual signals are eye-centered throughout the brain, whereas auditory signals are converted from head-centered to eye-centered coordinates. We show instead that both modalities largely employ hybrid reference frames: neither fully head- nor eye-centered. Across three hubs of the oculomotor network (intraparietal cortex, frontal eye field, and superior colliculus) visual and auditory signals evolve from hybrid to a common eye-centered format via different dynamics across brain areas and time.


Subject(s)
Auditory Perception/physiology , Frontal Lobe/physiology , Parietal Lobe/physiology , Saccades/physiology , Superior Colliculi/physiology , Visual Perception/physiology , Acoustic Stimulation/methods , Animals , Macaca mulatta , Photic Stimulation/methods , Time Factors
14.
J Neurophysiol ; 124(3): 715-727, 2020 09 01.
Article in English | MEDLINE | ID: mdl-32727263

ABSTRACT

The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior studies (especially in animals) have assumed fusion of cross-modal information, whereas recent work in humans has begun to probe the appropriateness of this assumption. Here we present results from a novel behavioral task in which both monkeys (Macaca mulatta) and humans localized visual and auditory stimuli and reported their perceived sources through saccadic eye movements. When the locations of visual and auditory stimuli were widely separated, subjects made two saccades, while when the two stimuli were presented at the same location they made only a single saccade. Intermediate levels of separation produced mixed response patterns: a single saccade to an intermediate position on some trials or separate saccades to both locations on others. The distribution of responses was well described by a hierarchical causal inference model that accurately predicted both the explicit "same vs. different" source judgments as well as biases in localization of the source(s) under each of these conditions. The results from this task are broadly consistent with prior work in humans across a wide variety of analogous tasks, extending the study of multisensory causal inference to nonhuman primates and to a natural behavioral task with both a categorical assay of the number of perceived sources and a continuous report of the perceived position of the stimuli.NEW & NOTEWORTHY We developed a novel behavioral paradigm for the study of multisensory causal inference in both humans and monkeys and found that both species make causal judgments in the same Bayes-optimal fashion. To our knowledge, this is the first demonstration of behavioral causal inference in animals, and this cross-species comparison lays the groundwork for future experiments using neuronal recording techniques that are impractical or impossible in human subjects.


Subject(s)
Auditory Perception/physiology , Saccades/physiology , Space Perception/physiology , Thinking/physiology , Visual Perception/physiology , Adult , Animals , Eye-Tracking Technology , Female , Humans , Male , Sound Localization/physiology
15.
Article in English | MEDLINE | ID: mdl-34505116

ABSTRACT

We recently reported the existence of fluctuations in neural signals that may permit neurons to code multiple simultaneous stimuli sequentially across time [1]. This required deploying a novel statistical approach to permit investigation of neural activity at the scale of individual trials. Here we present tests using synthetic data to assess the sensitivity and specificity of this analysis. We fabricated datasets to match each of several potential response patterns derived from single-stimulus response distributions. In particular, we simulated dual stimulus trial spike counts that reflected fluctuating mixtures of the single stimulus spike counts, stable intermediate averages, single stimulus winner-take-all, or response distributions that were outside the range defined by the single stimulus responses (such as summation or suppression). We then assessed how well the analysis recovered the correct response pattern as a function of the number of simulated trials and the difference between the simulated responses to each "stimulus" alone. We found excellent recovery of the mixture, intermediate, and outside categories (>97% correct), and good recovery of the single/winner-take-all category (>90% correct) when the number of trials was >20 and the single-stimulus response rates were 50Hz and 20Hz respectively. Both larger numbers of trials and greater separation between the single stimulus firing rates improved categorization accuracy. These results provide a benchmark, and guidelines for data collection, for use of this method to investigate coding of multiple items at the individual-trial time scale.

16.
J Acoust Soc Am ; 146(2): EL177, 2019 08.
Article in English | MEDLINE | ID: mdl-31472570

ABSTRACT

Visual calibration of auditory space requires re-alignment of representations differing in (1) format (auditory hemispheric channels vs visual maps) and (2) reference frames (head-centered vs eye-centered). Here, a ventriloquism paradigm from Kopco, Lin, Shinn-Cunningham, and Groh [J. Neurosci. 29, 13809-13814 (2009)] was used to examine these processes in humans for ventriloquism induced within one spatial hemifield. Results show that (1) the auditory representation can be adapted even by aligned audio-visual stimuli, and (2) the spatial reference frame is primarily head-centered, with a weak eye-centered modulation. These results support the view that the ventriloquism aftereffect is driven by multiple spatially non-uniform, hemisphere-specific processes.


Subject(s)
Figural Aftereffect , Functional Laterality , Sound Localization , Brain/physiology , Cues , Eye Movements , Humans , Illusions/physiology , Speech Perception , Young Adult
17.
J Eye Mov Res ; 12(7)2019 Nov 25.
Article in English | MEDLINE | ID: mdl-33828768

ABSTRACT

Keynote by Jenny Groh (Duke University) at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019 Information about eye movements with respect to the head is required for reconciling visual and auditory space. This keynote presentation describes recent findings concerning how eye movements affect early auditory processing via motor processes in the ear (eye movement-related eardrum oscillations, or EMREOs). Computational efforts to understand how eye movements are factored in to auditory processing to produce a reference frame aligned with visual space uncovered a second critical issue: sound location is not mapped but is instead rate (meter) coded in the primate brain, unlike visual space. Meter coding would appear to limit the representation of multiple simultaneous sounds. The second part of this presentation concerns how such a meter code could use fluctuating activity patterns to circumvent this limitation Video stream https://vimeo.com/356576513.

18.
Nat Commun ; 9(1): 2715, 2018 07 13.
Article in English | MEDLINE | ID: mdl-30006598

ABSTRACT

How the brain preserves information about multiple simultaneous items is poorly understood. We report that single neurons can represent multiple stimuli by interleaving signals across time. We record single units in an auditory region, the inferior colliculus, while monkeys localize 1 or 2 simultaneous sounds. During dual-sound trials, we find that some neurons fluctuate between firing rates observed for each single sound, either on a whole-trial or on a sub-trial timescale. These fluctuations are correlated in pairs of neurons, can be predicted by the state of local field potentials prior to sound onset, and, in one monkey, can predict which sound will be reported first. We find corroborating evidence of fluctuating activity patterns in a separate dataset involving responses of inferotemporal cortex neurons to multiple visual stimuli. Alternation between activity patterns corresponding to each of multiple items may therefore be a general strategy to enhance the brain processing capacity, potentially linking such disparate phenomena as variable neural firing, neural oscillations, and limits in attentional/memory capacity.


Subject(s)
Action Potentials/physiology , Auditory Cortex/physiology , Auditory Perception/physiology , Inferior Colliculi/physiology , Neurons/physiology , Acoustic Stimulation , Animals , Attention/physiology , Auditory Cortex/cytology , Electrodes, Implanted , Female , Inferior Colliculi/cytology , Macaca mulatta , Neurons/cytology , Single-Cell Analysis , Sound , Stereotaxic Techniques
19.
J Neurophysiol ; 119(4): 1411-1421, 2018 04 01.
Article in English | MEDLINE | ID: mdl-29357464

ABSTRACT

We accurately perceive the visual scene despite moving our eyes ~3 times per second, an ability that requires incorporation of eye position and retinal information. In this study, we assessed how this neural computation unfolds across three interconnected structures: frontal eye fields (FEF), intraparietal cortex (LIP/MIP), and the superior colliculus (SC). Single-unit activity was assessed in head-restrained monkeys performing visually guided saccades from different initial fixations. As previously shown, the receptive fields of most LIP/MIP neurons shifted to novel positions on the retina for each eye position, and these locations were not clearly related to each other in either eye- or head-centered coordinates (defined as hybrid coordinates). In contrast, the receptive fields of most SC neurons were stable in eye-centered coordinates. In FEF, visual signals were intermediate between those patterns: around 60% were eye-centered, whereas the remainder showed changes in receptive field location, boundaries, or responsiveness that rendered the response patterns hybrid or occasionally head-centered. These results suggest that FEF may act as a transitional step in an evolution of coordinates between LIP/MIP and SC. The persistence across cortical areas of mixed representations that do not provide unequivocal location labels in a consistent reference frame has implications for how these representations must be read out. NEW & NOTEWORTHY How we perceive the world as stable using mobile retinas is poorly understood. We compared the stability of visual receptive fields across different fixation positions in three visuomotor regions. Irregular changes in receptive field position were ubiquitous in intraparietal cortex, evident but less common in the frontal eye fields, and negligible in the superior colliculus (SC), where receptive fields shifted reliably across fixations. Only the SC provides a stable labeled-line code for stimuli across saccades.


Subject(s)
Electroencephalography/methods , Electrophysiological Phenomena , Frontal Lobe/physiology , Parietal Lobe/physiology , Saccades/physiology , Superior Colliculi/physiology , Visual Perception/physiology , Animals , Macaca mulatta
20.
Proc Natl Acad Sci U S A ; 115(6): E1309-E1318, 2018 02 06.
Article in English | MEDLINE | ID: mdl-29363603

ABSTRACT

Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.


Subject(s)
Auditory Pathways/physiology , Brain/physiology , Hearing/physiology , Saccades/physiology , Tympanic Membrane/physiology , Adolescent , Adult , Animals , Female , Humans , Macaca mulatta , Male , Photic Stimulation , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...