Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 508
Filter
1.
Nat Commun ; 15(1): 4829, 2024 Jun 06.
Article in English | MEDLINE | ID: mdl-38844438

ABSTRACT

Orientation or axial selectivity, the property of neurons in the visual system to respond preferentially to certain angles of visual stimuli, plays a pivotal role in our understanding of visual perception and information processing. This computation is performed as early as the retina, and although much work has established the cellular mechanisms of retinal orientation selectivity, how this computation is organized across the retina is unknown. Using a large dataset collected across the mouse retina, we demonstrate functional organization rules of retinal orientation selectivity. First, we identify three major functional classes of retinal cells that are orientation selective and match previous descriptions. Second, we show that one orientation is predominantly represented in the retina and that this predominant orientation changes as a function of retinal location. Third, we demonstrate that neural activity plays little role on the organization of retinal orientation selectivity. Lastly, we use in silico modeling followed by validation experiments to demonstrate that the overrepresented orientation aligns along concentric axes. These results demonstrate that, similar to direction selectivity, orientation selectivity is organized in a functional map as early as the retina.


Subject(s)
Orientation , Retina , Animals , Retina/physiology , Mice , Orientation/physiology , Photic Stimulation , Mice, Inbred C57BL , Computer Simulation , Visual Perception/physiology , Models, Neurological , Orientation, Spatial/physiology , Retinal Ganglion Cells/physiology
2.
Sci Rep ; 14(1): 10164, 2024 05 03.
Article in English | MEDLINE | ID: mdl-38702338

ABSTRACT

Orientation processing is one of the most fundamental functions in both visual and somatosensory perception. Converging findings suggest that orientation processing in both modalities is closely linked: somatosensory neurons share a similar orientation organisation as visual neurons, and the visual cortex has been found to be heavily involved in tactile orientation perception. Hence, we hypothesized that somatosensation would exhibit a similar orientation adaptation effect, and this adaptation effect would be transferable between the two modalities, considering the above-mentioned connection. The tilt aftereffect (TAE) is a demonstration of orientation adaptation and is used widely in behavioural experiments to investigate orientation mechanisms in vision. By testing the classic TAE paradigm in both tactile and crossmodal orientation tasks between vision and touch, we were able to show that tactile perception of orientation shows a very robust TAE, similar to its visual counterpart. We further show that orientation adaptation in touch transfers to produce a TAE when tested in vision, but not vice versa. Additionally, when examining the test sequence following adaptation for serial effects, we observed another asymmetry between the two conditions where the visual test sequence displayed a repulsive intramodal serial dependence effect while the tactile test sequence exhibited an attractive serial dependence. These findings provide concrete evidence that vision and touch engage a similar orientation processing mechanism. However, the asymmetry in the crossmodal transfer of TAE and serial dependence points to a non-reciprocal connection between the two modalities, providing further insights into the underlying processing mechanism.


Subject(s)
Adaptation, Physiological , Touch Perception , Visual Perception , Humans , Male , Female , Adult , Touch Perception/physiology , Visual Perception/physiology , Young Adult , Orientation/physiology , Touch/physiology , Orientation, Spatial/physiology , Vision, Ocular/physiology , Visual Cortex/physiology
3.
Proc Natl Acad Sci U S A ; 121(23): e2312851121, 2024 Jun 04.
Article in English | MEDLINE | ID: mdl-38771864

ABSTRACT

The way goal-oriented birds adjust their travel direction and route in response to wind significantly affects their travel costs. This is expected to be particularly pronounced in pelagic seabirds, which utilize a wind-dependent flight style called dynamic soaring. Dynamic soaring seabirds in situations without a definite goal, e.g. searching for prey, are known to preferentially fly with crosswinds or quartering-tailwinds to increase the speed and search area, and reduce travel costs. However, little is known about their reaction to wind when heading to a definite goal, such as homing. Homing tracks of wandering albatrosses (Diomedea exulans) vary from beelines to zigzags, which are similar to those of sailboats. Here, given that both albatrosses and sailboats travel slower in headwinds and tailwinds, we tested whether the time-minimizing strategies used by yacht racers can be compared to the locomotion patterns of wandering albatrosses. We predicted that when the goal is located upwind or downwind, albatrosses should deviate their travel directions from the goal on the mesoscale and increase the number of turns on the macroscale. Both hypotheses were supported by track data from albatrosses and racing yachts in the Southern Ocean confirming that albatrosses qualitatively employ the same strategy as yacht racers. Nevertheless, albatrosses did not strictly minimize their travel time, likely making their flight robust against wind fluctuations to reduce flight costs. Our study provides empirical evidence of tacking in albatrosses and demonstrates that man-made movement strategies provide a new perspective on the laws underlying wildlife movement.


Subject(s)
Birds , Flight, Animal , Wind , Animals , Flight, Animal/physiology , Birds/physiology , Orientation/physiology , Homing Behavior/physiology , Orientation, Spatial/physiology , Animal Migration/physiology
4.
Behav Processes ; 218: 105041, 2024 May.
Article in English | MEDLINE | ID: mdl-38692460

ABSTRACT

A previous study demonstrated that rodents on an inclined square platform traveled straight vertically or horizontally and avoided diagonal travel. Through behavior they aligned their head with the horizontal plane, acquiring similar bilateral vestibular cues - a basic requirement for spatial orientation and a salient feature of animals in motion. This behavior had previously been shown to be conspicuous in Tristram's jirds. Here, therefore jirds were challenged by testing their travel behavior on a circular arena inclined at 0°-75°. Our hypothesis was that if, as typical to rodents, the jirds would follow the curved arena wall, they would need to display a compensating mechanism to enable traveling in such a path shape, which involves a tilted frontal head axis and unbalanced bilateral vestibular cues. We found that with the increase in inclination, the jirds remained more in the lower section of the arena (geotaxis). When tested on the steep inclinations, however, their travel away from the arena wall was strictly straight up or down, in contrast to the curved paths that followed the circular arena wall. We suggest that traveling along a circular path while maintaining contact with the wall (thigmotaxis), provided tactile information that compensated for the unbalanced bilateral vestibular cues present when traveling along such curved inclined paths. In the latter case, the frontal plane of the head was in a diagonal posture in relation to gravity, a posture that was avoided when traveling away from the wall.


Subject(s)
Cues , Orientation, Spatial , Vestibule, Labyrinth , Animals , Vestibule, Labyrinth/physiology , Orientation, Spatial/physiology , Male , Touch/physiology , Posture/physiology , Touch Perception/physiology
5.
J Vis ; 24(5): 2, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38691087

ABSTRACT

Historically, in many perceptual learning experiments, only a single stimulus is practiced, and learning is often specific to the trained feature. Our prior work has demonstrated that multi-stimulus learning (e.g., training-plus-exposure procedure) has the potential to achieve generalization. Here, we investigated two important characteristics of multi-stimulus learning, namely, roving and feature variability, and their impacts on multi-stimulus learning and generalization. We adopted a feature detection task in which an oddly oriented target bar differed by 16° from the background bars. The stimulus onset asynchrony threshold between the target and the mask was measured with a staircase procedure. Observers were trained with four target orientation search stimuli, either with a 5° deviation (30°-35°-40°-45°) or with a 45° deviation (30°-75°-120°-165°), and the four reference stimuli were presented in a roving manner. The transfer of learning to the swapped target-background orientations was evaluated after training. We found that multi-stimulus training with a 5° deviation resulted in significant learning improvement, but learning failed to transfer to the swapped target-background orientations. In contrast, training with a 45° deviation slowed learning but produced a significant generalization to swapped orientations. Furthermore, a modified training-plus-exposure procedure, in which observers were trained with four orientation search stimuli with a 5° deviation and simultaneously passively exposed to orientations with high feature variability (45° deviation), led to significant orientation learning generalization. Learning transfer also occurred when the four orientation search stimuli with a 5° deviation were presented in separate blocks. These results help us to specify the condition under which multistimuli learning produces generalization, which holds potential for real-world applications of perceptual learning, such as vision rehabilitation and expert training.


Subject(s)
Photic Stimulation , Humans , Young Adult , Male , Female , Adult , Photic Stimulation/methods , Learning/physiology , Transfer, Psychology/physiology , Orientation, Spatial/physiology , Orientation/physiology
6.
Nature ; 626(8000): 819-826, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38326621

ABSTRACT

To navigate, we must continuously estimate the direction we are headed in, and we must correct deviations from our goal1. Direction estimation is accomplished by ring attractor networks in the head direction system2,3. However, we do not fully understand how the sense of direction is used to guide action. Drosophila connectome analyses4,5 reveal three cell populations (PFL3R, PFL3L and PFL2) that connect the head direction system to the locomotor system. Here we use imaging, electrophysiology and chemogenetic stimulation during navigation to show how these populations function. Each population receives a shifted copy of the head direction vector, such that their three reference frames are shifted approximately 120° relative to each other. Each cell type then compares its own head direction vector with a common goal vector; specifically, it evaluates the congruence of these vectors via a nonlinear transformation. The output of all three cell populations is then combined to generate locomotor commands. PFL3R cells are recruited when the fly is oriented to the left of its goal, and their activity drives rightward turning; the reverse is true for PFL3L. Meanwhile, PFL2 cells increase steering speed, and are recruited when the fly is oriented far from its goal. PFL2 cells adaptively increase the strength of steering as directional error increases, effectively managing the tradeoff between speed and accuracy. Together, our results show how a map of space in the brain can be combined with an internal goal to generate action commands, via a transformation from world-centric coordinates to body-centric coordinates.


Subject(s)
Brain , Drosophila melanogaster , Goals , Head , Neurons , Orientation, Spatial , Spatial Navigation , Animals , Brain/cytology , Brain/physiology , Connectome , Drosophila melanogaster/cytology , Drosophila melanogaster/physiology , Head/physiology , Locomotion/physiology , Neurons/classification , Neurons/physiology , Orientation, Spatial/physiology , Spatial Navigation/physiology , Time Factors
7.
Nature ; 626(8000): 808-818, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38326612

ABSTRACT

Neuronal signals that are relevant for spatial navigation have been described in many species1-10. However, a circuit-level understanding of how such signals interact to guide navigational behaviour is lacking. Here we characterize a neuronal circuit in the Drosophila central complex that compares internally generated estimates of the heading and goal angles of the fly-both of which are encoded in world-centred (allocentric) coordinates-to generate a body-centred (egocentric) steering signal. Past work has suggested that the activity of EPG neurons represents the fly's moment-to-moment angular orientation, or heading angle, during navigation2,11. An animal's moment-to-moment heading angle, however, is not always aligned with its goal angle-that is, the allocentric direction in which it wishes to progress forward. We describe FC2 cells12, a second set of neurons in the Drosophila brain with activity that correlates with the fly's goal angle. Focal optogenetic activation of FC2 neurons induces flies to orient along experimenter-defined directions as they walk forward. EPG and FC2 neurons connect monosynaptically to a third neuronal class, PFL3 cells12,13. We found that individual PFL3 cells show conjunctive, spike-rate tuning to both the heading angle and the goal angle during goal-directed navigation. Informed by the anatomy and physiology of these three cell classes, we develop a model that explains how this circuit compares allocentric heading and goal angles to build an egocentric steering signal in the PFL3 output terminals. Quantitative analyses and optogenetic manipulations of PFL3 activity support the model. Finally, using a new navigational memory task, we show that flies expressing disruptors of synaptic transmission in subsets of PFL3 cells have a reduced ability to orient along arbitrary goal directions, with an effect size in quantitative accordance with the prediction of our model. The biological circuit described here reveals how two population-level allocentric signals are compared in the brain to produce an egocentric output signal that is appropriate for motor control.


Subject(s)
Brain , Drosophila melanogaster , Goals , Head , Neural Pathways , Orientation, Spatial , Spatial Navigation , Animals , Action Potentials , Brain/cytology , Brain/physiology , Drosophila melanogaster/cytology , Drosophila melanogaster/physiology , Head/physiology , Locomotion , Neurons/metabolism , Optogenetics , Orientation, Spatial/physiology , Space Perception/physiology , Spatial Memory/physiology , Spatial Navigation/physiology , Synaptic Transmission
8.
Atten Percept Psychophys ; 86(3): 909-930, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38253985

ABSTRACT

Can synchrony in stimulation guide attention and aid perceptual performance? Here, in a series of three experiments, we tested the influence of visual and auditory synchrony on attentional selection during a novel human foraging task. Human foraging tasks are a recent extension of the classic visual search paradigm in which multiple targets must be located on a given trial, making it possible to capture a wide range of performance metrics. Experiment 1 was performed online, where the task was to forage for 10 (out of 20) vertical lines among 60 randomly oriented distractor lines that changed color between yellow and blue at random intervals. The targets either changed colors in visual synchrony or not. In another condition, a non-spatial sound additionally occurred synchronously with the color change of the targets. Experiment 2 was run in the laboratory (within-subjects) with the same design. When the targets changed color in visual synchrony, foraging times were significantly shorter than when they randomly changed colors, but there was no additional benefit for the sound synchrony, in contrast to predictions from the so-called "pip-and-pop" effect (Van der Burg et al., Journal of Experimental Psychology, 1053-1065, 2008). In Experiment 3, task difficulty was increased as participants foraged for as many 45° rotated lines as possible among lines of different orientations within 10 s, with the same synchrony conditions as in Experiments 1 and 2. Again, there was a large benefit of visual synchrony but no additional benefit for sound synchronization. Our results provide strong evidence that visual synchronization can guide attention during multiple target foraging. This likely reflects the local grouping of the synchronized targets. Importantly, there was no additional benefit for sound synchrony, even when the foraging task was quite difficult (Experiment 3).


Subject(s)
Attention , Color Perception , Pattern Recognition, Visual , Humans , Attention/physiology , Female , Color Perception/physiology , Male , Young Adult , Pattern Recognition, Visual/physiology , Adult , Auditory Perception/physiology , Reaction Time/physiology , Orientation, Spatial/physiology , Adolescent , Orientation
9.
J Vis ; 23(7): 5, 2023 Jul 03.
Article in English | MEDLINE | ID: mdl-37410493

ABSTRACT

Perception depends on both the current sensory input and on the preceding stimuli history, a mechanism referred to as serial dependence (SD). One interesting, and somewhat controversial, question is whether serial dependence originates at the perceptual stage, which should lead to a sensory improvement, or at a subsequent decisional stage, causing solely a bias. Here, we studied the effects of SD in a novel manner by leveraging on the human capacity to spontaneously assess the quality of sensory information. Two noisy-oriented Gabor stimuli were simultaneously presented along with two bars of the same orientation as the Gabor stimuli. Participants were asked to choose which Gabor stimulus to judge and then make a forced-choice judgment of its orientation by selecting the appropriate response bar. On all trials, one of the Gabor stimuli had the same orientation as the Gabor in the same position on the previous trial. We explored whether continuity in orientation and position affected choice and accuracy. Results show that continuity of orientation leads to a persistent (up to four back) accuracy advantage and a higher preference in the selection of stimuli with the same orientation, and this advantage accumulates over trials. In contrast, analysis of the continuity of the selected position indicated that participants had a strong tendency to choose stimuli in the same position, but this behavior did not lead to an improvement in accuracy.


Subject(s)
Decision Making , Visual Perception , Humans , Decision Making/physiology , Visual Perception/physiology , Orientation, Spatial/physiology , Bias , Judgment
10.
Conscious Cogn ; 113: 103547, 2023 08.
Article in English | MEDLINE | ID: mdl-37390767

ABSTRACT

The peripersonal space, that is, the limited space surrounding the body, involves multisensory coding and representation of the self in space. Previous studies have shown that peripersonal space representation and the visual perspective on the environment can be dramatically altered when neurotypical individuals self-identify with a distant avatar (i.e., in virtual reality) or during clinical conditions (i.e., out-of-body experience, heautoscopy, depersonalization). Despite its role in many cognitive/social functions, the perception of peripersonal space in dreams, and its relationship with the perception of other characters (interpersonal distance in dreams), remain largely uncharted. The present study aimed to explore the visuospatial properties of this space, which is likely to underlie self-location as well as self/other distinction in dreams. 530 healthy volunteers answered a web-based questionnaire to measure their dominant visuo-spatial perspective in dreams, the frequency of recall for felt distances between their dream self and other dream characters, and the dreamers' viewing angle of other dream characters. Most participants reported dream experiences from a first-person perspective (1PP) (82%) compared to a third-person perspective (3PP) (18%). Independent of their dream perspective, participants reported that they generally perceived other dream characters in their close space, that is, at distance of either between 0 and 90 cm, or 90-180 cm, than in further spaces (180-270 cm). Regardless of the perspective (1PP or 3PP), both groups also reported more frequently seeing other dream characters from eye level (0° angle of viewing) than from above (30° and 60°) or below eye level (-30° and -60°). Moreover, the intensity of sensory experiences in dreams, as measured by the Bodily Self-Consciousness in Dreams Questionnaire, was higher in individuals who habitually see other dream characters closer to their personal dream self (i.e., within 0-90 cm and 90-180 cm). These preliminary findings offer a new, phenomenological account of space representation in dreams with regards to the felt presence of others. They might provide insights not only to our understanding of how dreams are formed, but also to the type of neurocomputations involved in self/other distinction.


Subject(s)
Dreams , Orientation , Dreams/physiology , Dreams/psychology , Surveys and Questionnaires , Consciousness/physiology , Humans , Orientation/physiology , Self Report , Regression Analysis , Orientation, Spatial/physiology , Mental Recall , Wakefulness/physiology , Male , Female , Adolescent , Young Adult , Adult
11.
Nature ; 615(7954): 892-899, 2023 03.
Article in English | MEDLINE | ID: mdl-36949190

ABSTRACT

The head direction (HD) system functions as the brain's internal compass1,2, classically formalized as a one-dimensional ring attractor network3,4. In contrast to a globally consistent magnetic compass, the HD system does not have a universal reference frame. Instead, it anchors to local cues, maintaining a stable offset when cues rotate5-8 and drifting in the absence of referents5,8-10. However, questions about the mechanisms that underlie anchoring and drift remain unresolved and are best addressed at the population level. For example, the extent to which the one-dimensional description of population activity holds under conditions of reorientation and drift is unclear. Here we performed population recordings of thalamic HD cells using calcium imaging during controlled rotations of a visual landmark. Across experiments, population activity varied along a second dimension, which we refer to as network gain, especially under circumstances of cue conflict and ambiguity. Activity along this dimension predicted realignment and drift dynamics, including the speed of network realignment. In the dark, network gain maintained a 'memory trace' of the previously displayed landmark. Further experiments demonstrated that the HD network returned to its baseline orientation after brief, but not longer, exposures to a rotated cue. This experience dependence suggests that memory of previous associations between HD neurons and allocentric cues is maintained and influences the internal HD representation. Building on these results, we show that continuous rotation of a visual landmark induced rotation of the HD representation that persisted in darkness, demonstrating experience-dependent recalibration of the HD system. Finally, we propose a computational model to formalize how the neural compass flexibly adapts to changing environmental cues to maintain a reliable representation of HD. These results challenge classical one-dimensional interpretations of the HD system and provide insights into the interactions between this system and the cues to which it anchors.


Subject(s)
Cues , Head , Neurons , Orientation , Thalamus , Calcium Signaling , Head/physiology , Neurons/cytology , Neurons/physiology , Orientation/physiology , Orientation, Spatial/physiology , Rotation , Thalamus/cytology , Thalamus/physiology
12.
Nat Commun ; 13(1): 817, 2022 02 10.
Article in English | MEDLINE | ID: mdl-35145124

ABSTRACT

Social behaviours characterize cooperative, mutualistic, aggressive or parental interactions that occur among conspecifics. Although the Ventral Tegmental Area (VTA) has been identified as a key substrate for social behaviours, the input and output pathways dedicated to specific aspects of conspecific interaction remain understudied. Here, in male mice, we investigated the activity and function of two distinct VTA inputs from superior colliculus (SC-VTA) and medial prefrontal cortex (mPFC-VTA). We observed that SC-VTA neurons display social interaction anticipatory calcium activity, which correlates with orienting responses towards an unfamiliar conspecific. In contrast, mPFC-VTA neuron population activity increases after initiation of the social contact. While protracted phasic stimulation of SC-VTA pathway promotes head/body movements and decreases social interaction, inhibition of this pathway increases social interaction. Here, we found that SC afferents mainly target a subpopulation of dorsolateral striatum (DLS)-projecting VTA dopamine (DA) neurons (VTADA-DLS). While, VTADA-DLS pathway stimulation decreases social interaction, VTADA-Nucleus Accumbens stimulation promotes it. Altogether, these data support a model by which at least two largely anatomically distinct VTA sub-circuits oppositely control distinct aspects of social behaviour.


Subject(s)
Neural Pathways/physiology , Orientation, Spatial/physiology , Social Interaction , Superior Colliculi/pathology , Ventral Tegmental Area/physiology , Animals , Dopaminergic Neurons/physiology , Male , Mice , Mice, Inbred C57BL , Neurons/physiology , Nucleus Accumbens/physiology , Prefrontal Cortex/physiology , Social Behavior
13.
Curr Opin Neurobiol ; 73: 102514, 2022 04.
Article in English | MEDLINE | ID: mdl-35196623

ABSTRACT

Insects can perform impressive feats of navigation, suggesting a sophisticated sense of direction and an ability to choose appropriate trajectories toward ethological goals. The hypothesized substrate for these navigational abilities is the central complex (CX), a midline brain structure with orderly topology. The circuit transformations performed by the CX are now being concretely described by recent advances in the study of fruit fly neural circuits. An emerging theme is dynamic representation of navigational variables (e.g. heading or travel direction) computed in a manner distributed across specific neuronal populations. These representations are shaped by multimodal inputs whose weights evolve rapidly as surroundings change. Investigation of CX circuits is revealing with precise detail how structured wiring and synaptic plasticity enable neural circuits to flexibly subsample from the currently available sensory and motor cues to build a stable and accurate map of space. Given the sensory richness of natural environments, these findings are encouraging insect neuroscientists to no longer ask which cues insects use to navigate, but instead which cues can insects use, and under which contexts.


Subject(s)
Drosophila , Orientation, Spatial , Animals , Brain/physiology , Cues , Insecta/physiology , Neurons/physiology , Orientation, Spatial/physiology
14.
Nat Commun ; 13(1): 120, 2022 01 10.
Article in English | MEDLINE | ID: mdl-35013266

ABSTRACT

The vestibular system detects head motion to coordinate vital reflexes and provide our sense of balance and spatial orientation. A long-standing hypothesis has been that projections from the central vestibular system back to the vestibular sensory organs (i.e., the efferent vestibular system) mediate adaptive sensory coding during voluntary locomotion. However, direct proof for this idea has been lacking. Here we recorded from individual semicircular canal and otolith afferents during walking and running in monkeys. Using a combination of mathematical modeling and nonlinear analysis, we show that afferent encoding is actually identical across passive and active conditions, irrespective of context. Thus, taken together our results are instead consistent with the view that the vestibular periphery relays robust information to the brain during primate locomotion, suggesting that context-dependent modulation instead occurs centrally to ensure that coding is consistent with behavioral goals during locomotion.


Subject(s)
Locomotion/physiology , Neurons, Afferent/physiology , Orientation, Spatial/physiology , Semicircular Canals/physiology , Vestibule, Labyrinth/physiology , Animals , Brain/anatomy & histology , Brain/physiology , Electrodes, Implanted , Head Movements/physiology , Macaca mulatta , Male , Semicircular Canals/anatomy & histology , Space Perception/physiology , Vestibule, Labyrinth/anatomy & histology
15.
Sci Rep ; 12(1): 1430, 2022 01 26.
Article in English | MEDLINE | ID: mdl-35082357

ABSTRACT

The effect of varying sinusoidal linear acceleration on perception of human motion was examined using 4 motion paradigms: off-vertical axis rotation, variable radius centrifugation, linear lateral translation, and rotation about an earth-horizontal axis. The motion profiles for each paradigm included 6 frequencies (0.01-0.6 Hz) and 5 tilt amplitudes (5°-20°). Subjects verbally reported the perceived angle of their whole-body tilt and the peak-to-peak translation of their head in space and used a joystick capable of recording 2-axis motion in the sagittal and transversal planes to indicate the phase between the perceived and actual motions. The amplitudes of perceived tilt and translation were expressed in terms of gain, i.e., the ratio of perceived tilt to equivalent tilt angle, and the ratio of perceived translation to equivalent linear displacement. Tilt perception gain decreased, whereas translation perception gain increased, with increasing frequency. During off-vertical axis rotation, the phase of tilt perception and of translation perception did not vary across stimulus frequencies. These motion paradigms elicited similar responses in roll tilt and interaural perception of translation, with differences likely due to the influence of naso-occipital linear accelerations and input to the semicircular canals that varied across motion paradigms.


Subject(s)
Gravity Sensing/physiology , Head Movements/physiology , Head-Down Tilt/physiology , Motion Perception/physiology , Orientation, Spatial/physiology , Acceleration , Adult , Eye Movements/physiology , Female , Humans , Male , Middle Aged , Reflex, Vestibulo-Ocular/physiology , Rotation , Semicircular Canals/physiology
16.
J Neurosci ; 42(6): 1054-1067, 2022 02 09.
Article in English | MEDLINE | ID: mdl-34965979

ABSTRACT

Narrowband γ oscillations (NBG: ∼20-60 Hz) in visual cortex reflect rhythmic fluctuations in population activity generated by underlying circuits tuned for stimulus location, orientation, and color. A variety of theories posit a specific role for NBG in encoding and communicating this information within visual cortex. However, recent findings suggest a more nuanced role for NBG, given its dependence on certain stimulus feature configurations, such as coherent-oriented edges and specific hues. Motivated by these factors, we sought to quantify the independent and joint tuning properties of NBG to oriented and color stimuli using intracranial recordings from the human visual cortex (male and female). NBG was shown to display a cardinal orientation bias (horizontal) and also an end- and mid-spectral color bias (red/blue and green). When jointly probed, the cardinal bias for orientation was attenuated and an end-spectral preference for red and blue predominated. This loss of mid-spectral tuning occurred even for recording sites showing large responses to uniform green stimuli. Our results demonstrate the close, yet complex, link between the population dynamics driving NBG oscillations and known feature selectivity biases for orientation and color within visual cortex. Such a bias in stimulus tuning imposes new constraints on the functional significance of the visual γ rhythm. More generally, these biases in population electrophysiology will need to be considered in experiments using orientation or color features to examine the role of visual cortex in other domains, such as working memory and decision-making.SIGNIFICANCE STATEMENT Oscillations in electrophysiological activity occur in visual cortex in response to stimuli that strongly drive the orientation or color selectivity of visual neurons. The significance of this induced "γ rhythm" to brain function remains unclear. Answering this question requires understanding how and why some stimuli can reliably generate oscillatory γ activity while others do not. We examined how different orientations and colors independently and jointly modulate γ oscillations in the human brain. Our data show that γ oscillations are greatest for certain orientations and colors that reflect known response biases in visual cortex. Such findings complicate the functional significance of γ oscillations but open new avenues for linking circuits to population dynamics in visual cortex.


Subject(s)
Color Perception/physiology , Gamma Rhythm/physiology , Orientation, Spatial/physiology , Visual Cortex/physiology , Adult , Electrocorticography , Female , Humans , Male , Middle Aged
17.
Behav Brain Res ; 416: 113577, 2022 01 07.
Article in English | MEDLINE | ID: mdl-34506841

ABSTRACT

Astronauts undertaking deep space travel will receive chronic exposure to the mixed spectrum of particles that comprise Galactic Cosmic Radiation (GCR). Exposure to the different charged particles of varied fluence and energy that characterize GCR may impact neural systems that support performance on mission critical tasks. Indeed, growing evidence derived from years of terrestrial-based simulations of the space radiation environment using rodents has indicated that a variety of exposure scenarios can result in significant and long-lasting decrements to CNS functionality. Many of the behavioral tasks used to quantify radiation effects on the CNS depend on neural systems that support maintaining spatial orientation and organization of rodent open field behavior. The current study examined the effects of acute or chronic exposure to simulated GCR on the organization of open field behavior under conditions with varied access to environmental cues in male and female C57BL/6 J mice. In general, groups exhibited similar organization of open field behavior under dark and light conditions. Two exceptions were noted: the acute exposure group exhibited significantly slower and more circuitous homeward progressions relative to the chronic group under light conditions. These results demonstrate the potential of open field behavior organization to discriminate between the effects of select GCR exposure paradigms.


Subject(s)
Cosmic Radiation/adverse effects , Cues , Exploratory Behavior/physiology , Orientation, Spatial/physiology , Radiation Exposure/adverse effects , Animals , Female , Male , Mice , Mice, Inbred C57BL , Space Flight
18.
J Neurosci ; 42(1): 44-57, 2022 01 05.
Article in English | MEDLINE | ID: mdl-34759028

ABSTRACT

The primary somatosensory cortex (S1) is important for the control of movement as it encodes sensory input from the body periphery and external environment during ongoing movement. Mouse S1 consists of several distinct sensorimotor subnetworks that receive topographically organized corticocortical inputs from distant sensorimotor areas, including the secondary somatosensory cortex (S2) and primary motor cortex (M1). The role of the vibrissal S1 area and associated cortical connections during active sensing is well documented, but whether (and if so, how) non-whisker S1 areas are involved in movement control remains relatively unexplored. Here, we demonstrate that unilateral silencing of the non-whisker S1 area in both male and female mice disrupts hind paw movement during locomotion on a rotarod and a runway. S2 and M1 provide major long-range inputs to this S1 area. Silencing S2→non-whisker S1 projections alters the hind paw orientation during locomotion, whereas manipulation of the M1 projection has little effect. Using patch-clamp recordings in brain slices from male and female mice, we show that S2 projection preferentially innervates inhibitory interneuron subtypes. We conclude that interneuron-mediated S2-S1 corticocortical interactions are critical for efficient locomotion.SIGNIFICANCE STATEMENT Somatosensory cortex participates in controlling rhythmic movements, such as whisking and walking, but the neural circuitry underlying movement control by somatosensory cortex remains relatively unexplored. We uncover a corticocortical circuit in primary somatosensory cortex that regulates paw orientation during locomotion in mice. We identify neuronal elements that comprise these cortical pathways using pharmacology, behavioral assays, and circuit-mapping methods.


Subject(s)
Efferent Pathways/physiology , Interneurons/physiology , Orientation, Spatial/physiology , Somatosensory Cortex/physiology , Animals , Female , Locomotion/physiology , Male , Mice , Movement/physiology
19.
Sci Rep ; 11(1): 17574, 2021 09 02.
Article in English | MEDLINE | ID: mdl-34475474

ABSTRACT

Previous studies have shown that humans have a left spatial attention bias in cognition and behaviour. However, whether there exists a leftward perception bias of gaze direction has not been investigated. To address this gap, we conducted three behavioural experiments using a forced-choice gaze direction judgment task. The point of subjective equality (PSE) was employed to measure whether there was a leftward perception bias of gaze direction, and if there was, whether this bias was modulated by face emotion. The results of experiment 1 showed that the PSE of fearful faces was significantly positive as compared to zero and this effect was not found in angry, happy, and neutral faces, indicating that participants were more likely to judge the gaze direction of fearful faces as directed to their left-side space, namely a leftward perception bias. With the response keys counterbalanced between participants, experiment 2a replicated the findings in experiment 1. To further investigate whether the gaze direction perception variation was contributed by emotional or low-level features of faces, experiment 2b and 3 used inverted faces and inverted eyes, respectively. The results revealed similar leftward perception biases of gaze direction in all types of faces, indicating that gaze direction perception was biased by emotional information in faces rather than low-level facial features. Overall, our study demonstrates that there a fear-specific leftward perception bias in processing gaze direction. These findings shed new light on the cerebral lateralization in humans.


Subject(s)
Emotions/physiology , Eye Movements/physiology , Facial Expression , Fear/physiology , Fixation, Ocular , Judgment/physiology , Orientation, Spatial/physiology , Adult , Female , Humans , Male , Young Adult
20.
J Neurosci ; 41(39): 8233-8248, 2021 09 29.
Article in English | MEDLINE | ID: mdl-34385361

ABSTRACT

Complex perceptual decisions, in which information must be integrated across multiple sources of evidence, are ubiquitous but are not well understood. Such decisions rely on sensory processing of each individual source of evidence, and are therefore vulnerable to bias if sensory processing resources are disproportionately allocated among visual inputs. To investigate this, we developed an implicit neurofeedback protocol embedded within a complex decision-making task to bias sensory processing in favor of one source of evidence over another. Human participants of both sexes (N = 30) were asked to report the average motion direction across two fields of oriented moving bars. Bars of different orientations flickered at different frequencies, thus inducing steady-state visual evoked potentials. Unbeknownst to participants, neurofeedback was implemented to implicitly reward attention to a specific "trained" orientation (rather than any particular motion direction). As attentional selectivity for this orientation increased, the motion coherence of both fields of bars increased, making the task easier without altering the relative reliability of the two sources of evidence. Critically, these neurofeedback trials were alternated with "test" trials in which motion coherence was not contingent on attentional selectivity, allowing us to assess the training efficacy. The protocol successfully biased sensory processing, resulting in earlier and stronger encoding of the trained evidence source. In turn, this evidence was weighted more heavily in behavioral and neural representations of the integrated average, although the two sources of evidence were always matched in reliability. These results demonstrate how biases in sensory processing can impact integrative decision-making processes.SIGNIFICANCE STATEMENT Many everyday decisions require active integration of different sources of sensory information, such as deciding when it is safe to cross a road, yet little is known about how the brain prioritizes sensory sources in the service of adaptive behavior, or whether such decisions can be altered through learning. Here we addressed these questions using a novel behavioral protocol that provided observers with real-time feedback of their own brain activity patterns in which sensory processing was implicitly biased toward a subset of the available information. We show that, while such biases are a normal and adaptive mechanism for humans to process complex visual information, they can also contribute to suboptimal decision-making.


Subject(s)
Attention/physiology , Decision Making/physiology , Evoked Potentials, Visual/physiology , Neurofeedback/methods , Perception/physiology , Visual Perception/physiology , Adolescent , Adult , Brain/physiology , Electroencephalography , Female , Humans , Male , Orientation, Spatial/physiology , Reaction Time/physiology , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...