Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
bioRxiv ; 2024 Feb 01.
Article in English | MEDLINE | ID: mdl-38352527

ABSTRACT

Even under spontaneous conditions and in the absence of changing environmental demands, awake animals alternate between increased or decreased periods of alertness. These changes in brain state can occur rapidly, on a timescale of seconds, and neuromodulators such as acetylcholine (ACh) are thought to play an important role in driving these spontaneous state transitions. Here, we perform the first simultaneous imaging of ACh sensors and GCaMP-expressing axons in vivo, to examine the spatiotemporal properties of cortical ACh activity and release during spontaneous changes in behavioral state. We observed a high correlation between simultaneously recorded basal forebrain axon activity and neuromodulator sensor fluorescence around periods of locomotion and pupil dilation. Consistent with volume transmission of ACh, increases in axon activity were accompanied by increases in local ACh levels that fell off with the distance from the nearest axon. GRAB-ACh fluorescence could be accurately predicted from axonal activity alone, providing the first validation that neuromodulator axon activity is a reliable proxy for nearby neuromodulator levels. Deconvolution of fluorescence traces allowed us to account for the kinetics of the GRAB-ACh sensor and emphasized the rapid clearance of ACh for smaller transients outside of running periods. Finally, we trained a predictive model of ACh fluctuations from the combination of pupil size and running speed; this model performed better than using either variable alone, and generalized well to unseen data. Overall, these results contribute to a growing understanding of the precise timing and spatial characteristics of cortical ACh during fast brain state transitions.

2.
Nat Commun ; 12(1): 1539, 2021 03 09.
Article in English | MEDLINE | ID: mdl-33750784

ABSTRACT

Vagus nerve stimulation (VNS) is thought to affect neural activity by recruiting brain-wide release of neuromodulators. VNS is used in treatment-resistant epilepsy, and is increasingly being explored for other disorders, such as depression, and as a cognitive enhancer. However, the promise of VNS is only partially fulfilled due to a lack of mechanistic understanding of the transfer function between stimulation parameters and neuromodulatory response, together with a lack of biosensors for assaying stimulation efficacy in real time. We here develop an approach to VNS in head-fixed mice on a treadmill and show that pupil dilation is a reliable and convenient biosensor for VNS-evoked cortical neuromodulation. In an 'optimal' zone of stimulation parameters, current leakage and off-target effects are minimized and the extent of pupil dilation tracks VNS-evoked basal-forebrain cholinergic axon activity in neocortex. Thus, pupil dilation is a sensitive readout of the moment-by-moment, titratable effects of VNS on brain state.


Subject(s)
Pupil/physiology , Vagus Nerve/physiology , Animals , Brain , Cerebellar Cortex/physiology , Epilepsy/physiopathology , Female , Locus Coeruleus/pathology , Male , Mice , Mice, Inbred C57BL , Vagus Nerve Stimulation , Wakefulness/physiology
3.
Annu Int Conf IEEE Eng Med Biol Soc ; 2020: 2925-2928, 2020 07.
Article in English | MEDLINE | ID: mdl-33018619

ABSTRACT

An emerging corpus of research seeks to use virtual realities (VRs) to understand the neural mechanisms underlying spatial navigation and decision making in rodents. These studies have primarily used visual stimuli to represent the virtual world. However, auditory cues play an important role in navigation for animals, especially when the visual system cannot detect objects or predators. We have developed a virtual reality environment defined exclusively by free-field acoustic landmarks for head-fixed mice. We trained animals to run in a virtual environment with 3 acoustic landmarks. We present evidence that they can learn to navigate in our context: we observed anticipatory licking and modest anticipatory slowing preceding the reward region. Furthermore, we found that animals were highly aware of changes in landmark cues: licking behavior changed dramatically when the familiar virtual environment was switched to a novel one, and then rapidly reverted to normal when the familiar virtual environment was re-introduced, all within the same session. Finally, while animals executed the task, we performed in-vivo calcium imaging in the CA1 region of the hippocampus using a modified Miniscope.org system. Our experiments point to a future in which auditory virtual reality can be used to expand our understanding of the neural bases of audition in locomoting animals and the variety of sensory cues which anchor spatial representations in a new virtual environment.


Subject(s)
Spatial Navigation , Virtual Reality , Animals , Cues , Mice , Space Perception , User-Computer Interface
SELECTION OF CITATIONS
SEARCH DETAIL
...