Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 34
Filter
Add more filters










Publication year range
1.
bioRxiv ; 2024 Apr 09.
Article in English | MEDLINE | ID: mdl-38645204

ABSTRACT

Adaptive decision-making requires consideration of objective risks and rewards associated with each option, as well as subjective preference for risky/safe alternatives. Inaccurate risk/reward estimations can engender excessive risk-taking, a central trait in many psychiatric disorders. The lateral orbitofrontal cortex (lOFC) has been linked to many disorders associated with excessively risky behavior and is ideally situated to mediate risky decision-making. Here, we used single-unit electrophysiology to measure neuronal activity from lOFC of freely moving rats performing in a punishment-based risky decision-making task. Subjects chose between a small, safe reward and a large reward associated with either 0% or 50% risk of concurrent punishment. lOFC activity repeatedly encoded current risk in the environment throughout the decision-making sequence, signaling risk before, during, and after a choice. In addition, lOFC encoded reward magnitude, although this information was only evident during action selection. A Random Forest classifier successfully used neural data accurately to predict the risk of punishment in any given trial, and the ability to predict choice via lOFC activity differentiated between and risk-preferring and risk-averse rats. Finally, risk preferring subjects demonstrated reduced lOFC encoding of risk and increased encoding of reward magnitude. These findings suggest lOFC may serve as a central decision-making hub in which external, environmental information converges with internal, subjective information to guide decision-making in the face of punishment risk.

2.
Article in English | MEDLINE | ID: mdl-38052746

ABSTRACT

Effective decision-making involves careful consideration of all rewarding and aversive outcomes. Importantly, negative outcomes often occur later in time, leading to underestimation, or "discounting," of these consequences. Despite the frequent occurrence of delayed outcomes, little is known about the neurobiology underlying sensitivity to delayed punishment during decision-making. The Delayed Punishment Decision-making Task (DPDT) addresses this by assessing sensitivity to delayed versus immediate punishment in rats. Rats initially avoid punished reinforcers, then select this option more frequently when delay precedes punishment. We used DPDT to examine effects of acute systemic administration of catecholaminergic drugs on sensitivity to delayed punishment in male and female adult rats. Cocaine did not affect choice of rewards with immediate punishment but caused a dose-dependent reduction in choice of delayed punishment. Neither activation nor blockade of D1-like dopamine receptor affected decision-making, but activation of D2-like dopamine receptors reduced choice of delayed punishment. D2 blockade did not attenuate cocaine's effects on decision-making, suggesting that cocaine's effects are not dependent on D2 receptor activation. Increasing synaptic norepinephrine via atomoxetine also reduced choice of delayed (but not immediate) punishment. Notably, when DPDT was modified from ascending to descending pre-punishment delays, these drugs did not affect choice of delayed or immediate punishment, although high-dose quinpirole impaired behavioral flexibility. In summary, sensitivity to delayed punishment is regulated by both dopamine and norepinephrine transmission in task-specific fashion. Understanding the neurochemical modulation of decision-making with delayed punishment is a critical step toward treating disorders characterized by aberrant sensitivity to negative consequences.

3.
Behav Neurosci ; 137(4): 254-267, 2023 Aug.
Article in English | MEDLINE | ID: mdl-37104777

ABSTRACT

Substance use disorder (SUD) is associated with a cluster of cognitive disturbances that engender vulnerability to ongoing drug seeking and relapse. Two of these endophenotypes-risky decision-making and impulsivity-are amplified in individuals with SUD and are augmented by repeated exposure to illicit drugs. Identifying genetic factors underlying variability in these behavioral patterns is critical for early identification, prevention, and treatment of SUD-vulnerable individuals. Here, we compared risky decision-making and different facets of impulsivity between two fully inbred substrains of Lewis rats-LEW/NCrl and LEW/NHsd. We performed whole genome sequencing of both substrains to identify almost all relevant variants. We observed substantial differences in risky decision-making and impulsive behaviors. Relative to LEW/NHsd, the LEW/NCrl substrain accepts higher risk options in a decision-making task and higher rates of premature responses in the differential reinforcement of low rates of responding task. These phenotypic differences were more pronounced in females than males. We defined a total of ∼9,000 polymorphisms between these substrains at 40× whole genome short-read coverage. Roughly half of variants are located within a single 1.5 Mb region of Chromosome 8, but none impact protein-coding regions. In contrast, other variants are widely distributed, and of these, 38 are predicted to cause protein-coding variants. In conclusion, Lewis rat substrains differ significantly in risk-taking and impulsivity and only a small number of easily mapped variants are likely to be causal. Sequencing combined with a reduced complexity cross should enable identification of one or more variants underlying multiple complex addiction-relevant behaviors. (PsycInfo Database Record (c) 2023 APA, all rights reserved).


Subject(s)
Behavior, Addictive , Substance-Related Disorders , Male , Rats , Animals , Female , Decision Making , Rats, Inbred Lew , Impulsive Behavior , Reinforcement, Psychology , Risk-Taking
4.
Exp Clin Psychopharmacol ; 31(1): 228-237, 2023 Feb.
Article in English | MEDLINE | ID: mdl-35084912

ABSTRACT

Cannabis exerts an indirect effect on dopamine (DA) output in the mesolimbic projection, a circuit implicated in reward processing and effort expenditure, and thus may be associated with aberrant effort-based decision making. The "amotivation syndrome" hypothesis suggests that regular cannabis use results in impaired capacity for goal-directed behavior. However, investigations of this hypothesis have used divergent methodology and have not controlled for key confounding variables. The present study extends these findings by examining the relation between cannabis use and effort-related decision making in a sample of college students. Cannabis using (n = 25; 68% meeting criteria for Cannabis Use Disorder) and noncannabis using (n = 22) students completed the Effort Expenditure for Rewards Task (EEfRT). In generalized estimating equation models, reward magnitude, reward probability, and expected value predicted greater likelihood of selecting a high-effort trial. Furthermore, past-month cannabis days and cannabis use disorder symptoms predicted the likelihood of selecting a high-effort trial, such that greater levels of both cannabis use days and symptoms were associated with an increased likelihood after controlling for Attention Deficit/Hyperactivity Disorder (ADHD) symptoms, distress tolerance, income, and delay discounting. The results provide preliminary evidence suggesting that college students who use cannabis are more likely to expend effort to obtain reward, even after controlling for the magnitude of the reward and the probability of reward receipt. Thus, these results do not support the amotivational syndrome hypothesis. Future research with a larger sample is required to evaluate possible associations between cannabis use and patterns of real-world effortful behavior over time. (PsycInfo Database Record (c) 2023 APA, all rights reserved).


Subject(s)
Cannabis , Hallucinogens , Marijuana Abuse , Humans , Decision Making , Motivation , Reward , Cannabinoid Receptor Agonists , Students
5.
eNeuro ; 2022 Aug 29.
Article in English | MEDLINE | ID: mdl-36038251

ABSTRACT

In real-world decision-making scenarios, negative consequences do not always occur immediately after a choice. This delay between action and outcome drives the underestimation, or "delay discounting", of punishment. While the neural substrates underlying sensitivity to immediate punishment have been well-studied, there has been minimal investigation of delayed consequences. Here, we assessed the role of lateral orbitofrontal cortex (LOFC) and basolateral amygdala (BLA), two regions implicated in cost/benefit decision-making, in sensitivity to delayed vs immediate punishment. The delayed punishment decision-making task (DPDT) was used to measure delay discounting of punishment in rodents. During DPDT, rats choose between a small, single pellet reward and a large, three pellet reward accompanied by a mild foot shock. As the task progresses, the shock is preceded by a delay that systematically increases or decreases throughout the session. We observed that rats avoid choices associated with immediate punishment, then shift preference toward these options when punishment is delayed. LOFC inactivation did not influence choice of rewards with immediate punishment, but decreased choice of delayed punishment. We also observed that BLA inactivation reduced choice of delayed punishment for ascending but not descending delays. Inactivation of either brain region produced comparable effects on decision-making in males and females, but there were sex differences observed in omissions and latency to make a choice. In summary, both LOFC and BLA contribute to the delay discounting of punishment and may serve as promising therapeutic targets to improve sensitivity to delayed punishment during decision-making.Significance StatementNegative consequences occurring after a delay are often underestimated, which can lead to maladaptive decision-making. While sensitivity to immediate punishment during reward-seeking has been well-studied, the neural substrates underlying sensitivity to delayed punishment remain unclear. Here, we used the Delayed Punishment Decision-making Task to determine that lateral orbitofrontal cortex and basolateral amygdala both regulate the discounting of delayed punishment, suggesting that these regions may be potential targets to improve decision-making in psychopathology.

6.
Int J Mol Sci ; 23(3)2022 Jan 22.
Article in English | MEDLINE | ID: mdl-35163155

ABSTRACT

While the cognitive enhancing effects of nicotine use have been well documented, it has also been shown to impair decision making. The goal of this study was to determine if exposure to nicotine vapor increases risky decision making. The study also aims to investigate possible long-term effects of nicotine vapor exposure on the expression of genes coding for cholinergic and dopaminergic receptors in brain. Thirty-two adult male Sprague Dawley rats were exposed to 24 mg/mL nicotine vapor or vehicle control, immediately followed by testing in the probability discounting task for 10 consecutive days. Fifty-four days after the 10-day vapor exposure, animals were sacrificed and expression of genes coding for the α4 and ß2 cholinergic receptor subunits, and dopamine D1 and D2 receptors, were analyzed using RT-PCR. Exposure to nicotine vapor caused an immediate and transient increase in risky choice. Analyses of gene expression identified significant reductions in CHRNB2 and DRD1 in the nucleus accumbens core and CHRNB2 and DRD2 in the medial prefrontal cortex of rats previously exposed to nicotine vapor, relative to vehicle controls. Results provide data on the negative cognitive effects of nicotine vapor exposure and identify cholinergic and dopaminergic mechanisms that may affected with repeated use.


Subject(s)
Choice Behavior/drug effects , Gene Expression Regulation/drug effects , Nicotine/toxicity , Prefrontal Cortex/metabolism , Receptors, Nicotinic/metabolism , Animals , Male , Nicotinic Agonists/toxicity , Prefrontal Cortex/drug effects , Rats , Rats, Sprague-Dawley , Receptors, Dopamine D1/genetics , Receptors, Dopamine D1/metabolism , Receptors, Dopamine D2/genetics , Receptors, Dopamine D2/metabolism , Receptors, Nicotinic/genetics
7.
Behav Pharmacol ; 33(1): 32-41, 2022 02 01.
Article in English | MEDLINE | ID: mdl-35007234

ABSTRACT

Epigallocatechin-3-gallate (EGCG) and caffeine are the two primary compounds found in green tea. While EGCG has anxiolytic and anti-inflammatory effects, its acute effects on cognition are not well understood. Furthermore, despite widespread green tea consumption, little is known about how EGCG and caffeine co-administration impacts behavior. Here, we investigated the effects of multiple doses of either EGCG or caffeine on a rat model of risk-taking. This was assessed using the risky decision-making task (RDT), in which rats choose between a small, well-tolerated reward and a large reward with escalating risk of mild footshock. Rats were tested in RDT after acute systemic administration of EGCG, caffeine or joint EGCG and caffeine. EGCG caused a dose-dependent reduction in risk-taking without affecting reward discrimination or task engagement. Caffeine did not impact risk-taking, but elevated locomotor activity and reduced task engagement at high doses. Finally, exposure to both EGCG and caffeine had no effect on risk-taking, suggesting that low-dose caffeine is sufficient to mask the risk-aversion caused by EGCG. These data suggest EGCG as a potential therapeutic treatment for psychological disorders that induce compulsive risky decision-making.


Subject(s)
Caffeine/pharmacology , Catechin/analogs & derivatives , Cognition/drug effects , Compulsive Behavior/chemically induced , Decision Making/drug effects , Risk-Taking , Tea , Animals , Anti-Anxiety Agents/pharmacology , Behavior, Animal/drug effects , Catechin/pharmacology , Dose-Response Relationship, Drug , Drug Monitoring/methods , Locomotion/drug effects , Models, Animal , Psychotropic Drugs/pharmacology , Rats , Tea/adverse effects , Tea/chemistry
8.
Psychopharmacology (Berl) ; 238(4): 991-1004, 2021 Apr.
Article in English | MEDLINE | ID: mdl-33410986

ABSTRACT

RATIONALE: Optimal decision-making necessitates evaluation of multiple rewards that are each offset by distinct costs, such as high effort requirement or high risk of failure. The neurotransmitter dopamine is fundamental toward these cost-benefit analyses, and D1-like and D2-like dopamine receptors differently modulate the reward-discounting effects of both effort and risk. However, measuring the role of dopamine in regulating decision-making between options associated with distinct costs exceeds the scope of traditional rodent economic decision-making paradigms. OBJECTIVES: We developed the effort vs probability economic conflict task (EvP) to model multimodal economic decision-making in rats. This task measures choice between two rewards of uniform magnitude associated with either a high effort requirement or risk of reward omission. We then tested the modulatory effects of systemic cocaine and D1/D2 blockade or activation on the preference between high-effort and high-risk alternatives. METHODS: In the EvP, two reinforcers of equal magnitude are associated with either (1) an effort requirement that increases throughout the session (1, 5, 10, and 20 lever presses), or (2) a low probability of reward receipt (25% of probabilistic choices). Critically, the reinforcer for each choice is comparable (one pellet), which eliminates the influence of magnitude discrimination on the decision-making process. After establishing the task, the dopamine transporter blocker cocaine and D1/D2 antagonists and agonists were administered prior to EvP performance. RESULTS: Preference shifted away from either effortful or probabilistic choice when either option became more costly, and this preference was highly variable between subjects and stable over time. Cocaine, D1 activation, and D2 blockade produced limited, dose-dependent shifts in choice preference contingent on high or low effort conditions. In contrast, D2 activation across multiple doses evoked a robust shift from effortful to risky choice that was evident even when clearly disadvantageous. CONCLUSIONS: The EvP clearly demonstrates that rats can evaluate distinct effortful or risky costs associated with rewards of comparable magnitude, and shift preference away from either option with increasing cost. This preference is more tightly linked to D2 than D1 receptor manipulation, suggesting D2-like receptors as a possible therapeutic target for maladaptive biases toward risk-taking over effort.


Subject(s)
Decision Making/physiology , Receptors, Dopamine D1/drug effects , Receptors, Dopamine D2/drug effects , Reward , Animals , Dopamine/metabolism , Dopamine Agonists/pharmacology , Dopamine Antagonists/pharmacology , Male , Probability , Rats , Rats, Long-Evans
9.
Curr Protoc Neurosci ; 93(1): e100, 2020 09.
Article in English | MEDLINE | ID: mdl-32687693

ABSTRACT

Deficits in decision making are at the heart of many psychiatric diseases, such as substance abuse disorders and attention deficit hyperactivity disorder. Consequently, rodent models of decision making are germane to understanding the neural mechanisms underlying adaptive choice behavior and how such mechanisms can become compromised in pathological conditions. A critical factor that must be integrated with reward value to ensure optimal decision making is the occurrence of consequences, which can differ based on probability (risk of punishment) and temporal contiguity (delayed punishment). This article will focus on two models of decision making that involve explicit punishment, both of which recapitulate different aspects of consequences during human decision making. We will discuss each behavioral protocol, the parameters to consider when designing an experiment, and finally how such animal models can be utilized in studies of psychiatric disease. © 2020 Wiley Periodicals LLC. Basic Protocol 1: Behavioral training Support Protocol: Equipment testing Alternate Protocol: Reward discrimination Basic Protocol 2: Risky decision-making task (RDT) Basic Protocol 3: Delayed punishment decision-making task (DPDT).


Subject(s)
Behavior, Animal/physiology , Behavioral Research/methods , Conditioning, Operant/physiology , Decision Making/physiology , Neurosciences/methods , Punishment , Reward , Animals , Models, Animal , Rats
10.
Neuropsychopharmacology ; 45(2): 266-275, 2020 01.
Article in English | MEDLINE | ID: mdl-31546248

ABSTRACT

The risky decision-making task (RDT) measures risk-taking in a rat model by assessing preference between a small, safe reward and a large reward with increasing risk of punishment (mild foot shock). It is well-established that dopaminergic drugs modulate risk-taking; however, little is known about how differences in baseline phasic dopamine signaling drive individual differences in risk preference. Here, we used in vivo fixed potential amperometry in male Long-Evans rats to test if phasic nucleus accumbens shell (NACs) dopamine dynamics are associated with risk-taking. We observed a positive correlation between medial forebrain bundle-evoked dopamine release in the NACs and risky decision-making, suggesting that risk-taking is associated with elevated dopamine sensitivity. Moreover, "risk-taking" subjects were found to demonstrate greater phasic dopamine release than "risk-averse" subjects. Risky decision-making also predicted enhanced sensitivity to the dopamine reuptake inhibitor nomifensine, and elevated autoreceptor function. Importantly, this hyperdopaminergic phenotype was selective for risky decision-making, as delay discounting performance was not predictive of phasic dopamine release or dopamine supply. These data identify phasic NACs dopamine release as a possible therapeutic target for alleviating the excessive risk-taking observed across multiple forms of psychopathology.


Subject(s)
Decision Making/physiology , Dopamine/metabolism , Nucleus Accumbens/metabolism , Risk-Taking , Animals , Delay Discounting/physiology , Forecasting , Male , Rats , Rats, Long-Evans
11.
eNeuro ; 6(4)2019.
Article in English | MEDLINE | ID: mdl-31387878

ABSTRACT

The majority of the research studying punishment has focused on an aversive stimulus delivered immediately after an action. However, in real-world decision-making, negative consequences often occur long after a decision has been made. This can engender myopic decisions that fail to appropriately respond to consequences. Whereas discounting of delayed rewards has been well studied in both human and animal models, systematic discounting of delayed consequences remains largely unexplored. To address this gap in the literature, we developed the delayed punishment decision-making task. Rats chose between a small, single-pellet reinforcer and a large, three-pellet reinforcer accompanied by a mild foot shock. The shock was preceded by a delay, which systematically increased throughout the session (0, 4, 8, 12, 16 s). On average, rats discounted the negative value of delayed punishment, as indicated by increased choice of the large, punished reward as the delay preceding the shock lengthened. Female rats discounted delayed punishment less than males, and this behavior was not influenced by estrous cycling. The addition of a cue light significantly decreased the undervaluation of delayed consequences for both sexes. Finally, there was no correlation between the discounting of delayed punishments and a traditional reward delay discounting task for either sex. These data indicate that the ability of punishment to regulate decision-making is attenuated when punishment occurs later in time. This task provides an avenue for exploration of the neural circuitry underlying the devaluation of delayed punishment and may assist in developing treatments for substance use disorders.


Subject(s)
Cues , Delay Discounting , Punishment/psychology , Reward , Sex Characteristics , Animals , Behavior, Animal , Biobehavioral Sciences , Conditioning, Operant , Electroshock , Female , Male , Rats, Long-Evans
12.
Methods Mol Biol ; 2011: 79-92, 2019.
Article in English | MEDLINE | ID: mdl-31273694

ABSTRACT

Excessive preference for risky over safe options is a hallmark of several psychiatric disorders. Here we describe a behavioral task that models such risky decision making in rats. In this task, rats are given choices between small, safe rewards and large rewards accompanied by risk of footshock punishment. The risk of punishment changes within a test session, allowing quantification of decision making at different levels of risk. Importantly, this task can yield a wide degree of reliable individual variability, allowing the characterization of rats as "risk-taking" or "risk-averse." The task has been demonstrated to be effective for testing the effects of pharmacological agents and neurobiological manipulations, and the individual variability (which mimics the human population) allows assessment of behavioral and neurobiological distinctions among subjects based on their risk-taking profile.


Subject(s)
Behavior, Animal , Decision Making , Decision Support Techniques , Models, Theoretical , Rodentia/psychology , Animals , Data Interpretation, Statistical , Disease Models, Animal , Mental Disorders/etiology , Mental Disorders/psychology , Rats
13.
Cogn Affect Behav Neurosci ; 19(6): 1404-1417, 2019 12.
Article in English | MEDLINE | ID: mdl-31342271

ABSTRACT

Differences in the prevalence and presentation of psychiatric illnesses in men and women suggest that neurobiological sex differences confer vulnerability or resilience in these disorders. Rodent behavioral models are critical for understanding the mechanisms of these differences. Reward processing and punishment avoidance are fundamental dimensions of the symptoms of psychiatric disorders. Here we explored sex differences along these dimensions using multiple and distinct behavioral paradigms. We found no sex difference in reward-guided associative learning but a faster punishment-avoidance learning in females. After learning, females were more sensitive than males to probabilistic punishment but less sensitive when punishment could be avoided with certainty. No sex differences were found in reward-guided cognitive flexibility. Thus, sex differences in goal-directed behaviors emerged selectively when there was an aversive context. These differences were critically sensitive to whether the punishment was certain or unpredictable. Our findings with these new paradigms provide conceptual and practical tools for investigating brain mechanisms that account for sex differences in susceptibility to anxiety and impulsivity. They may also provide insight for understanding the evolution of sex-specific optimal behavioral strategies in dynamic environments.


Subject(s)
Punishment , Reward , Sex Characteristics , Animals , Anxiety/chemically induced , Anxiety/psychology , Association Learning , Avoidance Learning/drug effects , Carbolines/pharmacology , Cognition , Conditioning, Operant , Dose-Response Relationship, Drug , Female , Male , Maze Learning , Rats , Uncertainty
14.
Behav Brain Res ; 359: 579-588, 2019 02 01.
Article in English | MEDLINE | ID: mdl-30296531

ABSTRACT

Excessive risk-taking is common in multiple psychiatric conditions, including substance use disorders. The risky decision-making task (RDT) models addiction-relevant risk-taking in rats by measuring preference for a small food reward vs. a large food reward associated with systematically increasing risk of shock. Here, we examined the relationship between risk-taking in the RDT and multiple addiction-relevant phenotypes. Risk-taking was associated with elevated impulsive action, but not impulsive choice or habit formation. Furthermore, risk-taking predicted locomotor sensitivity to first-time nicotine exposure and resilience to nicotine-evoked anxiety. These data demonstrate that risk preference in the RDT predicts other traits associated with substance use disorder, and may have utility for identification of neurobiological and genetic biomarkers that engender addiction vulnerability.


Subject(s)
Decision Making , Impulsive Behavior , Motor Activity/drug effects , Nicotine/pharmacology , Nicotinic Agonists/pharmacology , Risk-Taking , Animals , Anxiety/chemically induced , Behavior, Addictive/psychology , Habits , Male , Rats, Long-Evans , Substance-Related Disorders/psychology
15.
Front Behav Neurosci ; 11: 140, 2017.
Article in English | MEDLINE | ID: mdl-28848408

ABSTRACT

Multiple and unpredictable numbers of actions are often required to achieve a goal. In order to organize behavior and allocate effort so that optimal behavioral policies can be selected, it is necessary to continually monitor ongoing actions. Real-time processing of information related to actions and outcomes is typically assigned to the prefrontal cortex and basal ganglia, but also depends on midbrain regions, especially the ventral tegmental area (VTA). We were interested in how individual VTA neurons, as well as networks within the VTA, encode salient events when an unpredictable number of serial actions are required to obtain a reward. We recorded from ensembles of putative dopamine and non-dopamine neurons in the VTA as animals performed multiple cued trials in a recording session where, in each trial, serial actions were randomly rewarded. While averaging population activity did not reveal a response pattern, we observed that different neurons were selectively tuned to low, medium, or high numbered actions in a trial. This preferential tuning of putative dopamine and non-dopamine VTA neurons to different subsets of actions in a trial allowed information about binned action number to be decoded from the ensemble activity. At the network level, tuning curve similarity was positively associated with action-evoked noise correlations, suggesting that action number selectivity reflects functional connectivity within these networks. Analysis of phasic responses to cue and reward revealed that the requirement to execute multiple and uncertain numbers of actions weakens both cue-evoked responses and cue-reward response correlation. The functional connectivity and ensemble coding scheme that we observe here may allow VTA neurons to cooperatively provide a real-time account of ongoing behavior. These computations may be critical to cognitive and motivational functions that have long been associated with VTA dopamine neurons.

16.
Neuropsychopharmacology ; 42(8): 1590-1598, 2017 Jul.
Article in English | MEDLINE | ID: mdl-28128335

ABSTRACT

Nicotine has strong addictive as well as procognitive properties. While a large body of research on nicotine continues to inform us about mechanisms related to its reinforcing effects, less is known about clinically relevant mechanisms that subserve its cognitive-enhancing properties. Understanding the latter is critical for developing optimal strategies for treating cognitive deficits. The primary brain region implicated in cognitive functions improved by nicotine is the prefrontal cortex (PFC). Here we assessed the impact of nicotine on unit activity and local field potential oscillations in the PFC of behaving rats. An acute dose of nicotine produced a predominantly inhibitory influence on population activity, a small increase in gamma oscillations, and a decrease in theta and beta oscillations. After a daily dosing regimen, a shift to excitatory-inhibitory balance in single-unit activity and stronger gamma oscillations began to emerge. This pattern of plasticity was specific to the gamma band as lower frequency oscillations were suppressed consistently across daily nicotine treatments. Gamma oscillations are associated with enhanced attentional capacity. Consistent with this mechanism, the repeat dosing regimen in a separate cohort of subjects led to improved performance in an attention task. These data suggest that procognitive effects of nicotine may involve development of enhanced gamma oscillatory activity and a shift to excitatory-inhibitory balance in PFC neural activity. In the context of the clinical use of nicotine and related agonists for treating cognitive deficits, these data suggest that daily dosing may be critical to allow for development of robust gamma oscillations.


Subject(s)
Attention/drug effects , Gamma Rhythm/drug effects , Nicotine/pharmacology , Prefrontal Cortex/drug effects , Action Potentials/physiology , Animals , Beta Rhythm/drug effects , Male , Neural Inhibition/drug effects , Neurons/physiology , Prefrontal Cortex/physiology , Rats , Theta Rhythm/drug effects
17.
Brain Res ; 1654(Pt B): 171-176, 2017 01 01.
Article in English | MEDLINE | ID: mdl-27431940

ABSTRACT

Ongoing development of the dopamine system during adolescence may provide a partial mechanism for behavioral and psychiatric vulnerabilities. Despite early evidence for a hyperactive adolescent dopaminergic system, recent data suggest that adolescent dopamine may be functionally hypoactive compared to in adults. While this distinction has been established in response to dopaminergic drugs and natural rewards, little is known about age-related differences in cognitive efficacy of dopaminergic drugs. Using a recently established Cued Response Inhibition Task, we tested the effects of acute systemic methylphenidate, commonly known as Ritalin, on response inhibition and response initiation in adolescent and adults rats. First, we replicated previous data that adolescents are able to inhibit a response to a cue on par with adults, but are slower to produce a rewarded response after a stop cue. Next, we observed that methylphenidate modulated response inhibition in adult rats, with low dose (0.3mg/kg) improving inhibition, and high dose (3mg/kg) impairing performance. This dose-response pattern is commonly observed with psychostimulant cognitive modulation. In adolescents, however, methylphenidate had no effect on response inhibition at any dose. Latency of response initiation after the stop cue was not affected by methylphenidate in either adult or adolescent rats. These data establish that dose-response of a commonly prescribed psychostimulant medication is different in adolescents and adults. They further demonstrate that healthy adolescent response inhibition is not as sensitive to psychostimulants as in adults, supporting the idea that the dopamine system is hypoactive in adolescence. This article is part of a Special Issue entitled SI: Adolescent plasticity.


Subject(s)
Aging/drug effects , Aging/psychology , Central Nervous System Stimulants/pharmacology , Inhibition, Psychological , Methylphenidate/pharmacology , Analysis of Variance , Animals , Conditioning, Psychological/drug effects , Conditioning, Psychological/physiology , Cues , Dose-Response Relationship, Drug , Male , Motor Activity/drug effects , Motor Activity/physiology , Nonlinear Dynamics , Psychological Tests , Rats, Sprague-Dawley
18.
Biol Psychiatry ; 79(11): 878-86, 2016 06 01.
Article in English | MEDLINE | ID: mdl-26067679

ABSTRACT

BACKGROUND: Elucidating the neurobiology of the adolescent brain is fundamental to our understanding of the etiology of psychiatric disorders such as schizophrenia and addiction, the symptoms of which often manifest during this developmental period. Dopamine neurons in the ventral tegmental area (VTA) are strongly implicated in adolescent behavioral and psychiatric vulnerabilities, but little is known about how adolescent VTA neurons encode information during motivated behavior. METHODS: We recorded daily from VTA neurons in adolescent and adult rats during learning and maintenance of a cued, reward-motivated instrumental task and extinction from this task. RESULTS: During performance of the same motivated behavior, identical events were encoded differently by adult and adolescent VTA neurons. Adolescent VTA neurons with dopamine-like characteristics lacked a reward anticipation signal and showed a smaller response to reward delivery compared with adults. After extinction, however, these neurons maintained a strong phasic response to cues formerly predictive of reward opportunity. CONCLUSIONS: Anticipatory neuronal activity in the VTA supports preparatory attention and is implicated in error prediction signaling. Absence of this activity, combined with persistent representations of previously rewarded experiences, may provide a mechanism for rash decision making in adolescents.


Subject(s)
Anticipation, Psychological/physiology , Neurons/physiology , Reward , Ventral Tegmental Area/growth & development , Ventral Tegmental Area/physiology , Action Potentials , Animals , Attention/physiology , Conditioning, Operant/physiology , Cues , Electrodes, Implanted , Extinction, Psychological/physiology , Male , Motivation/physiology , Rats, Sprague-Dawley
19.
J Neurophysiol ; 114(6): 3374-85, 2015 Dec.
Article in English | MEDLINE | ID: mdl-26467523

ABSTRACT

Internal representations of action-outcome relationships are necessary for flexible adaptation of motivated behavior in dynamic environments. Prefrontal cortex (PFC) is implicated in flexible planning and execution of goal-directed actions, but little is known about how information about action-outcome relationships is represented across functionally distinct regions of PFC. Here, we observe distinct patterns of action-evoked single unit activity in the medial prefrontal cortex (mPFC) and orbitofrontal cortex (OFC) during a task in which the relationship between outcomes and actions was independently manipulated. The mPFC encoded changes in the number of actions required to earn a reward, but not fluctuations in outcome magnitude. In contrast, OFC neurons decreased firing rates as outcome magnitude was increased, but were insensitive to changes in action requirement. A subset of OFC neurons also tracked outcome availability. Pre-outcome anticipatory activity in both mPFC and OFC was altered when reward expectation was reduced, but did not differ with outcome magnitude. These data provide novel evidence that PFC regions encode distinct information about the relationship between actions and impending outcomes during action execution.


Subject(s)
Action Potentials , Neurons/physiology , Prefrontal Cortex/physiology , Reward , Animals , Anticipation, Psychological , Male , Prefrontal Cortex/cytology , Rats , Rats, Sprague-Dawley
20.
Dev Cogn Neurosci ; 11: 145-54, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25524828

ABSTRACT

Immaturities in adolescent reward processing are thought to contribute to poor decision making and increased susceptibility to develop addictive and psychiatric disorders. Very little is known; however, about how the adolescent brain processes reward. The current mechanistic theories of reward processing are derived from adult models. Here we review recent research focused on understanding of how the adolescent brain responds to rewards and reward-associated events. A critical aspect of this work is that age-related differences are evident in neuronal processing of reward-related events across multiple brain regions even when adolescent rats demonstrate behavior similar to adults. These include differences in reward processing between adolescent and adult rats in orbitofrontal cortex and dorsal striatum. Surprisingly, minimal age related differences are observed in ventral striatum, which has been a focal point of developmental studies. We go on to discuss the implications of these differences for behavioral traits affected in adolescence, such as impulsivity, risk-taking, and behavioral flexibility. Collectively, this work suggests that reward-evoked neural activity differs as a function of age and that regions such as the dorsal striatum that are not traditionally associated with affective processing in adults may be critical for reward processing and psychiatric vulnerability in adolescents.


Subject(s)
Aging/physiology , Brain/physiology , Decision Making/physiology , Reward , Animals , Cerebral Cortex/physiology , Corpus Striatum/physiology , Frontal Lobe/physiology , Impulsive Behavior , Rats , Risk-Taking
SELECTION OF CITATIONS
SEARCH DETAIL
...