Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
1.
J Stud Alcohol Drugs ; 85(1): 26-31, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37796622

ABSTRACT

OBJECTIVE: Contingency management (CM) is the gold standard treatment for stimulant use disorder but typically requires twice- to thrice-weekly in-person treatment visits to objectively verify abstinence and deliver therapeutic incentives. There has been growing interest in telehealth-based delivery of CM to support broad access to this essential intervention--a need that has been emphatically underscored by the COVID-19 pandemic. Herein, we present observations from initial efforts to develop and test a protocol for telehealth-based delivery of prize-based CM treatment incentivizing stimulant abstinence. METHOD: Four participants engaged in hybrid courses of CM, including one or more telehealth-based treatment sessions, involving self-administered oral fluid testing to confirm abstinence. Observations from initial participants informed iterative improvements to telehealth procedures, and a 12-week course of telehealth-based CM was subsequently offered to two additional participants to further evaluate preliminary feasibility and acceptability. RESULTS: In most cases, participants were able to successfully join telehealth treatment sessions, self-administer oral fluid testing, and share oral fluid test results to verify stimulant abstinence. However, further improvements in telehealth-based toxicology testing may be necessary to interpret test results accurately and reliably, especially when colorimetric immunoassay results reflect substance concentrations near the cutoff for point-of-care testing devices. CONCLUSIONS: Preliminary findings suggest that telehealth-based CM is sufficiently feasible and acceptable to support future development, in particular through improved methods for remote interpretation and verification of test results. This is especially important in CM, wherein accurate and reliable detection of both early and sustained abstinence is crucial for appropriate delivery of therapeutic incentives.


Subject(s)
COVID-19 , Substance-Related Disorders , Telemedicine , Humans , Substance-Related Disorders/therapy , Pandemics , Behavior Therapy/methods
2.
Addict Behav Rep ; 18: 100518, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37955039

ABSTRACT

Research examining episodic future thinking (EFT; i.e., imagining oneself in future contexts) in community samples has demonstrated reduced discounting of delayed rewards when personalized event cues are included to prompt EFT related to reward latencies. While this EFT effect was recently demonstrated in individuals with substance use disorders, it is not yet known if it manifests similarly in individuals with and without a significant incarceration history-the latter being at elevated risk for negative outcomes including criminal recidivism. Individuals with cocaine use disorder (n = 35) identified personally-relevant future events and participated in a computerized delay discounting task, involving decisions between smaller immediate rewards or larger delayed rewards with and without EFT cues. Individuals with (n = 19) and without (n = 16) a significant history of incarceration were identified using the Addiction Severity Index-Lite. A significant reduction in discounting rates was observed when event cues were included to promote EFT (p = 0.02); however, there was no main effect of incarceration history on discounting behavior, or interaction between episodic future thinking condition and incarceration history. Results suggest personalized cues included to evoke EFT reduce discounting behavior in individuals with cocaine use disorder, regardless of incarceration history. EFT-based interventions may therefore have promise to reduce impulsive decision-making in individuals with cocaine use disorder with and without a significant history of incarceration, potentially supporting improved outcomes with respect to both substance use and future criminality.

3.
Am J Drug Alcohol Abuse ; 49(1): 5-20, 2023 01 02.
Article in English | MEDLINE | ID: mdl-36099534

ABSTRACT

Background: Tools predicting individual relapse risk would invaluably inform clinical decision-making (e.g. level-of-care) in substance use treatment. Studies of neuroprediction - use of neuromarkers to predict individual outcomes - have the dual potential to create such tools and inform etiological models leading to new treatments. However, financial limitations, statistical power demands, and related factors encourage restrictive selection criteria, yielding samples that do not fully represent the target population. This problem may be further compounded by a lack of statistical optimism correction in neuroprediction research, resulting in predictive models that are overfit to already-restricted samples.Objectives: This systematic review aims to identify potential threats to external validity related to restrictive selection criteria and underutilization of optimism correction in the existing neuroprediction literature targeting substance use treatment outcomes.Methods: Sixty-seven studies of neuroprediction in substance use treatment were identified and details of sample selection criteria and statistical optimism correction were extracted.Results: Most publications were found to report restrictive selection criteria (e.g. excluding psychiatric (94% of publications) and substance use comorbidities (69% of publications)) that would rule-out a considerable portion of the treatment population. Furthermore, only 21% of publications reported optimism correction.Conclusion: Restrictive selection criteria and underutilization of optimism correction are common in the existing literature and may limit the generalizability of identified neural predictors to the target population whose treatment they would ultimately inform. Greater attention to the inclusivity and generalizability of addiction neuroprediction research, as well as new opportunities provided through open science initiatives, have the potential to address this issue.


Subject(s)
Behavior, Addictive , Substance-Related Disorders , Humans , Patient Selection , Research Design , Treatment Outcome
4.
J Psychiatr Pract ; 28(6): 497-504, 2022 11 01.
Article in English | MEDLINE | ID: mdl-36355590

ABSTRACT

OBJECTIVE: Long-acting injectable antipsychotics (LAI-As) are a crucial treatment option for individuals with serious mental illness. However, due to the necessity of in-person administration of LAI-As, pandemics pose unique challenges for continuity of care in the population prescribed these medications. This project investigated the impact of the coronavirus disease 2019 (COVID-19) pandemic on LAI-A adherence at a Veterans Health Administration medical facility in the United States, as well as changes in LAI-A prescribing and administration practices during this period. METHODS: Electronic health records were evaluated for 101 patients prescribed LAI-As. A subset of 13 patients also participated in an interview and rated subjective concerns about pandemic-related barriers to medication adherence. RESULTS: Pandemic-related barriers to LAI-A adherence and/or changes to LAI-A medications were documented in 33% of the patients. Within-subjects comparison of an adherence metric computed from electronic health record data further suggested a somewhat higher incidence of missed or delayed LAI-A doses during the pandemic compared with before the pandemic. In contrast, only 2 of the 13 patients interviewed anticipated that pandemic-related concerns would interfere with medication adherence. CONCLUSIONS: The results of this study suggest that LAI-A access and adherence can be disrupted by pandemics and other public health emergencies but this finding may not generalize to other sites. As patients may not foresee the potential for disruption, psychiatric service providers may need to assist in proactively problem-solving barriers to access. Improved preparedness and additional safeguards against pandemic-related disruptions to LAI-A access and adherence may help mitigate adverse outcomes in the future. Identifying patients at elevated risk for such disruptions may help support these efforts.


Subject(s)
Antipsychotic Agents , COVID-19 , Schizophrenia , Humans , United States , Antipsychotic Agents/therapeutic use , Pandemics , Schizophrenia/drug therapy , Delayed-Action Preparations/therapeutic use , Injections , Medication Adherence
5.
Psychiatr Serv ; 73(5): 580-583, 2022 05.
Article in English | MEDLINE | ID: mdl-34496628

ABSTRACT

Individuals with psychiatric disorders often struggle to initiate and engage in treatment. Financial incentives improve treatment engagement, including treatment attendance, medication adherence, and abstinence from substance use. The U.S. Department of Veterans Affairs (VA) recently made the first large-scale, successful effort to implement incentive-based interventions in substance use disorder treatment. Health care systems, including the VA, can increase the impact of these interventions by extending them to target a range of psychiatric disorders, adapting them for specific clinical contexts, using insights from behavioral economics, and partnering with corporations to fund incentives and implement interventions.


Subject(s)
Substance-Related Disorders , Veterans , Humans , Motivation , Psychotherapy , Substance-Related Disorders/psychology , Substance-Related Disorders/therapy , United States , United States Department of Veterans Affairs , Veterans/psychology
6.
Contemp Clin Trials Commun ; 23: 100796, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34278041

ABSTRACT

BACKGROUND: Electrophysiological measures can predict and reflect substance use treatment response. Veterans are disproportionately affected by disorders of addiction; cocaine use disorder (CUD) being particularly problematic due to high relapse rates and the absence of approved pharmacotherapies. Prize-based Contingency Management (PBCM) is an evidence-based behavioral intervention for CUD, involving incentives for cocaine abstinence but treatment response is variable. Measurement-based adaptation of PBCM has promise to improve effectiveness but remains to be usefully developed. METHODS: This trial aims to determine if individuals with distinct neurocognitive profiles differentially benefit from one of two existing versions of PBCM. CUD patients will be randomized into treatment-as-usual or 12-weeks of PBCM using either monetary or tangible prize incentives. Prior to randomization, EEG will be used to assess response to monetary versus tangible reward; EEG and cognitive-behavioral measures of working memory, cognitive control, and episodic future thinking will also be acquired. Substance use and treatment engagement will be monitored throughout the treatment interval and assessments will be repeated at post-treatment. DISCUSSION: Results of this trial may elucidate individual differences contributing to PBCM treatment response and reveal predictors of differential benefits from existing treatment variants. The design also affords the opportunity to evaluate treatment-related changes in neurocognitive functioning over the course of PBCM. Our model posits that PBCM scaffolds future-oriented goal representation and self-control to support abstinence. Individuals with poorer functioning may be less responsive to abstract monetary reward and will therefore achieve better outcomes with respect to abstinence and treatment engagement when tangible incentives are utilized.

7.
Am J Drug Alcohol Abuse ; 47(2): 199-208, 2021 03 04.
Article in English | MEDLINE | ID: mdl-33539190

ABSTRACT

Background: Episodic future thinking (EFT; i.e., envisioning oneself in future contexts) has been demonstrated to reduce discounting of future reward in healthy adults. While this approach has the potential to support future-oriented decision-making in substance use recovery, the impact of EFT on discounting behavior in illicit stimulant users has not yet been evaluated.Objectives: This pilot study aimed to (1) assess the feasibility of utilizing EFT methods in individuals with cocaine use disorder (CUD) and (2) conduct preliminary measurement of the EFT effect on discounting behavior in this population.Methods: Eighteen treatment-seeking individuals with CUD (17 males) were interviewed about positive and neutral events expected to occur at a range of future latencies. Future event information identified by participants was subsequently included on a subset of trials in an intertemporal choice task to promote EFT; within-subject differences in discounting between standard and EFT conditions were evaluated.Results: Participants identified relevant events and demonstrated decreased discounting of future reward when event descriptors were included (relative to discounting without event descriptors; p = .039). It was further noted that most events identified by participants were goals, rather than plans or significant dates.Conclusion: While methods previously used to study the effect of EFT on discounting behavior in healthy individuals are also effective in individuals with CUD, methodological factors - including types of events identified - should be carefully considered in future work. These preliminary findings suggest that EFT can reduce impulsive decision-making in cocaine use disorder and may therefore have therapeutic value.


Subject(s)
Cocaine-Related Disorders/psychology , Delay Discounting , Reward , Adolescent , Adult , Aged , Choice Behavior , Female , Humans , Impulsive Behavior , Male , Middle Aged , Pilot Projects , Thinking , Veterans/psychology , Young Adult
8.
J Subst Abuse Treat ; 100: 64-83, 2019 05.
Article in English | MEDLINE | ID: mdl-30898330

ABSTRACT

Contingency Management is an evidence-based treatment for substance use disorders with strong potential for measurement-based customization. Previous work has examined individual difference factors in Contingency Management treatment response of potential relevance to treatment targeting and adaptive implementation; however, a systematic review of such factors has not yet been conducted. Here, we summarize and evaluate the existing literature on patient-level predictors, mediators, and moderators of Contingency Management treatment response in stimulant and/or opioid using outpatients - clinical populations most frequently targeted in Contingency Management research and clinical practice. Our search strategy identified 648 unique, peer-reviewed publications, of which 39 met full inclusion criteria for the current review. These publications considered a variety of individual difference factors, including (1) motivation to change and substance use before and during treatment (8/39 publications), (2) substance use comorbidity and chronicity (8/39 publications), (3) psychiatric comorbidity and severity (8/39 publications), (4) medical, legal, and sociodemographic considerations (15/39 publications), and (5) cognitive-behavioral variables (1/39 publications). Contingency Management was generally associated with improved treatment outcomes (e.g., longer periods of continuous abstinence, better retention), regardless of individual difference factors; however, specific patient-level characteristics were associated with either an enhanced (e.g., more previous treatment attempts, history of sexual abuse, diagnosis of antisocial personality disorder) or diminished (e.g., complex post-traumatic stress symptoms, pretreatment benzodiazepine use) response to Contingency Management. Overall, the current literature is limited but existing evidence generally supports greater benefits of Contingency Management in patients who would otherwise have a poorer prognosis in standard outpatient care. It was also identified that the majority of previous work represents a posteriori analysis of pre-existing clinical samples and has therefore rarely considered pre-specified, hypothesis-driven individual difference factors. We therefore additionally highlight patient-level factors that are currently understudied, as well as promising future directions for measurement-based treatment adaptations that may directly respond to patient traits and states to improve Contingency Management effectiveness across individuals and over time.


Subject(s)
Behavior Therapy , Evidence-Based Practice , Individuality , Motivation , Outcome Assessment, Health Care , Reward , Substance-Related Disorders/therapy , Humans
9.
Drug Alcohol Depend ; 185: 93-105, 2018 04 01.
Article in English | MEDLINE | ID: mdl-29428325

ABSTRACT

BACKGROUND: Predicting relapse vulnerability can inform level-of-care and personalized substance use treatment. Few reliable predictors of relapse risk have been identified from traditional clinical, psychosocial, and demographic variables. However, recent neuroimaging findings highlight the potential prognostic import of brain-based signals, indexing the degree to which neural systems have been perturbed by addiction. These proposed "neuromarkers" forecast the likelihood, severity, and timing of relapse but the reliability and generalizability of such effects remains to be established. METHODS: Activation likelihood estimation was used to conduct a preliminary quantitative, coordinate-based meta-analysis of the addiction neuroprediction literature; specifically, studies wherein baseline measures of regional cerebral blood flow were prospectively associated with substance use treatment outcomes. Consensus patterns of activation associated with relapse vulnerability (greater activation predicts poorer outcomes) versus resilience (greater activation predicts improved outcomes) were specifically investigated. RESULTS: Twenty-four eligible studies yielded 134 foci, representing 923 subjects. Consensus activation was identified in right putamen and claustrum (p < .05, cluster-corrected) in relation to positive and negative treatment outcomes - likely reflecting variability in measurement context (e.g., task, sample characteristics) across datasets. A single cluster in rostral-ventral anterior cingulate cortex (rACC) was associated with relapse resilience, specifically (p < .05, cluster-corrected); no significant vulnerability-related clusters were identified. CONCLUSIONS: Right putamen activation has been associated with relapse vulnerability and resilience, while increased baseline rACC activation has been consistently associated with improved treatment outcomes. Methodological heterogeneity within the existing literature, however, limits firm conclusions and future work will be necessary to confirm and clarify these results.


Subject(s)
Brain/diagnostic imaging , Cerebrovascular Circulation/physiology , Magnetic Resonance Imaging , Neuroimaging , Substance-Related Disorders/diagnostic imaging , Behavior, Addictive/diagnostic imaging , Humans , Recurrence , Reproducibility of Results
10.
Psychopharmacology (Berl) ; 234(17): 2545-2562, 2017 Sep.
Article in English | MEDLINE | ID: mdl-28601965

ABSTRACT

BACKGROUND: Signals carried by the mesencephalic dopamine system and conveyed to anterior cingulate cortex are critically implicated in probabilistic reward learning and performance monitoring. A common evaluative mechanism purportedly subserves both functions, giving rise to homologous medial frontal negativities in feedback- and response-locked event-related brain potentials (the feedback-related negativity (FRN) and the error-related negativity (ERN), respectively), reflecting dopamine-dependent prediction error signals to unexpectedly negative events. Consistent with this model, the dopamine receptor antagonist, haloperidol, attenuates the ERN, but effects on FRN have not yet been evaluated. METHODS: ERN and FRN were recorded during a temporal interval learning task (TILT) following randomized, double-blind administration of haloperidol (3 mg; n = 18), diphenhydramine (an active control for haloperidol; 25 mg; n = 20), or placebo (n = 21) to healthy controls. Centroparietal positivities, the Pe and feedback-locked P300, were also measured and correlations between ERP measures and behavioral indices of learning, overall accuracy, and post-error compensatory behavior were evaluated. We hypothesized that haloperidol would reduce ERN and FRN, but that ERN would uniquely track automatic, error-related performance adjustments, while FRN would be associated with learning and overall accuracy. RESULTS: As predicted, ERN was reduced by haloperidol and in those exhibiting less adaptive post-error performance; however, these effects were limited to ERNs following fast timing errors. In contrast, the FRN was not affected by drug condition, although increased FRN amplitude was associated with improved accuracy. Significant drug effects on centroparietal positivities were also absent. CONCLUSIONS: Our results support a functional and neurobiological dissociation between the ERN and FRN.


Subject(s)
Dopamine Antagonists/pharmacology , Evoked Potentials/drug effects , Haloperidol/pharmacology , Learning/drug effects , Reward , Adult , Brain/drug effects , Brain Mapping/methods , Double-Blind Method , Electroencephalography/methods , Female , Humans , Male , Middle Aged , Young Adult
11.
Addiction ; 112(5): 884-896, 2017 May.
Article in English | MEDLINE | ID: mdl-28029198

ABSTRACT

BACKGROUND AND AIMS: Patterns of brain activation have demonstrated promise as prognostic indicators in substance dependent individuals (SDIs) but have not yet been explored in SDIs typical of community-based treatment settings. DESIGN: Prospective clinical outcome design, evaluating baseline functional magnetic resonance imaging data from the Balloon Analogue Risk Task (BART) as a predictor of 3-month substance use treatment outcomes. SETTING: Community-based substance use programs in Bloomington, Indiana, USA. PARTICIPANTS: Twenty-three SDIs (17 male, aged 18-43 years) in an intensive outpatient or residential treatment program; abstinent 1-4 weeks at baseline. MEASUREMENTS: Event-related brain response, BART performance and self-report scores at treatment onset, substance use outcome measure (based on days of use). FINDINGS: Using voxel-level predictive modeling and leave-one-out cross-validation, an elevated response to unexpected negative feedback in bilateral amygdala and anterior hippocampus (Amyg/aHipp) at baseline successfully predicted greater substance use during the 3-month study interval (P ≤ 0.006, cluster-corrected). This effect was robust to inclusion of significant non-brain-based covariates. A larger response to negative feedback in bilateral Amyg/aHipp was also associated with faster reward-seeking responses after negative feedback (r(23)  = -0.544, P = 0.007; r(23)  = -0.588, P = 0.003). A model including Amyg/aHipp activation, faster reward-seeking after negative feedback and significant self-report scores accounted for 45% of the variance in substance use outcomes in our sample. CONCLUSIONS: An elevated response to unexpected negative feedback in bilateral amygdala and anterior hippocampus (Amyg/aHipp) appears to predict relapse to substance use in people attending community-based treatment.


Subject(s)
Brain/physiopathology , Substance-Related Disorders/rehabilitation , Adolescent , Adult , Ambulatory Care , Amygdala/diagnostic imaging , Amygdala/physiopathology , Brain/diagnostic imaging , Community Mental Health Services , Feedback , Female , Functional Neuroimaging , Hippocampus/diagnostic imaging , Hippocampus/physiopathology , Humans , Magnetic Resonance Imaging , Male , Prognosis , Prospective Studies , Residential Treatment , Reward , Substance-Related Disorders/diagnostic imaging , Substance-Related Disorders/physiopathology , Treatment Outcome , Young Adult
12.
Drug Alcohol Depend ; 168: 52-60, 2016 Nov 01.
Article in English | MEDLINE | ID: mdl-27620345

ABSTRACT

BACKGROUND: Few studies have explored longitudinal change in event-related brain responses during early recovery from addiction. Moreover, existing findings yield evidence of both increased and decreased signaling within reward and control centers over time. The current study explored reward- and control-related signals in a risky decision-making task and specifically investigated parametric modulations of the BOLD signal, rather than signal magnitude alone. It was hypothesized that risk-related signals during decision-making and outcome evaluation would reflect recovery and that change in specific signals would correspond with improved treatment outcomes. METHODS: Twenty-one substance dependent individuals were recruited upon enrollment in community-based substance use treatment programs, wherein they received treatment-as-usual. Participants completed functional neuroimaging assessments at baseline and 3-month follow-up while performing the Balloon Analogue Risk Task (BART). Risk- and reward-sensitive signals were identified using parametric modulators. Substance use was tracked throughout the 3-month study interval using the timeline follow-back procedure. RESULTS: Longitudinal contrasts of parametric modulators suggested improved formation of risk-informed outcome expectations at follow-up. Specifically, a greater response to high risk (low-likelihood) positive feedback was identified in caudal anterior cingulate cortex (ACC) and a greater response to low risk (low-likelihood) negative feedback was identified in caudal ACC and inferior frontal gyrus. In addition, attenuation of a ventromedial prefrontal cortex (vmPFC) "reward-seeking" signal (i.e., increasing response with greater reward) during risky decisions at follow-up was associated with less substance use during the study interval. CONCLUSIONS: Changes in risk- and reward-related signaling in ACC/vmPFC appear to reflect recovery and may support sobriety.


Subject(s)
Behavior, Addictive/diagnostic imaging , Decision Making/physiology , Gyrus Cinguli/diagnostic imaging , Prefrontal Cortex/diagnostic imaging , Reward , Substance-Related Disorders/therapy , Adult , Brain Mapping/methods , Female , Functional Neuroimaging , Humans , Magnetic Resonance Imaging/methods , Male , Neuropsychological Tests , Risk , Risk-Taking , Substance-Related Disorders/diagnostic imaging , Young Adult
13.
Br J Nutr ; 113(4): 654-64, 2015 Feb 28.
Article in English | MEDLINE | ID: mdl-25630436

ABSTRACT

Dietary assessment in older adults can be challenging. The Novel Assessment of Nutrition and Ageing (NANA) method is a touch-screen computer-based food record that enables older adults to record their dietary intakes. The objective of the present study was to assess the relative validity of the NANA method for dietary assessment in older adults. For this purpose, three studies were conducted in which a total of ninety-four older adults (aged 65-89 years) used the NANA method of dietary assessment. On a separate occasion, participants completed a 4 d estimated food diary. Blood and 24 h urine samples were also collected from seventy-six of the volunteers for the analysis of biomarkers of nutrient intake. The results from all the three studies were combined, and nutrient intake data collected using the NANA method were compared against the 4 d estimated food diary and biomarkers of nutrient intake. Bland-Altman analysis showed a reasonable agreement between the dietary assessment methods for energy and macronutrient intake; however, there were small, but significant, differences for energy and protein intake, reflecting the tendency for the NANA method to record marginally lower energy intakes. Significant positive correlations were observed between urinary urea and dietary protein intake using both the NANA and the 4 d estimated food diary methods, and between plasma ascorbic acid and dietary vitamin C intake using the NANA method. The results demonstrate the feasibility of computer-based dietary assessment in older adults, and suggest that the NANA method is comparable to the 4 d estimated food diary, and could be used as an alternative to the food diary for the short-term assessment of an individual's dietary intake.


Subject(s)
Aging , Diet/adverse effects , Nutritional Status , Aged , Aged, 80 and over , Aging/blood , Aging/urine , Ascorbic Acid/administration & dosage , Ascorbic Acid/blood , Biomarkers/blood , Biomarkers/urine , Computers , Diet Records , Dietary Proteins/administration & dosage , Energy Intake , Feasibility Studies , Female , Geriatric Assessment , Humans , Internet , Male , Nutrition Assessment , Reproducibility of Results , United Kingdom , Urea/urine , User-Computer Interface
14.
PLoS One ; 9(3): e90281, 2014.
Article in English | MEDLINE | ID: mdl-24603900

ABSTRACT

There has been accumulating evidence that cognitive control can be adaptively regulated by monitoring for processing conflict as an index of online control demands. However, it is not yet known whether top-down control mechanisms respond to processing conflict in a manner specific to the operative task context or confer a more generalized benefit. While previous studies have examined the taskset-specificity of conflict adaptation effects, yielding inconsistent results, control-related performance adjustments following errors have been largely overlooked. This gap in the literature underscores recent debate as to whether post-error performance represents a strategic, control-mediated mechanism or a nonstrategic consequence of attentional orienting. In the present study, evidence of generalized control following both high conflict correct trials and errors was explored in a task-switching paradigm. Conflict adaptation effects were not found to generalize across tasksets, despite a shared response set. In contrast, post-error slowing effects were found to extend to the inactive taskset and were predictive of enhanced post-error accuracy. In addition, post-error performance adjustments were found to persist for several trials and across multiple task switches, a finding inconsistent with attentional orienting accounts of post-error slowing. These findings indicate that error-related control adjustments confer a generalized performance benefit and suggest dissociable mechanisms of post-conflict and post-error control.


Subject(s)
Adaptation, Psychological/physiology , Cognition/physiology , Color Perception/physiology , Pattern Recognition, Visual/physiology , Adolescent , Analysis of Variance , Conflict, Psychological , Female , Humans , Male , Photic Stimulation , Psychomotor Performance/physiology , Reaction Time/physiology , Young Adult
15.
J Am Geriatr Soc ; 60(9): 1645-54, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22880945

ABSTRACT

OBJECTIVES: To determine the effect of a dietary intervention and micronutrient supplementation on self-reported infections in older adults. DESIGN: A randomized, placebo-controlled intervention trial. SETTING: Community living older people in South Yorkshire, United Kingdom. PARTICIPANTS: Two-hundred seventeen older adults aged 65 to 85. INTERVENTION: Participants were randomized to a dietary intervention, a daily micronutrient supplement, or placebo for 3 months, with a 3-month follow-up. MEASUREMENTS: Self-reported measures of infection were reported over the 6-month study period. Secondary outcome measures were nutritional status, dietary intake, quality of life, and depression. RESULTS: Self-reported measures of infection over the 6-month duration of the study were significantly different between the treatment groups. The number of weeks in which illness affected life and the number of general practitioner and hospital visits were significantly lower in the food and micronutrient groups than in the placebo group. The number of weeks in which symptoms of an infection were described was significantly lower in the food group than the placebo and micronutrient groups. Significant improvements in biomarkers of micronutrient status were achieved in the food and micronutrient groups and showed significantly greater change than observed in the placebo group. Significant improvement in dietary intakes was observed in the food group only. CONCLUSION: Improving dietary intake and micronutrient status reduces the clinical impact of self-reported infections in older adults.


Subject(s)
Diet , Infections/epidemiology , Micronutrients/administration & dosage , Nutritional Status , Aged , Aged, 80 and over , Analysis of Variance , Anthropometry , Depression/epidemiology , England/epidemiology , Female , Humans , Male , Poisson Distribution , Quality of Life , Treatment Outcome
16.
J Cogn Neurosci ; 23(4): 923-35, 2011 Apr.
Article in English | MEDLINE | ID: mdl-20146615

ABSTRACT

Mechanisms by which the brain monitors and modulates performance are an important focus of recent research. The conflict-monitoring hypothesis posits that the ACC detects conflict between competing response pathways which, in turn, signals for enhanced control. The N2, an ERP component that has been localized to ACC, has been observed after high conflict stimuli. As a candidate index of the conflict signal, the N2 would be expected to be sensitive to the degree of response conflict present, a factor that depends on both the features of external stimuli and the internal control state. In the present study, we sought to explore the relationship between N2 amplitude and these variables through use of a modified Eriksen flankers task in which target-distracter compatibility was parametrically varied. We hypothesized that greater target-distracter incompatibility would result in higher levels of response conflict, as indexed by both behavior and the N2 component. Consistent with this prediction, there were parametric degradations in behavioral performance and increases in N2 amplitudes with increasing incompatibility. Further, increasingly incompatible stimuli led to the predicted parametric increases in control on subsequent incompatible trials as evidenced by enhanced performance and reduced N2 amplitudes. These findings suggest that the N2 component and associated behavioral performance are finely sensitive to the degree of response conflict present and to the control adjustments that result from modulations in conflict.


Subject(s)
Adaptation, Psychological/physiology , Cognition/physiology , Conflict, Psychological , Gyrus Cinguli/physiology , Adolescent , Analysis of Variance , Electroencephalography/methods , Evoked Potentials/physiology , Female , Humans , Male , Reaction Time/physiology , Statistics as Topic , Young Adult
17.
Neuroimage ; 55(1): 253-65, 2011 Mar 01.
Article in English | MEDLINE | ID: mdl-21094259

ABSTRACT

The medial prefrontal cortex (mPFC) is active in conditions of performance monitoring including error commission and response conflict, but the mechanisms underlying these effects remain in dispute. Recent work suggests that mPFC learns to predict the value of actions, and that error effects represent a discrepancy between actual and expected outcomes of an action. In general, expectation signals regarding the outcome of an action may have a temporal structure, given that outcomes are expected at specific times. Nonetheless, it is unknown whether and how mPFC predicts the timing as well as the valence of expected action outcomes. Here we show with fMRI that otherwise correct feedback elicits apparent error-related activity in mPFC when delivered later than expected, suggesting that mPFC predicts not only the valence but also the timing of expected outcomes of an action. Results of a model-based analysis of fMRI data suggested that regions in the caudal cingulate zone, dorsal mPFC, and dorsal anterior cingulate cortex were jointly responsive to unexpectedly delayed feedback and negative feedback outcomes. These results suggest that regions in anterior cingulate and mPFC may be more broadly responsive to outcome prediction errors, signaling violations of both predicted outcome valence and predicted outcome timing, and the results further constrain theories of performance monitoring and cognitive control pertaining to these regions.


Subject(s)
Choice Behavior/physiology , Magnetic Resonance Imaging , Nerve Net/physiology , Prefrontal Cortex/physiology , Task Performance and Analysis , Adolescent , Adult , Female , Humans , Male , Young Adult
18.
BMC Med Res Methodol ; 10: 17, 2010 Feb 22.
Article in English | MEDLINE | ID: mdl-20175903

ABSTRACT

BACKGROUND: The success of a human intervention trial depends upon the ability to recruit eligible volunteers. Many trials fail because of unrealistic recruitment targets and flawed recruitment strategies. In order to predict recruitment rates accurately, researchers need information on the relative success of various recruitment strategies. Few published trials include such information and the number of participants screened or approached is not always cited. METHODS: This paper will describe in detail the recruitment strategies employed to identify older adults for recruitment to a 6-month randomised controlled dietary intervention trial which aimed to explore the relationship between diet and immune function (The FIT study). The number of people approached and recruited, and the reasons for exclusion, will be discussed. RESULTS: Two hundred and seventeen participants were recruited to the trial. A total of 7,482 letters were sent to potential recruits using names and addresses that had been supplied by local Family (General) Practices. Eight hundred and forty three potential recruits replied to all methods of recruitment (528 from GP letters and 315 from other methods). The eligibility of those who replied was determined using a screening telephone interview, 217 of whom were found to be suitable and agreed to take part in the study. CONCLUSION: The study demonstrates the application of multiple recruitment methods to successfully recruit older people to a randomised controlled trial. The most successful recruitment method was by contacting potential recruits by letter on NHS headed note paper using contacts provided from General Practices. Ninety percent of recruitment was achieved using this method. Adequate recruitment is fundamental to the success of a research project, and appropriate strategies must therefore be adopted in order to identify eligible individuals and achieve recruitment targets.


Subject(s)
Diet , Immunity/physiology , Patient Selection , Randomized Controlled Trials as Topic/methods , Aged , Aged, 80 and over , Female , Humans , Male
SELECTION OF CITATIONS
SEARCH DETAIL
...