Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 31
Filter
1.
Brain Sci ; 13(12)2023 Nov 21.
Article in English | MEDLINE | ID: mdl-38137060

ABSTRACT

Anxiety and stress plague populations worldwide. Voluntary regulated breathing practices offer a tool to address this epidemic. We examined peer-reviewed published literature to understand effective approaches to and implementation of these practices. PubMed and ScienceDirect were searched to identify clinical trials evaluating isolated breathing-based interventions with psychometric stress/anxiety outcomes. Two independent reviewers conducted all screening and data extraction. Of 2904 unique articles, 731 abstracts, and 181 full texts screened, 58 met the inclusion criteria. Fifty-four of the studies' 72 interventions were effective. Components of effective and ineffective interventions were evaluated to develop a conceptual framework of factors associated with stress/anxiety reduction effectiveness. Effective breath practices avoided fast-only breath paces and sessions <5 min, while including human-guided training, multiple sessions, and long-term practice. Population, other breath paces, session duration ≥5 min, and group versus individual or at-home practices were not associated with effectiveness. Analysis of interventions that did not fit this framework revealed that extensive standing, interruptions, involuntary diaphragmatic obstruction, and inadequate training for highly technical practices may render otherwise promising interventions ineffective. Following this evidence-based framework can help maximize the stress/anxiety reduction benefits of breathing practices. Future research is warranted to further refine this easily accessible intervention for stress/anxiety relief.

2.
Braz J Otorhinolaryngol ; 89(1): 48-53, 2023.
Article in English | MEDLINE | ID: mdl-34716112

ABSTRACT

OBJECTIVE: To assess the value of a morphine Patient Controlled Intravenous Analgesia (PCIA) after Tonsillectomies (TE). METHODS: 30 adult patients were treated with oral analgesics (protocol group) and compared to 30 patients treated with a morphine PCIA for the first 3 Postoperative Days (PODs) after TE. Average and maximum pain severities (Numeric Rating Scale - NRS: 0-10) on PODs 1-3, analgesic score, quality of life, patient satisfaction and side effects were defined as outcome measures. RESULTS: Average pain severities of the protocol and the PCIA group were of similar magnitude (NRS) (POD1: 4.48 vs. 4.71 [p = 0.68], POD2: 4.75 vs. 4.22 [p = 0.32] and POD3: 4.44 vs. 4.25 [p = 0.71]). Maximum pain intensities on POD1 (p = 0.92), POD2 (p = 0.51) and POD3 (p = 0.36) were also comparable between both groups. Patients with a PCIA consumed significantly more opioids (p = 0.001) without significant more side-effects. CONCLUSION: The PCIA did not provide a superior pain control compared to oral analgesics. In view of the considerable effort and the high opioid consumption, it cannot be recommended as a standardized application for pain control after TE.


Subject(s)
Morphine , Tonsillectomy , Adult , Humans , Morphine/adverse effects , Tonsillectomy/adverse effects , Quality of Life , Pain, Postoperative/drug therapy , Analgesics, Opioid/adverse effects , Analgesia, Patient-Controlled/adverse effects , Analgesia, Patient-Controlled/methods , Analgesics/therapeutic use
3.
Braz. j. otorhinolaryngol. (Impr.) ; 89(1): 48-53, Jan.-Feb. 2023. tab, graf
Article in English | LILACS-Express | LILACS | ID: biblio-1420918

ABSTRACT

Abstract Objective: To assess the value of a morphine Patient Controlled Intravenous Analgesia (PCIA) after Tonsillectomies (TE). Methods: 30 adult patients were treated with oral analgesics (protocol group) and compared to 30 patients treated with a morphine PCIA for the first 3 Postoperative Days (PODs) after TE. Average and maximum pain severities (Numeric Rating Scale - NRS: 0-10) on PODs 1-3, analgesic score, quality of life, patient satisfaction and side effects were defined as outcome measures. Results: Average pain severities of the protocol and the PCIA group were of similar magnitude (NRS) (POD1: 4.48 vs. 4.71 [p = 0.68], POD2: 4.75 vs. 4.22 [p = 0.32] and POD3: 4.44 vs. 4.25 [p = 0.71]). Maximum pain intensities on POD1 (p = 0.92), POD2 (p = 0.51) and POD3 (p = 0.36) were also comparable between both groups. Patients with a PCIA consumed significantly more opioids (p = 0.001) without significant more side-effects. Conclusion: The PCIA did not provide a superior pain control compared to oral analgesics. In view of the considerable effort and the high opioid consumption, it cannot be recommended as a standardized application for pain control after TE.

4.
Front Rehabil Sci ; 3: 864079, 2022.
Article in English | MEDLINE | ID: mdl-36189008

ABSTRACT

Purpose: Nearly one in three US adolescents meet the criteria for anxiety, an issue that has worsened with the COVID-19 pandemic. We developed a video-based slow diaphragmatic breathing stress-reduction curriculum for high school students and evaluated its feasibility, tolerability, and preliminary effectiveness. Methods: This cluster-randomized feasibility pilot compared 5-min slow diaphragmatic breathing for 5 weeks with treatment-as-usual control among four 12th-grade public high school classes. Students individually participated after school during COVID-19-related hybrid teaching, with slow diaphragmatic breathing three times/week and breath science education once/week. Feasibility was based on completion of breathing exercises, breath science education, and preliminary effectiveness assessments, and ease/tolerability was based on qualitative assessments. Preliminary effectiveness was measured with the State-Trait Anxiety Inventory (STAI) and a timed-exhale carbon dioxide tolerance test (CO2TT) of physiological stress response. Descriptive statistics and repeated analysis of variance were performed to quantify and compare outcomes between time periods. Human subjects research approval was granted through Western IRB-Copernicus Group (WCG IRB) [ClinicalTrials.gov, Identifier: NCT05266833.]. Results: Forty-three students consented to participate. Breath practice compliance ranged from 29 to 83% across classes and weeks, and decreased on average over the 5 weeks. Compliance with the breath science videos ranged from 43 to 86%, and that with the weekly STAI-State and CO2TT measures varied from 36 to 86%. Compliance with ease/tolerability assessments ranged from 0 to 60%. Preliminary effectiveness assessments' compliance varied across classes from 83 to 89% during baseline, and 29 to 72% at follow-up. The curriculum was rated as somewhat-to-definitely useful/beneficial, and definitely-to-very easy/tolerable. Students reported enjoying the diaphragmatic breathing, CO2TT, and breath science education; some found the extended exhales challenging and the curriculum and assessments time-consuming. Preliminary effectiveness analyses indicated no significant changes in STAI or CO2TT from baseline to followup or from before to after breathing exercises (p > 0.05 for all). Conclusions: Implementation of this 5-week slow breathing curriculum was feasible and tolerable to this cohort. Compliance, tolerability, and effectiveness may be improved with in-class participation. Future research on simple and accessible slow-breathing exercises is warranted to address today's adolescent stress-management crisis. Trial Registration: ClinicalTrials.gov, Identifier: NCT05266833.

5.
BMC Health Serv Res ; 22(1): 231, 2022 Feb 19.
Article in English | MEDLINE | ID: mdl-35183180

ABSTRACT

BACKGROUND: Among the over 5 million informal caregivers for patients with Alzheimer's disease (AD) in the United States (US), over 60% experience insomnia. Research on insomnia treatment efficacy in AD caregivers is limited. An ongoing randomized non-inferiority clinical trial, the Caregiver Sleep Research study, is evaluating whether mindfulness meditation is non-inferior to cognitive behavioral therapy for insomnia (CBT-I) in the treatment of insomnia in AD caregivers. The present report examines estimated intervention costs in this ongoing trial. METHODS: Micro-costing was used to itemize and abstract costs of the two interventions: a mindfulness-based intervention known as mindful awareness practices for insomnia (MAP-I); and CBT-I. This approach involves collecting detailed data on resources utilized and the unit costs of those resources, thereby revealing actual resource use and economic costs for each treatment arm. Personnel time, patient time, and supplies were inventoried, and unit costs were applied. Caregiver time costs, including travel, were based on US Labor Bureau home-health aide national mean hourly wages; instructor/staff costs were based on hourly wages. Per-participant and program costs were calculated assuming individual- and group-delivery to reflect real-world implementation. Sensitivity analyses evaluated robustness of estimates. RESULTS: From the societal perspective, per-participant MAP-I costs were $1884 for individual and $1377 for group delivery; for CBT-I, these costs were $3978 and $1981, respectively. Compared with CBT-I, MAP-I provided cost savings of $2094 (53%) and $604 (30%) per treated caregiver for individual and group delivery, respectively. From the US healthcare system perspective, MAP-I vs. CBT-I participant savings were $1872 (65%) for individual and $382 (44%) for group interventions, respectively. For MAP-I and CBT-I, instructor in-class time was the highest cost component. Results were most sensitive to combined instructor time costs. CONCLUSIONS: Treatment of insomnia with MAP-I, compared to CBT-I, yields substantial cost savings for society and the healthcare system. With this potential for cost savings, results of the ongoing non-inferiority trial have critical implications for insomnia treatment dissemination and its benefits to AD caregivers and other community populations with insomnia.


Subject(s)
Alzheimer Disease , Cognitive Behavioral Therapy , Meditation , Mindfulness , Sleep Initiation and Maintenance Disorders , Alzheimer Disease/therapy , Caregivers , Cognitive Behavioral Therapy/methods , Humans , Mindfulness/methods , Sleep Initiation and Maintenance Disorders/therapy , Treatment Outcome
6.
Epilepsia ; 61(2): 319-329, 2020 02.
Article in English | MEDLINE | ID: mdl-31953846

ABSTRACT

OBJECTIVE: The burden of caregiving for persons with epilepsy (PWEs) has not been examined previously in the United States. We assessed the clinical impact and direct and indirect economic costs for caregivers of PWEs. METHODS: An internet survey of 500 caregivers of PWEs was conducted from May to July 2015 using a combination of validated instruments and questions designed specifically for this survey. Caregivers were stratified by PWE age (adult/child) and disease severity (low: 0 vs high: 1 + seizures in the prior month). Annual self-reported direct and indirect costs were reported per caregiver and extrapolated to all US caregivers. The economic burden of caregiving for PWEs was defined as the difference between costs for caregivers and the general population. RESULTS: Caregivers reported that PWEs averaged 11.4 seizures in the prior month. Eighty percent of respondents were female and the average age was 44.3. Since becoming a caregiver, many reported anxiety (52.8%), depression (41.0%), and insomnia (30.8%). Annual mean direct medical costs for caregivers of children with low vs high seizure frequency were $4344 and $10 162, respectively. Costs for caregivers of adult PWEs were $4936 and $8518. Mean indirect costs associated with caregiving for a child with low vs high seizure frequency were $20 529 and $40 137; those for caregivers of an adult were $13 981 and $28 410. The cost estimates are higher vs the general US population; annual per-person healthcare utilization costs were $2740 and productivity loss costs were $5015. When extrapolating to the US population of PWE caregivers, annual costs exceeded $62 billion vs $14 billion for the general population, resulting in a caregiver burden of nearly $48 billion. SIGNIFICANCE: The clinical and economic burden of caregivers for PWE were substantial, and greatest for those caring for children with frequent seizures. The impact on caregivers should be considered when estimating the value of interventions that control epilepsy.


Subject(s)
Caregivers/psychology , Epilepsy/economics , Adolescent , Adult , Child , Child, Preschool , Cost of Illness , Costs and Cost Analysis , Epilepsy/psychology , Female , Health Care Costs , Humans , Male , Middle Aged , Quality of Life , Surveys and Questionnaires , United States/epidemiology
7.
Inquiry ; 56: 46958019875562, 2019.
Article in English | MEDLINE | ID: mdl-31524024

ABSTRACT

The burden of complications associated with peripheral intravenous use is underevaluated, in part, due to the broad use, inconsistent coding, and lack of mandatory reporting of these devices. This study aimed to analyze the clinical and economic impact of peripheral intravenous-related complications on hospitalized patients. This analysis of Premier Perspective® Database US hospital discharge records included admissions occurring between July 1, 2013 and June 30, 2015 for pneumonia, chronic obstructive pulmonary disease, myocardial infarction, congestive heart failure, chronic kidney disease, diabetes with complications, and major trauma (hip, spinal, cranial fractures). Admissions were assumed to include a peripheral intravenous. Admissions involving surgery, dialysis, or central venous lines were excluded. Multivariable analyses compared inpatient length of stay, cost, admission to intensive care unit, and discharge status of patients with versus without peripheral intravenous-related complications (bloodstream infection, cellulitis, thrombophlebitis, other infection, or extravasation). Models were conducted separately for congestive heart failure, chronic obstructive pulmonary disease, diabetes with complications, and overall (all 7 diagnoses) and adjusted for demographics, comorbidities, and hospital characteristics. We identified 588 375 qualifying admissions: mean (SD), age 66.1 (20.6) years; 52.4% female; and 95.2% urgent/emergent admissions. Overall, 1.76% of patients (n = 10 354) had peripheral intravenous-related complications. In adjusted analyses between patients with versus without peripheral intravenous complications, the mean (95% confidence interval) inpatient length of stay was 5.9 (5.8-6.0) days versus 3.9 (3.9-3.9) days; mean hospitalization cost was $10 895 ($10 738-$11 052) versus $7009 ($6988-$7031). Patients with complications were less likely to be discharged home versus those without (62.4% [58.6%-66.1%] vs 77.6% [74.6%-80.5%]) and were more likely to have died (3.6% [2.9%-4.2%] vs 0.7% [0.6%-0.9%]). Models restricted to single admitting diagnosis were consistent with overall results. Patients with peripheral intravenous-related complications have longer length of stay, higher costs, and greater risk of death than patients without such complications; this is true across diagnosis groups of interest. Future research should focus on reducing these complications to improve clinical and economic outcomes.


Subject(s)
Catheterization, Peripheral/adverse effects , Hospital Costs/statistics & numerical data , Infection Control , Length of Stay , Patient Discharge/statistics & numerical data , Aged , Databases, Factual , Female , Hospitalization , Humans , Length of Stay/economics , Length of Stay/statistics & numerical data , Male , Retrospective Studies , United States
8.
Am J Clin Oncol ; 41(1): 65-72, 2018 Jan.
Article in English | MEDLINE | ID: mdl-26398184

ABSTRACT

PURPOSE: We conducted a cost-effectiveness analysis incorporating recent phase III clinical trial (FIRE-3) data to evaluate clinical and economic tradeoffs associated with first-line treatments of KRAS wild-type (WT) metastatic colorectal cancer (mCRC). MATERIALS AND METHODS: A cost-effectiveness model was developed using FIRE-3 data to project survival and lifetime costs of FOLFIRI plus either cetuximab or bevacizumab. Hypothetical KRAS-WT mCRC patients initiated first-line treatment and could experience adverse events, disease progression warranting second-line treatment, or clinical response and hepatic metastasectomy. Model inputs were derived from FIRE-3 and published literature. Incremental cost-effectiveness ratios (ICERs) were reported as US$ per life year (LY) and quality-adjusted life year (QALY). Scenario analyses considered patients with extended RAS mutations and CALGB/SWOG 80405 data; 1-way and probabilistic sensitivity analyses were conducted. RESULTS: Compared with bevacizumab, KRAS-WT patients receiving first-line cetuximab gained 5.7 months of life at a cost of $46,266, for an ICER of $97,223/LY ($122,610/QALY). For extended RAS-WT patients, the ICER was $77,339/LY ($99,584/QALY). Cetuximab treatment was cost-effective 80.3% of the time, given a willingness-to-pay threshold of $150,000/LY. Results were sensitive to changes in survival, treatment duration, and product costs. CONCLUSIONS: Our analysis of FIRE-3 data suggests that first-line treatment with cetuximab and FOLFIRI in KRAS (and extended RAS) WT mCRC patients may improve health outcomes and use financial resources more efficiently than bevacizumab and FOLFIRI. This information, in combination with other studies investigating comparative effectiveness of first-line options, can be useful to clinicians, payers, and policymakers in making treatment and resource allocation decisions for mCRC patients.


Subject(s)
Bevacizumab/economics , Cetuximab/economics , Colorectal Neoplasms/drug therapy , Colorectal Neoplasms/pathology , Cost-Benefit Analysis , Health Care Costs , Adult , Aged , Bevacizumab/administration & dosage , Cetuximab/administration & dosage , Clinical Decision-Making , Colorectal Neoplasms/mortality , Disease-Free Survival , Female , Humans , Male , Middle Aged , Neoplasm Invasiveness/pathology , Neoplasm Metastasis , Neoplasm Staging , Prognosis , Risk Assessment , Survival Analysis , Treatment Outcome
9.
Curr Med Res Opin ; 34(3): 459-473, 2018 03.
Article in English | MEDLINE | ID: mdl-29105492

ABSTRACT

OBJECTIVE: Based on randomized controlled trials (RCTs), non-fatal myocardial infarction (MI) rates range between 9 and 15 events per 1000 person-years, ischemic stroke between 4 and 6 per 1000 person-years, CHD death rates between 5 and 7 events per 1000 person-years, and any major vascular event between 28 and 53 per 1000 person-years in patients with atherosclerotic cardiovascular disease (ASCVD). We reviewed global literature on the topic to determine whether the real-world burden of secondary major adverse cardiovascular events (MACEs) is higher among ASCVD patients. METHODS: We searched PubMed and Embase using MeSH/keywords including cardiovascular disease, secondary prevention and observational studies. Studies published in the last 5 years, in English, with ≥50 subjects with elevated low-density lipoprotein cholesterol (LDL-C) or on statins, and reporting secondary MACEs were included. The Newcastle-Ottawa Scale (NOS) was used to assess the quality of each included study. RESULTS: Of 4663 identified articles, 14 studies that reported MACE incidence rates per 1000 person-years were included in the review (NOS grades ranged from 8 to 9; 2 were prospective and 12 were retrospective studies). Reported incidence rates per 1000 person-years had a range (median) of 12.01-39.9 (26.8) for MI, 13.8-57.2 (41.5) for ischemic stroke, 1.0-94.5 (21.1) for CV-related mortality and 9.7-486 (52.6) for all-cause mortality. Rates were 25.8-211 (81.1) for composite of MACEs. Multiple event rates had a range (median) of 60-391 (183) events per 1000 person-years. CONCLUSIONS: Our review indicates that MACE rates observed in real-world studies are substantially higher than those reported in RCTs, suggesting that the secondary MACE burden and potential benefits of effective CVD management in ASCVD patients may be underestimated if real-world data are not taken into consideration.


Subject(s)
Atherosclerosis/epidemiology , Cardiovascular Diseases/epidemiology , Myocardial Infarction/epidemiology , Cholesterol/blood , Cholesterol, LDL/blood , Humans , Hydroxymethylglutaryl-CoA Reductase Inhibitors/administration & dosage , Randomized Controlled Trials as Topic , Secondary Prevention , Stroke/epidemiology
10.
Clinicoecon Outcomes Res ; 9: 495-503, 2017.
Article in English | MEDLINE | ID: mdl-28860831

ABSTRACT

OBJECTIVE: With the introduction of new therapies, hospitals have to plan spending limited resources in a cost-effective manner. To assist in identifying the optimal treatment for patients with locally advanced or metastatic gastroenteropancreatic neuroendocrine tumors, budget impact modeling was used to estimate the financial implications of adoption and diffusion of somatostatin analogs (SSAs). PATIENTS AND METHODS: A hypothetical cohort of 500 gastroenteropancreatic neuroendocrine tumor patients was assessed in an economic model, with the proportion with metastatic disease treated with an SSA estimated using published data. Drug acquisition, preparation, and administration costs were based on national pricing databases and published literature. Octreotide dosing was based on published estimates of real-world data, whereas for lanreotide, real-world dosing was unavailable and we therefore used the highest indicated dosing. Alternative scenarios reflecting the proportion of patients receiving lanreotide or octreotide were considered to estimate the incremental budget impact to the hospital. RESULTS: In the base case, 313 of the initial 500 gastroenteropancreatic neuroendocrine tumor patients were treated with an SSA. The model-predicted per-patient cost was US$83,473 for lanreotide and US$89,673 for octreotide. With a hypothetical increase in lanreotide utilization from 5% to 30% of this population, the annual model-projected hospital costs decreased by US$488,615. When varying the inputs in one-way sensitivity analyses, the results were most sensitive to changes in dosing assumptions. CONCLUSION: Results suggest that factors beyond drug acquisition cost can influence the budget impact to a hospital. When considering preparation and administration time, and real-world dosing, use of lanreotide has the potential to reduce health care expenditures associated with metastatic gastroenteropancreatic neuroendocrine tumor treatments.

11.
J Med Econ ; 20(7): 767-775, 2017 Jul.
Article in English | MEDLINE | ID: mdl-28562126

ABSTRACT

AIMS: Cost effectiveness analysis (CEA) is a useful tool for estimating the value of an intervention in relation to alternatives. In cardiovascular disease (CVD), CEA is especially important, given the high economic and clinical burden. One key driver of value is CVD mortality prevention. However, data used to inform CEA parameters can be limited, given the difficulty in demonstrating statistically significant mortality benefit in randomized clinical trials (RCTs), due in part to the frequency of fatal events and limited trial durations. This systematic review identifies and summarizes whether published CVD-related CEAs have incorporated mortality benefits, and the methodology among those that did. MATERIALS AND METHODS: A systematic literature review was conducted of CEAs of lipid-lowering therapies published between 2000-2017. Health technology assessments (HTA) and full-length manuscripts were included, and sources of mortality data and methods of applying mortality benefits were extracted. Results were summarized as proportions of articles to articulate common practices in CEAs of CVD. RESULTS: This review identified 100 studies for inclusion, comprising 93 full-length manuscripts and seven HTA reviews. Among these, 99% assumed a mortality benefit in the model. However, 87 of these studies that incorporated mortality differences did so despite the trials used to inform model parameters not demonstrating statistically significant differences in mortality. None of the 12 studies that used statistically significant findings from an individual RCT were based on active control studies. In a sub-group analysis considering the 60 CEAs that incorporated a direct mortality benefit, 48 (80%) did not have RCT evidence for statistically significant benefit in CVD mortality. LIMITATIONS AND CONCLUSIONS: The finding that few CEA models included mortality inputs from individual RCTs of lipid-lowering therapy may be surprising, as one might expect that treatment efficacy should be based on robust clinical evidence. However, regulatory requirements in CVD-related RCTs often lead to insufficient sample sizes and observation periods for detecting a difference in CVD mortality, which results in the use of intermediate outcomes, composite end-points, or meta-analysis to extrapolate long-term mortality benefit in a lifetime CEA.


Subject(s)
Cardiovascular Diseases/mortality , Cost-Benefit Analysis/methods , Dyslipidemias/drug therapy , Hypolipidemic Agents/economics , Humans , Hypolipidemic Agents/therapeutic use , Quality-Adjusted Life Years
12.
J Manag Care Spec Pharm ; 23(6-a Suppl): S34-S48, 2017 Jun.
Article in English | MEDLINE | ID: mdl-28535104

ABSTRACT

BACKGROUND: Several organizations have developed frameworks to systematically assess the value of new drugs. OBJECTIVE: To evaluate the convergent validity and interrater reliability of 4 value frameworks to understand the extent to which these tools can facilitate value-based treatment decisions in oncology. METHODS: Eight panelists used the American Society of Clinical Oncology (ASCO), European Society for Medical Oncology (ESMO), Institute for Clinical and Economic Review (ICER), and National Comprehensive Cancer Network (NCCN) frameworks to conduct value assessments of 15 drugs for advanced lung and breast cancers and castration-refractory prostate cancer. Panelists received instructions and published clinical data required to complete the assessments, assigning each drug a numeric or letter score. Kendall's Coefficient of Concordance for Ranks (Kendall's W) was used to measure convergent validity by cancer type among the 4 frameworks. Intraclass correlation coefficients (ICCs) were used to measure interrater reliability for each framework across cancers. Panelists were surveyed on their experiences. RESULTS: Kendall's W across all 4 frameworks for breast, lung, and prostate cancer drugs was 0.560 (P= 0.010), 0.562 (P = 0.010), and 0.920 (P < 0.001), respectively. Pairwise, Kendall's W for breast cancer drugs was highest for ESMO-ICER and ICER-NCCN (W = 0.950, P = 0.019 for both pairs) and lowest for ASCO-NCCN (W = 0.300, P = 0.748). For lung cancer drugs, W was highest pairwise for ESMO-ICER (W = 0.974, P = 0.007) and lowest for ASCO-NCCN (W = 0.218, P = 0.839); for prostate cancer drugs, pairwise W was highest for ICER-NCCN (W = 1.000, P < 0.001) and lowest for ESMO-ICER and ESMO-NCCN (W = 0.900, P = 0.052 for both pairs). When ranking drugs on distinct framework subdomains, Kendall's W among breast cancer drugs was highest for certainty (ICER, NCCN: W = 0.908, P = 0.046) and lowest for clinical benefit (ASCO, ESMO, NCCN: W = 0.345, P = 0.436). Among lung cancer drugs, W was highest for toxicity (ASCO, ESMO, NCCN: W = 0. 944, P < 0.001) and lowest for certainty (ICER, NCCN: W = 0.230, P = 0.827); and among prostate cancer drugs, it was highest for quality of life (ASCO, ESMO: W = 0.986, P = 0.003) and lowest for toxicity (ASCO, ESMO, NCCN: W = 0.200, P = 0.711). ICC (95% CI) for ASCO, ESMO, ICER, and NCCN were 0.800 (0.660-0.913), 0.818 (0.686-0.921), 0.652 (0.466-0.834), and 0.153 (0.045-0.371), respectively. When scores were rescaled to 0-100, NCCN provided the narrowest band of scores. When asked about their experiences using the ASCO, ESMO, ICER, and NCCN frameworks, panelists generally agreed that the frameworks were logically organized and reasonably easy to use, with NCCN rated somewhat easier. CONCLUSIONS: Convergent validity among the ASCO, ESMO, ICER, and NCCN frameworks was fair to excellent, increasing with clinical benefit subdomain concordance and simplicity of drug trial data. Interrater reliability, highest for ASCO and ESMO, improved with clarity of instructions and specificity of score definitions. Continued use, analyses, and refinements of these frameworks will bring us closer to the ultimate goal of using value-based treatment decisions to improve patient care and outcomes. DISCLOSURES: This work was funded by Eisai Inc. Copher and Knoth are employees of Eisai Inc. Bentley, Lee, Zambrano, and Broder are employees of Partnership for Health Analytic Research, a health services research company paid by Eisai Inc. to conduct this research. For this study, Cohen, Huynh, and Neville report fees from Partnership for Health Analytic Research. Outside of this study, Cohen receives grants and direct consulting fees from various companies that manufacture and market pharmaceuticals. Mei reports a grant from Eisai Inc. during this study. The other authors have no disclosures to report. Study concept and design were contributed by Bentley and Broder, with assistance from Elkin and Cohen. Bentley took the lead in data collection, along with Elkin, Huynh, Mukherjea, Neville, Mei, Popescu, Lee, and Zambrano. Data interpretation was performed by Bentley and Broder, along with Elkin, Cohen, Copher, and Knoth. The manuscript was written primarily by Bentley, along with Elkin and Broder, and revised by Bentley, Broder, Elkin, Cohen, Copher, and Knoth. Select components of this work's methods were presented at ISPOR 19th Annual European Congress held in Vienna, Austria, October 29-November 2, 2016, and Society for Medical Decision Making 38th Annual North American Meeting held in Vancouver, Canada, October 23-26, 2016.


Subject(s)
Antineoplastic Agents/therapeutic use , Decision Support Techniques , Neoplasms/drug therapy , Antineoplastic Agents/economics , Humans , Models, Economic , Neoplasms/economics , Reproducibility of Results , United States , Value-Based Purchasing
13.
Oncologist ; 22(4): 379-385, 2017 04.
Article in English | MEDLINE | ID: mdl-28283585

ABSTRACT

BACKGROUND: Although hypomethylating agents (HMAs) are effective and approved therapies for patients with myelodysplastic syndromes (MDS), many patients do not benefit from treatment, and nearly all ultimately stop responding to HMAs. The incidence and cost burden of HMA failure are unknown yet needed to appreciate the magnitude and significance of such failure. METHODS: We analyzed a de-identified dataset of over 5 million individuals with private health insurance in the U.S. to estimate MDS incidence, prevalence, and treatments. Based on MDS provider interviews, a conceptual model of MDS patient management was constructed to create a new, claims-relevant and drug development-relevant definition of HMA treatment failure. This algorithm was used to define resource encumbrance of MDS patients in whom HMA treatment failed. RESULTS: We estimated an MDS incidence rate of ∼70 cases per 100,000 enrollees per year and a prevalence of 155 cases per 100,000 enrollees. The proportion of MDS patients receiving HMA treatment was low (∼3%), and treatment was typically initiated within 1 year of the first MDS claim. Notably, HMA-treated individuals were older and had more comorbidities than the overall MDS cohort. Total health care costs of managing MDS patients after HMA failure were high (∼$77,000 during the first 6 months) and were driven primarily by non-pharmacy costs. CONCLUSION: This study quantifies for the first time the burden of significant unmet need in caring for MDS patients following HMA treatment failure. The Oncologist 2017;22:379-385Implications for Practice: U.S.-based treatment patterns among MDS patients demonstrate the significant clinical, financial, and health care burden associated with HMA failure and call for active therapies for this patient population.


Subject(s)
Antimetabolites, Antineoplastic/economics , Insurance, Health/economics , Myelodysplastic Syndromes/drug therapy , Myelodysplastic Syndromes/economics , DNA Methylation/genetics , Female , Health Resources/economics , Hematopoietic Stem Cell Transplantation/economics , Humans , Male , Myelodysplastic Syndromes/pathology , Treatment Failure
14.
Value Health ; 20(2): 200-205, 2017 02.
Article in English | MEDLINE | ID: mdl-28237195

ABSTRACT

BACKGROUND: Several organizations have developed frameworks to systematically assess the value of new drugs. These organizations include the American Society of Clinical Oncology (ASCO), the European Society for Medical Oncology (ESMO), the Institute for Clinical and Economic Review (ICER), and the National Comprehensive Cancer Network (NCCN). OBJECTIVES: To understand the extent to which these four tools can facilitate value-based treatment decisions in oncology. METHODS: In this pilot study, eight panelists conducted value assessments of five advanced lung cancer drugs using the ASCO, ESMO, and ICER frameworks. The panelists received instructions and published clinical data required to complete the assessments. Published NCCN framework scores were abstracted. The Kendall's W coefficient was used to measure convergent validity among the four frameworks. Intraclass correlation coefficients were used to measure inter-rater reliability among the ASCO, ESMO, and ICER frameworks. Sensitivity analyses were conducted. RESULTS: Drugs were ranked similarly by the four frameworks, with Kendall's W of 0.703 (P = 0.006) across all the four frameworks. Pairwise, Kendall's W was the highest for ESMO-ICER (W = 0.974; P = 0.007) and ASCO-NCCN (W = 0.944; P = 0.022) and the lowest for ICER-NCCN (W = 0.647; P = 0.315) and ESMO-NCCN (W = 0.611; P = 0.360). Intraclass correlation coefficients (confidence interval [CI]) for the ASCO, ESMO, and ICER frameworks were 0.786 (95% CI 0.517-0.970), 0.804 (95% CI 0.545-0.973), and 0.281 (95% CI 0.055-0.799), respectively. When scores were rescaled to 0 to 100, the ICER framework provided the narrowest band of scores. CONCLUSIONS: The ASCO, ESMO, ICER, and NCCN frameworks demonstrated convergent validity, despite differences in conceptual approaches used. The ASCO inter-rater reliability was high, although potentially at the cost of user burden. The ICER inter-rater reliability was poor, possibly because of its failure to distinguish differential value among the sample of drugs tested. Refinements of all frameworks should continue on the basis of further testing and stakeholder feedback.


Subject(s)
Antineoplastic Agents/standards , Decision Support Techniques , Value-Based Purchasing , Medical Oncology , Pilot Projects , Reproducibility of Results
15.
AIDS Care ; 29(8): 1067-1073, 2017 08.
Article in English | MEDLINE | ID: mdl-28147708

ABSTRACT

Efavirenz (EFV) is a non-nucleoside reverse transcriptase inhibitor indicated for treatment of HIV-1 infection. Despite concern over EFV tolerability in clinical trials and practice, particularly related to central nervous system (CNS) adverse events, some observational studies have shown high rates of EFV continuation at one year and low rates of CNS-related EFV substitution. The objective of this study was to further examine the real-world rate of CNS-related EFV discontinuation in antiretroviral therapy naïve HIV-1 patients. This retrospective cohort study used a nationally representative electronic medical records database to identify HIV-1 patients ≥12 years old, treated with a 1st-line EFV-based regimen (single or combination antiretroviral tablet) from 1 January 2009 to 30 June 2013. Patients without prior record of EFV use during 6-month baseline (i.e., antiretroviral therapy naïve) were followed 12 months post-medication initiation. CNS-related EFV discontinuation was defined as evidence of a switch to a replacement antiretroviral coupled with record of a CNS symptom within 30 days prior, absent lab evidence of virologic failure. We identified 1742 1st-line EFV patients. Mean age was 48 years, 22.7% were female, and 8.1% had a prior report of CNS symptoms. The first year, overall discontinuation rate among new users of EFV was 16.2%. Ten percent of patients (n = 174) reported a CNS symptom and 1.1% (n = 19) discontinued EFV due to CNS symptoms: insomnia (n = 12), headache (n = 5), impaired concentration (n = 1), and somnolence (n = 1). The frequency of CNS symptoms was similar for patients who discontinued EFV compared to those who did not (10.3 vs. 9.9%; P = .86). Our study found that EFV discontinuation due to CNS symptoms was low, consistent with prior reports.


Subject(s)
Benzoxazines/adverse effects , Central Nervous System Diseases/chemically induced , HIV Infections/drug therapy , Reverse Transcriptase Inhibitors/adverse effects , Adult , Alkynes , Benzoxazines/administration & dosage , Cyclopropanes , Drug-Related Side Effects and Adverse Reactions , Electronic Health Records , Female , Follow-Up Studies , HIV Infections/complications , HIV Infections/psychology , HIV-1/drug effects , Humans , Male , Middle Aged , Retrospective Studies , Reverse Transcriptase Inhibitors/administration & dosage , Treatment Outcome , Young Adult
16.
Article in English | MEDLINE | ID: mdl-26589773

ABSTRACT

PURPOSE: To evaluate optimal salvage therapy in high-risk myelodysplastic syndromes patients who have failed a first-line hypomethylating agent (HMA) therapy, given that treatment choice is challenging. METHODS: Using published literature and expert opinion, we developed a Markov model to evaluate the cost-effectiveness of current treatments for patients who failed first-line HMA therapy. The model predicted costs, life years, quality-adjusted life years and incremental cost-effectiveness ratios. Sensitivity analyses were conducted to assess the impact of uncertainty in model inputs. RESULTS: Supportive care was the least expensive option ($65,704/patient) with the shortest survival (0.48 years). Low- and high-intensity chemotherapies and hematopoietic cell transplantation increased survival and costs with incremental cost-effectiveness ratios of $108,808, 306,103 and 318,163/life year, respectively. Switching HMA was more costly and less efficacious than another treatment option, namely low-intensity chemotherapy. CONCLUSIONS: Subsequent treatments in myelodysplastic syndrome patients who failed first-line HMA significantly increase costs, while only providing marginal clinical benefit and substantially increasing treatment-related morbidities. Additional treatment options would benefit resource allocation, clinical decision-making and patient outcomes.


Subject(s)
Antimetabolites, Antineoplastic/therapeutic use , Myelodysplastic Syndromes/drug therapy , Salvage Therapy/methods , Antimetabolites, Antineoplastic/economics , Clinical Decision-Making , Cost-Benefit Analysis , Humans , Markov Chains , Myelodysplastic Syndromes/economics , Quality-Adjusted Life Years , Resource Allocation , Salvage Therapy/economics , Survival , Uncertainty
17.
Expert Rev Pharmacoecon Outcomes Res ; 15(2): 357-64, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25363000

ABSTRACT

OBJECTIVES: We examined the cost-effectiveness of treating poorly controlled, severe, persistent asthma patients with bronchial thermoplasty (BT), a novel technology that uses thermal energy to reduce airway smooth muscle mass, with 5-year outcome data demonstrating a durable reduction in asthma exacerbations. STUDY DESIGN: We conducted a model-based cost-effectiveness analysis assessing 5-year healthcare utilization, patient quality of life and adverse events. METHODS: We utilized Markov modeling to estimate the costs and quality-of-life impact of BT compared with high-dose combination therapy among poorly controlled, severe, persistent asthma patients: those requiring high-dose combination therapy and having experienced an asthma exacerbation-related ER visit in the past year. RESULTS: The cost-effectiveness of BT was US$5495 per quality-adjusted life year; and approximately 22% of sensitivity analysis iterations estimated BT to reduce costs and increase quality of life. CONCLUSIONS: BT is a cost-effective treatment option for patients with poorly controlled, severe, persistent asthma.


Subject(s)
Asthma/therapy , Bronchoscopy/methods , Catheter Ablation/methods , Quality of Life , Asthma/economics , Asthma/physiopathology , Bronchoscopy/economics , Catheter Ablation/economics , Cost-Benefit Analysis , Humans , Markov Chains , Models, Economic , Quality-Adjusted Life Years , Severity of Illness Index
18.
J Med Econ ; 17(8): 527-37, 2014 Aug.
Article in English | MEDLINE | ID: mdl-24689556

ABSTRACT

OBJECTIVES: To estimate the clinical and economic trade-offs involved in using a molecular assay (92-gene assay, CancerTYPE ID) to aid in identifying the primary site of difficult-to-diagnose metastatic cancers and to explore whether the 92-gene assay can be used to standardize the diagnostic process and costs for clinicians, patients, and payers. METHODS: Four decision-analytic models were developed to project the lifetime clinical and economic impact of incorporating the 92-gene assay compared with standard care alone. For each model, total and incremental costs, life-years, quality-adjusted life-years (QALYs), incremental cost-effectiveness ratios (ICERs), and the proportion of patients treated correctly versus incorrectly were projected from the payer perspective. Model inputs were based on published literature, analyses of SEER (Surveillance Epidemiology and End RESULTS) data, publicly available data, and interviews with clinical experts. RESULTS: In all four models, the 92-gene assay increased the proportion of patients treated correctly, decreased the proportion of patients treated with empiric therapy, and increased quality-adjusted survival. In the primary model, the ICER was $50,273/QALY; thus, the 92-gene assay is therefore cost effective when considering a societal willingness-to-pay threshold of $100,000/QALY. These findings were robust across sensitivity analyses. CONCLUSIONS: Use of the 92-gene assay for diagnosing metastatic tumors of uncertain origin is associated with reduced misdiagnoses, increased survival, and improved quality of life. Incorporating the assay into current practice is a cost-effective approach to standardizing diagnostic methods while improving patient care. Limitations of this analysis are the lack of data availability and resulting modeling simplifications, although sensitivity analyses showed these to not be key drivers of results.


Subject(s)
Genes, Neoplasm , Genetic Testing/economics , Neoplasm Metastasis/diagnosis , Neoplasm Metastasis/genetics , Cost-Benefit Analysis , DNA, Neoplasm/analysis , Databases, Genetic , Diagnostic Errors/prevention & control , Humans , Qualitative Research
19.
J Med Econ ; 17(8): 567-76, 2014 Aug.
Article in English | MEDLINE | ID: mdl-24758296

ABSTRACT

OBJECTIVE: To develop a decision-analytic model to estimate the cost-effectiveness of initiating maintenance treatment with aripiprazole once-monthly (AOM) vs paliperidone long-acting injectable (PLAI) once-monthly among patients with schizophrenia in the US. METHODS: A decision-analytic model was developed to evaluate a hypothetical cohort of patients initiating maintenance treatment with AOM or PLAI. Rates of relapse, adverse events (AEs), and direct medical costs were estimated for 1 year. Patients either remained on initial treatment or discontinued treatment due to lack of efficacy, AEs, or other reasons, including non-adherence. Data from placebo-controlled pivotal trials and product prescribing information (PI) were used to estimate treatment efficacy and AEs. Analyses were performed assuming dosing of clinical trials, real-world practice, PIs, and highest therapeutic dose available, because of variation in practice settings. The main outcome of interest was incremental cost per schizophrenia hospitalization averted with AOM vs PLAI. RESULTS: Based on placebo-controlled pivotal trials' dosing, AOM improved clinical outcomes by reducing schizophrenia relapses vs PLAI (0.181 vs 0.277 per person per year [pppy]) at an additional cost of US$1276 pppy, resulting in an incremental cost-effectiveness ratio (ICER) of US$13,280/relapse averted. When PI dosing was assumed, this ICER increased to US$19,968/relapse averted. When real-world dosing and highest available dosing were assumed, AOM was associated with fewer relapses and lower overall treatment costs vs PLAI. CONCLUSIONS: AOM consistently provided favorable clinical benefits. Under various dosing scenarios, AOM results indicated fewer relapses at lower overall costs or a reasonable cost-effectiveness threshold (i.e., less than the cost of a hospitalization relapse) vs PLAI. Given the heterogeneous nature of schizophrenia and variability in treatment response, health plans may consider open access for treatments like AOM. Since model inputs were based on data from separate placebo-controlled trials, generalization of results to the real-world setting is limited.


Subject(s)
Antipsychotic Agents/administration & dosage , Antipsychotic Agents/economics , Cost-Benefit Analysis , Isoxazoles/economics , Palmitates/economics , Piperazines/economics , Quinolones/economics , Schizophrenia/drug therapy , Antipsychotic Agents/adverse effects , Aripiprazole , Decision Support Techniques , Drug Administration Schedule , Humans , Injections, Intramuscular , Isoxazoles/administration & dosage , Paliperidone Palmitate , Palmitates/administration & dosage , Piperazines/administration & dosage , Quinolones/administration & dosage , Schizophrenia/economics , United States
20.
Clinicoecon Outcomes Res ; 5: 437-45, 2013.
Article in English | MEDLINE | ID: mdl-24039438

ABSTRACT

BACKGROUND: February 2013 US treatment guidelines recommend the once-daily tablet of efavirenz/emtricitabine/tenofovir (Atripla®) as a preferred regimen and the once-daily tablet of elvitegravir/cobicistat/emtricitabine/tenofovir (Stribild™) as an alternative regimen for first-line treatment of human immunodeficiency virus (HIV). This study assessed the clinical and economic trade-offs involved in using Atripla compared with Stribild as first-line antiretroviral therapy in HIV-infected US adults. METHODS: A Markov cohort model was developed to project lifetime health-related outcomes, costs, quality-adjusted life years (QALYs), and cost-effectiveness of Stribild compared with Atripla as first-line antiretroviral therapy in HIV-1-infected US patients. Patients progressed in 12-week cycles through second-line, third-line, and nonsuppressive therapies, acquired immune deficiency syndrome, and death. Baseline characteristics and first-line virologic suppression, change in CD4 count, and adverse effects (lipid, central nervous system, rash, renal) were based on 48-week clinical trial results. These results demonstrated equivalent virologic suppression between the two regimens. Point estimates for virologic suppression (favoring Stribild) were used in the base case, and equivalency was used in the scenario analysis. Published sources and expert opinion were used to estimate costs, utilities, risk of acquired immune deficiency syndrome, mortality, subsequent-line CD4 count, clinical efficacy, and adverse events. Costs were reported in 2012 US dollars. Sensitivity analyses were conducted to assess robustness of results. RESULTS: Compared with patients initiating Atripla, patients initiating Stribild were estimated to have higher lifetime costs. Stribild added 0.041 QALYs over a lifetime at an additional cost of $6,886, producing an incremental cost-effectiveness ratio of $166,287/QALY gained. Results were most sensitive to first-line response rates, product costs, and likelihood of renal adverse events. When equivalent efficacy was assumed, Atripla dominated Stribild with lower costs and greater QALYs. CONCLUSION: At a societal willingness to pay of $100,000/QALY, Stribild was not cost-effective in the base case compared with Atripla for first-line HIV treatment.

SELECTION OF CITATIONS
SEARCH DETAIL
...