Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
1.
J Pain Res ; 14: 2347-2357, 2021.
Article in English | MEDLINE | ID: mdl-34377015

ABSTRACT

BACKGROUND AND AIMS: Chronic pain affects more adults in the United States than any other condition. Opioid medications are widely used in the treatment of chronic pain, but there remains considerable risk and cost associated with their use. This study aims to characterize the effects of opioid prescribing for chronic pain and similar pain conditions on lost productivity in the United States. METHODS: This was a retrospective, longitudinal, observational study of chronic pain patients in 2011-2014. We identified patients with a diagnosis of musculoskeletal pain receiving index prescription for opioids in administrative claims and studied disability absence in a linked health and productivity management database. Patients were grouped as de novo and continued use opioid users before index, and by opioid dose in the year after index. Days of disability were compared before and after index with bootstrapping. Effect of opioid dose group on disability was evaluated with negative binomial regression. Lost productivity cost was compared before and after index. RESULTS: The cohort contained 16,273 de novo and 6604 continued use patients. On average, de novo patients used 24.8 days of disability after index, an increase of 18.3 more days compared to before (p < 0.001). Continued use patients used 30.7 days after index, 9 more days than before (p < 0.001). There was a dose-response relationship between dose group and days of disability in de novo patients (p < 0.001). The weighted-average cost per person of lost productivity was $4344 higher in the year after index compared to the year before. CONCLUSION: Opioid prescriptions for pain patients were associated with significant disability use and lost productivity costs. With the evolution of opioid-prescribing practices, CDC recommendations, and the HHS Pain Management Best Practices, there is opportunity to use alternative pain therapies without the risks of opioid-induced side effects to improve work productivity.

2.
J Heart Lung Transplant ; 40(5): 323-333, 2021 05.
Article in English | MEDLINE | ID: mdl-33744086

ABSTRACT

BACKGROUND: Several distinctly engineered left ventricular assist devices (LVADs) are in clinical use. However, contemporaneous real world comparisons have not been conducted, and clinical trials were not powered to evaluate differential survival outcomes across devices. OBJECTIVES: Determine real world survival outcomes and healthcare expenditures for commercially available durable LVADs. METHODS: Using a retrospective observational cohort design, Medicare claims files were linked to manufacturer device registration data to identify de-novo, durable LVAD implants performed between January 2014 and December 2018, with follow-up through December 2019. Survival outcomes were compared using a Cox proportional hazards model stratified by LVAD type and validated using propensity score matching. Healthcare resource utilization was analyzed across device types by using nonparametric bootstrap analysis methodology. Primary outcome was survival at 1-year and total Part A Medicare payments. RESULTS: A total of 4,195 de-novo LVAD implants were identified in fee-for-service Medicare beneficiaries (821 HeartMate 3; 1,840 HeartMate II; and 1,534 Other-VADs). The adjusted hazard ratio for mortality at 1-year (confirmed in a propensity score matched analysis) for the HeartMate 3 vs HeartMate II was 0.64 (95% CI; 0.52-0.79, p< 0.001) and for the HeartMate 3 vs Other-VADs was 0.51 (95% CI; 0.42-0.63, p < 0.001). The HeartMate 3 cohort experienced fewer hospitalizations per patient-year vs Other-VADs (respectively, 2.8 vs 3.2 EPPY hospitalizations, p < 0.01) and 6.1 fewer hospital days on average (respectively, 25.2 vs 31.3 days, p < 0.01). The difference in Medicare expenditures, conditional on survival, for HeartMate 3 vs HeartMate II was -$10,722, p < 0.001 (17.4% reduction) and for HeartMate 3 vs Other-VADs was -$17,947, p < 0.001 (26.1% reduction). CONCLUSIONS: In this analysis of a large, real world, United States. administrative dataset of durable LVADs, we observed that the HeartMate 3 had superior survival, reduced healthcare resource use, and lower healthcare expenditure compared to other contemporary commercially available LVADs.


Subject(s)
Health Expenditures/statistics & numerical data , Heart Failure/therapy , Heart Transplantation , Heart Ventricles/physiopathology , Heart-Assist Devices , Propensity Score , Aged , Female , Follow-Up Studies , Heart Failure/mortality , Heart Failure/physiopathology , Humans , Male , Middle Aged , Retrospective Studies , Survival Rate/trends , Treatment Outcome , United States/epidemiology , Waiting Lists/mortality
3.
Clin Cardiol ; 43(12): 1501-1510, 2020 Dec.
Article in English | MEDLINE | ID: mdl-32949178

ABSTRACT

BACKGROUND: In heart failure (HF) patients, both natriuretic peptides (NP) and previous HF hospitalization (pHFH) have been used to predict prognosis. HYPOTHESIS: In a large real-world population, both NP levels and pHFH have independent and interdependent predictive value for clinical outcomes of HFH and all-cause mortality. METHODS: Linked electronic health records and insurance claims data from Decision Resource Group were used to identify HF patients that had a BNP or NT-proBNP result between January 2012 and December 2016. NT-proBNP was converted into BNP equivalents by dividing by 4. Index event was defined as most recent NP on or after 1 January 2012. Patients with incomplete records or age < 18 years were excluded. During one-year follow up, HFH and mortality rates stratified by index BNP levels and pHFH are reported. RESULTS: Of 64 355 patients (74 ± 12 years old, 49% female) with available values, median BNP was 259 [IQR 101-642] pg/ml. The risk of both HFH and mortality was higher with increasing BNP levels. At each level of BNP, mortality was only slightly higher in patients with pHFH vs those without pHFH (RR 1.2 [95%CI 1.2,1.3], P < .001); however, at each BNP, HFH was markedly increased in patients with pHFH vs those without pHFH (RR 2.0 [95%CI 1.9,2.1], P < .001). CONCLUSION: In this large real-world heart failure population, higher BNP levels were associated with increased risk for both HFH and mortality. At any given level of BNP, pHFH added greater prognostic value for prediction of future HFH than for mortality.


Subject(s)
Heart Failure/blood , Hospitalization/statistics & numerical data , Natriuretic Peptide, Brain/blood , Peptide Fragments/blood , Aged , Aged, 80 and over , Biomarkers/blood , Female , Follow-Up Studies , Heart Failure/mortality , Humans , Male , Middle Aged , Prognosis , Protein Precursors , Retrospective Studies , Risk Factors , Survival Rate/trends , United States/epidemiology
4.
J Cardiovasc Electrophysiol ; 30(11): 2302-2309, 2019 11.
Article in English | MEDLINE | ID: mdl-31549456

ABSTRACT

AIMS: The TactiCath Contact Force Ablation Catheter Study for Atrial Fibrillation (TOCCASTAR) clinical trial compared clinical outcomes using a contact force (CF) sensing ablation catheter (TactiCath) with a catheter that lacked CF measurement. This analysis links recorded events in the TOCCASTAR study and a large claims database, IBM MarketScan®, to determine the economic impact of using CF sensing during atrial fibrillation (AF) ablation. METHODS AND RESULTS: Clinical events including repeat ablation, use of antiarrhythmic drugs, hospitalization, perforation, pericarditis, pneumothorax, pulmonary edema, pulmonary vein stenosis, tamponade, and vascular access complications were adjudicated in the year after ablation. CF was characterized as optimal if greater than or equal to 90% lesion was performed with greater than or equal to 10 g of CF. A probabilistic 1:1 linkage was created for subjects in MarketScan® with the same events in the year after ablation, and the cost was evaluated over 10 000 iterations. Of the 279 subjects in TOCCASTAR, 145 were ablated using CF (57% with optimal CF), and 134 were ablated without CF. In the MarketScan® cohort, 9811 subjects who underwent AF ablation were used to determine events and costs. For subjects ablated with optimal CF, total cost was $19 271 ± 3705 in the year after ablation. For ablation lacking CF measurement, cost was $22 673 ± 3079 (difference of $3402, P < .001). In 73% of simulations, optimal CF was associated with lower cost in the year after ablation. CONCLUSION: Compared to ablation without CF, there was a decrease in healthcare cost of $3402 per subject in the first year after the procedure when optimal CF was used.


Subject(s)
Atrial Fibrillation/economics , Atrial Fibrillation/surgery , Cardiac Catheterization/economics , Cardiac Catheters/economics , Catheter Ablation/economics , Health Care Costs , Transducers, Pressure/economics , Aged , Atrial Fibrillation/diagnosis , Atrial Fibrillation/physiopathology , Cardiac Catheterization/adverse effects , Cardiac Catheterization/instrumentation , Catheter Ablation/adverse effects , Catheter Ablation/instrumentation , Cost Savings , Cost-Benefit Analysis , Databases, Factual , Female , Humans , Male , Middle Aged , Postoperative Complications/economics , Postoperative Complications/therapy , Randomized Controlled Trials as Topic , Retrospective Studies , Risk Factors , Time Factors , Treatment Outcome , United States
5.
JAMA Cardiol ; 4(6): 556-563, 2019 06 01.
Article in English | MEDLINE | ID: mdl-31090869

ABSTRACT

Importance: In a randomized clinical trial, heart failure (HF) hospitalizations were lower in patients managed with guidance from an implantable pulmonary artery pressure sensor compared with usual care. It remains unclear if ambulatory monitoring could also improve long-term clinical outcomes in real-world practice. Objective: To determine the association between ambulatory hemodynamic monitoring and rates of HF hospitalization at 12 months in clinical practice. Design, Setting, and Participants: This matched cohort study of Medicare beneficiaries used claims data collected between June 1, 2014, and March 31, 2016. Medicare patients who received implants of a pulmonary artery pressure sensor were identified from the 100% Medicare claims database. Each patient who received an implant was matched to a control patient by demographic features, history of HF hospitalization, and number of all-cause hospitalizations. Propensity scoring based on comorbidities (arrhythmia, hypertension, diabetes, pulmonary disease, and renal disease) was used for additional matching. Data analysis was completed from July 2017 through January 2019. Exposures: Implantable pulmonary artery pressure monitoring system. Main Outcomes and Measures: The rates of HF hospitalization were compared using the Andersen-Gill method. Days lost owing to events were compared using a nonparametric bootstrap method. Results: The study cohort consisted of 1087 patients who received an implantable pulmonary artery pressure sensors and 1087 matched control patients. The treatment and control cohorts were well matched by age (mean [SD], 72.7 [10.2] years vs 72.9 [10.1] years) and sex (381 of 1087 female patients [35.1%] in each group), medical history, comorbidities, and timing of preimplant HF hospitalization. At 12 months postimplant, 616 HF hospitalizations occurred in the treatment cohort compared with 784 HF hospitalizations in the control cohort. The rate of HF hospitalization was lower in the treatment cohort at 12 months postimplant (hazard ratio [HR], 0.76 [95% CI, 0.65-0.89]; P < .001). The percentage of days lost to HF hospitalizations or death were lower in the treatment group (HR, 0.73 [95% CI, 0.64-0.84]; P < .001) and the percentage of days lost owing to all-cause hospitalization or death were also lower (HR, 0.77 [95% CI, 0.68-0.88]; P < .001). Conclusions and Relevance: Patients with HF who were implanted with a pulmonary artery pressure sensor had lower rates of HF hospitalization than matched controls and spent more time alive out of hospital. Ambulatory hemodynamic monitoring may improve outcomes in patients with chronic HF.


Subject(s)
Blood Pressure , Heart Failure/therapy , Hospitalization/statistics & numerical data , Monitoring, Ambulatory/methods , Prostheses and Implants , Pulmonary Artery , Aged , Aged, 80 and over , Cohort Studies , Disease Management , Female , Heart Failure/diagnosis , Heart Failure/physiopathology , Hemodynamics , Humans , Kaplan-Meier Estimate , Male , Medicare , Patient Care Planning , Propensity Score , Proportional Hazards Models , United States
6.
Heart Rhythm ; 15(3): 355-362, 2018 03.
Article in English | MEDLINE | ID: mdl-29030235

ABSTRACT

BACKGROUND: Catheter ablation of ventricular tachycardia (VT) has been shown to reduce the number of recurrent shocks in patients with an implantable cardioverter-defibrillator (ICD). However, how VT ablation affects postprocedural medical and pharmaceutical usage remains unclear. OBJECTIVE: The purpose of this study was to investigate changes in health care resource utilization (HCRU) after VT ablation. METHODS: This large-scale, real-world, retrospective study used the MarketScan databases to identify patients in the United States with an ICD or cardiac resynchronization therapy-defibrillator (CRT-D) undergoing VT ablation. We calculated cumulative medical and pharmaceutical expenditures, office visits, hospitalizations, and emergency room (ER) visits in the 1-year periods before and after ablation. RESULTS: A total of 523 patients met the study inclusion criteria. After VT ablation, median annual cardiac rhythm-related medical expenditures decreased by $5,408. Moreover, the percentage of patients with at least 1 cardiac rhythm-related hospitalization and ER visit decreased from 53% and 41% before ablation to 28% and 26% after ablation, respectively. Similar changes were observed in the number of all-cause hospitalizations and ER visits, but there were no significant changes in all-cause medical expenditures. During the year before VT ablation, there was an increasing rate of health care resource utilization, followed by drastic slowing after ablation. CONCLUSION: This retrospective study demonstrated that catheter ablation seems to reduce hospitalization and overall health care utilization in VT patients with an ICD or CRT-D in place.


Subject(s)
Catheter Ablation , Health Expenditures/trends , Hospitalization/trends , Patient Acceptance of Health Care/statistics & numerical data , Tachycardia, Ventricular/surgery , Aged , Female , Humans , Male , Middle Aged , Retrospective Studies , Tachycardia, Ventricular/economics , United States
7.
Pain Med ; 19(4): 699-707, 2018 04 01.
Article in English | MEDLINE | ID: mdl-29244102

ABSTRACT

Study Design: Observational study using insurance claims. Objective: To quantify opioid usage leading up to spinal cord stimulation (SCS) and the potential impact on outcomes of SCS. Setting: SCS is an interventional therapy that often follows opioid usage in the care continuum for chronic pain. Methods: This study identified SCS patients using the Truven Health MarketScan databases from January 2010 to December 2014. The index event was the first occurrence of a permanent SCS implant. Indicators of opioid usage at implant were daily morphine equivalent dose (MED), number of unique pain drug classes, and diagnosis code for opioid abuse. System explant was used as a measure of ineffective SCS therapy. Multivariate logistic regression was used to analyze the effect of pre-implant medications on explants. Results: A total of 5,476 patients (56 ± 14 years; 60% female) were included. SCS system removal occurred in 390 patients (7.1%) in the year after implant. Number of drug classes (odds ratio [OR] = 1.11, P = 0.007) and MED level (5-90 vs < 5 mg/d: OR = 1.32, P = 0.043; ≥90 vs < 5 mg/d: OR = 1.57, P = 0.005) were independently predictive of system explant. Over the year before implant, MED increased in 54% (stayed the same in 21%, decreased in 25%) of patients who continued with SCS and increased in 53% (stayed the same in 20%, decreased in 27%) of explant patients (P = 0.772). Over the year after implant, significantly more patients with continued SCS had an MED decrease (47%) or stayed the same (23%) than before (P < 0.001). Conclusions: Chronic pain patients receive escalating opioid dosage prior to SCS implant, and high-dose opioid usage is associated with an increased risk of explant. Neuromodulation can stabilize or decrease opioid usage. Earlier consideration of SCS before escalated opioid usage has the potential to improve outcomes in complex chronic pain.


Subject(s)
Analgesics, Opioid/therapeutic use , Chronic Pain/therapy , Spinal Cord Stimulation , Treatment Outcome , Adult , Aged , Female , Humans , Male , Middle Aged , Retrospective Studies
8.
J Am Heart Assoc ; 6(5)2017 May 10.
Article in English | MEDLINE | ID: mdl-28490521

ABSTRACT

BACKGROUND: Whether outcomes differ between sexes following treatment with pacemakers (PM), implantable cardioverter defibrillators, and cardiac resynchronization therapy (CRT) devices is unclear. METHODS AND RESULTS: Consecutive US patients with newly implanted PM, implantable cardioverter defibrillators, and CRT devices from a large remote monitoring database between 2008 and 2011 were included in this observational cohort study. Sex-specific all-cause survival postimplant was compared within each device type using a multivariable Cox proportional hazards model, stratified on age and adjusted for remote monitoring utilization and ZIP-based socioeconomic variables. A total of 269 471 patients were assessed over a median 2.9 [interquartile range, 2.2, 3.6] years. Unadjusted mortality rates (MR; deaths/100 000 patient-years) were similar between women versus men receiving PMs (n=115 076, 55% male; MR 4193 versus MR 4256, respectively; adjusted hazard ratio, 0.87; 95% CI, 0.84-0.90; P<0.001) and implantable cardioverter defibrillators (n=85 014, 74% male; MR 4417 versus MR 4479, respectively; adjusted hazard ratio, 0.98; 95% CI, 0.93-1.02; P=0.244). In contrast, survival was superior in women receiving CRT defibrillators (n=61 475, 72% male; MR 5270 versus male MR 7175; adjusted hazard ratio, 0.73; 95% CI, 0.70-0.76; P<0.001) and also CRT pacemakers (n=7906, 57% male; MR 5383 versus male MR 7625, adjusted hazard ratio, 0.69; 95% CI, 0.61-0.78; P<0.001). This relative difference increased with time. These results were unaffected by age or remote monitoring utilization. CONCLUSIONS: Women accounted for less than 30% of high-voltage implants and fewer than half of low-voltage implants in a large, nation-wide cohort. Survival for women and men receiving implantable cardioverter defibrillators and PMs was similar, but dramatically greater for women receiving both defibrillator- and PM-based CRT.


Subject(s)
Cardiac Pacing, Artificial , Cardiac Resynchronization Therapy Devices , Cardiac Resynchronization Therapy , Defibrillators, Implantable , Electric Countershock/instrumentation , Healthcare Disparities , Heart Diseases/therapy , Pacemaker, Artificial , Aged , Aged, 80 and over , Cardiac Pacing, Artificial/adverse effects , Cardiac Pacing, Artificial/mortality , Cardiac Resynchronization Therapy/adverse effects , Cardiac Resynchronization Therapy/mortality , Databases, Factual , Electric Countershock/adverse effects , Electric Countershock/mortality , Female , Health Services Accessibility , Heart Diseases/diagnosis , Heart Diseases/mortality , Heart Diseases/physiopathology , Humans , Kaplan-Meier Estimate , Logistic Models , Male , Middle Aged , Multivariate Analysis , Propensity Score , Proportional Hazards Models , Retrospective Studies , Risk Factors , Sex Factors , Socioeconomic Factors , Time Factors , Treatment Outcome , United States
9.
J Am Coll Cardiol ; 69(19): 2357-2365, 2017 May 16.
Article in English | MEDLINE | ID: mdl-28330751

ABSTRACT

BACKGROUND: In the CHAMPION (CardioMEMS Heart Sensor Allows Monitoring of Pressure to Improve Outcomes in New York Heart Association [NYHA] Functional Class III Heart Failure Patients) trial, heart failure hospitalization (HFH) rates were lower in patients managed with guidance from an implantable pulmonary artery pressure sensor compared with usual care. OBJECTIVES: This study examined the effectiveness of ambulatory hemodynamic monitoring in reducing HFH outside of the clinical trial setting. METHODS: We conducted a retrospective cohort study using U.S. Medicare claims data from patients undergoing pulmonary artery pressure sensor implantation between June 1, 2014, and December 31, 2015. Rates of HFH during pre-defined periods before and after implantation were compared using the Andersen-Gill extension to the Cox proportional hazards model while accounting for the competing risk of death, ventricular assist device implantation, or cardiac transplantation. Comprehensive heart failure (HF)-related costs were compared over the same periods. RESULTS: Among 1,114 patients receiving implants, there were 1,020 HFHs in the 6 months before, compared with 381 HFHs, 139 deaths, and 17 ventricular assist device implantations and/or transplants in the 6 months after implantation (hazard ratio [HR]: 0.55; 95% confidence interval [CI]: 0.49 to 0.61; p < 0.001). This lower rate of HFH was associated with a 6-month comprehensive HF cost reduction of $7,433 per patient (IQR: $7,000 to $7,884), and was robust in analyses restricted to 6-month survivors. Similar reductions in HFH and costs were noted in the subset of 480 patients with complete data available for 12 months before and after implantation (HR: 0.66; 95% CI: 0.57 to 0.76; p < 0.001). CONCLUSIONS: As in clinical trials, use of ambulatory hemodynamic monitoring in clinical practice is associated with lower HFH and comprehensive HF costs. These benefits are sustained to 1 year and support the "real-world" effectiveness of this approach to HF management.


Subject(s)
Heart Failure/prevention & control , Hospitalization/statistics & numerical data , Monitoring, Ambulatory , Aged , Aged, 80 and over , Female , Hemodynamics , Humans , Male , Middle Aged , Retrospective Studies
10.
JACC Clin Electrophysiol ; 3(2): 129-138, 2017 02.
Article in English | MEDLINE | ID: mdl-29759385

ABSTRACT

OBJECTIVES: The purpose of this study was to compare health care costs associated with repeat ablation of atrial fibrillation (AF) with health care costs associated with a successful first procedure. BACKGROUND: Catheter ablation has become established as a rhythm control strategy for symptomatic paroxysmal and persistent AF. The economic impact of ablation is not completely understood, and it may be affected by repeat procedures performed for recurrent AF. METHODS: The source of data was the MarketScan (Truven Health, Ann Arbor, Michigan) administrative claims dataset from April 2008 to March 2013, including U.S. patients with private and Medicare supplemental insurance. Patients who underwent an outpatient atrial ablation procedure and a diagnosis of AF were identified. Total health care cost was calculated for 1 year before and after the ablation. Patients were categorized as having undergone a repeat ablation if an additional ablation was performed in the following year. RESULTS: Of 12,027 patients included in the study, repeat ablation was performed in 2,066 (17.2%) within 1 year. Patients with repeat ablation had higher rates of emergency department visits (43.4% vs. 32.2%; < 0.001) and subsequent hospitalization (35.6% vs. 21.5%; p < 0.001), after excluding hospitalizations for the repeat procedure. Total medical cost was higher for patients with repeat ablation ($52,821 vs. $13,412; p < 0.001), and it remained 46% higher even after excluding the cost associated with additional ablations ($19,621 vs. $13,412; p < 0.001). CONCLUSIONS: Health care costs are significantly higher for patients with a repeat ablation for AF than for patients with only a single ablation procedure, even though both groups have similar baseline characteristics. The increased costs persist even after excluding the cost of the repeat ablation itself. These results emphasize the economic benefit of procedural success in AF ablation.


Subject(s)
Atrial Fibrillation/surgery , Catheter Ablation/methods , Atrial Fibrillation/economics , Catheter Ablation/economics , Costs and Cost Analysis , Electric Countershock/economics , Electric Countershock/statistics & numerical data , Electrocardiography, Ambulatory/economics , Electrocardiography, Ambulatory/statistics & numerical data , Female , Hospitalization/economics , Hospitalization/statistics & numerical data , Humans , Male , Middle Aged , Recurrence , Reoperation/economics , Reoperation/statistics & numerical data , Retrospective Studies , Treatment Outcome
11.
Eur J Heart Fail ; 19(5): 652-660, 2017 05.
Article in English | MEDLINE | ID: mdl-27647784

ABSTRACT

AIMS: Haemodynamic-guided heart failure (HF) management effectively reduces decompensation events and need for hospitalizations. The economic benefit of clinical improvement requires further study. METHODS AND RESULTS: An estimate of the cost-effectiveness of haemodynamic-guided HF management was made based on observations published in the randomized, prospective single-blinded CHAMPION trial. A comprehensive analysis was performed including healthcare utilization event rates, survival, and quality of life demonstrated in the randomized portion of the trial (18 months). Markov modelling with Monte Carlo simulation was used to approximate comprehensive costs and quality-adjusted life years (QALYs) from a payer perspective. Unit costs were estimated using the Truven Health MarketScan database from April 2008 to March 2013. Over a 5-year horizon, patients in the Treatment group had average QALYs of 2.56 with a total cost of US$56 974; patients in the Control group had QALYs of 2.16 with a total cost of US$52 149. The incremental cost-effectiveness ratio (ICER) was US$12 262 per QALY. Using comprehensive cost modelling, including all anticipated costs of HF and non-HF hospitalizations, physician visits, prescription drugs, long-term care, and outpatient hospital visits over 5 years, the Treatment group had a total cost of US$212 004 and the Control group had a total cost of US$200 360. The ICER was US$29 593 per QALY. CONCLUSIONS: Standard economic modelling suggests that pulmonary artery pressure-guided management of HF using the CardioMEMS™ HF System is cost-effective from the US-payer perspective. This analysis provides the background for further modelling in specific country healthcare systems and cost structures.


Subject(s)
Disease Management , Health Care Costs , Heart Failure/economics , Hospitalization/economics , Models, Economic , Pulmonary Wedge Pressure/physiology , Aged , Aged, 80 and over , Cost-Benefit Analysis , Female , Heart Failure/physiopathology , Heart Failure/therapy , Humans , Male , Patient Acceptance of Health Care , Prospective Studies , Quality of Life , Single-Blind Method , United States
12.
J Interv Card Electrophysiol ; 47(2): 189-195, 2016 Nov.
Article in English | MEDLINE | ID: mdl-27613184

ABSTRACT

BACKGROUND: Cardiac resynchronization therapy (CRT) is an effective treatment for heart failure (HF) with left ventricular systolic dysfunction and prolonged QRS interval. However, one third of patients do not benefit from treatment. This study compares the heart failure hospitalization (HFH) rates and corresponding costs between responders and non-responders to CRT. METHODS: At a single center in New Jersey, we enrolled patients with de novo CRT-D implants between January 2011 and July 2013. Medical history at implant and all subsequent hospitalizations were collected. A retrospective chart review of the cardiology visit at or closest to 12 months post-CRT implant was performed, and patients were classified into responders and non-responders. Universal billing records (UB-04), ICD-9-CM diagnoses, and procedure codes were used to determine whether each hospitalization was due to HF. For each heart failure hospitalization (HFH), an MS-DRG-based US national average Medicare reimbursement was determined. HFH rates and associated payor costs were compared between responders and non-responders using negative binomial regression and non-parametric bootstrapping (×10,000), respectively. RESULTS: CRT response was determined in 135 patients (n = 103 responders, n = 32 non-responders, average follow-up 1.4 years). Demographics, pre-implant HF characteristics, NYHA Class, QRS duration, ejection fraction (EF), left bundle branch block (LBBB) status, and co-morbidities were not statistically different between the two groups. The HFH rate was significantly lower in responders (0.43/patient year) compared to non-responders (0.96/patient year, IRR = 0.45, 95 % CI (0.23 0.90), P = 0.0197). Average US national Medicare reimbursement for the responder group (US$7205/patient year) was 48 % lower than that for the non-responder group (US$13,861/patient year, P = 0.035). CONCLUSION: In this single-center retrospective study, responders to CRT had significantly lower rates of post-implant heart failure hospitalization rate and reduced associated payor costs compared to non-responders. Therapies that increase CRT response rates can substantially reduce healthcare utilization.


Subject(s)
Cost of Illness , Defibrillators, Implantable/economics , Health Expenditures/statistics & numerical data , Heart Failure/economics , Heart Failure/prevention & control , Hospitalization/economics , Patient Acceptance of Health Care/statistics & numerical data , Aged , Defibrillators, Implantable/statistics & numerical data , Female , Heart Failure/epidemiology , Hospitalization/statistics & numerical data , Humans , Male , New Jersey/epidemiology , Prevalence , Risk Factors , Treatment Failure , Utilization Review
13.
Heart Rhythm ; 13(12): 2279-2286, 2016 12.
Article in English | MEDLINE | ID: mdl-27544748

ABSTRACT

BACKGROUND: Remote monitoring (RM) of cardiac implantable electronic devices (CIEDs) improves patient survival. However, whether RM reduces health care utilization is unknown. OBJECTIVE: The purpose of this study was to determine whether RM was associated with reduced hospitalization and costs in clinical practice. METHODS: We conducted a nationwide cohort study using the Truven Health Analytics MarketScan database. Patients implanted with a CIED between March 31, 2009, and April 1, 2012, were included. All-cause hospitalization events were compared between those using RM and those not using RM by using Cox proportional hazards methods with Andersen-Gill extension and propensity scoring. We also compared health care costs (payments >30 days after CIED implantation). RESULTS: Overall, there were 92,566 patients (mean age 72 ± 13 years; 58,140 [63%] men) with a mean follow-up of 19 ± 12 months, including 54,520 (59%) pacemaker, 27,816 (30%) implantable cardioverter-defibrillator, and 10,230 (11%) cardiac resynchronization therapy patients. Only 37% of patients (34,259) used RM. Patients with RM had Charlson Comorbidity Index values similar to those not using RM but had lower adjusted risk of all-cause hospitalization (adjusted hazard ratio 0.82; 95% confidence interval 0.80-0.84; P < .001) and shorter mean length of hospitalization (5.3 days vs 8.1 days; P < .001) during follow-up. RM was associated with a 30% reduction in hospitalization costs ($8720 mean cost per patient-year vs $12,423 mean cost per patient-year). For every 100,000 patient-years of follow-up, RM was associated with 9810 fewer hospitalizations, 119,000 fewer days in hospital, and $370,270,000 lower hospital payments. CONCLUSION: RM is associated with reductions in hospitalization and health care utilization. Since only about a third of patients with CIEDs routinely use RM, this represents a major opportunity for quality improvement.


Subject(s)
Cardiac Resynchronization Therapy , Heart Diseases , Hospitalization , Remote Sensing Technology , Aged , Aged, 80 and over , Cardiac Resynchronization Therapy/methods , Cardiac Resynchronization Therapy/statistics & numerical data , Cohort Studies , Defibrillators, Implantable/statistics & numerical data , Female , Health Care Costs/statistics & numerical data , Heart Diseases/economics , Heart Diseases/physiopathology , Heart Diseases/therapy , Hospitalization/economics , Hospitalization/statistics & numerical data , Humans , Male , Middle Aged , Outcome and Process Assessment, Health Care , Pacemaker, Artificial/statistics & numerical data , Patient Acceptance of Health Care/statistics & numerical data , Quality Improvement , Remote Sensing Technology/methods , Remote Sensing Technology/statistics & numerical data , Telemedicine/methods , United States
14.
Neuromodulation ; 19(5): 469-76, 2016 Jul.
Article in English | MEDLINE | ID: mdl-26923728

ABSTRACT

INTRODUCTION: A shorter delay time from chronic pain diagnosis to spinal cord stimulation (SCS) implantation may make it more likely to achieve lasting therapeutic efficacy with SCS. The objective of this analysis was to determine the impact of pain-to-SCS time on patients' post-implant healthcare resource utilization (HCRU). METHODS: A retrospective observational study was performed using a real-world patient cohort derived from MarketScan(®) Commercial and Medicare Supplemental claims data bases from April 2008 through March 2013. The predictor variable was the time from the first diagnosis of chronic pain to permanent SCS implant. Using multivariable analysis, we studied the impact of pain-to-SCS time on HCRU in the first year post-implant. For some regression tests, patients were grouped into terciles by HCRU. RESULTS: A total of 762 patients met inclusion criteria, with a median pain-to-SCS time of 1.35 years (Q1: 0.8, Q3: 1.9). For every one-year increase in pain-to-SCS time, the odds increased by 33% for being in the high medical expenditures group (defined using the upper tercile: $4133 over above) over the low group (first lower: $603 or less). The odds increased by 39% for being in the high opioid prescriptions group (10-58 prescriptions) over the low group (0-1). The odds increased by 44% and 55%, respectively, for being in the high office visits (8-77) or hospitalizations (3-28) group over the low office visits (0-2) or hospitalizations (0) group. CONCLUSIONS: HCRU increased in the year following SCS implantation with longer pain-to-SCS time. These results suggest that considering SCS earlier in the care continuum for chronic pain may improve patient outcomes, with reductions in hospitalizations, clinic visits, and opioid usage.


Subject(s)
Chronic Pain/therapy , Health Resources/statistics & numerical data , Spinal Cord Stimulation/methods , Spinal Cord Stimulation/statistics & numerical data , Adult , Aged , Chi-Square Distribution , Cohort Studies , Female , Humans , Male , Middle Aged , Pain Measurement , Regression Analysis , Treatment Outcome
15.
J Interv Card Electrophysiol ; 46(2): 129-36, 2016 Aug.
Article in English | MEDLINE | ID: mdl-26860839

ABSTRACT

PURPOSE: Guidelines advocate remote monitoring (RM) in patients with a cardiac implantable electronic device (CIED). However, it is not known when RM should be initiated. We hypothesized that prompt initiation of RM (within 91 days of implant) is associated with improved survival compared to delayed initiation. METHODS: This retrospective, national, observational cohort study evaluated patients receiving new implants of market-released St. Jude Medical™ pacemakers (PM), implantable cardioverter defibrillators (ICD), and cardiac resynchronization therapy (CRT) devices. Patients were assigned to one of two groups: an "RM Prompt" group, in which RM was initiated within 91 days of implant; and an "RM Delayed" group, in which RM was initiated >91 days but ≤365 days of implant. The primary endpoint was all-cause mortality. RESULTS: The cohort included 106,027 patients followed for a mean of 2.6 ± 0.9 years. Overall, 47,014 (44 %) patients had a PM, 31,889 (30 %) patients had an ICD, 24,005 (23 %) patients had a CRT-D, and 3119 (3 %) patients had a CRT-P. Remote monitoring was initiated promptly (median 4 weeks [IQR 2, 8 weeks]) in 66,070 (62 %) patients; in the other 39,957 (38 %) patients, RM initiation was delayed (median 24 weeks [IQR 18, 34 weeks]). In comparison to delayed initiation, prompt initiation of RM was associated with a lower mortality rate (4023 vs. 4679 per 100,000 patient-years, p < 0.001) and greater adjusted survival (HR 1.18 [95 % CI 1.13-1.22], p < 0.001). CONCLUSIONS: Our data, for the first time, show improved survival in patients enrolled promptly into RM following CIED implantation. This advantage was observed across all CIED device types.


Subject(s)
Defibrillators, Implantable/statistics & numerical data , Heart Failure/mortality , Heart Failure/prevention & control , Pacemaker, Artificial/statistics & numerical data , Remote Sensing Technology/statistics & numerical data , Time-to-Treatment/statistics & numerical data , Aged , Female , Heart Failure/diagnosis , Humans , Male , Patient Selection , Prevalence , Remote Sensing Technology/mortality , Retrospective Studies , Risk Factors , Survival Rate , Telemedicine/statistics & numerical data , Treatment Outcome , United States/epidemiology , Waiting Lists/mortality
16.
JACC Clin Electrophysiol ; 2(4): 426-433, 2016 Aug.
Article in English | MEDLINE | ID: mdl-29759861

ABSTRACT

OBJECTIVES: The study sought to compare survival, lead deactivation, and lead replacement with quadripolar versus bipolar leads using a retrospective cohort of patients with newly implanted cardiac resynchronization therapy (CRT) systems. BACKGROUND: In CRT, quadripolar left ventricular (LV) leads offer alternative pacing sites and vectors not available with bipolar LV leads, which may improve the effectiveness of the therapy. METHODS: Using nationwide data from device implant registration records of a single manufacturer, we identified patients with a de novo cardiac resynchronization therapy with defibrillation (CRT-D) implanted between November 30, 2011, and May 31, 2013. Patients were followed for up to 24 months. The primary predictor was LV lead type (quadripolar Quartet [St. Jude Medical, St. Paul, Minnesota] LV lead or bipolar LV lead). The primary outcome was death and the secondary outcomes were LV lead replacement and deactivation. RESULTS: Among 23,570 patients (69.5 ± 11.1 years of age; 28% female; median follow-up time 1.14 years), 18,406 had quadripolar and 5,164 had bipolar LV leads. The quadripolar and bipolar groups had 5.04 and 6.45 deaths per 100 patient-years, respectively (p < 0.001). After multivariate adjustment, the quadripolar lead was associated with a lower risk of deactivation (hazard ratio [HR]: 0.62; 95% confidence interval [CI]: 0.46 to 0.84; p = 0.002), replacement (HR: 0.67; 95% CI: 0.55 to 0.83; p < 0.001), and death (HR: 0.77; 95% CI: 0.69 to 0.86; p < 0.001). CONCLUSIONS: In this observational study of CRT-D devices, use of a quadripolar, compared to a bipolar LV lead, was associated with a reduction in LV lead deactivation, replacement, and mortality.

17.
J Am Coll Cardiol ; 65(24): 2601-2610, 2015 Jun 23.
Article in English | MEDLINE | ID: mdl-25983008

ABSTRACT

BACKGROUND: Remote monitoring (RM) technology embedded within cardiac rhythm devices permits continuous monitoring, which may result in improved patient outcomes. OBJECTIVES: This study used "big data" to assess whether RM is associated with improved survival and whether this is influenced by the type of cardiac device and/or its degree of use. METHODS: We studied 269,471 consecutive U.S. patients implanted between 2008 and 2011 with pacemakers (PMs), implantable cardioverter-defibrillators (ICDs), or cardiac resynchronization therapy (CRT) with pacing capability (CRT-P)/defibrillation capability (CRT-D) with wireless RM. We analyzed weekly use and all-cause survival for each device type by the percentage of time in RM (%TRM) stratified by age. Socioeconomic influences on %TRM were assessed using 8 census variables from 2012. RESULTS: The group had implanted PMs (n = 115,076; 43%), ICDs (n = 85,014; 32%), CRT-D (n = 61,475; 23%), and CRT-P (n = 7,906; 3%). When considered together, 127,706 patients (47%) used RM, of whom 67,920 (53%) had ≥75%TRM (high %TRM) and 59,786 (47%) <75%TRM (low %TRM); 141,765 (53%) never used RM (RM None). RM use was not affected by age or sex, but demonstrated wide geographic and socioeconomic variability. Survival was better in high %TRM versus RM None (hazard ratio [HR]: 2.10; p < 0.001), in high %TRM versus low %TRM (HR: 1.32; p < 0.001), and also in low %TRM versus RM None (HR: 1.58; p < 0.001). The same relationship was observed when assessed by individual device type. CONCLUSIONS: RM is associated with improved survival, irrespective of device type (including PMs), but demonstrates a graded relationship with the level of adherence. The results support the increased application of RM to improve patient outcomes.


Subject(s)
Cardiac Resynchronization Therapy/mortality , Defibrillators, Implantable , Electric Countershock/mortality , Pacemaker, Artificial , Patient Compliance , Remote Sensing Technology/mortality , Aged , Aged, 80 and over , Cardiac Resynchronization Therapy/methods , Cohort Studies , Electric Countershock/methods , Female , Follow-Up Studies , Humans , Male , Middle Aged , Remote Sensing Technology/methods , Retrospective Studies , Survival Rate/trends
18.
Europace ; 17(1): 101-7, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25371428

ABSTRACT

AIMS: This study compares, from a prospective, observational, non-randomized registry, the post-implant hospitalization rates and associated healthcare resource utilization of cardiac resynchronization therapy-defibrillator (CRT-D) patients with quadripolar (QUAD) vs. bipolar (BIP) left ventricular (LV) leads. METHODS AND RESULTS: Between January 2009 and December 2012, 193 consecutive patients receiving de novo CRT-D implants with either a QUAD (n = 116) or a BIP (n = 77) LV lead were enrolled at implant and followed until July 2013 at a single-centre, university hospital. Post-implant hospitalizations related to heart failure (HF) or LV lead surgical revision and associated payer costs were identified using ICD-9-CM diagnosis and procedure codes. Italian national reimbursement rates were determined. Propensity scores were estimated using a logistic regression model based upon 11 pre-implant baseline characteristics and were used to derive a 1 : 1 matched cohort of QUAD (n = 77) and BIP (n = 77) patients. Hospitalization rates for the two groups were compared using negative binomial regression and associated payer costs were compared using non-parametric bootstrapping (×10 000) and one-sided hypothesis test. Hospitalization rates of the QUAD group [0.15/ patient (pt)-year] were lower than those of the BIP group (0.32/ pt-year); the incidence rate ratio was 0.46, P = 0.04. The hospitalization costs for the QUAD group (434 ± 128 €/pt-year) were lower than those for the BIP group (1136 ± 362 €/pt-year). The average difference was 718 €/pt-year, P = 0.016. CONCLUSIONS: In this comparative effectiveness assessment of well-matched groups of CRT-D patients with quadripolar and bipolar LV leads, QUAD patients experienced a lower rate of hospitalizations for HF and LV lead surgical revision, and a lower cost burden. This has important implications for LV pacing lead choice.


Subject(s)
Cardiac Resynchronization Therapy/economics , Defibrillators, Implantable/economics , Health Care Costs/statistics & numerical data , Heart Failure/economics , Hospitalization/economics , Hospitalization/statistics & numerical data , Aged , Defibrillators, Implantable/statistics & numerical data , Electrodes, Implanted/economics , Female , Heart Failure/epidemiology , Heart Failure/prevention & control , Humans , Italy/epidemiology , Male , Treatment Outcome , Utilization Review
SELECTION OF CITATIONS
SEARCH DETAIL
...