Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 19 de 19
Filter
1.
Article in English | MEDLINE | ID: mdl-37556090

ABSTRACT

BACKGROUND: Real-world data have suggested inconsistent adherence to oral anticoagulation for thromboembolic event (TE) prevention in patients with Non valvular atrial fibrillation (NVAF), yet it remains unclear if event risk is elevated during gaps of non-adherence. OBJECTIVE: To compare difference in outcomes between direct oral anticoagulant (DOAC) and warfarin based on adherence to the therapy in patients with NVAF. METHODS: Using the MarketScan claims data, patients receiving prescription of warfarin or a DOAC for NVAF from January 2015 to June 2016 were included. Outcomes included hospitalization for TE (ischemic stroke or systemic embolism), hemorrhagic stroke, stroke of any kind, and major bleeding. Event rates were reported for warfarin and DOACs at a higher-adherence proportion of days covered (PDC > 80%) and lower-adherence (PDC 40-80%). RESULTS: The cohort included 83,168 patients prescribed warfarin (51% [n = 42,639]) or DOAC (49% [n = 40,529]). Lower adherence occurred in 36% (n = 15,330) of patients prescribed warfarin and 26% (n = 10,956) prescribed DOAC. As compared to higher-adherence warfarin after multivariable adjustment, the risk of TE was highest in lower-adherence DOAC (HR 1.26; 95% CI, 1.14-1.33), and lowest in higher-adherence DOAC (HR, 0.93; 95% CI, 0.88-0.99). There was a significantly higher risk of hemorrhagic stroke and stroke of any kind in the lower-adherence groups. Major bleeding was more common with lower-adherence DOAC (HR, 1.43, 95% CI, 1.35-1.52) and lower-adherence warfarin (HR, 1.32, 95% CI, 1.26-1.39). CONCLUSIONS: In this large real-world study, low adherence DOAC was associated with higher risk of TE events as compared to high and low adherence warfarin.

2.
Neuromodulation ; 25(6): 817-828, 2022 Aug.
Article in English | MEDLINE | ID: mdl-34047410

ABSTRACT

OBJECTIVE: Published reports on directional deep brain stimulation (DBS) have been limited to small, single-center investigations. Therapeutic window (TW) is used to describe the range of stimulation amplitudes achieving symptom relief without side effects. This crossover study performed a randomized double-blind assessment of TW for directional and omnidirectional DBS in a large cohort of patients implanted with a DBS system in the subthalamic nucleus for Parkinson's disease. MATERIALS AND METHODS: Participants received omnidirectional stimulation for the first three months after initial study programming, followed by directional DBS for the following three months. The primary endpoint was a double-blind, randomized evaluation of TW for directional vs omnidirectional stimulation at three months after initial study programming. Additional data recorded at three- and six-month follow-ups included stimulation preference, therapeutic current strength, Unified Parkinson's Disease Rating Scale (UPDRS) part III motor score, and quality of life. RESULTS: The study enrolled 234 subjects (62 ± 8 years, 33% female). TW was wider using directional stimulation in 183 of 202 subjects (90.6%). The mean increase in TW with directional stimulation was 41% (2.98 ± 1.38 mA, compared to 2.11 ± 1.33 mA for omnidirectional). UPDRS part III motor score on medication improved 42.4% at three months (after three months of omnidirectional stimulation) and 43.3% at six months (after three months of directional stimulation) with stimulation on, compared to stimulation off. After six months, 52.8% of subjects blinded to stimulation type (102/193) preferred the period with directional stimulation, and 25.9% (50/193) preferred the omnidirectional period. The directional period was preferred by 58.5% of clinicians (113/193) vs 21.2% (41/193) who preferred the omnidirectional period. CONCLUSION: Directional stimulation yielded a wider TW compared to omnidirectional stimulation and was preferred by blinded subjects and clinicians.


Subject(s)
Deep Brain Stimulation , Parkinson Disease , Cross-Over Studies , Deep Brain Stimulation/methods , Female , Humans , Male , Parkinson Disease/drug therapy , Quality of Life , Treatment Outcome
3.
Neuromodulation ; 24(4): 708-718, 2021 Jun.
Article in English | MEDLINE | ID: mdl-32153073

ABSTRACT

OBJECTIVES: ACCURATE, a randomized controlled trial, compared safety and effectiveness of stimulation of the dorsal root ganglion (DRG) vs. conventional spinal cord stimulation (SCS) in complex regional pain syndrome (CRPS-I and II) of the lower extremities. This analysis compares cost-effectiveness of three modalities of treatment for CRPS, namely DRG stimulation, SCS, and comprehensive medical management (CMM). MATERIALS AND METHODS: The retrospective cost-utility analysis combined ACCURATE study data with claims data to compare cost-effectiveness between DRG stimulation, SCS, and CMM. Cost-effectiveness was evaluated using a Markov cohort model with ten-year time horizon from the U.S. payer perspective. Incremental cost-effectiveness ratio (ICER) was reported as cost in 2017 U.S. dollars per gain in quality-adjusted life years (QALYs). Willingness-to-pay thresholds of $50,000/QALY and $100,000/QALY were used to define highly cost-effective and cost-effective therapies. RESULTS: Both DRG and SCS provided an increase in QALYs (4.96 ± 1.54 and 4.58 ± 1.35 QALYs, respectively) and an increase in costs ($153,992 ± $36,651 and $128,269 ± $27,771, respectively) compared to CMM (3.58 ± 0.91 QALYs, $106,173 ± $27,005) over the ten-year model lifetime. Both DRG stimulation ($34,695 per QALY) and SCS ($22,084 per QALY) were cost-effective compared to CMM. In the base case, ICER for DRG v SCS was $68,095/QALY. CONCLUSIONS: DRG and SCS are cost-effective treatments for chronic pain secondary to CRPS-I and II compared to CMM. DRG accrued higher cost due to higher conversion from trial to permanent implant and shorter battery life, but DRG was the most beneficial therapy due to more patients receiving permanent implants and experiencing higher quality of life compared to SCS. New DRG technology has improved battery life, which we expect to make DRG more cost-effective compared to both CMM and SCS in the future.


Subject(s)
Complex Regional Pain Syndromes , Spinal Cord Stimulation , Complex Regional Pain Syndromes/therapy , Cost-Benefit Analysis , Ganglia, Spinal , Humans , Quality of Life , Quality-Adjusted Life Years , Randomized Controlled Trials as Topic , Retrospective Studies
4.
J Cardiovasc Electrophysiol ; 30(11): 2302-2309, 2019 11.
Article in English | MEDLINE | ID: mdl-31549456

ABSTRACT

AIMS: The TactiCath Contact Force Ablation Catheter Study for Atrial Fibrillation (TOCCASTAR) clinical trial compared clinical outcomes using a contact force (CF) sensing ablation catheter (TactiCath) with a catheter that lacked CF measurement. This analysis links recorded events in the TOCCASTAR study and a large claims database, IBM MarketScan®, to determine the economic impact of using CF sensing during atrial fibrillation (AF) ablation. METHODS AND RESULTS: Clinical events including repeat ablation, use of antiarrhythmic drugs, hospitalization, perforation, pericarditis, pneumothorax, pulmonary edema, pulmonary vein stenosis, tamponade, and vascular access complications were adjudicated in the year after ablation. CF was characterized as optimal if greater than or equal to 90% lesion was performed with greater than or equal to 10 g of CF. A probabilistic 1:1 linkage was created for subjects in MarketScan® with the same events in the year after ablation, and the cost was evaluated over 10 000 iterations. Of the 279 subjects in TOCCASTAR, 145 were ablated using CF (57% with optimal CF), and 134 were ablated without CF. In the MarketScan® cohort, 9811 subjects who underwent AF ablation were used to determine events and costs. For subjects ablated with optimal CF, total cost was $19 271 ± 3705 in the year after ablation. For ablation lacking CF measurement, cost was $22 673 ± 3079 (difference of $3402, P < .001). In 73% of simulations, optimal CF was associated with lower cost in the year after ablation. CONCLUSION: Compared to ablation without CF, there was a decrease in healthcare cost of $3402 per subject in the first year after the procedure when optimal CF was used.


Subject(s)
Atrial Fibrillation/economics , Atrial Fibrillation/surgery , Cardiac Catheterization/economics , Cardiac Catheters/economics , Catheter Ablation/economics , Health Care Costs , Transducers, Pressure/economics , Aged , Atrial Fibrillation/diagnosis , Atrial Fibrillation/physiopathology , Cardiac Catheterization/adverse effects , Cardiac Catheterization/instrumentation , Catheter Ablation/adverse effects , Catheter Ablation/instrumentation , Cost Savings , Cost-Benefit Analysis , Databases, Factual , Female , Humans , Male , Middle Aged , Postoperative Complications/economics , Postoperative Complications/therapy , Randomized Controlled Trials as Topic , Retrospective Studies , Risk Factors , Time Factors , Treatment Outcome , United States
5.
Am J Cardiol ; 121(10): 1192-1199, 2018 05 15.
Article in English | MEDLINE | ID: mdl-29571722

ABSTRACT

Catheter ablation and antiarrhythmic drugs (AADs) are the most common rhythm-control strategies for atrial fibrillation (AF). Data comparing the rate of stroke and cardiovascular events between the treatment strategies are limited. Therefore, this observational study uses claims data to compare rate of cardiovascular hospitalization and stroke for patients with AF treated with ablation or AADs. Patients in the MarketScan dataset with AF between January 2010 and December 2014 were categorized in the ablation group if an atrial catheter ablation was performed, or in the AAD group if a relevant AAD was prescribed for AF but no ablation was performed. One year of history was required, and the index event was selected as the most recent ablation or AAD prescription closest to January 1, 2013. A 2:1 propensity score match was performed for age, gender, co-morbidities, and total medical cost in the year before index event. Outcomes included thromboembolic event (ischemic stroke, transient ischemic attack, or systemic embolism) and all cardiovascular hospitalizations. Of the 164,639 patients in the AAD group, 29,456 were matched to the 14,728 ablation patients. There were no significant differences in age (64 ± 10 in both groups), gender (58% male), or CHA2DS2-VASc score (3.2 ± 1.3). Risk of hospitalization with primary diagnosis of thromboembolic event was 41% greater in the AADs group (p < 0.001), and cardiovascular hospitalizations were 13% more likely (p < 0.001). In conclusion, patients treated with catheter ablation of AF have lower risk of thromboembolic events and cardiovascular hospitalizations than a matched cohort of patients managed with AADs.


Subject(s)
Anti-Arrhythmia Agents/therapeutic use , Anticoagulants/therapeutic use , Atrial Fibrillation/therapy , Catheter Ablation , Ischemic Attack, Transient/epidemiology , Stroke/epidemiology , Thromboembolism/epidemiology , Aged , Aged, 80 and over , Atrial Fibrillation/complications , Cardiovascular Diseases/epidemiology , Cohort Studies , Female , Hospitalization , Humans , Ischemic Attack, Transient/etiology , Ischemic Attack, Transient/prevention & control , Male , Middle Aged , Retrospective Studies , Stroke/etiology , Stroke/prevention & control , Thromboembolism/etiology , Thromboembolism/prevention & control
6.
Heart Rhythm ; 15(3): 355-362, 2018 03.
Article in English | MEDLINE | ID: mdl-29030235

ABSTRACT

BACKGROUND: Catheter ablation of ventricular tachycardia (VT) has been shown to reduce the number of recurrent shocks in patients with an implantable cardioverter-defibrillator (ICD). However, how VT ablation affects postprocedural medical and pharmaceutical usage remains unclear. OBJECTIVE: The purpose of this study was to investigate changes in health care resource utilization (HCRU) after VT ablation. METHODS: This large-scale, real-world, retrospective study used the MarketScan databases to identify patients in the United States with an ICD or cardiac resynchronization therapy-defibrillator (CRT-D) undergoing VT ablation. We calculated cumulative medical and pharmaceutical expenditures, office visits, hospitalizations, and emergency room (ER) visits in the 1-year periods before and after ablation. RESULTS: A total of 523 patients met the study inclusion criteria. After VT ablation, median annual cardiac rhythm-related medical expenditures decreased by $5,408. Moreover, the percentage of patients with at least 1 cardiac rhythm-related hospitalization and ER visit decreased from 53% and 41% before ablation to 28% and 26% after ablation, respectively. Similar changes were observed in the number of all-cause hospitalizations and ER visits, but there were no significant changes in all-cause medical expenditures. During the year before VT ablation, there was an increasing rate of health care resource utilization, followed by drastic slowing after ablation. CONCLUSION: This retrospective study demonstrated that catheter ablation seems to reduce hospitalization and overall health care utilization in VT patients with an ICD or CRT-D in place.


Subject(s)
Catheter Ablation , Health Expenditures/trends , Hospitalization/trends , Patient Acceptance of Health Care/statistics & numerical data , Tachycardia, Ventricular/surgery , Aged , Female , Humans , Male , Middle Aged , Retrospective Studies , Tachycardia, Ventricular/economics , United States
7.
Pain Med ; 19(4): 699-707, 2018 04 01.
Article in English | MEDLINE | ID: mdl-29244102

ABSTRACT

Study Design: Observational study using insurance claims. Objective: To quantify opioid usage leading up to spinal cord stimulation (SCS) and the potential impact on outcomes of SCS. Setting: SCS is an interventional therapy that often follows opioid usage in the care continuum for chronic pain. Methods: This study identified SCS patients using the Truven Health MarketScan databases from January 2010 to December 2014. The index event was the first occurrence of a permanent SCS implant. Indicators of opioid usage at implant were daily morphine equivalent dose (MED), number of unique pain drug classes, and diagnosis code for opioid abuse. System explant was used as a measure of ineffective SCS therapy. Multivariate logistic regression was used to analyze the effect of pre-implant medications on explants. Results: A total of 5,476 patients (56 ± 14 years; 60% female) were included. SCS system removal occurred in 390 patients (7.1%) in the year after implant. Number of drug classes (odds ratio [OR] = 1.11, P = 0.007) and MED level (5-90 vs < 5 mg/d: OR = 1.32, P = 0.043; ≥90 vs < 5 mg/d: OR = 1.57, P = 0.005) were independently predictive of system explant. Over the year before implant, MED increased in 54% (stayed the same in 21%, decreased in 25%) of patients who continued with SCS and increased in 53% (stayed the same in 20%, decreased in 27%) of explant patients (P = 0.772). Over the year after implant, significantly more patients with continued SCS had an MED decrease (47%) or stayed the same (23%) than before (P < 0.001). Conclusions: Chronic pain patients receive escalating opioid dosage prior to SCS implant, and high-dose opioid usage is associated with an increased risk of explant. Neuromodulation can stabilize or decrease opioid usage. Earlier consideration of SCS before escalated opioid usage has the potential to improve outcomes in complex chronic pain.


Subject(s)
Analgesics, Opioid/therapeutic use , Chronic Pain/therapy , Spinal Cord Stimulation , Treatment Outcome , Adult , Aged , Female , Humans , Male , Middle Aged , Retrospective Studies
8.
Am J Cardiol ; 120(8): 1325-1331, 2017 Oct 15.
Article in English | MEDLINE | ID: mdl-28947249

ABSTRACT

Although higher detection rates and delayed detection improve survival in implantable cardioverter defibrillator clinical trials, their effectiveness in clinical practice has limited validation. To evaluate the effectiveness of programming strategies for reducing shocks and mortality, we conducted a nationwide assessment of patients with implantable cardioverter defibrillators or cardiac resynchronization therapy defibrillators with linked remote monitoring data. We categorized patients based on the presence or absence of high rate detection and delayed detection: higher rate delayed detection (HRDD), higher rate early detection (HRED), lower rate delayed detection (LRDD), and lower rate early detection (LRED). Cox regression was used to compare mortality and shock-free survival. There were 64,769 patients (age 68 ± 12 years; 27% female; 46% cardiac resynchronization therapy defibrillator; follow-up 1.7 ± 1.1 years). In the first year, 13% of HRDD, 14% of HRED, 18% of LRDD, and 20% in the LRED group experienced a shock. After adjustment, HRDD was associated with lower risk of shock than HRED (hazard ratio [HR] 0.93, 95% confidence interval [CI] 0.89 to 0.98, p = 0.002), LRDD (HR 0.63, 95% CI 0.60 to 0.66, p <0.001), and LRED (HR 0.58, 95% CI 0.55 to 0.61, p <0.001). HRDD was also associated with lower risk of mortality than HRED (adjusted HR 0.80, 95% CI 0.75 to 0.86, p <0.001), LRDD (HR 0.76, 95% CI 0.70 to 0.83, p <0.001), and LRED (HR 0.68, 95% CI 0.62 to 0.73, p <0.001). Similar results were observed in patients with or without a shock in the first 6 months after implant. In conclusion, high rate programming is associated with lower risk of shocks or death compared with delayed detection. Optimal outcomes are observed in patients programmed with both high rate and delayed detection.


Subject(s)
Tachycardia/diagnosis , Adolescent , Adult , Aged , Aged, 80 and over , Electric Countershock/methods , Follow-Up Studies , Humans , Incidence , Middle Aged , Retrospective Studies , Risk Factors , Survival Rate/trends , Tachycardia/epidemiology , Tachycardia/therapy , Time Factors , Treatment Outcome , United States/epidemiology , Young Adult
9.
Neuromodulation ; 20(7): 642-649, 2017 Oct.
Article in English | MEDLINE | ID: mdl-28834092

ABSTRACT

OBJECTIVES: Clinical trials of spinal cord stimulation (SCS) have largely focused on conversion from trial to permanent SCS and the first years after implant. This study evaluates the association of type of SCS and patient characteristics with longer-term therapy-related explants. MATERIALS AND METHODS: Implanting centers in three European countries conducted a retrospective chart review of SCS systems implanted from 2010 to 2013. Ethics approval or waiver was obtained, and informed consent was not required. The chart review recorded implants, follow-up visits, and date and reasons for any explants through mid-2016. Results are presented using Cox regression to determine factors associated with explant for inadequate pain relief. RESULTS: Four implanting centers in three countries evaluated 955 implants, with 8720 visits over 2259 years of follow-up. Median age was 53 years; 558 (58%) were female. Explant rate was 7.9% per year. Over half (94 of 180) of explants were for inadequate pain relief, including 32/462 (6.9%) of implants with conventional nonrechargeable SCS, 37/329 (11.2%) with conventional rechargeable and 22/155 (14.2%) with high-frequency (10 kHz) rechargeable SCS. A higher explant rate was found in univariate regression for conventional rechargeable (HR 1.98, p = 0.005) and high-frequency stimulation (HR 1.79, p = 0.035) than nonrechargeable SCS. After covariate adjustment, the elevated explant rate persisted for conventional rechargeable SCS (HR 1.95, p = 0.011), but was not significant for high-frequency stimulation (HR 1.71, p = 0.069). CONCLUSIONS: This international, real-world study found higher explant rates for conventional rechargeable and high-frequency SCS than nonrechargeable systems. The increased rate for conventional rechargeable stimulation persisted after covariate adjustment.


Subject(s)
Chronic Pain/therapy , Device Removal/statistics & numerical data , Adolescent , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Pain Management/methods , Retrospective Studies , Spinal Cord Stimulation/instrumentation , Young Adult
10.
Neuromodulation ; 20(6): 582-588, 2017 Aug.
Article in English | MEDLINE | ID: mdl-28370724

ABSTRACT

OBJECTIVE: Concordant paresthesia coverage is an independent predictor of pain relief following spinal cord stimulation (SCS). Using aggregate data, our objective is to produce a map of paresthesia coverage as a function of electrode location in SCS. METHODS: This retrospective analysis used x-rays, SCS programming data, and paresthesia coverage maps from the EMPOWER registry of SCS implants for chronic neuropathic pain. Spinal level of dorsal column stimulation was determined by x-ray adjudication and active cathodes in patient programs. Likelihood of paresthesia coverage was determined as a function of stimulating electrode location. Segments of paresthesia coverage were grouped anatomically. Fisher's exact test was used to identify significant differences in likelihood of paresthesia coverage as a function of spinal stimulation level. RESULTS: In the 178 patients analyzed, the most prevalent areas of paresthesia coverage were buttocks, anterior and posterior thigh (each 98%), and low back (94%). Unwanted paresthesia at the ribs occurred in 8% of patients. There were significant differences in the likelihood of achieving paresthesia, with higher thoracic levels (T5, T6, and T7) more likely to achieve low back coverage but also more likely to introduce paresthesia felt at the ribs. Higher levels in the thoracic spine were associated with greater coverage of the buttocks, back, and thigh, and with lesser coverage of the leg and foot. CONCLUSION: This paresthesia atlas uses real-world, aggregate data to determine likelihood of paresthesia coverage as a function of stimulating electrode location. It represents an application of "big data" techniques, and a step toward achieving personalized SCS therapy tailored to the individual's chronic pain.


Subject(s)
Atlases as Topic , Electrodes, Implanted , Paresthesia/therapy , Spinal Cord Stimulation/instrumentation , Spinal Cord Stimulation/methods , Adult , Aged , Electrodes, Implanted/standards , Female , Follow-Up Studies , Humans , Male , Middle Aged , Paresthesia/diagnosis , Registries , Retrospective Studies , Spinal Cord Stimulation/standards
11.
J Cardiovasc Electrophysiol ; 28(4): 416-422, 2017 Apr.
Article in English | MEDLINE | ID: mdl-28128491

ABSTRACT

AIMS: Antitachycardia pacing (ATP) is an effective treatment for ventricular tachycardia (VT) and can reduce the frequency of shocks in patients with an implantable cardioverter defibrillator (ICD). The association between survival and ATP, as compared to a shock, has not been confirmed in a large patient population. This study aims to determine if patients with an ICD receiving ATP have lower mortality, as compared to those receiving shock. METHODS: Sixty-nine thousand three hundred and sixty-eight patients underwent ICD implantation between October 2008 and May 2013 and were enrolled in the remote monitoring network Merlin.net™ (St. Jude Medical, St. Paul, MN, USA). Patients were categorized into three groups based on the type of ICD therapy received during follow-up: no therapy (N = 47,927), ATP (N = 8,049), and shock (N = 13,392) groups. Survival was determined by linking implant records to the Social Security Death Index. RESULTS: The no therapy (hazard ratio [HR] 0.60, 95% confidence interval [CI] 0.56-0.64, P < 0.001) and ATP (HR 0.70, 95% CI 0.64-0.77, P < 0.001) groups were associated with a lower mortality risk than the shock group. These results were unaffected by age, gender, device type, atrial fibrillation (AF) burden, or ventricular rate. ATP was effective in 85% of episodes and ATP effectiveness was dependent on the ventricular rate. CONCLUSIONS: Mortality rates were higher in ICD patients who received only ATP compared to no therapy, but ICD patients who received a shock had higher mortality compared to both groups. Furthermore, the data suggest that age, gender, device type, AF burden, and rate of arrhythmia do not change the trend of higher mortality in patients receiving ICD shock compared to ATP alone.


Subject(s)
Defibrillators, Implantable , Electric Countershock/instrumentation , Tachycardia, Ventricular/therapy , Adolescent , Adult , Aged , Aged, 80 and over , Electric Countershock/adverse effects , Electric Countershock/mortality , Female , Heart Rate , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Proportional Hazards Models , Prosthesis Failure , Retrospective Studies , Risk Factors , Tachycardia, Ventricular/diagnosis , Tachycardia, Ventricular/mortality , Tachycardia, Ventricular/physiopathology , Time Factors , Treatment Outcome , Young Adult
12.
JACC Clin Electrophysiol ; 3(2): 129-138, 2017 02.
Article in English | MEDLINE | ID: mdl-29759385

ABSTRACT

OBJECTIVES: The purpose of this study was to compare health care costs associated with repeat ablation of atrial fibrillation (AF) with health care costs associated with a successful first procedure. BACKGROUND: Catheter ablation has become established as a rhythm control strategy for symptomatic paroxysmal and persistent AF. The economic impact of ablation is not completely understood, and it may be affected by repeat procedures performed for recurrent AF. METHODS: The source of data was the MarketScan (Truven Health, Ann Arbor, Michigan) administrative claims dataset from April 2008 to March 2013, including U.S. patients with private and Medicare supplemental insurance. Patients who underwent an outpatient atrial ablation procedure and a diagnosis of AF were identified. Total health care cost was calculated for 1 year before and after the ablation. Patients were categorized as having undergone a repeat ablation if an additional ablation was performed in the following year. RESULTS: Of 12,027 patients included in the study, repeat ablation was performed in 2,066 (17.2%) within 1 year. Patients with repeat ablation had higher rates of emergency department visits (43.4% vs. 32.2%; < 0.001) and subsequent hospitalization (35.6% vs. 21.5%; p < 0.001), after excluding hospitalizations for the repeat procedure. Total medical cost was higher for patients with repeat ablation ($52,821 vs. $13,412; p < 0.001), and it remained 46% higher even after excluding the cost associated with additional ablations ($19,621 vs. $13,412; p < 0.001). CONCLUSIONS: Health care costs are significantly higher for patients with a repeat ablation for AF than for patients with only a single ablation procedure, even though both groups have similar baseline characteristics. The increased costs persist even after excluding the cost of the repeat ablation itself. These results emphasize the economic benefit of procedural success in AF ablation.


Subject(s)
Atrial Fibrillation/surgery , Catheter Ablation/methods , Atrial Fibrillation/economics , Catheter Ablation/economics , Costs and Cost Analysis , Electric Countershock/economics , Electric Countershock/statistics & numerical data , Electrocardiography, Ambulatory/economics , Electrocardiography, Ambulatory/statistics & numerical data , Female , Hospitalization/economics , Hospitalization/statistics & numerical data , Humans , Male , Middle Aged , Recurrence , Reoperation/economics , Reoperation/statistics & numerical data , Retrospective Studies , Treatment Outcome
13.
Circulation ; 134(16): 1130-1140, 2016 10 18.
Article in English | MEDLINE | ID: mdl-27754946

ABSTRACT

BACKGROUND: The RATE Registry (Registry of Atrial Tachycardia and Atrial Fibrillation Episodes) is a prospective, outcomes-oriented registry designed to document the prevalence of atrial tachycardia and/or fibrillation (AT/AF) of any duration in patients with pacemakers and implantable cardioverter defibrillators (ICDs) and evaluate associations between rigorously adjudicated AT/AF and predefined clinical events, including stroke. The appropriate clinical response to brief episodes of AT/AF remains unclear. METHODS: Rigorously adjudicated electrogram (EGM) data were correlated with adjudicated clinical events with logistic regression and Cox models. Long episodes of AT/AF were defined as episodes in which the onset and/or offset of AT/AF was not present within a single EGM recording. Short episodes of AT/AF were defined as episodes in which both the onset and offset of AT/AF were present within a single EGM recording. RESULTS: We enrolled 5379 patients with pacemakers (N=3141) or ICDs (N=2238) at 225 US sites (median follow-up 22.9 months). There were 359 deaths. There were 478 hospitalizations among 342 patients for clinical events. We adjudicated 37 531 EGMs; 50% of patients had at least one episode of AT/AF. Patients with clinical events were more likely than those without to have long AT/AF (31.9% vs. 22.1% for pacemaker patients and 28.7% vs. 20.2% for ICD patients; P<0.05 for both groups). Only short episodes of AT/AF were documented in 9% of pacemaker patients and 16% of ICD patients. Patients with clinical events were no more likely than those without to have short AT/AF (5.1% vs. 7.9% for pacemaker patients and 11.5% vs. 10.4% for ICD patients; P=0.21 and 0.66, respectively). CONCLUSIONS: In the RATE Registry, rigorously adjudicated short episodes of AT/AF, as defined, were not associated with increased risk of clinical events compared with patients without documented AT/AF. CLINICAL TRIAL REGISTRATION: URL: http://www.clinicaltrials.gov. Unique identifier: NCT00837798.


Subject(s)
Defibrillators, Implantable , Pacemaker, Artificial , Tachycardia/epidemiology , Tachycardia/etiology , Aged , Aged, 80 and over , Atrial Fibrillation/diagnosis , Atrial Fibrillation/epidemiology , Atrial Fibrillation/etiology , Atrial Fibrillation/therapy , Case-Control Studies , Comorbidity , Disease Management , Disease Progression , Electrocardiography, Ambulatory , Female , Humans , Incidence , Male , Middle Aged , Mortality , Odds Ratio , Population Surveillance , Registries , Tachycardia/diagnosis , Tachycardia/therapy , United States
14.
Neuromodulation ; 19(5): 469-76, 2016 Jul.
Article in English | MEDLINE | ID: mdl-26923728

ABSTRACT

INTRODUCTION: A shorter delay time from chronic pain diagnosis to spinal cord stimulation (SCS) implantation may make it more likely to achieve lasting therapeutic efficacy with SCS. The objective of this analysis was to determine the impact of pain-to-SCS time on patients' post-implant healthcare resource utilization (HCRU). METHODS: A retrospective observational study was performed using a real-world patient cohort derived from MarketScan(®) Commercial and Medicare Supplemental claims data bases from April 2008 through March 2013. The predictor variable was the time from the first diagnosis of chronic pain to permanent SCS implant. Using multivariable analysis, we studied the impact of pain-to-SCS time on HCRU in the first year post-implant. For some regression tests, patients were grouped into terciles by HCRU. RESULTS: A total of 762 patients met inclusion criteria, with a median pain-to-SCS time of 1.35 years (Q1: 0.8, Q3: 1.9). For every one-year increase in pain-to-SCS time, the odds increased by 33% for being in the high medical expenditures group (defined using the upper tercile: $4133 over above) over the low group (first lower: $603 or less). The odds increased by 39% for being in the high opioid prescriptions group (10-58 prescriptions) over the low group (0-1). The odds increased by 44% and 55%, respectively, for being in the high office visits (8-77) or hospitalizations (3-28) group over the low office visits (0-2) or hospitalizations (0) group. CONCLUSIONS: HCRU increased in the year following SCS implantation with longer pain-to-SCS time. These results suggest that considering SCS earlier in the care continuum for chronic pain may improve patient outcomes, with reductions in hospitalizations, clinic visits, and opioid usage.


Subject(s)
Chronic Pain/therapy , Health Resources/statistics & numerical data , Spinal Cord Stimulation/methods , Spinal Cord Stimulation/statistics & numerical data , Adult , Aged , Chi-Square Distribution , Cohort Studies , Female , Humans , Male , Middle Aged , Pain Measurement , Regression Analysis , Treatment Outcome
15.
Sci Transl Med ; 7(319): 319ra207, 2015 Dec 23.
Article in English | MEDLINE | ID: mdl-26702095

ABSTRACT

Uncoordinated contraction from electromechanical delay worsens heart failure pathophysiology and prognosis, but restoring coordination with biventricular pacing, known as cardiac resynchronization therapy (CRT), improves both. However, not every patient qualifies for CRT. We show that heart failure with synchronous contraction is improved by inducing dyssynchrony for 6 hours daily by right ventricular pacing using an intracardiac pacing device, in a process we call pacemaker-induced transient asynchrony (PITA). In dogs with heart failure induced by 6 weeks of atrial tachypacing, PITA (starting on week 3) suppressed progressive cardiac dilation as well as chamber and myocyte dysfunction. PITA enhanced ß-adrenergic responsiveness in vivo and normalized it in myocytes. Myofilament calcium response declined in dogs with synchronous heart failure, which was accompanied by sarcomere disarray and generation of myofibers with severely reduced function, and these changes were absent in PITA-treated hearts. The benefits of PITA were not replicated when the same number of right ventricular paced beats was randomly distributed throughout the day, indicating that continuity of dyssynchrony exposure is necessary to trigger the beneficial biological response upon resynchronization. These results suggest that PITA could bring the benefits of CRT to the many heart failure patients with synchronous contraction who are not CRT candidates.


Subject(s)
Disease Progression , Heart Failure/pathology , Heart Failure/therapy , Pacemaker, Artificial , Animals , Calcium/metabolism , Dogs , Heart Failure/physiopathology , Myocytes, Cardiac/metabolism , Myocytes, Cardiac/pathology , Myofibrils/metabolism , Proteomics , Receptors, Adrenergic, beta/metabolism , Sarcomeres/metabolism
16.
Int J Neurosci ; 125(7): 475-85, 2015.
Article in English | MEDLINE | ID: mdl-25526555

ABSTRACT

The proceedings of the 2nd Annual Deep Brain Stimulation Think Tank summarize the most contemporary clinical, electrophysiological, and computational work on DBS for the treatment of neurological and neuropsychiatric disease and represent the insights of a unique multidisciplinary ensemble of expert neurologists, neurosurgeons, neuropsychologists, psychiatrists, scientists, engineers and members of industry. Presentations and discussions covered a broad range of topics, including advocacy for DBS, improving clinical outcomes, innovations in computational models of DBS, understanding of the neurophysiology of Parkinson's disease (PD) and Tourette syndrome (TS) and evolving sensor and device technologies.


Subject(s)
Deep Brain Stimulation/methods , International Cooperation , Parkinson Disease/therapy , Tourette Syndrome/therapy , Animals , Brain/physiology , Humans
17.
J Card Fail ; 20(5): 365-72, 2014 May.
Article in English | MEDLINE | ID: mdl-24508810

ABSTRACT

BACKGROUND: Invasively measured maximum increase in left ventricular pressure (LV dP/dtmax) has been used to assess biventricular (BiV) pacing. We quantified extracardiac factors contributing to its variability, and developed a protocol to minimize these effects in an acute pacing experiment. METHODS AND RESULTS: Continuous pressure was recorded by a guidewire sensor placed in the LV. Four to six test pacing interventions were performed, each repeated 3 times and followed by a baseline pacing configuration. Maximum increase in LV dP/dtmax from any measurement of BiV pacing was median 20.3% in 25 patients, compared with BiV pacing off. When directly comparing sequential measurements with BiV pacing on and off, median increase was 7.4%. Noncardiac sources of modulation included respiratory variation (6.4%), drift from first to last baseline measurement (5.0%), and discrepancy among repeated recordings of the same pacing intervention (3.3%). Comparing test interventions to interleaved baseline measurements reduced discrepancy among recordings to 2.1%; P < .001. CONCLUSIONS: With repeated measurements of baseline state, and by comparing test interventions only to baseline measurements performed before and after, it is possible to minimize extracardiac factors and focus on the effects of test pacing interventions.


Subject(s)
Cardiac Resynchronization Therapy/methods , Hemodynamics/physiology , Research Design/standards , Ventricular Dysfunction, Left/physiopathology , Ventricular Dysfunction, Left/therapy , Aged , Female , Humans , Male , Middle Aged , Time Factors , Treatment Outcome , Ventricular Dysfunction, Left/diagnosis
18.
Europace ; 15(7): 984-91, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23447571

ABSTRACT

AIMS: Pacing from multiple sites in the left ventricle (LV) may bring about further resynchronization of the diseased heart compared with biventricular (BiV) pacing. We compared acute haemodynamic response (LV dP/dtmax) of multisite and BiV pacing using a quadripolar LV lead. METHODS AND RESULTS: In 21 patients receiving cardiac resynchronization therapy, a quadripolar LV lead and conventional right atrial and ventricular leads were connected to an external pacing system. A guidewire pressure sensor was placed in the LV for continuous dP/dt measurement. Four multisite pacing configurations were tested three times each and compared with BiV pacing using the distal LV electrode. Nineteen patients had useable haemodynamic data. Median increase in LV dP/dtmax with BiV vs. atrial-only pacing was 8.2% (interquartile range 2.3%, 15.7%). With multisite pacing using distal and proximal LV electrodes, median increase in LV dP/dtmax was 10.2% compared with atrial-only pacing (interquartile range 6.1%, 25.6%). In 16 of 19 patients (84%), two or more of the four multisite pacing configurations increased LV dP/dtmax compared with BiV pacing. Overall, 72% of all tested configurations of multisite pacing produced greater LV dP/dtmax than obtained with BiV pacing. Pacing from most distal and proximal electrodes was the most common optimal configuration, superior to BiV pacing in 74% of patients. CONCLUSION: In the majority of patients, multisite pacing improved acute systolic function further compared with BiV pacing. Pacing with the most distal and proximal electrodes of the quadripolar LV lead most commonly yielded greatest LV dP/dtmax.


Subject(s)
Cardiac Resynchronization Therapy Devices , Cardiac Resynchronization Therapy/methods , Heart Diseases/therapy , Hemodynamics , Ventricular Function, Left , Ventricular Function, Right , Aged , Cardiac Catheterization/instrumentation , Cardiac Catheters , Equipment Design , Female , Heart Diseases/diagnosis , Heart Diseases/physiopathology , Humans , Male , Middle Aged , Predictive Value of Tests , Prospective Studies , Recovery of Function , Time Factors , Transducers, Pressure , Treatment Outcome , Ventricular Pressure
19.
Europace ; 12(5): 751-3, 2010 May.
Article in English | MEDLINE | ID: mdl-20080902

ABSTRACT

A 60-year-old ischaemic patient presented for routine cardiac resynchronization therapy (CRT)-D implantation. An investigational quadripolar left ventricular lead was placed in the posterolateral vein. Phrenic nerve stimulation (PNS) was observed, but it occurred during pacing from only one of the four electrodes. A lead with multiple pacing electrodes is a potential alternative to physical adjustment of the lead or discontinuing CRT when PNS occurs.


Subject(s)
Cardiac Pacing, Artificial/methods , Cardiomyopathies/therapy , Electrodes , Pacemaker, Artificial , Phrenic Nerve/physiology , Cardiomyopathies/physiopathology , Electric Stimulation , Humans , Male , Middle Aged , Ventricular Function, Left/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...