Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 97
Filter
1.
J Neuromuscul Dis ; 11(2): 425-442, 2024.
Article in English | MEDLINE | ID: mdl-38250783

ABSTRACT

Background: Long-term, real-world effectiveness and safety data of disease-modifying treatments for spinal muscular atrophy (SMA) are important for assessing outcomes and providing information for a larger number and broader range of SMA patients than included in clinical trials. Objective: We sought to describe patients with SMA treated with onasemnogene abeparvovec monotherapy in the real-world setting. Methods: RESTORE is a prospective, multicenter, multinational, observational registry that captures data from a variety of sources. Results: Recruitment started in September 2018. As of May 23, 2022, data were available for 168 patients treated with onasemnogene abeparvovec monotherapy. Median (IQR) age at initial SMA diagnosis was 1 (0-6) month and at onasemnogene abeparvovec infusion was 3 (1-10) months. Eighty patients (47.6%) had two and 70 (41.7%) had three copies of SMN2, and 98 (58.3%) were identified by newborn screening. Infants identified by newborn screening had a lower age at final assessment (mean age 11.5 months) and greater mean final (SD) CHOP INTEND score (57.0 [10.0] points) compared with clinically diagnosed patients (23.1 months; 52.1 [8.0] points). All patients maintained/achieved motor milestones. 48.5% (n = 81/167) experienced at least one treatment-emergent adverse event (AE), and 31/167 patients (18.6%) experienced at least one serious AE, of which 8/31 were considered treatment-related. Conclusion: These real-world outcomes support findings from the interventional trial program and demonstrate effectiveness of onasemnogene abeparvovec over a large patient population, which was consistent with initial clinical data and published 5-year follow-up data. Observed AEs were consistent with the established safety profile of onasemnogene abeparvovec.


Subject(s)
Biological Products , Muscular Atrophy, Spinal , Recombinant Fusion Proteins , Spinal Muscular Atrophies of Childhood , Infant , Infant, Newborn , Humans , Spinal Muscular Atrophies of Childhood/drug therapy , Prospective Studies , Genetic Therapy , Muscular Atrophy, Spinal/drug therapy , Registries
2.
J Neuromuscul Dis ; 7(2): 145-152, 2020.
Article in English | MEDLINE | ID: mdl-32039859

ABSTRACT

BACKGROUND: Dramatic improvements in spinal muscular atrophy (SMA) treatment have changed the prognosis for patients with this disease, leading to important new questions. Gathering representative, real-world data about the long-term efficacy and safety of emerging SMA interventions is essential to document their impact on patients and caregivers. OBJECTIVES: This registry will assess outcomes in patients with genetically confirmed SMA and provide information on the effectiveness and long-term safety of approved and emerging treatments. DESIGN AND METHODS: RESTORE is a prospective, multicenter, multinational observational registry. Patients will be managed according to usual clinical practice. Both newly recruitedSMAtreatment centers and sites involved in existing SMA registries, including iSMAC, Treat-NMD, French SMA Assistance Publique- Hôpitaux de Paris (AP-HP), Cure-SMA, SMArtCARE, will be eligible to participate; de novo; sites already participating in another registry may be included via consortium agreements. Data from patients enrolled in partnering registries will be shared with the RESTORE Registry and data for newly diagnosed patients will be added upon enrollment. Patients will be enrolled over a 5-year period and followed for 15 years or until death. Assessments will include SMA history and treatment, pulmonary, nutritional, and motor milestones, healthcare resource utilization, work productivity, activity impairment, adverse events, quality of life, caregiver burden, and survival.Status:Recruitment started in September 2018. As of January 3, 2020, 64 patients were enrolled at 25 participating sites. CONCLUSIONS: The RESTORE Registry has begun recruiting recently diagnosed patients with genetically confirmed SMA, enabling assessment of both short- and long-term patient outcomes.


Subject(s)
Muscular Atrophy, Spinal , Registries , Research Design , Humans , Muscular Atrophy, Spinal/diagnosis , Muscular Atrophy, Spinal/genetics , Muscular Atrophy, Spinal/physiopathology , Muscular Atrophy, Spinal/therapy , Prospective Studies , Rare Diseases
3.
NPJ Digit Med ; 3: 8, 2020.
Article in English | MEDLINE | ID: mdl-31993506

ABSTRACT

The ability to identify patients who are likely to have an adverse outcome is an essential component of good clinical care. Therefore, predictive risk stratification models play an important role in clinical decision making. Determining whether a given predictive model is suitable for clinical use usually involves evaluating the model's performance on large patient datasets using standard statistical measures of success (e.g., accuracy, discriminatory ability). However, as these metrics correspond to averages over patients who have a range of different characteristics, it is difficult to discern whether an individual prediction on a given patient should be trusted using these measures alone. In this paper, we introduce a new method for identifying patient subgroups where a predictive model is expected to be poor, thereby highlighting when a given prediction is misleading and should not be trusted. The resulting "unreliability score" can be computed for any clinical risk model and is suitable in the setting of large class imbalance, a situation often encountered in healthcare settings. Using data from more than 40,000 patients in the Global Registry of Acute Coronary Events (GRACE), we demonstrate that patients with high unreliability scores form a subgroup in which the predictive model has both decreased accuracy and decreased discriminatory ability.

4.
World J Transplant ; 9(7): 145-157, 2019 Nov 20.
Article in English | MEDLINE | ID: mdl-31850158

ABSTRACT

BACKGROUND: The Karnofsky Performance Status (KPS) scale has been widely validated for clinical practice for over 60 years. AIM: To examine the extent to which poor pre-transplant functional status, assessed using the KPS scale, is associated with increased risk of mortality and/or graft failure at 1-year post-transplantation. METHODS: This study included 38278 United States adults who underwent first, non-urgent, liver-only transplantation from 2005 to 2014 (Scientific Registry of Transplant Recipients). Functional impairment/disability was categorized as severe, moderate, or none/normal. Analyses were conducted using multivariable-adjusted Cox survival regression models. RESULTS: The median age was 56 years, 31% were women, median pre-transplant Model for End-Stage for Liver Disease score was 18. Functional impairment was present in 70%; one-quarter of the sample was severely disabled. After controlling for key recipient and donor factors, moderately and severely disabled patients had a 1-year mortality rate of 1.32 [confidence interval (CI): 1.21-1.44] and 1.73 (95%CI: 1.56-1.91) compared to patients with no impairment, respectively. Subjects with moderate and severe disability also had a multivariable-adjusted 1-year graft failure rate of 1.13 (CI: 1.02-1.24) and 1.16 (CI: 1.02-1.31), respectively. CONCLUSION: Pre-transplant functional status is a useful prognostic indicator for 1-year post-transplant patient and graft survival.

5.
Clin Appl Thromb Hemost ; 25: 1076029619880008, 2019.
Article in English | MEDLINE | ID: mdl-31588785

ABSTRACT

Major medical illnesses place patients at risk of venous thromboembolism (VTE). Some risk factors including age ≥75 years or history of cancer place them at increased risk of VTE that extends for at least 5 to 6 weeks following hospital admission. Betrixaban thromboprophylaxis is now approved in the United States for this indication. We estimated the annual number of acutely ill medical patients at extended risk of VTE discharged from US hospital. Major medical illnesses (stroke, respiratory failure/chronic obstructive pulmonary disease, heart failure, pneumonia, other infections, and rheumatologic disorders) and 2 common risk factors for extended VTE risk, namely, age ≥75 years and history of cancer (active or past) were examined in 2014 US hospital discharges using the first 3 discharge diagnosis codes in the National Inpatient Sample (database of acute-care hospital discharges from the US Agency for Health Care Quality and Research). In 2014, there were 20.8 million discharges with potentially at risk of nonsurgical-related VTE. Overall, 7.2 million (35%) discharges corresponded to major medical illness that warranted thromboprophylaxis according to 2012 American College of Chest Physicians (ACCP) guideline. Among them, 2.79 million were aged ≥75 years and 1.36 million had a history of cancer (aged 40-74 years). Overall, 3.48 million discharges were at extended risk of VTE. Many medical inpatients at risk of VTE according to 2012 ACCP guideline might benefit from the awareness of continuing risk and some of these patients might benefit from extended thromboprophylaxis, depending on the risk of bleeding and comorbidities.


Subject(s)
Patient Discharge , Premedication/methods , Risk Assessment , Venous Thromboembolism/prevention & control , Adult , Aged , Comorbidity , Female , Guideline Adherence , Hemorrhage , Hospitalization , Humans , Male , Middle Aged , United States , Venous Thromboembolism/drug therapy
6.
Am J Med ; 132(5): 588-595, 2019 05.
Article in English | MEDLINE | ID: mdl-30658087

ABSTRACT

BACKGROUND: The annual number of US hospital discharges at risk for venous thromboembolism and the impact of evolving American College of Chest Physicians (ACCP) consensus guidelines for prevention of venous thromboembolism are unknown. METHODS: Three risk-assessment algorithms based on 2004, 2008, and 2012 ACCP guidelines for prevention of venous thromboembolism were applied to the 2014 US National Inpatient Sample to derive estimates of the annual number of US inpatients at risk for venous thromboembolism. RESULTS: Of 35.4 million discharges from US acute-care hospitals in 2014, 25.3 million (71%) met study inclusion criteria of age ≥18 years and length of stay (LOS) ≥2 days. Among 7.5 million patients who underwent a procedure in an operating room, more than 4.4 million (59%) were at ACCP-defined risk for venous thromboembolism, irrespective of which version of the ACCP guidelines applied. With an additional 8.4/8.5/7.3 million eligible discharges meeting criteria for venous thromboembolism prophylaxis due to medical risk factors, the total annual numbers of inpatients at risk for venous thromboembolism were 12.8/12.9/11.7 million according to 2004/2008/2012 ACCP guidelines, respectively. CONCLUSIONS: Over half of adult patients who had an LOS ≥2 days in US acute-care hospitals met ACCP criteria for consideration of venous thromboembolism prophylaxis based on risk factors associated with surgery or acute medical illness. These data provide an objective basis for estimating the potential impact of venous thromboembolism prevention on patient care, together with associated costs, risks, and benefits.


Subject(s)
Guideline Adherence/statistics & numerical data , Patient Discharge/statistics & numerical data , Practice Guidelines as Topic , Risk Assessment/methods , Venous Thromboembolism , Algorithms , Eligibility Determination/methods , Eligibility Determination/statistics & numerical data , Female , Humans , Male , Middle Aged , Outcome and Process Assessment, Health Care , Preventive Health Services , Risk Factors , United States/epidemiology , Venous Thromboembolism/diagnosis , Venous Thromboembolism/epidemiology , Venous Thromboembolism/prevention & control
7.
World J Surg ; 42(1): 246-253, 2018 01.
Article in English | MEDLINE | ID: mdl-28744593

ABSTRACT

BACKGROUND: C. difficile (CDI) has surpassed methicillin-resistant staph aureus as the most common nosocomial infection with recurrence reaching 30% and the elderly being disproportionately affected. We hypothesized that post-discharge antibiotic therapy for continued CDI treatment reduces readmissions. STUDY DESIGN: We queried a 5% random sample of Medicare claims (2009-2011 Part A and Part D; n = 864,604) for hospitalizations with primary or secondary diagnosis of CDI. We compared demographics, comorbidities, and post-discharge CDI treatment (no CDI treatment, oral metronidazole only, oral vancomycin only, or both) between patients readmitted with a primary diagnosis of CDI within 90 days and patients not readmitted for any reason using univariate tests of association and multivariable models. RESULTS: Of 7042 patients discharged alive, 945 were readmitted ≤90 days with CDI (13%), while 1953 were not readmitted for any reason (28%). Patients discharged on dual therapy had the highest rates of readmission (50%), followed by no post-discharge CDI treatment (43%), vancomycin only (28%), and metronidazole only (19%). Patients discharged on only metronidazole (OR 0.28) or only vancomycin (OR 0.42) had reduced odds of 90-day readmission compared to patients discharged on no CDI treatment. Patients discharged on dual therapy did not vary in odds of readmission. CONCLUSIONS: Thirteen percent of patients discharged with CDI are readmitted within 90 days. Patients discharged with single-drug therapy for CDI had lower readmission rates compared to patients discharged on no ongoing CDI treatment suggesting that short-term monotherapy may be beneficial in inducing eradication and preventing relapse. Half of patients requiring dual therapy required readmission, suggesting patients with symptoms severe enough to warrant discharge on dual therapy may benefit from longer hospitalization.


Subject(s)
Clostridioides difficile , Clostridium Infections/drug therapy , Colitis/drug therapy , Cross Infection/drug therapy , Patient Readmission , Aged , Aged, 80 and over , Anti-Bacterial Agents/therapeutic use , Drug Therapy, Combination , Female , Hospitalization , Humans , Male , Medicare , Metronidazole/therapeutic use , Recurrence , Retrospective Studies , United States , Vancomycin/therapeutic use
8.
Am J Cardiol ; 118(8): 1105-1110, 2016 Oct 15.
Article in English | MEDLINE | ID: mdl-27561191

ABSTRACT

The GRACE Risk Score is a well-validated tool for estimating short- and long-term risk in acute coronary syndrome (ACS). GRACE Risk Score 2.0 substitutes several variables that may be unavailable to clinicians and, thus, limit use of the GRACE Risk Score. GRACE Risk Score 2.0 performed well in the original GRACE cohort. We sought to validate its performance in a contemporary multiracial ACS cohort, in particular in black patients with ACS. We evaluated the performance of the GRACE Risk Score 2.0 simplified algorithm for predicting 1-year mortality in 2,131 participants in Transitions, Risks, and Actions in Coronary Events Center for Outcomes Research and Education (TRACE-CORE), a multiracial cohort of patients discharged alive after an ACS in 2011 to 2013 from 6 hospitals in Massachusetts and Georgia. The median age of study participants was 61 years, 67% were men, and 16% were black. Half (51%) of the patients experienced a non-ST-segment elevation myocardial infarction (NSTEMI) and 18% STEMI. Eighty patients (3.8%) died within 12 months of discharge. The GRACE Risk Score 2.0 simplified algorithm demonstrated excellent model discrimination for predicting 1-year mortality after hospital discharge in the TRACE-CORE cohort (c-index = 0.77). The c-index was 0.94 in patients with STEMI, 0.78 in those with NSTEMI, and 0.87 in black patients with ACS. In conclusion, the GRACE Risk Score 2.0 simplified algorithm for predicting 1-year mortality exhibited excellent model discrimination across the spectrum of ACS types and racial/ethnic subgroups and, thus, may be a helpful tool to guide routine clinical care for patients with ACS.


Subject(s)
Acute Coronary Syndrome , Algorithms , Black or African American/statistics & numerical data , Mortality , Non-ST Elevated Myocardial Infarction , ST Elevation Myocardial Infarction , White People/statistics & numerical data , Age Factors , Aged , Blood Pressure , Cohort Studies , Creatinine/blood , Diuretics/therapeutic use , Female , Georgia , Heart Arrest/epidemiology , Heart Rate , Hospitalization , Humans , Male , Massachusetts , Middle Aged , Proportional Hazards Models , Prospective Studies , Risk Assessment
9.
J Am Coll Surg ; 222(6): 1054-65, 2016 06.
Article in English | MEDLINE | ID: mdl-27178368

ABSTRACT

BACKGROUND: The central tenet of liver transplant organ allocation is to prioritize the sickest patients first. However, a 2007 Centers for Medicare and Medicaid Services regulatory policy, Conditions of Participation (COP), which mandates publically reported transplant center performance assessment and outcomes-based auditing, critically altered waitlist management and clinical decision making. We examine the extent to which COP implementation is associated with increased removal of the "sickest" patients from the liver transplant waitlist. STUDY DESIGN: This study included 90,765 adult (aged 18 years and older) deceased donor liver transplant candidates listed at 102 transplant centers from April 2002 through December 2012 (Scientific Registry of Transplant Recipients). We quantified the effect of COP implementation on trends in waitlist removal due to illness severity and 1-year post-transplant mortality using interrupted time series segmented Poisson regression analysis. RESULTS: We observed increasing trends in delisting due to illness severity in the setting of comparable demographic and clinical characteristics. Delisting abruptly increased by 16% at the time of COP implementation, and likelihood of being delisted continued to increase by 3% per quarter thereafter, without attenuation (p < 0.001). Results remained consistent after stratifying on key variables (ie, Model for End-Stage Liver Disease and age). The COP did not significantly impact 1-year post-transplant mortality (p = 0.38). CONCLUSIONS: Although the 2007 Centers for Medicare and Medicaid Services COP policy was a quality initiative designed to improve patient outcomes, in reality, it failed to show beneficial effects in the liver transplant population. Patients who could potentially benefit from transplantation are increasingly being denied this lifesaving procedure while transplant mortality rates remain unaffected. Policy makers and clinicians should strive to balance candidate and recipient needs from a population-benefit perspective when designing performance metrics and during clinical decision making for patients on the waitlist.


Subject(s)
Centers for Medicare and Medicaid Services, U.S./standards , Health Care Rationing/standards , Health Policy , Liver Transplantation/trends , Severity of Illness Index , Waiting Lists , Adolescent , Adult , Aged , Aged, 80 and over , Female , Humans , Liver Transplantation/standards , Male , Middle Aged , Poisson Distribution , Retrospective Studies , United States , Young Adult
10.
J Thromb Thrombolysis ; 41(3): 525-38, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26847621

ABSTRACT

Venous thromboembolism (VTE) has multiple risk factors and tends to recur. Despite the benefits of anticoagulation, the prevalence of, and case-fatality rate associated with, recurrent VTE remains a concern after an acute episode; it is particularly high during the acute treatment phase. We sought to quantify the magnitude, identify predictors, and develop risk score calculator of recurrence within 3 years after first-time VTE. This was a population-based surveillance study among residents of central Massachusetts (MA), USA, diagnosed with an acute first-time pulmonary embolism and/or lower-extremity deep vein thrombosis from 1999 to 2009 in hospital and ambulatory settings in all 12 central MA hospitals. Medical records were reviewed and validated. The 2989 study patients were followed for 5836 person-years [mean follow-up 23.4 (median 30) months]. Mean age was 64.3 years, 44 % were men, and 94 % were white. The cumulative incidence rate of recurrent VTE within 3 years after an index VTE was 15 % overall, and 25, 13, and 13 % among patients with active cancer, provoked, or unprovoked VTE, respectively. Multivariable regression indicated that active cancer, varicose vein stripping, and inferior vena cava filter placement were independent predictors of recurrence during both 3-month and 3-year follow-up. A risk score calculator was developed based on the 3-month prognostic model. In conclusion, the rate of VTE recurrence over 3 years of follow-up remained high. The risk score calculator may assist clinicians at the index encounter in determining the frequency of clinical surveillance and appropriate outpatient treatment of VTE during the acute treatment phase.


Subject(s)
Venous Thromboembolism/epidemiology , Aged , Aged, 80 and over , Female , Follow-Up Studies , Humans , Male , Middle Aged , Recurrence , Retrospective Studies , Risk Factors , Venous Thromboembolism/blood
11.
J Bone Miner Res ; 31(7): 1466-72, 2016 07.
Article in English | MEDLINE | ID: mdl-26861139

ABSTRACT

Increased fracture risk has been associated with weight loss in postmenopausal women, but the time course over which this occurs has not been established. The aim of this study was to examine the effects of unintentional weight loss of ≥10 lb (4.5 kg) in postmenopausal women on fracture risk at multiple sites up to 5 years after weight loss. Using data from the Global Longitudinal Study of Osteoporosis in Women (GLOW), we analyzed the relationships between self-reported unintentional weight loss of ≥10 lb at baseline, year 2, or year 3 and incident clinical fracture in the years after weight loss. Complete data were available in 40,179 women (mean age ± SD 68 ± 8.3 years). Five-year cumulative fracture rate was estimated using the Kaplan-Meier method, and adjusted hazard ratios for weight loss as a time-varying covariate were calculated from Cox multiple regression models. Unintentional weight loss at baseline was associated with a significantly increased risk of fracture of the clavicle, wrist, spine, rib, hip, and pelvis for up to 5 years after weight loss. Adjusted hazard ratios showed a significant association between unintentional weight loss and fracture of the hip, spine, and clavicle within 1 year of weight loss, and these associations were still present at 5 years. These findings demonstrate increased fracture risk at several sites after unintentional weight loss in postmenopausal women. This increase is found as early as 1 year after weight loss, emphasizing the need for prompt fracture risk assessment and appropriate management to reduce fracture risk in this population. © 2016 American Society for Bone and Mineral Research.


Subject(s)
Fractures, Bone/epidemiology , Postmenopause , Weight Loss , Aged , Female , Follow-Up Studies , Fractures, Bone/etiology , Humans , Longitudinal Studies , Middle Aged , Risk Assessment , Risk Factors , Time Factors
12.
PLoS One ; 10(3): e0121429, 2015.
Article in English | MEDLINE | ID: mdl-25816146

ABSTRACT

BACKGROUND: The risk of venous thromboembolism (VTE) can be reduced by appropriate use of anticoagulant prophylaxis. VTE prophylaxis does, however, remain substantially underused, particularly among acutely ill medical inpatients. We sought to evaluate the clinical and economic impact of increasing use of American College of Chest Physicians (ACCP)-recommended VTE prophylaxis among medical inpatients from a US healthcare system perspective. METHODS AND FINDINGS: In this retrospective database cost-effectiveness evaluation, a decision-tree model was developed to estimate deaths within 30 days of admission and outcomes attributable to VTE that might have been averted by use of low-molecular-weight heparin (LMWH) or unfractionated heparin (UFH). Incremental cost-effectiveness ratio was calculated using "no prophylaxis" as the comparator. Data from the ENDORSE US medical inpatients and the US nationwide Inpatient Sample (NIS) were used to estimate the annual number of eligible inpatients who failed to receive ACCP-recommended VTE prophylaxis. The cost-effectiveness analysis indicated that VTE-prevention strategies would reduce deaths by 0.5% and 0.3%, comparing LMWH and UFH strategies with no prophylaxis, translating into savings of $50,637 and $25,714, respectively, per death averted. The ENDORSE findings indicated that 51.1% of US medical inpatients were at ACCP-defined VTE risk, 47.5% of whom received ACCP-recommended prophylaxis. By extrapolating these findings to the NIS and applying cost-effectives analysis results, the full implementation of ACCP guidelines would reduce number of deaths (by 15,875 if using LMWH or 10,201 if using UFH), and was extrapolated to calculate the cost reduction of $803M for LMWH and $262M for UFH. CONCLUSIONS: Efforts to improve VTE prophylaxis use in acutely ill inpatients are warranted due to the potential for reducing VTE-attributable deaths, with net cost savings to healthcare systems.


Subject(s)
Anticoagulants/therapeutic use , Heparin/therapeutic use , Pre-Exposure Prophylaxis/economics , Venous Thromboembolism/prevention & control , Anticoagulants/economics , Cost-Benefit Analysis , Critical Care/economics , Cross-Sectional Studies , Heparin, Low-Molecular-Weight/therapeutic use , Humans , Pre-Exposure Prophylaxis/methods , Prospective Studies , United States , Venous Thromboembolism/economics
13.
Am J Med ; 128(7): 766-75, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25554379

ABSTRACT

PURPOSE: Short-term outcomes have been well characterized in acute coronary syndromes; however, longer-term follow-up for the entire spectrum of these patients, including ST-segment-elevation myocardial infarction, non-ST-segment-elevation myocardial infarction, and unstable angina, is more limited. Therefore, we describe the longer-term outcomes, procedures, and medication use in Global Registry of Acute Coronary Events (GRACE) hospital survivors undergoing 6-month and 2-year follow-up, and the performance of the discharge GRACE risk score in predicting 2-year mortality. METHODS: Between 1999 and 2007, 70,395 patients with a suspected acute coronary syndrome were enrolled. In 2004, 2-year prospective follow-up was undertaken in those with a discharge acute coronary syndrome diagnosis in 57 sites. RESULTS: From 2004 to 2007, 19,122 (87.2%) patients underwent follow-up; by 2 years postdischarge, 14.3% underwent angiography, 8.7% percutaneous coronary intervention, 2.0% coronary bypass surgery, and 24.2% were re-hospitalized. In patients with 2-year follow-up, acetylsalicylic acid (88.7%), beta-blocker (80.4%), renin-angiotensin system inhibitor (69.8%), and statin (80.2%) therapy was used. Heart failure occurred in 6.3%, (re)infarction in 4.4%, and death in 7.1%. Discharge-to-6-month GRACE risk score was highly predictive of all-cause mortality at 2 years (c-statistic 0.80). CONCLUSION: In this large multinational cohort of acute coronary syndrome patients, there were important later adverse consequences, including frequent morbidity and mortality. These findings were seen in the context of additional coronary procedures and despite continued use of evidence-based therapies in a high proportion of patients. The discriminative accuracy of the GRACE risk score in hospital survivors for predicting longer-term mortality was maintained.


Subject(s)
Acute Coronary Syndrome/epidemiology , Acute Coronary Syndrome/therapy , Angioplasty, Balloon, Coronary/methods , Cause of Death , Coronary Artery Bypass/methods , Registries , Acute Coronary Syndrome/diagnosis , Age Distribution , Aged , Angioplasty, Balloon, Coronary/mortality , Continuity of Patient Care , Coronary Artery Bypass/mortality , Female , Follow-Up Studies , Global Health , Hospital Mortality , Humans , Internationality , Male , Middle Aged , Patient Discharge/statistics & numerical data , Retrospective Studies , Risk Assessment , Severity of Illness Index , Sex Distribution , Survival Analysis , Time Factors , Treatment Outcome
14.
J Vasc Surg ; 61(1): 44-9, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25065583

ABSTRACT

OBJECTIVE: The Agency for Healthcare Research and Quality (AHRQ) Inpatient Quality Indicator (IQI) #11, abdominal aortic aneurysm (AAA) repair mortality rate, is a measure of hospital quality that is publically reported but has not been externally validated. Because the IQI #11 overall mortality rate includes both intact and ruptured aneurysms and open and endovascular repair, we hypothesized that IQI #11 overall mortality rate does not provide accurate assessment of mortality risk after AAA repair and that AAA mortality cannot be accurately assessed by a single quality measure. METHODS: Using AHRQ IQI software version 4.2, we calculated observed (O) and expected (E) mortality rates for IQI #11 for all hospitals performing more than 10 AAA repairs per year in the Nationwide Inpatient Sample for the years 2007 to 2011. We used Spearman correlation coefficient to compare expected rates as determined by IQI #11 overall mortality rate risk adjustment methodology and observed rates for all AAA repairs in four cohorts stratified by aneurysm stability (ruptured vs intact) and method of repair (open vs endovascular). RESULTS: Among 187,773 AAA repairs performed at 1268 U.S. hospitals, hospitals' IQI #11 overall expected rates correlated poorly with their observed rates (E: 5.0% ± 4.4% vs O: 6.0% ± 9.8%; r = .49). For ruptured AAAs, IQI #11 overall mortality rate methodology underestimated the mortality risk of open repair (E: 34% ± 7.2% vs O: 40.1% ± 38.2%; r = 0.20) and endovascular repair (E: 24.8% ± 9% vs O: 27.3% ± 37.9%; r = 0.08). For intact AAA repair, IQI #11 overall mortality rate methodology underestimated the mortality risk of open repair (E: 4.3% ± 2.4% vs O: 6.3% ± 16.1%; r = .24) but overestimated the mortality risk of endovascular repair (E: 1.3% ± 0.8% vs O: 1.1% ± 3.7%; r = 0.25). Hospitals' observed mortality rates after intact AAA repair were not correlated with their mortality rates after ruptured AAA repair (r = 0.03). CONCLUSIONS: IQI #11 overall mortality rate fails to provide accurate assessment of inpatient mortality risk after AAA repair. Thus, it is inappropriate to use IQI #11 overall mortality rate for quality reporting. The accuracy of separate quality measures that assess mortality risk after repair of ruptured and intact AAAs, stratified by the use of open or endovascular repair, should be examined.


Subject(s)
Aortic Aneurysm, Abdominal/surgery , Aortic Rupture/surgery , Blood Vessel Prosthesis Implantation/mortality , Decision Support Techniques , Endovascular Procedures/mortality , Hospital Mortality , Inpatients , Quality Indicators, Health Care/standards , Aortic Aneurysm, Abdominal/diagnosis , Aortic Aneurysm, Abdominal/mortality , Aortic Rupture/diagnosis , Aortic Rupture/mortality , Blood Vessel Prosthesis Implantation/adverse effects , Blood Vessel Prosthesis Implantation/standards , Databases, Factual , Endovascular Procedures/adverse effects , Endovascular Procedures/standards , Health Services Research , Hospitals/standards , Humans , Predictive Value of Tests , Reproducibility of Results , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome , United States
15.
Am J Med ; 127(9): 829-39.e5, 2014 Sep.
Article in English | MEDLINE | ID: mdl-24813864

ABSTRACT

BACKGROUND: The clinical epidemiology of venous thromboembolism has changed recently because of advances in identification, prophylaxis, and treatment. We sought to describe secular trends in the occurrence of venous thromboembolism among residents of the Worcester, Massachusetts, metropolitan statistical area. METHODS: Population-based methods were used to monitor trends in event rates of first-time or recurrent venous thromboembolism in 5025 Worcester, Massachusetts, metropolitan statistical area residents who were diagnosed with acute pulmonary embolism or lower-extremity deep vein thrombosis during 9 annual periods between 1985 and 2009. Medical records were reviewed by abstractors and validated by clinicians. RESULTS: Age- and sex-adjusted annual event rates for first-time venous thromboembolism increased from 73 (95% confidence interval [CI], 64-82) per 100,000 in 1985/1986 to 133 (CI, 122-143) in 2009, primarily because of an increase in pulmonary embolism. The rate of recurrent venous thromboembolism decreased from 39 (CI, 32-45) in 1985/1986 to 19 (CI, 15-23) in 2003, and then increased to 35 (CI, 29-40) in 2009. There was an increasing trend in using noninvasive diagnostic testing, with approximately half of tests being invasive in 1985/1986 and almost all noninvasive by 2009. CONCLUSIONS: Despite advances in identification, prophylaxis, and treatment between 1985 and 2009, the annual event rate of venous thromboembolism has increased and remains high. Although these increases partially may be due to increased sensitivity of diagnostic methods, especially for pulmonary embolism, they also may imply that current prevention and treatment strategies are less than optimal.


Subject(s)
Pulmonary Embolism/epidemiology , Venous Thromboembolism/epidemiology , Venous Thrombosis/epidemiology , Acute Disease , Adult , Aged , Aged, 80 and over , Cohort Studies , Cost of Illness , Female , Humans , Male , Massachusetts/epidemiology , Middle Aged , Poisson Distribution , Population Surveillance , Regression Analysis
16.
PLoS One ; 9(4): e91755, 2014.
Article in English | MEDLINE | ID: mdl-24717837

ABSTRACT

BACKGROUND: Anticoagulants reduce the risk of venous thromboembolism (VTE) after total joint replacement. However, concern remains that pharmacologic VTE prophylaxis can lead to bleeding, which may impact on postoperative complications such as infections and reoperations. METHODS AND FINDINGS: From the Global Orthopedic Registry (GLORY), we reviewed 3,755 patients in US who elected for primary total hip or knee arthroplasty, received either warfarin or low molecular weight heparin (LMWH) as VTE prophylactics, and had up-to-90-day follow-up after discharge. We compared incidence rates of VTE, infections and other complications between LMWH and warfarin groups, and used multivariate analyses with propensity score weighting to generate the odds ratio (OR). Patients receiving LMWH tended to be older and higher in the American Society of Anesthesiologists grade scores. In contrast, warfarin was used more frequently for hip arthroplasty with longer duration among patients with more pre-existing comorbidity (all P<0.02). A weight variable was created with propensity score to account for differences in covariate distributions. Propensity score-weighted analyses showed no differences in VTE complications. However, compared to warfarin, LMWH was associated with significantly higher rates of bleeding (6.2% vs. 2.1%; OR = 3.82, 95% confidence interval [CI], 2.64 to 5.52), blood transfusion (29.4% vs. 22.0%; OR = 1.75, 95% CI, 1.51 to 2.04), reoperations (2.4% vs. 1.3%; OR = 1.77, 95% CI, 1.07 to 2.93) and infections (1.6% vs. 0.6%; OR = 2.79, 95% CI, 1.42 to 5.45). Similar results were obtained from compliant uses of warfarin (26%) and LMWH (62%) according to clinical guidelines. While surgical site infections were mostly superficial, current study was underpowered to compare incidence rates of deep infections (<1.0%). CONCLUSIONS: Surgical site infections and reoperations in 3 months following primary total joint arthroplasty may be associated with anticoagulant use that exhibited higher bleeding risk. Long-term complications and deep wound infections remain to be studied.


Subject(s)
Anticoagulants/adverse effects , Anticoagulants/therapeutic use , Arthroplasty, Replacement, Hip/adverse effects , Arthroplasty, Replacement, Knee/adverse effects , Postoperative Complications/etiology , Surgical Wound Infection/etiology , Adolescent , Adult , Aged , Cohort Studies , Humans , Middle Aged , Multivariate Analysis , Registries , Treatment Outcome , United States , Young Adult
17.
J Am Coll Surg ; 218(6): 1141-1147.e1, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24755188

ABSTRACT

BACKGROUND: The incidence of community-acquired Clostridium difficile (CACD) is increasing in the United States. Many CACD infections occur in the elderly, who are predisposed to poor outcomes. We aimed to describe the epidemiology and outcomes of CACD in a nationally representative sample of Medicare beneficiaries. STUDY DESIGN: We queried a 5% random sample of Medicare beneficiaries (2009-2011 Part A inpatient and Part D prescription drug claims; n = 864,604) for any hospital admission with a primary ICD-9 diagnosis code for C difficile (008.45). We examined patient sociodemographic and clinical characteristics, preadmission exposure to oral antibiotics, earlier treatment with oral vancomycin or metronidazole, inpatient outcomes (eg, colectomy, ICU stay, length of stay, mortality), and subsequent admissions for C difficile. RESULTS: A total of 1,566 (0.18%) patients were admitted with CACD. Of these, 889 (56.8%) received oral antibiotics within 90 days of admission. Few were being treated with oral metronidazole (n = 123 [7.8%]) or vancomycin (n = 13 [0.8%]) at the time of admission. Although 223 (14%) patients required ICU admission, few (n = 15 [1%]) underwent colectomy. Hospital mortality was 9%. Median length of stay among survivors was 5 days (interquartile range 3 to 8 days). One fifth of survivors were readmitted with C difficile, with a median follow-up time of 393 days (interquartile range 129 to 769 days). CONCLUSIONS: Nearly half of the Medicare beneficiaries admitted with CACD have no recent antibiotic exposure. High mortality and readmission rates suggest that the burden of C difficile on patients and the health care system will increase as the US population ages. Additional efforts at primary prevention and eradication might be warranted.


Subject(s)
Clostridioides difficile , Clostridium Infections/epidemiology , Clostridium Infections/therapy , Medicare , Aged , Aged, 80 and over , Community-Acquired Infections/epidemiology , Community-Acquired Infections/therapy , Female , Humans , Male , Treatment Outcome , United States
18.
Am J Med ; 127(6): 503-11, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24561113

ABSTRACT

BACKGROUND: Current guidelines recommend early oral beta-blocker administration in the management of acute coronary syndromes for patients who are not at high risk of complications. METHODS: Data from patients enrolled between 2000 and 2007 in the Global Registry of Acute Coronary Events (GRACE) were used to evaluate hospital outcomes in 3 cohorts of patients admitted with ST-elevation myocardial infarction, based on beta-blocker use (early [first 24 hours] intravenous (IV) [± oral], only early oral, or delayed [after first 24 hours]). RESULTS: Among 13,110 patients with ST-elevation myocardial infarction, 21% received any early IV beta-blockers, 65% received only early oral beta-blockers, and 14% received delayed (>24 hours) beta-blockers. Higher systolic blood pressure, higher heart rate, and chronic beta-blocker use were independent predictors of early beta-blocker use. Early beta-blocker use was less likely in older patients, patients with moderate to severe left ventricular dysfunction, and in those presenting with inferior myocardial infarction or Killip class II or III heart failure. IV beta-blocker use and delayed beta-blocker use were associated with higher rates of cardiogenic shock, sustained ventricular fibrillation/ventricular tachycardia, and acute heart failure, compared with oral beta-blocker use. In-hospital mortality was increased with IV beta-blocker use (propensity score adjusted odds ratio, 1.41; 95% confidence interval, 1.03-1.92) but significantly reduced with delayed beta-blocker administration (propensity adjusted odds ratio, 0.44; 95% confidence interval, 0.26-0.74). CONCLUSIONS: Early beta-blocker use is common in patients presenting with ST-elevation myocardial infarction, with oral administration being the most prevalent. Oral beta-blockers were associated with a decrease in the risk of cardiogenic shock, ventricular arrhythmias, and acute heart failure. However, the early receipt of any form of beta-blockers was associated with an increase in hospital mortality.


Subject(s)
Adrenergic beta-Antagonists/administration & dosage , Myocardial Infarction/drug therapy , Administration, Intravenous , Administration, Oral , Adolescent , Adrenergic beta-Antagonists/therapeutic use , Adult , Aged , Aged, 80 and over , Drug Administration Schedule , Electrocardiography , Female , Hospital Mortality , Humans , Logistic Models , Male , Middle Aged , Multivariate Analysis , Myocardial Infarction/complications , Myocardial Infarction/diagnosis , Myocardial Infarction/mortality , Propensity Score , Registries , Treatment Outcome , Young Adult
19.
J Clin Endocrinol Metab ; 99(3): 817-26, 2014 Mar.
Article in English | MEDLINE | ID: mdl-24423345

ABSTRACT

CONTEXT: Several fracture prediction models that combine fractures at different sites into a composite outcome are in current use. However, to the extent individual fracture sites have differing risk factor profiles, model discrimination is impaired. OBJECTIVE: The objective of the study was to improve model discrimination by developing a 5-year composite fracture prediction model for fracture sites that display similar risk profiles. DESIGN: This was a prospective, observational cohort study. SETTING: The study was conducted at primary care practices in 10 countries. PATIENTS: Women aged 55 years or older participated in the study. INTERVENTION: Self-administered questionnaires collected data on patient characteristics, fracture risk factors, and previous fractures. MAIN OUTCOME MEASURE: The main outcome is time to first clinical fracture of hip, pelvis, upper leg, clavicle, or spine, each of which exhibits a strong association with advanced age. RESULTS: Of four composite fracture models considered, model discrimination (c index) is highest for an age-related fracture model (c index of 0.75, 47 066 women), and lowest for Fracture Risk Assessment Tool (FRAX) major fracture and a 10-site model (c indices of 0.67 and 0.65). The unadjusted increase in fracture risk for an additional 10 years of age ranges from 80% to 180% for the individual bones in the age-associated model. Five other fracture sites not considered for the age-associated model (upper arm/shoulder, rib, wrist, lower leg, and ankle) have age associations for an additional 10 years of age from a 10% decrease to a 60% increase. CONCLUSIONS: After examining results for 10 different bone fracture sites, advanced age appeared the single best possibility for uniting several different sites, resulting in an empirically based composite fracture risk model.


Subject(s)
Fractures, Bone/diagnosis , Fractures, Bone/etiology , Models, Statistical , Osteoporosis, Postmenopausal/diagnosis , Age Factors , Aged , Aged, 80 and over , Cohort Studies , Female , Fractures, Bone/epidemiology , Humans , Longitudinal Studies , Middle Aged , Osteoporosis, Postmenopausal/complications , Osteoporosis, Postmenopausal/epidemiology , Prognosis , Risk Factors
20.
Calcif Tissue Int ; 94(2): 223-31, 2014 Feb.
Article in English | MEDLINE | ID: mdl-24077896

ABSTRACT

Fractures may be associated with higher morbidity in obese postmenopausal women than in nonobese women. We compared health-care utilization, functional status, and health-related quality of life (HRQL) in obese, nonobese, and underweight women with fractures. Information from the GLOW study, started in 2006, was collected at baseline and at 1, 2, and 3 years. In this subanalysis, self-reported incident clinical fractures, health-care utilization, HRQL, and functional status were recorded and examined. Women in GLOW (n = 60,393) were aged ≥55 years, from 723 physician practices at 17 sites in 10 countries. Complete data for fracture and body mass index were available for 90 underweight, 3,270 nonobese, and 941 obese women with one or more incident clinical fractures during the 3-year follow-up. The median hospital length of stay, adjusted for age, comorbidities, and fracture type, was significantly greater in obese than nonobese women (6 vs. 5 days, p = 0.017). Physical function and vitality score were significantly worse in obese than in nonobese women, both before and after fracture; but changes after fracture were similar across groups. Use of antiosteoporosis medication was significantly lower in obese than in nonobese or underweight women. In conclusion, obese women with fracture undergo a longer period of hospitalization for treatment and have poorer functional status and HRQL than nonobese women. Whether these differences translate into higher economic costs and adverse effects on longer-term outcomes remains to be established.


Subject(s)
Health Resources/statistics & numerical data , Obesity/epidemiology , Osteoporosis, Postmenopausal/epidemiology , Osteoporotic Fractures/epidemiology , Quality of Life , Aged , Aged, 80 and over , Body Mass Index , Female , Health Status , Humans , Length of Stay/statistics & numerical data , Longitudinal Studies , Middle Aged , Obesity/complications , Obesity/therapy , Osteoporosis, Postmenopausal/complications , Osteoporosis, Postmenopausal/therapy , Osteoporotic Fractures/complications , Osteoporotic Fractures/therapy , Surveys and Questionnaires
SELECTION OF CITATIONS
SEARCH DETAIL
...