Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 120
Filter
1.
Ther Adv Cardiovasc Dis ; 18: 17539447241277382, 2024.
Article in English | MEDLINE | ID: mdl-39291696

ABSTRACT

BACKGROUND: Reperfusion injury, characterized by oxidative stress and inflammation, poses a significant challenge in cardiac surgery with cardiopulmonary bypass (CPB). Deferoxamine, an iron-chelating compound, has shown promise in mitigating reperfusion injury by inhibiting iron-dependent lipid peroxidation and reactive oxygen species (ROS) production. OBJECTIVES: The objective of our study was to analyze and evaluate both the efficacy and safety of a new and promising intervention, that is, deferoxamine for ischemia-reperfusion injury (I/R). DESIGN: Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines are used to perform the study. DATA SOURCES AND METHODS: We conducted a systematic review following PRISMA guidelines to assess the efficacy and safety of deferoxamine in reducing I/R injury following CPB. A comprehensive search of electronic databases, namely, PubMed, Scopus, and Embase, yielded relevant studies published until August 18, 2023. Included studies evaluated ROS production, lipid peroxidation, cardiac performance, and morbidity outcomes. RESULTS: (a) ROS production: Multiple studies demonstrated a statistically significant decrease in ROS production in patients treated with deferoxamine, highlighting its potential to reduce oxidative stress. (b) Lipid peroxidation: Deferoxamine was associated with decreased lipid peroxidation levels, indicating its ability to protect cardiac tissue from oxidative damage during CPB. (c) Cardiac performance: Some studies reported improvements in left ventricular ejection fraction and wall motion score index with deferoxamine. CONCLUSION: Our review shows that deferoxamine is an efficacious and safe drug that can be used to prevent myocardial I/R injury following CPB. It also highlights the need for trials on a larger scale to develop potential strategies and guidelines on the use of deferoxamine for I/R injury.


Subject(s)
Cardiopulmonary Bypass , Deferoxamine , Myocardial Reperfusion Injury , Oxidative Stress , Reactive Oxygen Species , Humans , Deferoxamine/adverse effects , Deferoxamine/therapeutic use , Cardiopulmonary Bypass/adverse effects , Myocardial Reperfusion Injury/prevention & control , Oxidative Stress/drug effects , Treatment Outcome , Reactive Oxygen Species/metabolism , Male , Lipid Peroxidation/drug effects , Female , Middle Aged , Aged , Adult , Antioxidants/adverse effects , Antioxidants/administration & dosage
3.
Int J Cardiol ; 411: 132263, 2024 Sep 15.
Article in English | MEDLINE | ID: mdl-38878873

ABSTRACT

BACKGROUND: Atrial fibrillation (AF) increases stroke and mortality in patients with hypertrophic cardiomyopathy (HCM). Cardiac MRI (CMR) is increasingly used to detect late gadolinium enhancement (LGE) as a reliable indicator of left ventricular fibrosis, a potential predisposing factor of AF. Our research explored the correlation between left ventricular LGE and AF prevalence in HCM. METHODS: This retrospective study involved 351 HCM patients who underwent CMR. LGE percentages (0%, 1-5%, 6-14%, ≥15%) on CMR were compared with AF prevalence in HCM patients. Demographic, comorbidity, and imaging data were analyzed using appropriate univariate and multivariate analyses assessing for significant differences in AF prevalence. The predetermined significance level was p < 0.05. RESULTS: CMR demonstrated increased LGE in those with AF (p = 0.004). Increased LGE correlated with increased AF rates: 27.6% (0% LGE), 38.5% (1-5% LGE), 44.4% (6-14% LGE), and 54.7% (≥15% LGE) (p = 0.101, p = 0.043, p = 0.002, respectively, vs. 0% LGE). Adjusted for age, differences persisted and were most evident for LGE >15% (p = 0.001). Multivariate analysis, factoring age, gender, BMI, RVSP, and LVEF, supported LGE (odds ratio of 1.20, p = 0.036) and LAVI (odds ratio 1.05, 1.02-1.07, p < 0.001) as predictive markers for AF prevalence. CONCLUSIONS: Our study suggests a correlation between ventricular LGE and AF in patients with HCM. LGE exceeding 15% was associated with a significant increase in AF prevalence. These patients may require more frequent AF monitoring.


Subject(s)
Atrial Fibrillation , Cardiomyopathy, Hypertrophic , Contrast Media , Gadolinium , Magnetic Resonance Imaging, Cine , Predictive Value of Tests , Humans , Atrial Fibrillation/diagnostic imaging , Cardiomyopathy, Hypertrophic/diagnostic imaging , Cardiomyopathy, Hypertrophic/complications , Male , Female , Middle Aged , Retrospective Studies , Magnetic Resonance Imaging, Cine/methods , Adult , Aged , Heart Ventricles/diagnostic imaging , Heart Ventricles/physiopathology , Heart Ventricles/pathology
4.
Pulm Circ ; 14(2): e12371, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38646412

ABSTRACT

Lung transplantation remains an important therapeutic option for idiopathic pulmonary arterial hypertension (IPAH), yet short-term survival is the poorest among the major diagnostic categories. We sought to develop a prediction model for 90-day mortality using the United Network for Organ Sharing database for adults with IPAH transplanted between 2005 and 2021. Variables with a p value ≤ 0.1 on univariate testing were included in multivariable analysis to derive the best subset model. The cohort comprised 693 subjects, of whom 71 died (10.2%) within 90 days of transplant. Significant independent predictors of early mortality were: extracorporeal circulatory support and/or mechanical ventilation at transplant (OR: 3; CI: 1.4-5), pulmonary artery diastolic pressure (OR: 1.3 per 10 mmHg; CI: 1.07-1.56), forced expiratory volume in the first second percent predicted (OR: 0.8 per 10%; CI: 0.7-0.94), recipient total bilirubin >2 mg/dL (OR: 3; CI: 1.4-7.2) and ischemic time >6 h (OR: 1.7, CI: 1.01-2.86). The predictive model was able to distinguish 25% of the cohort with a mortality of ≥20% from 49% with a mortality of ≤5%. We conclude that recipient variables associated with increasing severity of pulmonary vascular disease, including pretransplant advanced life support, and prolonged ischemic time are important risk factors for 90-day mortality after lung transplant for IPAH.

5.
Clin Infect Dis ; 78(6): 1551-1553, 2024 Jun 14.
Article in English | MEDLINE | ID: mdl-38640140

ABSTRACT

Among patients with pathologically proven infective endocarditis, the association of pathogen with occurrence of infection-related glomerulonephritis (IRGN) was examined in 48 case patients with IRGN and 192 propensity score-matched controls. Bartonella was very strongly associated with IRGN (odds ratio, 38.2 [95% confidence interval, 6.7-718.8]; P < .001); other microorganisms were not.


Subject(s)
Endocarditis , Glomerulonephritis , Humans , Glomerulonephritis/microbiology , Male , Female , Middle Aged , Aged , Endocarditis/microbiology , Endocarditis/complications , Adult , Case-Control Studies , Bartonella/isolation & purification , Endocarditis, Bacterial/microbiology
6.
Clin Infect Dis ; 79(2): 405-411, 2024 Aug 16.
Article in English | MEDLINE | ID: mdl-38465901

ABSTRACT

BACKGROUND: The purpose of this study was to evaluate whether the 2023-2024 formulation of the coronavirus disease 2019 (COVID-19) messenger RNA vaccine protects against COVID-19. METHODS: Cleveland Clinic employees when the 2023-2024 formulation of the COVID-19 messenger RNA vaccine became available to employees were included. Cumulative incidence of COVID-19 over the following 17 weeks was examined prospectively. Protection provided by vaccination (analyzed as a time-dependent covariate) was evaluated using Cox proportional hazards regression, with time-dependent coefficients used to separate effects before and after the JN.1 lineage became dominant. The analysis was adjusted for the propensity to get tested, age, sex, pandemic phase when the last prior COVID-19 episode occurred, and the number of prior vaccine doses. RESULTS: Among 48 210 employees, COVID-19 occurred in 2462 (5.1%) during the 17 weeks of observation. In multivariable analysis, the 2023-2024 formula vaccinated state was associated with a significantly lower risk of COVID-19 before the JN.1 lineage became dominant (hazard ratio = .58; 95% confidence interval [CI] = .49-.68; P < .001), and lower risk but one that did not reach statistical significance after (hazard ratio = .81; 95% CI = .65-1.01; P = .06). Estimated vaccine effectiveness was 42% (95% CI = 32-51) before the JN.1 lineage became dominant, and 19% (95% CI = -1-35) after. Risk of COVID-19 was lower among those previously infected with an XBB or more recent lineage and increased with the number of vaccine doses previously received. CONCLUSIONS: The 2023-2024 formula COVID-19 vaccine given to working-aged adults afforded modest protection overall against COVID-19 before the JN.1 lineage became dominant, and less protection after.


Subject(s)
COVID-19 Vaccines , COVID-19 , SARS-CoV-2 , Humans , COVID-19/prevention & control , COVID-19/epidemiology , COVID-19 Vaccines/immunology , COVID-19 Vaccines/administration & dosage , Female , Male , Adult , SARS-CoV-2/immunology , SARS-CoV-2/genetics , Middle Aged , mRNA Vaccines , Vaccine Efficacy , Prospective Studies , Vaccination , Proportional Hazards Models
7.
J Breath Res ; 18(2)2024 03 28.
Article in English | MEDLINE | ID: mdl-38502958

ABSTRACT

Clostridioides difficileinfection (CDI) is the leading cause of hospital-acquired infective diarrhea. Current methods for diagnosing CDI have limitations; enzyme immunoassays for toxin have low sensitivity andClostridioides difficilepolymerase chain reaction cannot differentiate infection from colonization. An ideal diagnostic test that incorporates microbial factors, host factors, and host-microbe interaction might characterize true infection. Assessing volatile organic compounds (VOCs) in exhaled breath may be a useful test for identifying CDI. To identify a wide selection of VOCs in exhaled breath, we used thermal desorption-gas chromatography-mass spectrometry to study breath samples from 17 patients with CDI. Age- and sex-matched patients with diarrhea and negativeC.difficiletesting (no CDI) were used as controls. Of the 65 VOCs tested, 9 were used to build a quadratic discriminant model that showed a final cross-validated accuracy of 74%, a sensitivity of 71%, a specificity of 76%, and a receiver operating characteristic area under the curve of 0.72. If these findings are proven by larger studies, breath VOC analysis may be a helpful adjunctive diagnostic test for CDI.


Subject(s)
Volatile Organic Compounds , Humans , Volatile Organic Compounds/analysis , Breath Tests/methods , Gas Chromatography-Mass Spectrometry , ROC Curve , Diarrhea
8.
J Heart Lung Transplant ; 43(1): 134-147, 2024 01.
Article in English | MEDLINE | ID: mdl-37643656

ABSTRACT

BACKGROUND: The study objective was to assess disparities in outcomes in the waitlist and post-heart transplantation (HT) according to socioeconomic status (SES) in the old and new U.S. HT allocation systems. METHODS: Adult HT candidates in the United Network for Organ Sharing database from 2014 through 2021 were included. Old or new system classification was according to listing before or after October 18, 2018. SES was stratified by patient ZIP code and median household income via U.S. Census Bureau and classified into terciles. Competing waitlist outcomes and post-transplantation survival were compared between systems. RESULTS: In total, 26,450 patients were included. Waitlisted candidates with low SES were more frequently younger, female, African American, and with higher body mass index. Reduced cumulative incidence (CI) of HT in the old system occurred in low SES (53.5%) compared to middle (55.7%, p = 0.046), and high (57.9%, p < 0.001). In the new system, the CI of HT was 65.3% in the low SES vs middle (67.6%, p = 0.002) and high (70.2%, p < 0.001), and SES remained significant in the adjusted analysis. In the old system, CI of death/delisting was similar across SES. In the new system, low SES had increased CI of death/delisting (7.4%) vs middle (6%, p = 0.012) and high (5.4%, p = 0.002). The old system showed similar 1-year survival across SES. In the new system, recipients with low SES had decreased 1-year survival (p = 0.041). CONCLUSIONS: SES affects waitlist and post-transplant outcomes. In the new system, all SES had increased access to HT; however, low SES had increased death/delisting due to worsening clinical status and decreased post-transplant survival.


Subject(s)
Healthcare Disparities , Heart Failure , Heart Transplantation , Social Class , Waiting Lists , Adult , Female , Humans , Black or African American , Incidence , Retrospective Studies , Male
9.
ASAIO J ; 70(1): 22-30, 2024 Jan 01.
Article in English | MEDLINE | ID: mdl-37913499

ABSTRACT

HeartMate 3 is the only durable left ventricular assist devices (LVAD) currently implanted in the United States. The purpose of this study was to develop a predictive model for 1 year mortality of HeartMate 3 implanted patients, comparing standard statistical techniques and machine learning algorithms. Adult patients registered in the Society of Thoracic Surgeons, Interagency Registry for Mechanically Assisted Circulatory Support (STS-INTERMACS) database, who received primary implant with a HeartMate 3 between January 1, 2017, and December 31, 2019, were included. Epidemiological, clinical, hemodynamic, and echocardiographic characteristics were analyzed. Standard logistic regression and machine learning (elastic net and neural network) were used to predict 1 year survival. A total of 3,853 patients were included. Of these, 493 (12.8%) died within 1 year after implantation. Standard logistic regression identified age, Model End Stage Liver Disease (MELD)-XI score, right arterial (RA) pressure, INTERMACS profile, heart rate, and etiology of heart failure (HF), as important predictor factors for 1 year mortality with an area under the curve (AUC): 0.72 (0.66-0.77). This predictive model was noninferior to the ones developed using the elastic net or neural network. Standard statistical techniques were noninferior to neural networks and elastic net in predicting 1 year survival after HeartMate 3 implantation. The benefit of using machine-learning algorithms in the prediction of outcomes may depend on the type of dataset used for analysis.


Subject(s)
Heart Failure , Heart-Assist Devices , Adult , Humans , United States , Retrospective Studies , Heart Failure/surgery , Heart-Assist Devices/adverse effects , Registries , Treatment Outcome
10.
J Thorac Cardiovasc Surg ; 167(1): 127-140.e15, 2024 01.
Article in English | MEDLINE | ID: mdl-35927083

ABSTRACT

OBJECTIVE: The objectives of this study were to investigate patient characteristics, valve pathology, bacteriology, and surgical techniques related to outcome of patients who underwent surgery for isolated native (NVE) or prosthetic (PVE) mitral valve endocarditis. METHODS: From January 2002 to January 2020, 447 isolated mitral endocarditis operations were performed, 326 for NVE and 121 for PVE. Multivariable analysis of time-related outcomes used random forest machine learning. RESULTS: Staphylococcus aureus was the most common causative organism. Of 326 patients with NVE, 88 (27%) underwent standard mitral valve repair, 43 (13%) extended repair, and 195 (60%) valve replacement. Compared with NVE with standard repair, patients who underwent all other operations were older, had more comorbidities, worse cardiac function, and more invasive disease. Hospital mortality was 3.8% (n = 17); 0 (0%) after standard valve repair, 3 (7.0%) after extended repair, 8 (4.1%) after NVE replacement, and 6 (5.0%) after PVE re-replacement. Survival at 1, 5, and 10 years was 91%, 75%, and 62% after any repair and 86%, 62%, and 44% after replacement, respectively. The most important risk factor for mortality was renal failure. Risk-adjusted outcomes, including survival, were similar in all groups. Unadjusted extended repair outcomes, particularly early, were similar or worse than replacement in terms of reinfection, reintervention, regurgitation, gradient, and survival. CONCLUSIONS: A patient- and pathology-tailored approach to surgery for isolated mitral valve endocarditis has low mortality and excellent results. Apparent superiority of standard valve repair is related to patient characteristics and pathology. Renal failure is the most powerful risk factor. In case of extensive destruction, extended repair shows no benefit over replacement.


Subject(s)
Endocarditis, Bacterial , Endocarditis , Prosthesis-Related Infections , Renal Insufficiency , Humans , Endocarditis, Bacterial/diagnosis , Endocarditis, Bacterial/surgery , Endocarditis, Bacterial/microbiology , Mitral Valve/diagnostic imaging , Mitral Valve/surgery , Mitral Valve/microbiology , Aortic Valve/surgery , Prosthesis-Related Infections/diagnosis , Prosthesis-Related Infections/surgery , Prosthesis-Related Infections/microbiology , Endocarditis/pathology , Treatment Outcome
11.
PLoS One ; 18(11): e0293449, 2023.
Article in English | MEDLINE | ID: mdl-37939032

ABSTRACT

BACKGROUND: The CDC recently defined being "up-to-date" on COVID-19 vaccination as having received at least one dose of a COVID-19 bivalent vaccine. The purpose of this study was to compare the risk of COVID-19 among those "up-to-date" and "not up-to-date". METHODS: Employees of Cleveland Clinic in Ohio, USA, in employment when the COVID-19 bivalent vaccine first became available, and still employed when the XBB lineages became dominant, were included. Cumulative incidence of COVID-19 since the XBB lineages became dominant was compared across the"up-to-date" and "not up-to-date" states, by treating COVID-19 bivalent vaccination as a time-dependent covariate whose value changed on receipt of the vaccine. Risk of COVID-19 by vaccination status was also evaluated using multivariable Cox proportional hazards regression adjusting for propensity to get tested for COVID-19, age, sex, most recent prior SARS-CoV-2 infection, and number of prior vaccine doses. RESULTS: COVID-19 occurred in 1475 (3%) of 48 344 employees during the 100-day study period. The cumulative incidence of COVID-19 was lower in the "not up-to-date" than the "up-to-date" state. On multivariable analysis, being "up-to-date" was not associated with lower risk of COVID-19 (HR, 1.05; 95% C.I., 0.88-1.25; P-value, 0.58). Results were very similar when those 65 years and older were only considered "up-to-date" after 2 doses of the bivalent vaccine. CONCLUSIONS: Since the XBB lineages became dominant, adults "up-to-date" on COVID-19 vaccination by the CDC definition do not have a lower risk of COVID-19 than those "not up-to-date", bringing into question the value of this risk classification definition.


Subject(s)
COVID-19 , Adult , Humans , United States/epidemiology , COVID-19/epidemiology , COVID-19/prevention & control , COVID-19 Vaccines , SARS-CoV-2 , Vaccination , Vaccines, Combined , Centers for Disease Control and Prevention, U.S.
12.
Blood Purif ; 52(7-8): 631-641, 2023.
Article in English | MEDLINE | ID: mdl-37586332

ABSTRACT

INTRODUCTION: Acute kidney injury (AKI) in patients treated with veno-arterial extracorporeal membrane oxygenation (VA-ECMO) is associated with high mortality. The objective of this study was to investigate whether cytokine levels before the initiation of ECMO treatment could predict AKI. We also aimed to investigate the impact of AKI on 30-day and 1-year mortality. METHODS: Serum cytokine levels were analyzed in 100 consecutive VA-ECMO-treated patients at pre-cannulation, at 48 h post-cannulation, and at 8 days. Clinical data to establish the incidence and outcome of AKI after the start of ECMO was retrieved from the local ECMO registry. SETTING: The study was conducted at tertiary care, university hospital. Participants included 100 patients treated with VA-ECMO. INTERVENTIONS: The blood samples for cytokine analysis were collected before VA-ECMO treatment, at 48 h after VA-ECMO treatment was started, and at 8 days. RESULTS: Pre-cannulation serum IL-10 levels were significantly higher in patients who developed AKI (212 [38.9, 620.7]) versus those who did not (49.0 [11.9, 102.2]; p = 0.007), and the development of AKI can be predicted by pre-cannulation IL-10 levels (p = 0.025, OR = 1.2 [1.02-1.32]). The development of AKI during ECMO treatment is associated with increased 30-day mortality (p = 0.049) compared to patients who did not develop AKI and had a pre-cannulation estimated glomerular filtration rate ≥ 45 mL/min. The 1-year survival rate for patients with AKI who survived the first 30 days of ECMO treatment is comparable to that of patients without AKI. CONCLUSION: Increased pre-cannulation IL-10 levels are associated with the development of AKI during VA-ECMO support. AKI is associated with increased 30-day mortality compared to patients with no AKI and better renal function. However, patients with AKI who survive the first 30 days have a 1-year survival rate similar to those without AKI.


Subject(s)
Acute Kidney Injury , Extracorporeal Membrane Oxygenation , Humans , Extracorporeal Membrane Oxygenation/adverse effects , Interleukin-10 , Acute Kidney Injury/etiology , Acute Kidney Injury/therapy , Survival Rate , Catheterization , Retrospective Studies
13.
Open Forum Infect Dis ; 10(6): ofad209, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37274183

ABSTRACT

Background: The purpose of this study was to evaluate whether a bivalent coronavirus disease 2019 (COVID-19) vaccine protects against COVID-19. Methods: The study included employees of Cleveland Clinic in employment when the bivalent COVID-19 vaccine first became available. Cumulative incidence of COVID-19 over the following 26 weeks was examined. Protection provided by vaccination (analyzed as a time-dependent covariate) was evaluated using Cox proportional hazards regression, with change in dominant circulating lineages over time accounted for by time-dependent coefficients. The analysis was adjusted for the pandemic phase when the last prior COVID-19 episode occurred and the number of prior vaccine doses. Results: Among 51 017 employees, COVID-19 occurred in 4424 (8.7%) during the study. In multivariable analysis, the bivalent-vaccinated state was associated with lower risk of COVID-19 during the BA.4/5-dominant (hazard ratio, 0.71 [95% confidence interval, .63-79]) and the BQ-dominant (0.80 [.69-.94]) phases, but decreased risk was not found during the XBB-dominant phase (0.96 [.82-.1.12]). The estimated vaccine effectiveness was 29% (95% confidence interval, 21%-37%), 20% (6%-31%), and 4% (-12% to 18%), during the BA.4/5-, BQ-, and XBB-dominant phases, respectively. The risk of COVID-19 also increased with time since the most recent prior COVID-19 episode and with the number of vaccine doses previously received. Conclusions: The bivalent COVID-19 vaccine given to working-aged adults afforded modest protection overall against COVID-19 while the BA.4/5 lineages were the dominant circulating strains, afforded less protection when the BQ lineages were dominant, and effectiveness was not demonstrated when the XBB lineages were dominant.

14.
Perfusion ; : 2676591231182247, 2023 Jun 07.
Article in English | MEDLINE | ID: mdl-37283140

ABSTRACT

PURPOSE: There is limited research on the use and outcomes of veno-arterial extracorporeal membrane oxygenation (VA-ECMO) treatment for massive pulmonary embolism (PE). This study compared VA-ECMO treatment for massive PE versus patients treated medically. MATERIALS AND METHODS: Patients diagnosed with massive PE at one hospital system were reviewed. VA-ECMO and non-ECMO groups were compared by t test and Chi-square. Mortality risk factors were identified by logistic regression. Survival was assessed by Kaplan Meier and propensity matching of groups. RESULTS: Ninety-two patients were included (22 VA-ECMO and 70 non-ECMO). Age (OR 1.08, 95% CI 1.03-1.13), arterial SBP (OR 0.97, 95% CI 0.94-0.99), albumin (OR 0.3, 95% CI 0.1-0.8), and phosphorus (OR 2.0, 95% CI 1.4-3.17) were independently associated with 30-day mortality. Alkaline phosphate (OR 1.03, 95% CI 1.01-1.05) and SOFA score (OR 1.3, 95% CI 1.06-1.51) were associated with 1-year mortality. Propensity matching showed no difference in 30-day (59% VA-ECMO versus 72% non-ECMO, p = 0.363) or 1-year survival (50% VA-ECMO versus 64% non-ECMO, p = 0.355). CONCLUSIONS: Patients treated with VA-ECMO for massive PE and medically treated patients have similar short- and long-term survival. Further research is needed to define clinical recommendations and benefits of intensive therapy such as VA-ECMO in this critically ill population.

15.
J Heart Lung Transplant ; 42(8): 1059-1071, 2023 08.
Article in English | MEDLINE | ID: mdl-36964083

ABSTRACT

BACKGROUND: Venoarterial extracorporeal membrane oxygenation (VA-ECMO) is a key support modality for cardiogenic shock. The 2018 United Network for Organ Sharing (UNOS) heart transplant allocation algorithm prioritizes VA-ECMO patients. OBJECTIVE: To evaluate the role of VA-ECMO in bridging to advanced heart failure therapies. METHODS: We analyzed adult patients from the multicenter Extracorporeal Life Support Organization registry receiving VA-ECMO for cardiac support or resuscitation between 2016 and 2021 in the United States, comparing bridge-to-transplant (BTT) and non-BTT intent patients, as well as pre- vs post-2018 patients, on a wide range of demographic and clinical outcome predictors. RESULTS: Of 17,087 patients, 797 received left ventricular assist device (LVAD)/heart transplant, 7,931 died or had poor prognosis, and 8,359 had expected recovery at ECMO discontinuation. Patients supported with BTT intent had lower clinical acuity than non-BTT candidates and were more likely to receive LVAD/transplant. The proportion of patients who received VA-ECMO as BTT and received LVAD/transplant increased after 2018. Post-2018 BTT patients had significantly lower clinical acuity and higher likelihood of transplant than both post-2018 non-BTT patients and pre-2018 BTT patients. ECMO complications were associated with lower likelihood of transplant but were significantly less common post-2018 than pre-2018. CONCLUSIONS: After implementation of the 2018 UNOS allocation system, ECMO utilization as BTT or LVAD has increased, and the acuity of BTT intent patients cannulated for ECMO has decreased. There has not yet been an increase in more acute ECMO patients getting transplanted. This may partially explain the post-transplant outcomes of ECMO patients in the current era reported in UNOS.


Subject(s)
Extracorporeal Membrane Oxygenation , Heart Failure , Heart Transplantation , Heart-Assist Devices , Adult , Humans , Heart Failure/therapy , Shock, Cardiogenic/therapy , Retrospective Studies
16.
J Infect Dis ; 227(6): 800-805, 2023 03 28.
Article in English | MEDLINE | ID: mdl-36625675

ABSTRACT

BACKGROUND: Severe acute respiratory syndrome coronavirus 2 immunity has declined with subsequent waves and accrual of viral mutations. In vitro studies raise concern for immune escape by BA.4/BA.5, and a study in Qatar showed moderate protection, but these findings have yet to be reproduced. METHODS: This retrospective cohort study included individuals tested for coronavirus disease 2019 by polymerase chain reaction during Delta or BA.1/BA.2 and retested during BA.4/BA.5. The preventable fraction (PF) was calculated as ratio of the infection to the hospitalization rate for initially positive patients divided by the ratio for initially negative patients, stratified by age and adjusted for age, sex, comorbid conditions, and vaccination using logistic regression. RESULTS: A total of 20 987 patients met inclusion criteria. Prior Delta infection provided no protection against BA.4/BA.5 infection (adjusted PF, 11.9% [95% confidence interval, .8%-21.8%]); P = .04) and minimal protection against hospitalization (10.7% [4.9%-21.7%]; P = .003). In adjusted models, prior BA.1/BA.2 infection provided 45.9% (95% confidence interval, 36.2%-54.1%; P < .001) protection against BA.4/BA.5 reinfection and 18.8% (10.3%-28.3%; (P < .001) protection against hospitalization. Up-to-date vaccination provided modest protection against reinfection with BA.4/BA.5 and hospitalization. CONCLUSIONS: Prior infection with BA.1/BA.2 and up-to-date vaccination provided modest protection against infection with BA.4/BA.5 and hospitalization, while prior Delta infection provided minimal protection against hospitalization and none against infection.


Subject(s)
COVID-19 , Hepatitis D , Humans , Reinfection , Retrospective Studies , COVID-19/prevention & control , Hospitalization
17.
J Thorac Cardiovasc Surg ; 165(4): 1303-1315.e9, 2023 04.
Article in English | MEDLINE | ID: mdl-34366128

ABSTRACT

OBJECTIVE: Intrinsic risk of infection of cryopreserved allograft aortic root replacements remains poorly understood despite their long history of use. The objective of this study was to determine this intrinsic risk of allograft infection and its risk factors when allografts are implanted for both nonendocarditis indications and infective endocarditis. METHODS: From January 1987 to January 2017, 2042 patients received 2110 allograft aortic valves at a quaternary medical center, 1124 (53%) for nonendocarditis indications and 986 (47%) for endocarditis indications (670 [68%] prosthetic valve endocarditis). Staphylococcus aureus caused 193 of 949 cases of endocarditis (20%), 71 (7.3%) in persons who injected drugs. Periodic surveillance and cross-sectional follow-up achieved 85% of possible follow-up time. The primary end point was allograft infection in patients with nonendocarditis and endocarditis indications. Risk factors were identified by hazard function decomposition and machine learning. RESULTS: During follow-up, 30 allografts (26 explanted) became infected in patients in the nonendocarditis group and 49 (41 explanted) in patients with endocarditis. At 20 years, the probability of allograft infection was 5.6% in patients in the nonendocarditis group and 14% in patients with endocarditis. Risk factors for allograft infection in patients in the nonendocarditis group were younger patient age and older donor age. Risk factors for allograft infection in patients with endocarditis were earlier implant year, injection drug use, and younger age. In patients with endocarditis, 18% of allograft infections were caused by the original organism. CONCLUSIONS: The low infection rates, both in patients without and with endocarditis, support continued use of allografts in the modern era, in particular for the treatment of invasive endocarditis of the aortic root.


Subject(s)
Endocarditis, Bacterial , Endocarditis , Heart Valve Prosthesis , Humans , Endocarditis, Bacterial/etiology , Cross-Sectional Studies , Heart Valve Prosthesis/adverse effects , Endocarditis/etiology , Allografts
18.
Clin Infect Dis ; 76(3): e142-e147, 2023 02 08.
Article in English | MEDLINE | ID: mdl-35867678

ABSTRACT

BACKGROUND: Previous infection with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) provides strong protection against future infection. There is limited evidence on whether such protection extends to the Omicron variant. METHODS: This retrospective cohort study included 635 341 patients tested for SARS-CoV-2 via polymerase chain reaction from 9 March 2020 to 1 March 2022. Patients were analyzed according to the wave in which they were initially infected. The primary outcome was reinfection during the Omicron period (20 December 2021-1 March 2022). We used a multivariable model to assess the effects of prior infection and vaccination on hospitalization. RESULTS: Among the patients tested during the Omicron wave, 30.6% tested positive. Protection of prior infection against reinfection with Omicron ranged from 18.0% (95% confidence interval [CI], 13.0-22.7) for patients infected in wave 1 to 69.2% (95% CI, 63.4-74.1) for those infected in the Delta wave. In adjusted models, previous infection reduced hospitalization by 28.5% (95% CI, 19.1-36.7), whereas full vaccination plus a booster reduced it by 59.2% (95% CI, 54.8-63.1). CONCLUSIONS: Previous infection offered less protection against Omicron than was observed in past waves. Immunity against future waves will likely depend on the degree of similarity between variants.


Subject(s)
COVID-19 , Humans , COVID-19/prevention & control , SARS-CoV-2 , Reinfection , Retrospective Studies
19.
Open Forum Infect Dis ; 9(12): ofac601, 2022 Dec.
Article in English | MEDLINE | ID: mdl-36540389

ABSTRACT

Background: Best practice guidelines recommend that patients at risk for sexually transmitted infections (STIs), such as gonorrhea (GC) and chlamydia, should also be tested for human immunodeficiency virus (HIV) and syphilis. This prospective quality assurance study aimed to increase HIV and syphilis testing rates in emergency departments (EDs) across the Cleveland Clinic Health System from January 1, 2020 through January 1, 2022. Methods: A multidisciplinary team of emergency medicine, infectious diseases, pharmacy, and microbiology personnel convened to identify barriers to HIV and syphilis testing during ED encounters at which GC/chlamydia were tested. The following interventions were implemented in response: rapid HIV testing with new a workflow for results follow-up, a standardized STI-screening order panel, and feedback to clinicians about ordering patterns. Results: There were 57 797 ED visits with GC/chlamydia testing completed during the study period. Human immunodeficiency virus testing was ordered at 5% of these encounters before the interventions were implemented and increased to 8%, 23%, and 36% after each successive intervention. Syphilis testing increased from 9% before the interventions to 12%, 28%, and 39% after each successive intervention. In multivariable analyses adjusted for age, gender, and location, the odds ratio for HIV and syphilis testing after all interventions was 11.72 (95% confidence interval [CI], 10.82-12.71; P ≤.001) and 6.79 (95% CI, 6.34-7.27; P ≤.001), respectively. Conclusions: The multidisciplinary intervention resulted in improved testing rates for HIV and syphilis.

20.
Clin Infect Dis ; 75(12): 2169-2177, 2022 12 19.
Article in English | MEDLINE | ID: mdl-35476018

ABSTRACT

BACKGROUND: The purpose of this study was to determine whether boosting previously infected or vaccinated individuals with a vaccine developed for an earlier variant of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) protects against the Omicron variant. METHODS: Employees of Cleveland Clinic, previously infected with or vaccinated against coronavirus disease 2019 (COVID-19) and working the day the Omicron variant was declared a variant of concern, were included. The cumulative incidence of COVID-19 was examined over 2 months during an Omicron variant surge. Protection provided by boosting was evaluated using Cox proportional hazards regression. Analyses were adjusted for time since proximate SARS-CoV-2 exposure. RESULTS: Among 39 766 employees, 8037 (20%) previously infected and the remaining previously vaccinated, COVID-19 occurred in 6230 (16%) during the study. Risk of COVID-19 increased with time since proximate SARS-CoV-2 exposure, and boosting protected those >6 months since prior infection or vaccination. In multivariable analysis, boosting was independently associated with lower risk of COVID-19 among those vaccinated but not previously infected (hazard ratio [HR], .43; 95% confidence interval [CI], .41-.46) as well as those previously infected (HR, .66; 95% CI, .58-.76). Among those previously infected, receipt of 2 compared with 1 dose of vaccine was associated with higher risk of COVID-19 (HR, 1.54; 95% CI, 1.21-1.97). CONCLUSIONS: Administering a COVID-19 vaccine not designed for the Omicron variant >6 months after prior infection or vaccination protects against Omicron variant infection. There is no advantage to administering more than 1 dose of vaccine to previously infected persons.


Subject(s)
COVID-19 Vaccines , COVID-19 , Humans , SARS-CoV-2 , Ambulatory Care Facilities
SELECTION OF CITATIONS
SEARCH DETAIL