Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 87
Filter
1.
Perfusion ; 39(3): 525-535, 2024 Apr.
Article in English | MEDLINE | ID: mdl-36595340

ABSTRACT

INTRODUCTION: There are no randomized controlled trials comparing low and high activated partial thromboplastin time (aPTT) targets in heparinized adult veno-arterial (VA) extracorporeal membrane oxygenation (ECMO) patients. Our systematic review and meta-analysis summarized complication rates in adult VA ECMO patients treated with low and high aPTT targets. METHODS: Studies published from January 2000 to May 2022 were identified using Pubmed, Embase, Cochrane Library, and LILACS (Latin American and Caribbean Health Sciences Literature). Studies were included if aPTT was primarily used to guide heparin anticoagulation. For the low aPTT group, we included studies where aPTT goal was ≤60 seconds and for the high aPTT group, we included studies where aPTT goal was ≥60 seconds. Proportional meta-analysis with a random effects model was used to calculate pooled complication rates for patients in the two aPTT groups. RESULTS: Twelve studies met inclusion criteria (5 in the low aPTT group and 7 in the high aPTT group). The pooled bleeding complication incidence for low aPTT studies was 53.6% (95% CI = 37.4%-69.4%, I2 = 60.8%) and for high aPTT studies was 43.8% (95% CI = 21.7%-67.1%, I2 = 91.8%). No studies in the low aPTT group reported overall thrombosis incidence, while three studies in the high aPTT group reported overall thrombosis incidence. The pooled thrombosis incidence for high aPTT studies was 16.1% (95% CI = 9.0%-24.5%, I2 = 13.1%). CONCLUSIONS: Adult ECMO patients managed with low and high aPTT goals appeared to have similar bleeding and other complication rates further highlighting the need for a randomized controlled trial.


Subject(s)
Extracorporeal Membrane Oxygenation , Thrombosis , Adult , Humans , Partial Thromboplastin Time , Anticoagulants/therapeutic use , Extracorporeal Membrane Oxygenation/adverse effects , Heparin/adverse effects , Thrombosis/etiology , Retrospective Studies
2.
J Thorac Cardiovasc Surg ; 167(5): 1866-1877.e1, 2024 May.
Article in English | MEDLINE | ID: mdl-37156364

ABSTRACT

OBJECTIVE: The influence of Extracorporeal Life Support Organization (ELSO) center of excellence (CoE) recognition on failure to rescue after cardiac surgery is unknown. We hypothesized that ELSO CoE would be associated with improved failure to rescue. METHODS: Patients undergoing a Society of Thoracic Surgeons index operation in a regional collaborative (2011-2021) were included. Patients were stratified by whether or not their operation was performed at an ELSO CoE. Hierarchical logistic regression analyzed the association between ELSO CoE recognition and failure to rescue. RESULTS: A total of 43,641 patients were included across 17 centers. In total, 807 developed cardiac arrest with 444 (55%) experiencing failure to rescue after cardiac arrest. Three centers received ELSO CoE recognition, and accounted for 4238 patients (9.71%). Before adjustment, operative mortality was equivalent between ELSO CoE and non-ELSO CoE centers (2.08% vs 2.36%; P = .25), as was the rate of any complication (34.5% vs 33.8%; P = .35) and cardiac arrest (1.49% vs 1.89%; P = .07). After adjustment, patients undergoing surgery at an ELSO CoE facility were observed to have 44% decreased odds of failure to rescue after cardiac arrest, relative to patients at non-ELSO CoE facility (odds ratio, 0.56; 95% CI, 0.316-0.993; P = .047). CONCLUSIONS: ELSO CoE status is associated with improved failure to rescue following cardiac arrest for patients undergoing cardiac surgery. These findings highlight the important role that comprehensive quality programs serve in improving perioperative outcomes in cardiac surgery.


Subject(s)
Extracorporeal Membrane Oxygenation , Heart Arrest , Humans , Extracorporeal Membrane Oxygenation/adverse effects , Heart Arrest/diagnosis , Heart Arrest/etiology , Heart Arrest/therapy , Heart , Retrospective Studies
3.
Article in English | MEDLINE | ID: mdl-38135000

ABSTRACT

OBJECTIVE: Renal failure after cardiac surgery is associated with increased morbidity and mortality. There is a lack of data examining the rate of renal recovery after patients have started dialysis following cardiac surgery. We aimed to determine the frequency of and time to renal recovery of patients requiring dialysis after cardiac surgery. METHODS: All patients who developed new-onset renal failure requiring dialysis following cardiac surgery at our institution from 2011 to 2022 were included. Renal recovery, time to renal recovery, and mortality at 1 year were merged with patients' Society of Thoracic Surgeons Adult Cardiac Surgery Database files. Kaplan-Meier analysis was used to predict time to renal recovery; we censored patients who died or were lost to follow up. Cox regression was used for risk-adjustment. RESULTS: A total of 312 patients were included in the final analysis. Mortality during index hospital admission was 33% (n = 105), and mortality at 1 year was 45% (n = 141). Of those surviving at 1 year, 69% (n = 118) remained renally recovered. Median renal recovery time was 56 (37-74) days. Accounting for mortality as a competing risk, 51% of patients were predicted to achieve renal recovery. Increasing age (hazard ratio, 0.98; 0.514-0.94, P < .026) and increasing total packed red blood cells (hazard ratio, 0.0958; 0.937-0.987, P < .001) received were found to be significant negative predictors of renal recovery in the Fine-Gray model for subhazard distribution. CONCLUSIONS: More than two-thirds of patients with renal failure who survived the perioperative period had renal recovery within 1 year after surgery. Recovery was driven primarily by postoperative complications rather than comorbidities and intraoperative factors, suggesting renal failure in the postoperative cardiac surgery patient surviving to discharge is unlikely to be permanent.

4.
J Am Heart Assoc ; 12(17): e029406, 2023 09 05.
Article in English | MEDLINE | ID: mdl-37589123

ABSTRACT

Background Adults undergoing heart surgery are particularly vulnerable to respiratory complications, including COVID-19. Immunization can significantly reduce this risk; however, the effect of cardiopulmonary bypass (CPB) on immunization status is unknown. We sought to evaluate the effect of CPB on COVID-19 vaccination antibody concentration after cardiac surgery. Methods and Results This prospective observational clinical trial evaluated adult participants undergoing cardiac surgery requiring CPB at a single institution. All participants received a full primary COVID-19 vaccination series before CPB. SARS-CoV-2 spike protein-specific antibody concentrations were measured before CPB (pre-CPB measurement), 24 hours following CPB (postoperative day 1 measurement), and approximately 1 month following their procedure. Relationships between demographic or surgical variables and change in antibody concentration were assessed via linear regression. A total of 77 participants were enrolled in the study and underwent surgery. Among all participants, mean antibody concentration was significantly decreased on postoperative day 1, relative to pre-CPB levels (-2091 AU/mL, P<0.001). Antibody concentration increased between postoperative day 1and 1 month post CPB measurement (2465 AU/mL, P=0.015). Importantly, no significant difference was observed between pre-CPB and 1 month post CPB concentrations (P=0.983). Two participants (2.63%) developed symptomatic COVID-19 pneumonia postoperatively; 1 case of postoperative COVID-19 pneumonia resulted in mortality (1.3%). Conclusions COVID-19 vaccine antibody concentrations were significantly reduced in the short-term following CPB but returned to pre-CPB levels within 1 month. One case of postoperative COVID 19 pneumonia-specific mortality was observed. These findings suggest the need for heightened precautions in the perioperative period for cardiac surgery patients.


Subject(s)
COVID-19 Vaccines , COVID-19 , Adult , Humans , COVID-19 Vaccines/adverse effects , SARS-CoV-2 , Cardiopulmonary Bypass/adverse effects , COVID-19/prevention & control , Vaccination , Antibodies
5.
J Surg Res ; 291: 67-72, 2023 11.
Article in English | MEDLINE | ID: mdl-37352738

ABSTRACT

INTRODUCTION: Deep sternal wound infection (DSWI) is a rare complication associated with high mortality. Seasonal variability in surgical site infections has been demonstrated, however, these patterns have not been applied to DSWI. The purpose of this study was to assess temporal clustering of DSWIs. METHODS: All cardiac surgery patients who underwent sternotomy were queried from a regional Society of Thoracic Surgeons database from 17 centers from 2001 to 2019. All patients with the diagnosis of DSWI were then identified. Cluster analysis was performed at varying time intervals (monthly, quarterly, and yearly) at the hospital and regional level. DSWI rates were calculated by year and month, and compared using mixed-effects negative binomial regression. RESULTS: A total of 134,959 patients underwent a sternotomy for cardiac surgery, of whom 469 (0.35%) developed a DSWI. Rates of DSWI per hospital across all years ranged from 0.12% to 0.69%. Collaborative-level rates of DSWIs were the greatest in September (0.44%) and the lowest in January (0.30%). Temporal clustering was not seen across seasonal quarters (high rate in preceeding quarter was not associated with a high rate in the next quarter) (P = 0.39). There were yearly differences across all institutions in the DSWI rates. A downward trend in DSWI rates was seen from 2001 to 2019 (P < 0.001). A difference among hospitals in the cohort was observed (P < 0.001). CONCLUSIONS: DSWI are a rare event within our region. Unlike other surgical site infection, there does not appear to be a seasonal pattern associated with DSWI.


Subject(s)
Cardiac Surgical Procedures , Humans , Risk Factors , Cardiac Surgical Procedures/adverse effects , Sternum/surgery , Surgical Wound Infection/epidemiology , Surgical Wound Infection/etiology , Cluster Analysis , Retrospective Studies
6.
Article in English | MEDLINE | ID: mdl-37211243

ABSTRACT

OBJECTIVE: Our understanding of the impact of a center's case volume on failure to rescue (FTR) after cardiac surgery is incomplete. We hypothesized that increasing center case volume would be associated with lower FTR. METHODS: Patients undergoing a Society of Thoracic Surgeons index operation in a regional collaborative (2011-2021) were included. After we excluded patients with missing Society of Thoracic Surgeons Predicted Risk of Mortality scores, patients were stratified by mean annual center case volume. The lowest quartile of case volume was compared with all other patients. Logistic regression analyzed the association between center case volume and FTR, adjusting for patient demographics, race, insurance, comorbidities, procedure type, and year. RESULTS: A total of 43,641 patients were included across 17 centers during the study period. Of these, 5315 (12.2%) developed an FTR complication, and 735 (13.8% of those who developed an FTR complication) experienced FTR. Median annual case volume was 226, with 25th and 75th percentile cutoffs of 136 and 284 cases, respectively. Increasing center-level case volume was associated with significantly greater center-level major complication rates but lower mortality and FTR rates (all P values < .01). Observed-to-expected FTR was significantly associated with case volume (P = .040). Increasing case volume was independently associated with decreasing FTR rate in the final multivariable model (odds ratio, 0.87 per quartile; confidence interval, 0.799-0.946, P = .001). CONCLUSIONS: Increasing center case volume is significantly associated with improved FTR rates. Assessment of low-volume centers' FTR performance represents an opportunity for quality improvement.

7.
JTCVS Open ; 13: 218-231, 2023 Mar.
Article in English | MEDLINE | ID: mdl-37063148

ABSTRACT

Objectives: The 2018 change in the heart transplant allocation system resulted in greater use of temporary mechanical circulatory support. We hypothesized that the allocation change has increased hospital resource utilization, including length of stay and cost. Methods: All heart transplant patients within a regional Society of Thoracic Surgeons database were included (2012-2020). Patients were stratified before and after the transplant allocation changes into early (January 2012-September 2018) and late eras (November 2018-June 2020). Costs were adjusted for inflation and presented in 2020 dollars. Results: Of 535 heart transplants, there were 410 early and 125 late era patients. Baseline characteristics were similar, except for greater lung and valvular disease in the late era. Fewer patients in the late era were bridged with durable left ventricular assist devices (69% vs 31%; P < .0001), biventricular devices (5% vs 1%; P = .047), and more with temporary mechanical circulatory support (4% vs 46%; P < .0001). There was no difference in early mortality (6% vs 4%; P = .33) or major morbidity (57% vs 61%; P = .40). Length of stay was longer preoperatively (1 vs 9 days; P < .0001), but not different postoperatively. There was no difference in median total hospital cost ($132,465 vs $128,996; P = .15), although there was high variability. On multivariable regression, preoperative extracorporeal membrane oxygenation utilization was the main driver of resource utilization. Conclusions: The new heart transplant allocation system has resulted in different bridging techniques, with greater reliance on temporary mechanical circulatory support. Although this is associated with an increase in preoperative length of stay, it did not translate into increased hospital cost.

8.
J Surg Res ; 286: 49-56, 2023 06.
Article in English | MEDLINE | ID: mdl-36753949

ABSTRACT

INTRODUCTION: Pulmonary hypertension (PHT) is a known risk factor for coronary artery bypass grafting (CABG), though less well understood for valve operations. We hypothesized PHT is associated with lower risk during mitral valve operations compared to CABG. METHODS: Patients undergoing isolated mitral valve or CABG operations (2011-2019) in a regional Society of Thoracic Surgeons (STS) database were stratified by pulmonary artery systolic pressure (PASP). The association of PASP by procedure type was assessed by hierarchical regression modeling, adjusting for STS predicted risk scores. RESULTS: Of the 2542 mitral and 11,059 CABG patients, the mitral population had higher mean STS risk of mortality (3.6% versus 2.4%, P < 0.0001) and median PASP (42 mmHg versus 32 mmHg, P < 0.0001). PASP was independently associated with operative mortality and major morbidity in both mitral and CABG patients. However, for mitral patients a 10-mmHg increase in PASP was associated with lower odds of morbidity (odds ratio: 1.06 versus 1.13), mortality (odds ratio: 1.11 versus 1.18) and intensive care unit time (4.3 versus 7.6 h) compared with CABG patients (interaction terms P < 0.0001). Among mitral patients, median PASP was higher in stenotic versus regurgitant disease (57 mmHg versus 40 mmHg, P < 0.0001). However, there was no differential association of PASP on morbidity or mortality (interaction terms P > 0.05). CONCLUSIONS: Although mitral surgery patients tend to have higher preoperative pulmonary artery pressures, PHT was associated with a lower risk for mitral outcomes compared with CABG. Further research on the management and optimization of patients with PHT perioperatively is needed to improve care for these patients.


Subject(s)
Cardiac Surgical Procedures , Heart Valve Prosthesis Implantation , Hypertension, Pulmonary , Mitral Valve Insufficiency , Humans , Mitral Valve/surgery , Hypertension, Pulmonary/epidemiology , Hypertension, Pulmonary/etiology , Coronary Artery Bypass/adverse effects , Coronary Artery Bypass/methods , Risk Factors , Treatment Outcome , Heart Valve Prosthesis Implantation/methods , Mitral Valve Insufficiency/surgery
9.
J Heart Lung Transplant ; 42(7): 880-887, 2023 07.
Article in English | MEDLINE | ID: mdl-36669942

ABSTRACT

BACKGROUND: Employment is an important metric of post-transplant functional status and the quality of life yet remains poorly described after heart transplant. We sought to characterize the prevalence of employment following heart transplantation and identify patients at risk for post-transplant unemployment. METHODS: Adults undergoing single-organ heart transplantation (2007-2016) were evaluated using the UNOS database. Univariable analysis was performed after stratifying by employment status at 1-year post-transplant. Fine-Gray competing risk regression was used for risk adjustment. Cox regression evaluated employment status at 1 year with mortality. RESULTS: Of 10,132 heart transplant recipients who survived to 1 year and had follow-up, 22.0% were employed 1-year post-transplant. Employment rate of survivors increased to 32.9% by year 2. Employed individuals were more likely white (70.8% vs 60.4%, p < 0.01), male (79.6% vs 70.7% p < 0.01), held a job at listing/transplant (37.6% vs 7.6%, p < 0.01), and had private insurance (79.1% vs 49.5%, p < 0.01). Several characteristics were independently associated with employment including age, employment status at time of listing or transplant, race and ethnicity, gender, insurance status, education, and postoperative complications. Of 1,657 (14.0%) patients employed pretransplant, 58% were working at 1-year. Employment at 1year was independently associated with mortality with employed individuals having a 26% decreased risk of mortality. CONCLUSION: Over 20% of heart transplant patients were employed at 1 year and over 30% at 2 years, while 58% of those working pretransplant had returned to work by 1-year. While the major predictor of post-transplant employment is preoperative employment status, our study highlights the impact of social determinants of health.


Subject(s)
Heart Transplantation , Kidney Transplantation , Adult , Humans , Male , United States/epidemiology , Quality of Life , Employment , Unemployment
10.
Ann Thorac Surg ; 115(6): 1511-1518, 2023 06.
Article in English | MEDLINE | ID: mdl-36696937

ABSTRACT

BACKGROUND: Increasing socioeconomic distress has been associated with worse cardiac surgery outcomes. The extent to which the pandemic affected cardiac surgical access and outcomes remains unknown. We sought to examine the relationship between the COVID-19 pandemic and outcomes after cardiac surgery by socioeconomic status. METHODS: All patients undergoing a Society of Thoracic Surgeons (STS) index operation in a regional collaborative, the Virginia Cardiac Services Quality Initiative (2011-2022), were analyzed. Patients were stratified by timing of surgery before vs during the COVID-19 pandemic (March 13, 2020). Hierarchic logistic regression assessed the relationship between the pandemic and operative mortality, major morbidity, and cost, adjusting for the Distressed Communities Index (DCI), STS predicted risk of mortality, intraoperative characteristics, and hospital random effect. RESULTS: A total of 37,769 patients across 17 centers were included. Of these, 7269 patients (19.7%) underwent surgery during the pandemic. On average, patients during the pandemic were less socioeconomically distressed (DCI 37.4 vs DCI 41.9; P < .001) and had a lower STS predicted risk of mortality (2.16% vs 2.53%, P < .001). After risk adjustment, the pandemic was significantly associated with increased mortality (odds ratio 1.398; 95% CI, 1.179-1.657; P < .001), cost (+$4823, P < .001), and STS failure to rescue (odds ratio 1.37; 95% CI, 1.10-1.70; P = .005). The negative impact of the pandemic on mortality and cost was similar regardless of DCI. CONCLUSIONS: Across all socioeconomic statuses, the pandemic is associated with higher cost and greater risk-adjusted mortality, perhaps related to a resource-constrained health care system. More patients during the pandemic were from less distressed communities, raising concern for access to care in distressed communities.


Subject(s)
COVID-19 , Cardiac Surgical Procedures , Humans , Pandemics , Retrospective Studies , COVID-19/epidemiology , Social Class , Postoperative Complications/epidemiology
11.
Ann Thorac Surg ; 115(4): 922-928, 2023 04.
Article in English | MEDLINE | ID: mdl-35093386

ABSTRACT

BACKGROUND: Racial disparities in outcomes after cardiac surgery are well reported. We sought to determine whether variation by race exists in controllable practices during coronary artery bypass graft surgery (CABG). We hypothesized that racial disparities exist in CABG quality metrics, but have improved over time. METHODS: All patients undergoing isolated CABG (2000 to 2019) in a multiple state database were stratified into three eras by race. Analysis included propensity matched White Americans and Black Americans. Primary outcomes included left internal mammary artery use, multiple arterial grafting, revascularization completeness, and guideline-directed medication prescription. RESULTS: Of 72 248 patients undergoing CABG, Black American patients (n = 10 270, 15%) had higher rates of diabetes mellitus, hypertension, prior stroke, and myocardial infarction. After matching, 19 806 patients (n = 9903 per group) were well balanced. Left internal mammary artery use was significantly different early (era 1, Black Americans 84.7% vs White Americans 86.6%; P = .03), but equalized over time. Importantly, multiarterial grafting differed between Black Americans and White Americans over the entire study (9.1% vs 11.5%, P < .001) and within each era. Black Americans had more incomplete revascularization during the study period (14% vs 12.8%, P = .02) driven by a large disparity in era 1 (9.5% vs 7.2%, P < .001). Despite similar rates of preoperative use, Black Americans were more often discharged on a regimen of ß-blockers (91.8% vs 89.6%, P < .001). CONCLUSIONS: Coronary artery bypass graft surgery metrics of left internal mammary artery use and optimal medical therapy have improved over time and are similar despite patient race. Black Americans undergo less frequent multiarterial grafting and greater discharge ß-blocker prescription. Identifying changes in controllable CABG quality practices across races supports a continued focus on standardizing such efforts.


Subject(s)
Coronary Artery Bypass , Coronary Artery Disease , Humans , Black or African American , Coronary Artery Bypass/methods , Coronary Artery Disease/surgery , Retrospective Studies , Treatment Outcome , White
12.
Ann Thorac Surg ; 115(4): 914-921, 2023 04.
Article in English | MEDLINE | ID: mdl-35868555

ABSTRACT

BACKGROUND: The influence of socioeconomic determinants of health on choice of percutaneous coronary intervention (PCI) vs coronary artery bypass grafting (CABG) for coronary artery disease is unknown. We hypothesized that higher Distressed Communities Index (DCI) scores, a comprehensive socioeconomic ranking by zip code, would be associated with more frequent PCI. METHODS: All patients undergoing isolated CABG or PCI in a regional American College of Cardiology CathPCI registry and The Society of Thoracic Surgeons database (2018-2021) were assigned DCI scores (0 = no distress, 100 = severe distress) based on education level, poverty, unemployment, housing vacancies, median income, and business growth. Patients who presented with ST-segment elevation myocardial infarction or emergent procedures were excluded. The most distressed quintile (DCI ≥80) was compared with all other patients. Multivariable logistic regression analyzed the association between DCI and procedure type. RESULTS: A total of 23 223 patients underwent either PCI (n = 16 079) or CABG (n = 7144) for coronary artery disease across 28 centers during the study period. Before adjustment, high socioeconomic distress occurred more frequently among CABG patients (DCI ≥80, 12.4% vs 8.42%; P < .001). After multivariable adjustment, high socioeconomic distress was associated with greater odds of receiving PCI, relative to CABG (odds ratio 1.26; 95% CI, 1.07-1.49; P = .007). High socioeconomic distress was significantly associated with postprocedural mortality (odds ratio 1.52; 95% CI, 1.02-2.26; P = .039). CONCLUSIONS: High socioeconomic distress is associated with greater risk-adjusted odds of receiving PCI, relative to CABG, as well as higher postprocedural mortality. Targeted resource allocation in high DCI areas may help eliminate barriers to CABG.


Subject(s)
Coronary Artery Disease , Percutaneous Coronary Intervention , Humans , Coronary Artery Disease/surgery , Percutaneous Coronary Intervention/adverse effects , Risk Factors , Coronary Artery Bypass/adverse effects , Socioeconomic Factors , Treatment Outcome
13.
J Heart Lung Transplant ; 42(1): 33-39, 2023 01.
Article in English | MEDLINE | ID: mdl-36347767

ABSTRACT

BACKGROUND: Continuous flow left ventricular assist devices have improved outcomes in patients with end-stage heart failure that require mechanical circulatory support. Current devices have an adverse event profile that has hindered widespread application. The EVAHEART®2 left ventricular assist device (EVA2) has design features such as large blood gaps, lower pump speeds and an inflow cannula that does not protrude into the left ventricle that may mitigate the adverse events currently seen with other continuous flow devices. METHODS: A prospective, multi-center randomized non-inferiority study, COMPETENCE Trial, is underway to assess non-inferiority of the EVA2 to the HeartMate 3 LVAS when used for the treatment of refractory advanced heart failure. The primary end-point is a composite of the individual primary outcomes: Survival to cardiac transplant or device explant for recovery; Free from disabling stroke; Free from severe Right Heart Failure after implantation of original device. Randomization is in a 2:1 (EVA2:HM3) ratio. RESULTS: The first patient was enrolled into the COMPETENCE Trial in December of 2020, and 25 subjects (16 EVA2 and 9 HM3) are currently enrolled. Enrollment of a safety cohort is projected to be completed by third quarter of 2022 at which time an interim analysis will be performed. Short-term cohort (92 EVA2 subjects) and long-term cohort is expected to be completed by the end of 2023 and 2024, respectively. CONCLUSIONS: The design features of the EVA2 such as a novel inflow cannula and large blood gaps may improve clinical outcomes but require further study. The ongoing COMPETENCE trial is designed to determine if the EVA2 is non-inferior to the HM3.


Subject(s)
Heart Failure , Heart Transplantation , Heart-Assist Devices , Humans , Heart-Assist Devices/adverse effects , Prospective Studies , Heart Failure/surgery , Heart Ventricles , Treatment Outcome
14.
Perfusion ; 38(8): 1714-1721, 2023 11.
Article in English | MEDLINE | ID: mdl-36167522

ABSTRACT

OBJECTIVES: The optimal method for monitoring of anticoagulation in patients on extracorporeal life support (ECLS) is unknown. The objective of this study was to assess the relationship between anti-factor Xa level (anti-Xa; IU/mL) and activated partial thromboplastin time (aPTT; seconds) for monitoring intravenous unfractionated heparin anticoagulation in adult ECLS patients. METHODS: Charts of all adult patients cannulated for ECLS from 2015 through 2017 were reviewed and laboratory and heparin infusion data were extracted for analysis. Time matched pairs of anti-Xa and aPTT were considered concordant if both laboratory values were within the same clinically utilized range. A hierarchical logistic regression model was used to determine factors associated with discordance while accounting for patient level effects. RESULTS: A total of 1016 paired anti-Xa and aPTT values from 65 patients were evaluated. 500 (49.2%) paired samples were discordant with a degree of variability on linear regression (r2 = 0.315). The aPTT fell into a higher therapeutic range compared to the anti-Xa in 31.6% and lower in 17.3%. Logistic regression demonstrated that discordance was independently associated with time from initiation of ECLS (OR 1.17 per day, p < 0.001), average heparin infusion rate (OR 1.25 per U/kg/hr, p < 0.001), and INR (OR 3.22, p < 0.001). CONCLUSIONS: Nearly half of all aPTT and anti-Xa values were in discordant ranges and discordance is more likely as the time on ECLS and the INR level increase. The use of either assay in isolation to guide heparin anticoagulation may lead to misestimation of the degree of anticoagulation in complex ECLS patients.


Subject(s)
Extracorporeal Membrane Oxygenation , Heparin , Adult , Humans , Heparin/therapeutic use , Heparin/pharmacology , Anticoagulants/therapeutic use , Anticoagulants/pharmacology , Extracorporeal Membrane Oxygenation/methods , Partial Thromboplastin Time , Blood Coagulation , Retrospective Studies
15.
Article in English | MEDLINE | ID: mdl-36460133

ABSTRACT

Time-directed extubation (fast-track) protocols may decrease length of stay and cost but data on operating room (OR) extubation is limited. The objective of this study was to compare the outcomes of extubation in the OR versus fast-track extubation within 6 hours of leaving the operating room. Patients undergoing nonemergent STS index cases (2011-2021) who were extubated within 6 hours were identified from a regional STS quality collaborative. Patients were stratified by extubation in the OR versus fast track. Propensity score matching (1:n) was performed to balance baseline differences. Of the 24,962 patients, 498 were extubated in the OR. After matching, 487 OR extubation cases and 899 fast track cases were well balanced. The rate of reintubation was higher for patients extubated in the OR [21/487 (4.3%) vs 16/899 (1.8%), P = 0.008] as was the incidence of reoperation for bleeding [12/487 (2.5%) vs 8/899 (0.9%), P = 0.03]. There was no significant difference in the rate of any reoperation [16/487 (3.3%) vs 15/899 (1.6%), P = 0.06] or operative mortality [4/487 (0.8%) vs 6/899 (0.6%), P = 0.7]. OR extubation was associated with shorter hospital length of stay (5.6 vs 6.2 days, P < 0.001) and lower total cost of admission ($29,602 vs $31,565 P < 0.001). OR extubation is associated with a higher postoperative risk of reintubation and reoperation due to bleeding, but lower resource utilization.Future research exploring predictors of extubation readiness may be required prior to widespread adoption of this practice.

16.
Article in English | MEDLINE | ID: mdl-36031426

ABSTRACT

OBJECTIVE: The influence of socioeconomic determinants of health on failure to rescue (mortality after a postoperative complication) after cardiac surgery is unknown. We hypothesized that increasing Distressed Communities Index, a comprehensive socioeconomic ranking by ZIP code, would be associated with higher failure to rescue. METHODS: Patients undergoing Society of Thoracic Surgeons index operation in a regional collaborative (2011-2021) who developed a failure to rescue complication were included. After excluding patients with missing ZIP code or Society of Thoracic Surgeons predicted risk of mortality, patients were stratified by Distressed Communities Index scores (0-no distress, 100-severe distress) based on education level, poverty, unemployment, housing vacancies, median income, and business growth. The upper 2 quintiles of distress (Distressed Communities Index >60) were compared to all other patients. Hierarchical logistic regression analyzed the association between Distressed Communities Index and failure to rescue. RESULTS: A total of 4004 patients developed 1 or more of the defined complications across 17 centers. Of these, 582 (14.5%) experienced failure to rescue. High socioeconomic distress (Distressed Communities Index >60) was identified among 1272 patients (31.8%). Before adjustment, failure to rescue occurred more frequently among those from socioeconomically distressed communities (Distressed Communities Index >60; 16.9% vs 13.4%, P = .004). After adjustment, residing in a socioeconomically distressed community was associated with 24% increased odds of failure to rescue (odds ratio, 1.24; confidence interval, 1.003-1.54; P = .044). CONCLUSIONS: Increasing Distressed Communities Index, a measure of poor socioeconomic status, is associated with greater risk-adjusted likelihood of failure to rescue after cardiac surgery. These findings highlight that current quality metrics do not account for socioeconomic status, and as such underrepresent procedural risk for these vulnerable patients.

17.
Ann Thorac Surg ; 2022 Aug 07.
Article in English | MEDLINE | ID: mdl-35948120

ABSTRACT

BACKGROUND: The adoption of transcatheter aortic valve replacement led to the development of appropriate use criteria (AUC) for transcatheter and surgical aortic valve replacement (SAVR) for aortic stenosis in 2017. This study hypothesized that appropriateness of SAVR improved after publication of AUC. METHODS: All patients undergoing isolated SAVR for severe aortic stenosis in a regional cardiac surgical quality collaborative were evaluated using data from the Society of Thoracic Surgeons Adult Cardiac Surgery Database (2011-2021). After excluding endocarditis and emergency cases, appropriateness of SAVR (rarely appropriate, may be appropriate, or appropriate) was assigned to patients by using established criteria. The relationship of appropriateness with publication of AUC was assessed, as was variation in appropriateness over time and by center. RESULTS: Of 3035 patients across 17 centers, 106 (3.5%) underwent SAVR for an indication identified as rarely appropriate or may be appropriate. Patients who underwent SAVR for rarely or may be appropriate indications were significantly more likely to experience operative mortality (5.7% vs 1.6%, P = .001) as well as major morbidity (21.7% vs 10.5%, P < .001). Performance of rarely or may be appropriate SAVR significantly decreased over time (slope -0.51%/year, P trend < .001), and it was decreased after the release of the AUC (before release, 3.83% vs after release, 2.06%; P = .036). Substantial interhospital variation in appropriateness was observed (range of may be or rarely appropriate SAVR, 0%-10%). CONCLUSIONS: The majority of isolated SAVR for aortic stenosis was appropriate according to the 2017 AUC. Appropriateness improved after publication of AUC, and this improvement was associated with a significant reduction of major morbidity and mortality.

18.
J Card Surg ; 37(5): 1224-1229, 2022 May.
Article in English | MEDLINE | ID: mdl-35245397

ABSTRACT

BACKGROUND: Recent reports suggest an increased rate of early structural valve degeneration (SVD) in the Trifecta bioprosthesis (Abbott Cardiovascular). We sought to compare the intermediate-term outcomes of the Magna (Edwards Life Sciences) and Trifecta valves. METHODS: All surgical aortic valve replacements (SAVRs) with Trifecta or Magna/Magna Ease bioprostheses at an academic medical center were extracted from an institutional database. Patients who survived until after discharge (2011-2019) were included. The primary outcome was valve failure for any reason requiring reintervention or contributing to death, identified by reintervention or review of cause of death. Time to failure was estimated with Kaplan-Meier analysis and Cox Proportional Hazards Modeling. RESULTS: Out of 1444 patients, 521 (36%) underwent Trifecta and 923 (64%) underwent Magna implantation with a median follow-up of 27.6 months. Trifecta patients had larger median valve size (25 vs. 23 mm, p < .001) and lower median gradient (8.0 vs. 10.9 mmHg, p < .001). Trifecta patients had higher 48-month estimated failure rates (20.2 ± 7.6% vs. 2.6 ± 0.7%, p < .0001), with failure rates of 21.4 versus 9.2 failures per 1000 person-years (p < .001). After risk-adjustment, Trifecta patients had a 5.3 times hazard of failure (95% confidence interval: 2.78-12.34, p < .001) compared to Magna patients. Only Trifecta valves failed due to sudden aortic regurgitation, 8 out of 521 (1.5%). CONCLUSION: Despite lower postoperative mean gradients, the Trifecta bioprosthesis may have an increased risk of intermediate-term SVD. Further research is warranted to confirm the potential for sudden valve failure.


Subject(s)
Aortic Valve Stenosis , Biological Products , Bioprosthesis , Heart Valve Prosthesis Implantation , Heart Valve Prosthesis , Aortic Valve/surgery , Aortic Valve Stenosis/surgery , Heart Valve Prosthesis Implantation/adverse effects , Hemodynamics , Humans , Prosthesis Design , Retrospective Studies
19.
Ann Thorac Surg ; 114(4): e273-e274, 2022 10.
Article in English | MEDLINE | ID: mdl-34998734

ABSTRACT

The Lotus (Boston Scientific) valve stood out from alternative transcatheter aortic valve replacement (TAVR) valves because of its ability to recapture and redeploy if initial placement was unsatisfactory. With the exponential rise in TAVR, there have been numerous reports of TAVR valve explantation. A 79-year-old man with nonrheumatic aortic valve stenosis who had undergone TAVR with a Lotus valve 12 months earlier presented for surgical excision for early valve failure. The Food and Drug Administration stated that already implanted Lotus valves are not a safety issue; however, cases of valve degeneration will occur, necessitating future valve explantation.


Subject(s)
Aortic Valve Stenosis , Heart Valve Prosthesis , Transcatheter Aortic Valve Replacement , Aged , Aortic Valve/surgery , Aortic Valve Stenosis/surgery , Humans , Male , Prosthesis Design , Risk Factors , Treatment Outcome
20.
Ann Thorac Surg ; 114(2): 484-491, 2022 08.
Article in English | MEDLINE | ID: mdl-34843696

ABSTRACT

BACKGROUND: Refractory right ventricular failure at the time of left ventricular assist device implantation requires treatment with supplemental mechanical circulatory support. However, the optimal strategy for support remains unknown. METHODS: All patients undergoing first-time durable left ventricular assist device implantation with a contemporary device were selected from The Society of Thoracic Surgeons National Database (2011 to 2019). Patients requiring right ventricular assist device (RVAD) or venoarterial extracorporeal membrane oxygenation (VA-ECMO) were included in the analysis. Patients were stratified by RVAD or VA-ECMO and by timing of placement (intraoperative vs postoperative). RESULTS: In all, 18 423 left ventricular assist device implants were identified, of which 940 (5.1%) required RVAD (n = 750) or VA-ECMO (n = 190) support. Patients receiving an RVAD more frequently had preoperative inotrope requirement (76% vs 62%, P < .01) and severe tricuspid regurgitation (20% vs 13%, P < .01). The RVAD patients had lower rates of postoperative renal failure (40% vs 51%, P = .02) and limb ischemia (4% vs 13%, P < .01), as well as significantly less operative mortality (41% vs 54%, P < .01). After risk adjustment with propensity score analysis, support with VA-ECMO was associated with a higher risk of mortality (risk ratio 1.46; 95% confidence interval, 1.21 to 1.77; P < .01) compared with patients receiving an RVAD. Importantly, institution of right ventricular support postoperatively was associated with higher mortality (1.43, P < .01) compared with intraoperative initiation. CONCLUSIONS: Patients with severe right ventricular failure in the setting of durable left ventricular assist device implantation may benefit from the use of RVAD over VA-ECMO. Regardless of the type of support, initiation at the index operation was associated with improved outcomes.


Subject(s)
Extracorporeal Membrane Oxygenation , Heart Failure , Heart-Assist Devices , Ventricular Dysfunction, Right , Heart Failure/surgery , Humans , Retrospective Studies , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...