ABSTRACT
SARS-CoV-2, the causative agent of coronavirus disease 2019 (COVID-19), is responsible for the largest pandemic facing humanity since the Spanish flu pandemic in the early twentieth century. Since there is no specific antiviral treatment, optimized support is the most relevant factor in the patient's prognosis. In the hospital setting, the identification of high-risk patients for clinical deterioration is essential to ensure access to intensive treatment of severe conditions in a timely manner. The initial management of hypoxemia includes conventional oxygen therapy, high-flow nasal canula oxygen, and non-invasive ventilation. For patients requiring invasive mechanical ventilation, lung-protective ventilation with low tidal volumes and plateau pressure is recommended. Cardiovascular complications are frequent and include myocardial injury, thrombotic events, myocarditis, and cardiogenic shock. Acute renal failure is a common complication and is a marker of poor prognosis, with significant impact in costs and resources allocation. Regarding promising therapies for COVID-19, the most promising drugs until now are remdesivir and corticosteroids although further studies may be needed to confirm their effectiveness. Other therapies such as, tocilizumab, anakinra, other anti-cytokine drugs, and heparin are being tested in clinical trials. Thousands of physicians are living a scenario that none of us have ever seen: demand for hospital exceed capacity in most countries. Until now, the certainty we have is that we should try to decrease the number of infected patients and that an optimized critical care support is the best strategy to improve patient's survival.
ABSTRACT
BACKGROUND: The detrimental effects of inotropes are well-known, and in many fields they are only used within a goal-directed therapy approach. Nevertheless, standard management in many centers includes administering inotropes to all patients undergoing cardiac surgery to prevent low cardiac output syndrome and its implications. Randomized evidence in favor of a patient-tailored, inotrope-sparing approach is still lacking. We designed a randomized controlled noninferiority trial in patients undergoing cardiac surgery with normal ejection fraction to assess whether an dobutamine-sparing strategy (in which the use of dobutamine was guided by hemodynamic evidence of low cardiac output associated with signs of inadequate tissue perfusion) was noninferior to an inotrope-to-all strategy (in which all patients received dobutamine). RESULTS: A total of 160 patients were randomized to the dobutamine-sparing strategy (80 patients) or to the dobutamine-to-all approach (80 patients). The primary composite endpoint of 30-day mortality or occurrence of major cardiovascular complications (arrhythmias, acute myocardial infarction, low cardiac output syndrome and stroke or transient ischemic attack) occurred in 25/80 (31%) patients of the dobutamine-sparing group (p = 0.74) and 27/80 (34%) of the dobutamine-to-all group. There were no significant differences between groups regarding the incidence of acute kidney injury, prolonged mechanical ventilation, intensive care unit or hospital length of stay. DISCUSSION: Although it is common practice in many centers to administer inotropes to all patients undergoing cardiac surgery, a dobutamine-sparing strategy did not result in an increase of mortality or occurrence of major cardiovascular events when compared to a dobutamine-to-all strategy. Further research is needed to assess if reducing the administration of inotropes can improve outcomes in cardiac surgery. Trial registration ClinicalTrials.gov, NCT02361801. Registered Feb 2nd, 2015. https://clinicaltrials.gov/ct2/show/NCT02361801.
ABSTRACT
OBJECTIVES: Previous trials suggest that vasopressin may improve outcomes in patients with vasodilatory shock. The aim of this study was to evaluate whether vasopressin could be superior to norepinephrine to improve outcomes in cancer patients with septic shock. DESIGN: Single-center, randomized, double-blind clinical trial, and meta-analysis of randomized trials. SETTING: ICU of a tertiary care hospital. PATIENTS: Two-hundred fifty patients 18 years old or older with cancer and septic shock. INTERVENTIONS: Patients were assigned to either vasopressin or norepinephrine as first-line vasopressor therapy. An updated meta-analysis was also conducted including randomized trials published until October 2018. MEASUREMENTS AND MAIN RESULTS: The primary outcome was all-cause mortality at 28 days after randomization. Prespecified secondary outcomes included 90-days all-cause mortality rate; number of days alive and free of advanced organ support at day 28; and Sequential Organ Failure Assessment score 24 hours and 96 hours after randomization. We also measure the prevalence of adverse effects in 28 days. A total of 250 patients were randomized. The primary outcome was observed in 71 patients (56.8%) in the vasopressin group and 66 patients (52.8%) in the norepinephrine group (p = 0.52). There were no significant differences in 90-day mortality (90 patients [72.0%] and 94 patients [75.2%], respectively; p = 0.56), number of days alive and free of advanced organ support, adverse events, or Sequential Organ Failure Assessment score. CONCLUSIONS: In cancer patients with septic shock, vasopressin as first-line vasopressor therapy was not superior to norepinephrine in reducing 28-day mortality rate.
Subject(s)
Neoplasms/complications , Norepinephrine/therapeutic use , Shock, Septic/complications , Shock, Septic/drug therapy , Vasopressins/therapeutic use , Double-Blind Method , Female , Humans , Male , Middle Aged , Shock, Septic/mortality , Vasoconstrictor Agents/therapeutic useABSTRACT
BACKGROUND: Appropriate use of antimicrobials is essential to improve outcomes in sepsis. The aim of this study was to determine whether the use of a rapid molecular blood test-SeptiFast (SF) reduces the antibiotic consumption through early de-escalation in patients with nosocomial sepsis compared with conventional blood cultures (BCs). METHODS: This was a prospective, randomized, superiority, controlled trial conducted at Sao Paulo Heart Institute in the period October 2012-May 2016. Adult patients admitted to the hospital for at least 48 h with a diagnosis of nosocomial sepsis underwent microorganism identification by both SF test and BCs. Patients randomized into the intervention group received antibiotic therapy adjustment according to the results of SF. Patients randomized into the control group received standard antibiotic adjustment according to the results of BCs. The primary endpoint was antimicrobial consumption during the first 14 days after randomization. RESULTS: A total of 200 patients were included (100 in each group). The intention to treat analysis found no significant differences in median antibiotic consumption. In the subgroup of patients with positive SF and blood cultures (19 and 25 respectively), we found a statistically significant reduction in the median antimicrobial consumption which was 1429 (1071-2000) days of therapy (DOT)/1000 patients-day in the intervention group and 1889 (1357-2563) DOT/1000 patients-day in the control group (p = 0.017), in the median time of antimicrobial de-escalation (8 versus 54 h-p < 0.001), in the duration of antimicrobial therapy (p = 0.039) and in anti-gram-positive antimicrobial costs (p = 0.002). Microorganism identification was possible in 24.5% of patients (45/184) by SF and 21.2% (39/184) by BC (p = 0.45). CONCLUSION: This randomized clinical trial showed that the use of a rapid molecular-based pathogen identification test does not reduce the median antibiotic consumption in nosocomial sepsis. However, in patients with positive microbiological tests, the use of SeptiFast reduced antimicrobial consumption through early de-escalation compared to conventional blood cultures. These results were driven by a reduction in the consumption of antimicrobials used for Gram-positive bacteria. TRIAL REGISTRATION: The trial was registered at ClinicalTrials.gov (NCT01450358) on 12th October 2011.
ABSTRACT
OBJECTIVE: To investigate the effects of the administration of 4% albumin on lactated Ringer's, when compared with lactated Ringer's alone, in the early phase of sepsis in cancer patients. DESIGN: Single-center, randomized, double-blind, controlled-parallel trial. SETTING: A tertiary care university cancer hospital. PATIENTS: Cancer patients with severe sepsis or septic shock. INTERVENTIONS: Between October 2014 and December 2016, patients were randomly assigned to receive either bolus of albumin in a lactated Ringer's solution or lactated Ringer's solution alone during the first 6 hours of fluid resuscitation after intensive care medicine (ICU) admission. Primary outcome was defined as death from any cause at 7 days. Secondary outcomes were defined as death from any cause within 28 days, change in Sequence Organ Failure Assessment scores from baseline to day 7, days alive and free of mechanical ventilation, days alive and free of vasopressor, renal replacement therapy during ICU stay, and length of ICU and hospital stay. MEASUREMENTS AND MAIN RESULTS: A total of 360 patients were enrolled in the trial. At 7 days, 46 of 180 patients (26%) died in the albumin group and 40 of 180 (22%) died in the lactated Ringer's group (p = 0.5). At 28 days, 96 of 180 patients (53%) died in the albumin group and 83 of 180 (46%) died in the lactated Ringer's group (p = 0.2). No significant differences in secondary outcomes were observed. CONCLUSIONS: Adding albumin to early standard resuscitation with lactated Ringer's in cancer patients with sepsis did not improve 7-day survival.
Subject(s)
Albumins/administration & dosage , Fluid Therapy , Ringer's Lactate/administration & dosage , Sepsis/therapy , Aged , Double-Blind Method , Drug Therapy, Combination , Female , Humans , Male , Middle Aged , Neoplasms/complications , Pilot Projects , Secondary Prevention , Sepsis/complicationsABSTRACT
OBJECTIVES: The aim of this study was to evaluate the efficacy of perioperative intra-aortic balloon pump use in high-risk cardiac surgery patients. DESIGN: A single-center randomized controlled trial and a meta-analysis of randomized controlled trials. SETTING: Heart Institute of São Paulo University. PATIENTS: High-risk patients undergoing elective coronary artery bypass surgery. INTERVENTION: Patients were randomized to receive preskin incision intra-aortic balloon pump insertion after anesthesia induction versus no intra-aortic balloon pump use. MEASUREMENTS AND MAIN RESULTS: The primary outcome was a composite endpoint of 30-day mortality and major morbidity (cardiogenic shock, stroke, acute renal failure, mediastinitis, prolonged mechanical ventilation, and a need for reoperation). A total of 181 patients (mean [SD] age 65.4 [9.4] yr; 32% female) were randomized. The primary outcome was observed in 43 patients (47.8%) in the intra-aortic balloon pump group and 42 patients (46.2%) in the control group (p = 0.46). The median duration of inotrope use (51 hr [interquartile range, 32-94 hr] vs 39 hr [interquartile range, 25-66 hr]; p = 0.007) and the ICU length of stay (5 d [interquartile range, 3-8 d] vs 4 d [interquartile range, 3-6 d]; p = 0.035) were longer in the intra-aortic balloon pump group than in the control group. A meta-analysis of 11 randomized controlled trials confirmed a lack of survival improvement in high-risk cardiac surgery patients with perioperative intra-aortic balloon pump use. CONCLUSIONS: In high-risk patients undergoing cardiac surgery, the perioperative use of an intra-aortic balloon pump did not reduce the occurrence of a composite outcome of 30-day mortality and major complications compared with usual care alone.
Subject(s)
Cardiac Surgical Procedures/mortality , Cardiac Surgical Procedures/methods , Intra-Aortic Balloon Pumping/methods , Postoperative Complications/epidemiology , Aged , Cardiotonic Agents/administration & dosage , Female , Humans , Intensive Care Units , Length of Stay , Male , Middle Aged , Postoperative Complications/mortality , Risk Factors , Single-Blind MethodABSTRACT
BACKGROUND: Vasoplegic syndrome is a common complication after cardiac surgery and impacts negatively on patient outcomes. The objective of this study was to evaluate whether vasopressin is superior to norepinephrine in reducing postoperative complications in patients with vasoplegic syndrome. METHODS: This prospective, randomized, double-blind trial was conducted at the Heart Institute, University of Sao Paulo, Sao Paulo, Brazil, between January 2012 and March 2014. Patients with vasoplegic shock (defined as mean arterial pressure less than 65 mmHg resistant to fluid challenge and cardiac index greater than 2.2 l · min · m) after cardiac surgery were randomized to receive vasopressin (0.01 to 0.06 U/min) or norepinephrine (10 to 60 µg/min) to maintain arterial pressure. The primary endpoint was a composite of mortality or severe complications (stroke, requirement for mechanical ventilation for longer than 48 h, deep sternal wound infection, reoperation, or acute renal failure) within 30 days. RESULTS: A total of 330 patients were randomized, and 300 were infused with one of the study drugs (vasopressin, 149; norepinephrine, 151). The primary outcome occurred in 32% of the vasopressin patients and in 49% of the norepinephrine patients (unadjusted hazard ratio, 0.55; 95% CI, 0.38 to 0.80; P = 0.0014). Regarding adverse events, the authors found a lower occurrence of atrial fibrillation in the vasopressin group (63.8% vs. 82.1%; P = 0.0004) and no difference between groups in the rates of digital ischemia, mesenteric ischemia, hyponatremia, and myocardial infarction. CONCLUSIONS: The authors' results suggest that vasopressin can be used as a first-line vasopressor agent in postcardiac surgery vasoplegic shock and improves clinical outcomes.
Subject(s)
Cardiac Surgical Procedures , Norepinephrine/pharmacology , Postoperative Complications/drug therapy , Shock/drug therapy , Vasoplegia/drug therapy , Vasopressins/pharmacology , Brazil , Double-Blind Method , Female , Humans , Male , Middle Aged , Prospective Studies , Shock/complications , Treatment Outcome , Vasoconstrictor Agents/pharmacology , Vasoplegia/complicationsABSTRACT
OBJECTIVE: The aim of this study was to compare outcomes in patients undergoing cardiac surgery who are aged 60 years or more or less than 60 years after implementation of a restrictive or a liberal transfusion strategy. METHODS: This is a substudy of the Transfusion Requirements After Cardiac Surgery (TRACS) randomized controlled trial. In this subgroup analysis, we separated patients into those aged 60 years or more (elderly) and those aged less than 60 years randomized to a restrictive or a liberal strategy of red blood cell transfusion. The primary outcome was a composite defined as a combination of 30-day all-cause mortality and severe morbidity. RESULTS: Of the 502 patients included in the Transfusion Requirements After Cardiac Surgery study, 260 (51.8%) were aged 60 years or more and 242 (48.2%) were aged less than 60 years and were included in this study. The primary end point occurred in 11.9% of patients in the liberal strategy group and 16.8% of patients in the restrictive strategy group (P = .254) for those aged 60 years or more and in 6.8% of patients in the liberal strategy group and 5.6% of patients in the restrictive strategy group for those aged less than 60 years (P = .714). However, in the older patients, cardiogenic shock was more frequent in patients in the restrictive transfusion group (12.8% vs 5.2%, P = .031). Thirty-day mortality, acute respiratory distress syndrome, and acute renal injury were similar in the restrictive and liberal transfusion groups in both age groups. CONCLUSIONS: Although there was no difference between groups regarding the primary outcome, a restrictive transfusion strategy may result in an increased rate of cardiogenic shock in elderly patients undergoing cardiac surgery compared with a more liberal strategy. Cardiovascular risk of anemia may be more harmful than the risk of blood transfusion in older patients.
Subject(s)
Anemia/therapy , Cardiac Surgical Procedures/adverse effects , Erythrocyte Transfusion/methods , Shock, Cardiogenic/prevention & control , Adult , Age Factors , Aged , Anemia/blood , Anemia/diagnosis , Anemia/mortality , Biomarkers/blood , Brazil , Cardiac Surgical Procedures/mortality , Erythrocyte Transfusion/adverse effects , Erythrocyte Transfusion/mortality , Hematocrit , Hemoglobins/metabolism , Humans , Middle Aged , Prospective Studies , Risk Factors , Shock, Cardiogenic/diagnosis , Shock, Cardiogenic/etiology , Shock, Cardiogenic/mortality , Time Factors , Treatment OutcomeABSTRACT
BACKGROUND: Several studies have indicated that a restrictive erythrocyte transfusion strategy is as safe as a liberal one in critically ill patients, but there is no clear evidence to support the superiority of any perioperative transfusion strategy in patients with cancer. METHODS: In a randomized, controlled, parallel-group, double-blind (patients and outcome assessors) superiority trial in the intensive care unit of a tertiary oncology hospital, the authors evaluated whether a restrictive strategy of erythrocyte transfusion (transfusion when hemoglobin concentration <7 g/dl) was superior to a liberal one (transfusion when hemoglobin concentration <9 g/dl) for reducing mortality and severe clinical complications among patients having major cancer surgery. All adult patients with cancer having major abdominal surgery who required postoperative intensive care were included and randomly allocated to treatment with the liberal or the restrictive erythrocyte transfusion strategy. The primary outcome was a composite endpoint of mortality and morbidity. RESULTS: A total of 198 patients were included as follows: 101 in the restrictive group and 97 in the liberal group. The primary composite endpoint occurred in 19.6% (95% CI, 12.9 to 28.6%) of patients in the liberal-strategy group and in 35.6% (27.0 to 45.4%) of patients in the restrictive-strategy group (P = 0.012). Compared with the restrictive strategy, the liberal transfusion strategy was associated with an absolute risk reduction for the composite outcome of 16% (3.8 to 28.2%) and a number needed to treat of 6.2 (3.5 to 26.5). CONCLUSION: A liberal erythrocyte transfusion strategy with a hemoglobin trigger of 9 g/dl was associated with fewer major postoperative complications in patients having major cancer surgery compared with a restrictive strategy.
Subject(s)
Abdominal Neoplasms/surgery , Erythrocyte Transfusion/methods , Erythrocyte Transfusion/statistics & numerical data , Brazil/epidemiology , Double-Blind Method , Female , Follow-Up Studies , Hemoglobins/analysis , Humans , Intensive Care Units , Male , Middle Aged , Outcome Assessment, Health Care/statistics & numerical data , Postoperative Complications/epidemiology , Prospective Studies , RiskABSTRACT
OBJECTIVE: The failure to wean from mechanical ventilation is related to worse outcomes after cardiac surgery. The aim of this study was to evaluate whether the serum level of B-type natriuretic peptide is a predictor of weaning failure from mechanical ventilation after cardiac surgery. METHODS: We conducted a prospective, observational cohort study of 101 patients who underwent on-pump coronary artery bypass grafting. B-type natriuretic peptide was measured postoperatively after intensive care unit admission and at the end of a 60-min spontaneous breathing test. The demographic data, hemodynamic and respiratory parameters, fluid balance, need for vasopressor or inotropic support, and length of the intensive care unit and hospital stays were recorded. Weaning failure was considered as either the inability to sustain spontaneous breathing after 60 min or the need for reintubation within 48 h. RESULTS: Of the 101 patients studied, 12 patients failed the weaning trial. There were no differences between the groups in the baseline or intraoperative characteristics, including left ventricular function, EuroSCORE and lengths of the cardiac procedure and cardiopulmonary bypass. The B-type natriuretic peptide levels were significantly higher at intensive care unit admission and at the end of the breathing test in the patients with weaning failure compared with the patients who were successfully weaned. In a multivariate model, a high B-type natriuretic peptide level at the end of a spontaneous breathing trial was the only independent predictor of weaning failure from mechanical ventilation. CONCLUSIONS: A high B-type natriuretic peptide level is a predictive factor for the failure to wean from mechanical ventilation after cardiac surgery. These findings suggest that optimizing ventricular function should be a goal during the perioperative period.
Subject(s)
Cardiac Surgical Procedures , Natriuretic Peptide, Brain/blood , Ventilator Weaning , Adult , Age Factors , Aged , Biomarkers/blood , Epidemiologic Methods , Female , Hemodynamics , Humans , Male , Middle Aged , Postoperative Period , Predictive Value of Tests , Respiratory Function Tests , Risk Assessment , Sex Factors , Time Factors , Treatment Failure , Ventricular Dysfunction/physiopathologyABSTRACT
OBJECTIVE: The standard therapy for patients with high-level spinal cord injury is long-term mechanical ventilation through a tracheostomy. However, in some cases, this approach results in death or disability. The aim of this study is to highlight the anesthetics and perioperative aspects of patients undergoing insertion of a diaphragmatic pacemaker. METHODS: Five patients with quadriplegia following high cervical traumatic spinal cord injury and ventilator-dependent chronic respiratory failure were implanted with a laparoscopic diaphragmatic pacemaker after preoperative assessments of their phrenic nerve function and diaphragm contractility through transcutaneous nerve stimulation. ClinicalTrials.gov: NCT01385384. RESULTS: The diaphragmatic pacemaker placement was successful in all of the patients. Two patients presented with capnothorax during the perioperative period, which resolved without consequences. After six months, three patients achieved continuous use of the diaphragm pacing system, and one patient could be removed from mechanical ventilation for more than 4 hours per day. CONCLUSIONS: The implantation of a diaphragmatic phrenic system is a new and safe technique with potential to improve the quality of life of patients who are dependent on mechanical ventilation because of spinal cord injuries. Appropriate indication and adequate perioperative care are fundamental to achieving better results.
Subject(s)
Anesthesia/methods , Diaphragm , Electric Stimulation Therapy/methods , Pacemaker, Artificial , Prosthesis Implantation/methods , Respiration, Artificial/methods , Spinal Cord Injuries , Adolescent , Adult , Female , Humans , Laparoscopy/methods , Male , Perioperative Care/methods , Perioperative Period , Quadriplegia/therapy , Time Factors , Treatment Outcome , Young AdultABSTRACT
OBJECTIVE: Cancer patients frequently require admission to intensive care unit. However, there are a few data regarding predictive factors for mortality in this group of patients. The aim of this study was to evaluate whether arterial lactate or standard base deficit on admission and after 24 hours can predict mortality for patients with cancer. METHODS: We evaluated 1,129 patients with severe sepsis, septic shock, or postoperative after high-risk surgery. Lactate and standard base deficit collected at admission and after 24 hours were compared between survivors and non-survivors. We evaluated whether these perfusion markers are independent predictors of mortality. RESULTS: There were 854 hospital survivors (76.5%). 24 h lactate > 1.9 mmol/L and standard base deficit < -2.3 were independent predictors of intensive care unit mortality. 24 h lactate >1.9 mmol/L and 24 h standard base deficit < -2.3 mmol/Lwere independent predictors of hospital death. CONCLUSION: Our findings suggest that lactate and standard base deficit measurement should be included in the routine assessment of patients with cancer admitted to the intensive care unit with sepsis, septic shock or after high-risk surgery. These markers may be useful in the adequate allocation of resources in this population.