RESUMO
BACKGROUND AND OBJECTIVE: Infective endocarditis (IE) is an infection with a poor prognosis, and an associated in-hospital mortality of at least 25%. Optimal therapy of IE requires long-term effective antibiotic therapy and valve surgery in many cases. The aim of this study was to review the demographics, bacteriology, and outcomes of patients with IE admitted to a tertiary referral center in Mexico City, over a 10-year period. METHODS: Retrospective cohort study of patients admitted at Instituto Nacional Salvador Zubiran with a new diagnosis of IE over a 10-year period, from January 2009 to January 2019. Patients who met the definition for definitive diagnosis of infective endocarditis according to the modified Duke criteria were included in the study. RESULTS: There were 62 patients (50.85 ± 17.46 years, 40.3% females) with IE. The culprit microorganism was identified in all cases, with Staphylococcus aureus being the most frequently found (34%). Valve surgery was performed in 58.1%, while 41.9% only received medical treatment. The mortality rate was 25.8% at 30 days and 41.9% at 12 months. Comparing the surgical and medical treatment groups, we found that 50% and 36% in each group, respectively, had died within 12 months of admission. CONCLUSIONS: Our center has a high prevalence of health care-associated endocarditis, mostly related to the presence of intravascular access devices. Most of the patients had a surgical indication. Patients with type 2 diabetes mellitus and decreased right ventricular systolic function had an increased mortality rate at 12 months.
Antecedentes y objetivo: La endocarditis infecciosa (EI) es una infección de mal pronóstico, con una mortalidad intrahospitalaria que va del 15-20%. La terapia óptima requiere antibioticoterapia efectiva por tiempo prolongado y cirugía valvular en algunos casos. El objetivo de este estudio fue revisar la epidemiología y desenlaces de pacientes con EI en un centro de referencia en la Ciudad de México. Métodos: Cohorte retrospectiva de pacientes admitidos al Instituto Nacional de Ciencias Médicas y Nutrición Salvador Zubirán con diagnóstico de EI en un periodo de 10 años, de enero de 2009 a enero de 2019. Se incluyeron a pacientes que cumplían la definición de diagnóstico definitivo de endocarditis infecciosa de acuerdo a los criterios modificados de Duke. Resultados: Se incluyeron a 62 pacientes (50.85 ± 17.46 años, 40.3% mujeres). Todos los casos tuvieron cultivos positivos, siendo S. aureus el microorganismo más frecuente (34%). El 58.1% de los pacientes recibió tratamiento quirúrgico y 41.9% recibió únicamente tratamiento médico. La mortalidad a 30 días fue de 25.8% y a 12 meses fue de 41.9%. Comparando los grupos de tratamiento médico y quirúrgico, se encontró que 50% y 36% de cada grupo, respectivamente, habían fallecido a los 12 meses. Conclusiones: Encontramos una alta prevalencia de EI asociada a los cuidados de la salud, principalmente en relación a accesos intravasculares. Casi todos los pacientes presentaban un criterio quirúrgico al momento del diagnóstico. Los pacientes con diabetes mellitus tipo 2 y función sistólica del ventrículo derecho disminuida presentaron una mayor mortalidad a 12 meses.
Assuntos
Diabetes Mellitus Tipo 2 , Endocardite Bacteriana , Endocardite , Infecções Estafilocócicas , Endocardite/diagnóstico , Endocardite/epidemiologia , Endocardite/terapia , Endocardite Bacteriana/diagnóstico , Endocardite Bacteriana/epidemiologia , Endocardite Bacteriana/terapia , Feminino , Mortalidade Hospitalar , Humanos , Masculino , Estudos RetrospectivosRESUMO
Background: Antimalarial drugs were widely used as experimental therapies against COVID-19 in the initial stages of the pandemic. Despite multiple randomized controlled trials demonstrating unfavorable outcomes in both efficacy and adverse effects, antimalarial drugs are still prescribed in developing countries, especially in those experiencing recurrent COVID-19 crises (India and Brazil). Therefore, real-life experience and pharmacovigilance studies describing the use and side effects of antimalarials for COVID-19 in developing countries are still relevant. Objective: To describe the adverse effects associated with the use of antimalarial drugs in hospitalized patients with COVID-19 pneumonia at a reference center in Mexico City. Methods: We integrated a retrospective cohort with all adult patients hospitalized for COVID-19 pneumonia from March 13th, 2020, to May 17th, 2020. We compared the baseline characteristics (demographic and clinical) and the adverse effects between the groups of patients treated with and without antimalarial drugs. The mortality analysis was performed in 491 patients who received optimal care and were not transferred to other institutions (210 from the antimalarial group and 281 from the other group). Results: We included 626 patients from whom 38% (n = 235) received an antimalarial drug. The mean age was 51.2 ± 13.6 years, and 64% were males. At baseline, compared with the group treated with antimalarials, the group that did not receive antimalarials had more dyspnea (82 vs. 73%, p = 0.017) and cyanosis (5.3 vs. 0.9%, p = 0.009), higher respiratory rate (median of 28 vs. 24 bpm, p < 0.001), and lower oxygen saturation (median of 83 vs. 87%, p < 0.001). In the group treated with antimalarials, 120 patients had two EKG evaluations, from whom 12% (n = 16) prolonged their QTc from baseline in more than 50 ms, and six developed a ventricular arrhythmia. Regarding the trajectories of the liver function tests over time, no significant differences were found for the change in the mean value per day between the two groups. Among patients who received optimal care, the mortality was 16% (33/210) in those treated with antimalarials and 15% (41/281) in those not receiving antimalarials (RR 1.08, 95% 0.75-1.64, and adjusted RR 1.12, 95% CI 0.69-1.82). Conclusion: The adverse events in patients with COVID-19 treated with antimalarials were similar to those who did not receive antimalarials at institutions with rigorous pharmacological surveillance. However, they do not improve survival in patients who receive optimal medical care.
RESUMO
Resumen: La lesión renal aguda es una de las entidades más comunes en el Área de Terapia Intensiva, llega a presentarse hasta en 50% de los enfermos que ingresan a la Unidad de Terapia Intensiva. Se han desarrollado herramientas de detección de riesgo de lesión renal aguda, las cuales utilizan información de rutina altamente accesible (AKI predictor). El objetivo es determinar el desempeño de AKI predictor en la detección de daño renal y requerimiento de terapia sustitutiva renal (TSR) en pacientes adultos admitidos a la Unidad de Terapia Intensiva. Material y métodos: Se recolectaron de manera retrospectiva variables demográficas, bioquímicas, clínicas y el valor de AKI predictor al ingreso y 24 horas de cada paciente admitido en un periodo de ocho meses; asimismo, se determinó el concepto de daño renal definido como requerimiento de TSR y/o progresión de daño renal definida como un incremento de la creatinina ≥ 0.3 mg/dL en 24 horas y/o presencia de diuresis < 0.5 mL/kg/hora a las 48 horas. Se construyeron curvas receiver operating characteristics con el fin de determinar el desempeño del AKI predictor en la detección de daño renal o requerimiento de TSR por separado. Resultados: Se incluyó un total de 95 pacientes, donde aquéllos con daño renal mostraron mayor gravedad de la enfermedad por Sequential Organ Failure Assessment score, mayor proporción de pacientes con sepsis, uso de vasopresores, mortalidad y estancia en la Unidad de Terapia Intensiva. La herramienta AKI predictor calculada al ingreso mostró un área bajo la curva significativa de 0.76 para le detección de daño renal, 0.85 para requerimiento de TSR y calculada a las 24 horas un área bajo la curva significativa de 0.91 para detección de TSR. Conclusiones: La herramienta AKI predictor se muestra como una opción viable en la práctica diaria para la evaluación dinámica de aquellos pacientes que muestren progresión del daño renal, dejando como última consecuencia el uso de TSR.
Abstract: Acute kidney injury (AKI) is one of the most common entities in the intensive care area. It occurs in up to 50% of patients admitted to the intensive care unit (ICU). New tools to detect AKI risk using highly accessible routine information have been developed (AKI predictor). The goal is to determine the performance of the AKI predictor tool to detect renal damage and renal replacement therapy (RRT) requirement in adult patients admitted to the ICU. Material and methods: Demographic, biochemical, clinical variables and AKI predictor percentages at admission and at 24 hours were retrospectively collected of every patient admitted in an 8 months period, likewise the concept of renal damage was determined, defined as requirement of RRT and/or an increment of creatinine ≥ 0.3 mg/dL in 24 hours and/or urine output < 0.5 mL/kg/h at 48 hours. Receiver operating characteristics curves were developed in order to determine the performance of the AKI predictor to detect renal damage or RRT requirement by separate. Results: 95 patients were included in the analysis, those with renal damage showed higher illness severity by Sequential Organ Failure Assessment score, higher proportion of these patients presented sepsis, need for vasopressors, mortality and longer UCI stay. AKI predictor tool showed a significant area under the curve (AUC) of 0.76 for renal damage detection, 0.85 for requirement of RRT and calculated at 24 hours an AUC of 0.91 for RRT requirement. Conclusions: The AKI predictor tool it is shown as a viable option in daily practice for the dynamic evaluation of those patients who will show renal damage progression until its final consequence RRT.
Resumo: Introdução: A lesão renal aguda (LRA) é uma das entidades mais comuns na área de terapia intensiva, podendo ocorrer em até 50% dos pacientes internados em unidade de terapia intensiva (UTI). Foram desenvolvidas ferramentas de detecção de risco para o LRA, que usam informações de rotina altamente acessíveis (preditor de AKI). Objetivo: Determinar o desempenho do preditor de LRA na detecção de dano renal e a necessidade de terapia renal substitutiva (TRS) em pacientes adultos admitidos na UTI. Métodos: Variáveis clínicas, demográficas, bioquímicas e o valor do preditor de IRA foram coletados retrospectivamente na admissão e 24 horas de cada paciente em um período de 8 meses, bem como o conceito de dano renal definido como requisito de TRS e/ou progressão do dano renal definido como aumento da creatinina ≥ 0.3 mg/dL em 24 horas e/ou presença de diurese < 0.5 mL/kg/hora às 48 horas. As curvas Receiver Operating Characteristics foram construídas com o objetivo de determinar o desempenho do preditor de AKI na detecção de dano renal ou exigência de TSR separadamente. Resultados: Foram incluídos 95 pacientes, sendo que aqueles com lesão renal apresentaram maior gravidade da doença devido ao escore Sequential Organ Failure Assessment, maior proporção de pacientes com sepse, uso de vasopressores, mortalidade e permanência na UTI. O instrumento preditor AKI calculado na admissão, mostrou uma área significativa sob a curva (ABC) de 0.76 para a detecção de dano renal, 0.85 para a necessidade de TSR e calculada às 24 horas um ABC significativa de 0.91 para a detecção de TSR. Conclusão: O instrumento preditor de AKI é apresentado como uma opção viável na prática diária para a avaliação dinâmica daqueles pacientes que apresentarão progressão da lesão renal até a última consequência do uso da TRS.