Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
1.
Enferm. intensiva (Ed. impr.) ; 29(3): 121-127, jul.-sept. 2018. tab, graf
Article in Spanish | IBECS | ID: ibc-182123

ABSTRACT

La contaminación de hemocultivos puede ocurrir desde la extracción al procesamiento, y su tasa no debería exceder del 3%. Objetivo: Evaluar el impacto de una acción formativa sobre la tasa de hemocultivos contaminados tras la instauración de recomendaciones de extracción de muestras basadas en la mejor evidencia. Método: Estudio prospectivo antes-después en una unidad de cuidados intensivos polivalente de 18 camas. Se establecieron dos fases (enero-junio 2012, octubre 2012-octubre 2015) con un período formativo entre ellas. Principales recomendaciones: técnica estéril, mascarilla quirúrgica, doble desinfección de piel (alcohol 70° y clorhexidina alcohólica 2%), desinfección con alcohol 70° de tapones de frascos de cultivo e inyección de muestras sin cambiar aguja. Incluidos todos los hemocultivos de pacientes con solicitud facultativa de extracción. Variables: demográficas, gravedad, patología, motivo de ingreso, estancia y resultados de hemocultivos (negativo, positivo y contaminado). Estadística descriptiva básica: media (desviación estándar), mediana (rango intercuartílico) o porcentaje (intervalo de confianza del 95%). Calculadas tasas de contaminación por 100 hemocultivos extraídos. Análisis bivariado entre períodos. Resultados: Incluidos 458 pacientes. Extraídos 841 hemocultivos, 33 de ellos contaminados. En las variables demográficas, gravedad, diagnóstico y estancia en pacientes con contaminación de la muestra, no se observaron diferencias con no contaminados. Tasas de contaminación pre-formación vs post-formación: 14 vs 5,6 por 100 hemocultivos extraídos (p = 0,00003). Conclusión: Una acción formativa basada en la evidencia ha reducido la contaminación de las muestras. Es necesario seguir trabajando en la planificación de actividades y cuidados para mejorar la detección de contaminantes y prevenir la contaminación de las mismas


Blood culture contamination can occur from extraction to processing; its rate should not exceed 3%. Objective: To evaluate the impact of a training programme on the rate of contaminated blood cultures after the implementation of sample extraction recommendations based on the best evidence. Method: Prospective before-after study in a polyvalent intensive care unit with 18 beds. Two phases were established (January-June 2012, October 2012-October 2015) with a training period between them. Main recommendations: sterile technique, surgical mask, double skin disinfection (70° alcohol and 2% alcoholic chlorhexidine), 70° alcohol disinfection of culture flasks and injection of samples without changing needles. Including all blood cultures of patients with extraction request. Variables: demographic, severity, pathology, reason for admission, stay and results of blood cultures (negative, positive and contaminated). Basic descriptive statistics: mean (standard deviation), median (interquartile range) and percentage (95% confidence interval). Calculated contamination rates per 100 blood cultures extracted. Bivariate analysis between periods. Results: Four hundred and eight patients were included. Eight hundred and forty-one blood cultures were taken, 33 of which were contaminated. In the demographic variables, severity, diagnosis and stay of patients with contaminated samples, no differences were observed from those with uncontaminated samples. Pre-training vs post-training contamination rates: 14 vs 5.6 per 100 blood cultures extracted (P = .00003). Conclusion: An evidence-based training programme reduced the contamination of samples. It is necessary to continue working on the planning of activities and care to improve the detection of pollutants and prevent contamination of samples


Subject(s)
Humans , Male , Female , Middle Aged , Aged , Blood/microbiology , Blood Culture/standards , Blood Specimen Collection/standards , Critical Care , Critical Care Nursing/education , Intensive Care Units , Program Evaluation , Prospective Studies
2.
Enferm Intensiva (Engl Ed) ; 29(3): 121-127, 2018.
Article in English, Spanish | MEDLINE | ID: mdl-29609850

ABSTRACT

Blood culture contamination can occur from extraction to processing; its rate should not exceed 3%. OBJECTIVE: To evaluate the impact of a training programme on the rate of contaminated blood cultures after the implementation of sample extraction recommendations based on the best evidence. METHOD: Prospective before-after study in a polyvalent intensive care unit with 18 beds. Two phases were established (January-June 2012, October 2012-October 2015) with a training period between them. Main recommendations: sterile technique, surgical mask, double skin disinfection (70° alcohol and 2% alcoholic chlorhexidine), 70° alcohol disinfection of culture flasks and injection of samples without changing needles. Including all blood cultures of patients with extraction request. VARIABLES: demographic, severity, pathology, reason for admission, stay and results of blood cultures (negative, positive and contaminated). Basic descriptive statistics: mean (standard deviation), median (interquartile range) and percentage (95% confidence interval). Calculated contamination rates per 100 blood cultures extracted. Bivariate analysis between periods. RESULTS: Four hundred and eight patients were included. Eight hundred and forty-one blood cultures were taken, 33 of which were contaminated. In the demographic variables, severity, diagnosis and stay of patients with contaminated samples, no differences were observed from those with uncontaminated samples. Pre-training vs post-training contamination rates: 14 vs 5.6 per 100 blood cultures extracted (P=.00003). CONCLUSION: An evidence-based training programme reduced the contamination of samples. It is necessary to continue working on the planning of activities and care to improve the detection of pollutants and prevent contamination of samples.


Subject(s)
Blood Culture/standards , Blood Specimen Collection/standards , Blood/microbiology , Critical Care Nursing/education , Critical Care , Female , Humans , Intensive Care Units , Male , Middle Aged , Program Evaluation , Prospective Studies
8.
Med. intensiva (Madr., Ed. impr.) ; 36(9): 626-633, dic. 2012. ilus
Article in Spanish | IBECS | ID: ibc-110100

ABSTRACT

Introducción: El daño renal agudo (DRA) es un síndrome frecuente en el paciente hospitalizado. Los factores de riesgo asociados a su desarrollo y evolución clásicamente aceptados se encuentran en relación con el ambiente o la enfermedad de base del paciente. Sin embargo, en los últimos años se ha reconocido la influencia de los factores genéticos. Objetivo: Analizar la influencia de los polimorfismos genéticos en el riesgo de presentar y en la evolución del DRA. Fuente de datos: búsqueda electrónica en MEDLINE. Selección de estudios: manuscritos redactados en idioma inglés o español, publicados entre el 1/1/1995 y el 31/5/2011 y que analizaron la asociación entre polimorfismos genéticos y: (a) susceptibilidad a DRA entre pacientes versus controles sanos o entre diferentes grupos de pacientes; (b) gravedad del DRA. Criterios de exclusión: estudios publicados solo en forma de resumen, casos clínicos o estudios que incluyeran pacientes menores de 16 años, en diálisis crónica o con trasplante renal. Extracción de datos: al menos uno de los investigadores analizó cada artículo mediante formulario predefinido. Resultados: Se encontraron 12 trabajos que incluyeron 4.835 pacientes. Once genes contienen polimorfismos asociados a la susceptibilidad o gravedad del DRA. Hemos clasificado estos genes de acuerdo con su función en aquellos que participan en la respuesta hemodinámica (ACE, eNOS, FNMT y COMT), respuesta inflamatoria (TNFα, IL10, IL6, HIP-1A, EPO), estrés oxidativo (NAPH oxidasa) y en el metabolismo lipídico (APOE). Solo los genes de APOE, ACE y receptor AT1 han sido analizados en más de un estudio. Conclusión: La susceptibilidad y gravedad del DRA están relacionadas con factores genéticos que están implicados en distintos mecanismos fisiopatológicos (AU)


Introduction: Acute renal damage (ARD) is a frequent syndrome in hospitalized patients. It is well accepted that ARD susceptibility and outcome are related to environmental risk factors and to the patient premorbid status. Recently, host factors have also been recognized as important in ARD predisposition and evolution. Objective: To analyze genetic influences related to the risk and severity of ARD. Data sour MEDLINE search. Selection of studies: articles published in English or Spanish between 1/1/1995 and 31/5/2011, analyzing the association between genic polymorphisms and (a) ARD susceptibility in patients versus healthy controls or within groups of patients; or (b) ARD severity. Exclusion criteria: studies published only in abstract form, case reports or including patients less than 16 years of age, on chronic dialysis or having received a renal transplant. Dataextraction: at least one investigator analyzed each manuscript and collected the information using a predefined form. Results: We identified 12 relevant studies that included 4835 patients. Eleven genes showed polymorphisms related to ARD susceptibility or severity. They were related to cardiovascular regulation (ACE I/D, eNOS, FNMT and COMT), inflammatory response (TNFα, IL10, IL6, HIP-1α, EPO), oxidative stress (NAPH oxidase) and lipid metabolism (APO E). Only APO E, ACE and AT1 receptor have been analyzed in more than one study. Conclusion: ARD susceptibility and severity is influenced by genetic factors, which are multiple and involve different physiopathological mechanisms (AU)


Subject(s)
Humans , Acute Kidney Injury/genetics , Renal Insufficiency/genetics , Polymorphism, Genetic , Genetic Techniques , Genetic Predisposition to Disease/genetics , Genetic Markers
9.
Med Intensiva ; 36(9): 626-33, 2012 Dec.
Article in Spanish | MEDLINE | ID: mdl-22436318

ABSTRACT

INTRODUCTION: Acute renal damage (ARD) is a frequent syndrome in hospitalized patients. It is well accepted that ARD susceptibility and outcome are related to environmental risk factors and to the patient premorbid status. Recently, host factors have also been recognized as important in ARD predisposition and evolution. OBJECTIVE: To analyze genetic influences related to the risk and severity of ARD. DATA SOURCE: MEDLINE search. SELECTION OF STUDIES: articles published in English or Spanish between 1/1/1995 and 31/5/2011, analyzing the association between genic polymorphisms and (a) ARD susceptibility in patients versus healthy controls or within groups of patients; or (b) ARD severity. EXCLUSION CRITERIA: studies published only in abstract form, case reports or including patients less than 16 years of age, on chronic dialysis or having received a renal transplant. DATA EXTRACTION: at least one investigator analyzed each manuscript and collected the information using a predefined form. RESULTS: We identified 12 relevant studies that included 4835 patients. Eleven genes showed polymorphisms related to ARD susceptibility or severity. They were related to cardiovascular regulation (ACE I/D, eNOS, FNMT and COMT), inflammatory response (TNFα, IL10, IL6, HIP-1α, EPO), oxidative stress (NAPH oxidase) and lipid metabolism (APO E). Only APO E, ACE and AT1 receptor have been analyzed in more than one study. CONCLUSION: ARD susceptibility and severity is influenced by genetic factors, which are multiple and involve different physiopathological mechanisms.


Subject(s)
Acute Kidney Injury/genetics , Genetic Predisposition to Disease , Humans , Polymorphism, Genetic , Prognosis , Risk Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...