Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 44
Filter
1.
Clin Microbiol Infect ; 26(5): 563-571, 2020 May.
Article in English | MEDLINE | ID: mdl-31586658

ABSTRACT

OBJECTIVES: The prevention of catheter-related bloodstream infection (CRBSI) has been an area of intense research, but the heterogeneity of endpoints used to define catheter infection makes the interpretation of randomized controlled trials (RCTs) problematic. The aim of this study was to determine the validity of different endpoints for central venous catheter infections. DATA SOURCES: (a) Individual-catheter data were collected from 9428 catheters from four large RCTs; (b) study-level data from 70 RCTs were identified with a systematic search. Eligible studies were RCTs published between January 1987 and October 2018 investigating various interventions to reduce infections from short-term central venous catheters or short-term dialysis catheters. For each RCT the prevalence rates of CRBSI, quantitative catheter tip colonization, catheter-associated infection (CAI) and central line-associated bloodstream infection (CLABSI) were extracted for each randomized study arm. METHODS: CRBSI was used as the gold-standard endpoint, for which colonization, CAI and CLABSI were evaluated as surrogate endpoints. Surrogate validity was assessed as (1) the individual partial coefficient of determination (individual-pR2) using individual catheter data; (2) the coefficient of determination (study-R2) from mixed-effect models regressing the therapeutic effect size of the surrogates on the effect size of CRBSI, using study-level data. RESULTS: Colonization showed poor agreement with CRBSI at the individual-patient level (pR2 = 0.33 95% CI 0.28-0.38) and poor capture at the study level (R2 = 0.42, 95% CI 0.21-0.58). CAI showed good agreement with CRBSI at the individual-patient level (pR2 = 0.80, 95% CI 0.76-0.83) and moderate capture at the study level (R2 = 0.71, 95% CI 0.51-0.85). CLABSI showed poor agreement with CRBSI at the individual patient level (pR2 = 0.34, 95% CI 0.23-0.46) and poor capture at the study level (R2 = 0.28, 95% CI 0.07-0.76). CONCLUSIONS: CAI is a moderate to good surrogate endpoint for CRBSI. Colonization and CLABSI do not reliably reflect treatment effects on CRBSI and are consequently more suitable for surveillance than for clinical effectiveness research.


Subject(s)
Bacteremia/diagnosis , Biomarkers/analysis , Catheter-Related Infections/diagnosis , Central Venous Catheters/adverse effects , Bacteremia/therapy , Catheter-Related Infections/therapy , Central Venous Catheters/microbiology , Humans , Network Meta-Analysis , Patient Outcome Assessment , Practice Guidelines as Topic , Reproducibility of Results
2.
Clin Nutr ; 38(6): 2623-2631, 2019 12.
Article in English | MEDLINE | ID: mdl-30595377

ABSTRACT

BACKGROUND & AIMS: High protein delivery during early critical illness is associated with lower mortality, while energy overfeeding is associated with higher mortality. Protein-to-energy ratios of traditional enteral formulae are sometimes too low to reach protein targets without energy overfeeding. This prospective feasibility study aimed to evaluate the ability of a new enteral formula with a high protein-to-energy ratio to achieve the desired protein target while avoiding energy overfeeding. METHODS: Mechanically ventilated non-septic patients received the high protein-to-energy ratio nutrition during the first 4 days of ICU stay (n = 20). Nutritional prescription was 90% of measured energy expenditure. Primary endpoint was the percentage of patients reaching a protein target of ≥1.2 g/kg ideal body weight on day 4. Other endpoints included a comparison of nutritional intake to matched historic controls and the response of plasma amino acid concentrations. Safety endpoints were gastro-intestinal tolerance and plasma urea concentrations. RESULTS: Nineteen (95%) patients reached the protein intake target of ≥1.2 g/kg ideal body weight on day 4, compared to 65% in historic controls (p = 0.024). Mean plasma concentrations of all essential amino acids increased significantly from baseline to day 4. Predefined gastro-intestinal tolerance was good, but unexplained foul smelling diarrhoea occurred in two patients. In one patient plasma urea increased unrelated to acute kidney injury. CONCLUSIONS: In selected non-septic patients tolerating enteral nutrition, recommended protein targets can be achieved without energy overfeeding using a new high protein-to-energy ratio enteral nutrition.


Subject(s)
Critical Care/methods , Critical Illness/therapy , Energy Intake/physiology , Nutritional Status/physiology , Adult , Aged , Amino Acids/blood , Dietary Proteins/administration & dosage , Feasibility Studies , Female , Humans , Male , Middle Aged , Overnutrition/prevention & control , Prospective Studies
3.
BJA Educ ; 19(9): 290-296, 2019 Sep.
Article in English | MEDLINE | ID: mdl-33456905
4.
PLoS One ; 12(8): e0182637, 2017.
Article in English | MEDLINE | ID: mdl-28796814

ABSTRACT

Hospitalized patients often receive oxygen supplementation, which can lead to a supraphysiological oxygen tension (hyperoxia). Hyperoxia can have hemodynamic effects, including an increase in systemic vascular resistance. This increase suggests hyperoxia-induced vasoconstriction, yet reported direct effects of hyperoxia on vessel tone have been inconsistent. Furthermore, hyperoxia-induced changes in vessel diameter have not been studied in mice, currently the most used mammal model of disease. In this study we set out to develop a pressure-myograph model using isolated vessels from mice for investigation of pathways involved in hyperoxic vasoconstriction. Isolated conduit and resistance arteries (femoral artery and gracilis arteriole, respectively) from C57BL/6 mice were exposed to normoxia (PO2 of 80 mmHg) and three levels of hyperoxia (PO2 of 215, 375 and 665 mmHg) in a no-flow pressure myograph setup. Under the different PO2 levels, dose-response agonist induced endothelium-dependent vasodilation (acetylcholine, arachidonic acid), endothelium-independent vasodilation (s-nitroprusside), as well as vasoconstriction (norepinephrine, prostaglandin F2α) were examined. The investigated arteries did not respond to oxygen by a change in vascular tone. In the dose-response studies, maximal responses and EC50 values to any of the aforementioned agonists were not affected by hyperoxia either. We conclude that arteries and arterioles from healthy mice are not intrinsically sensitive to hyperoxic conditions. The present ex-vivo model is therefore not suitable for further research into mechanisms of hyperoxic vasoconstriction.


Subject(s)
Femoral Artery/physiopathology , Hyperoxia/physiopathology , Acetylcholine/pharmacology , Animals , Arachidonic Acid/pharmacology , Drug Evaluation, Preclinical , Femoral Artery/drug effects , Male , Mice, Inbred C57BL , Muscle, Smooth, Vascular/drug effects , Muscle, Smooth, Vascular/physiopathology , Nitroprusside/pharmacology , Norepinephrine/pharmacology , Oxygen/pharmacology , Vasoconstriction , Vasoconstrictor Agents/pharmacology , Vasodilation , Vasodilator Agents/pharmacokinetics
5.
Intensive Care Med ; 43(6): 730-749, 2017 Jun.
Article in English | MEDLINE | ID: mdl-28577069

ABSTRACT

BACKGROUND: Acute kidney injury (AKI) in the intensive care unit is associated with significant mortality and morbidity. OBJECTIVES: To determine and update previous recommendations for the prevention of AKI, specifically the role of fluids, diuretics, inotropes, vasopressors/vasodilators, hormonal and nutritional interventions, sedatives, statins, remote ischaemic preconditioning and care bundles. METHOD: A systematic search of the literature was performed for studies published between 1966 and March 2017 using these potential protective strategies in adult patients at risk of AKI. The following clinical conditions were considered: major surgery, critical illness, sepsis, shock, exposure to potentially nephrotoxic drugs and radiocontrast. Clinical endpoints included incidence or grade of AKI, the need for renal replacement therapy and mortality. Studies were graded according to the international GRADE system. RESULTS: We formulated 12 recommendations, 13 suggestions and seven best practice statements. The few strong recommendations with high-level evidence are mostly against the intervention in question (starches, low-dose dopamine, statins in cardiac surgery). Strong recommendations with lower-level evidence include controlled fluid resuscitation with crystalloids, avoiding fluid overload, titration of norepinephrine to a target MAP of 65-70 mmHg (unless chronic hypertension) and not using diuretics or levosimendan for kidney protection solely. CONCLUSION: The results of recent randomised controlled trials have allowed the formulation of new recommendations and/or increase the strength of previous recommendations. On the other hand, in many domains the available evidence remains insufficient, resulting from the limited quality of the clinical trials and the poor reporting of kidney outcomes.


Subject(s)
Acute Kidney Injury/prevention & control , Acute Kidney Injury/therapy , Critical Care/standards , Practice Guidelines as Topic , Adult , Aged , Aged, 80 and over , Europe , Female , Humans , Male , Middle Aged
6.
Intensive Care Med ; 43(6): 855-866, 2017 Jun.
Article in English | MEDLINE | ID: mdl-28466146

ABSTRACT

Acute kidney injury (AKI) is a frequent complication of critical illness and carries a significant risk of short- and long-term mortality, chronic kidney disease (CKD) and cardiovascular events. The degree of renal recovery from AKI may substantially affect these long-term endpoints. Therefore maximising recovery of renal function should be the goal of any AKI prevention and treatment strategy. Defining renal recovery is far from straightforward due in part to the limitations of the tests available to assess renal function. Here, we discuss common pitfalls in the evaluation of renal recovery and provide suggestions for improved assessment in the future. We review the epidemiology of renal recovery and of the association between AKI and the development of CKD. Finally, we stress the importance of post-discharge follow-up of AKI patients and make suggestions for its incorporation into clinical practice. Summary key points are that risk factors for non-recovery of AKI are age, CKD, comorbidity, higher severity of AKI and acute disease scores. Second, AKI and CKD are mutually related and seem to have a common denominator. Third, despite its limitations full recovery of AKI may best be defined as the absence of AKI criteria, and partial recovery as a fall in AKI stage. Fourth, after an episode of AKI, serial follow-up measurements of serum creatinine and proteinuria are warranted to diagnose renal impairment and prevent further progression. Measures to promote recovery are similar to those preventing renal harm. Specific interventions promoting repair are still experimental.


Subject(s)
Acute Kidney Injury/therapy , Creatinine/blood , Critical Illness/therapy , Kidney/physiopathology , Recovery of Function , Renal Insufficiency, Chronic/therapy , Humans , Kidney Function Tests
7.
J Crit Care ; 39: 199-204, 2017 06.
Article in English | MEDLINE | ID: mdl-28279497

ABSTRACT

BACKGROUND: Concerns have been expressed regarding a possible association between arterial hyperoxia and adverse outcomes in critically ill patients. Oxygen status is commonly monitored noninvasively by peripheral saturation monitoring (SpO2). However, the risk of hyperoxia above specific SpO2 levels in critically ill patients is unknown. The purpose of this study was to determine a threshold value of SpO2 above which the prevalence of arterial hyperoxia distinctly increases. METHODS: This is a cross-sectional study in adult mechanically ventilated intensive care patients in a tertiary referral center. In 100 patients, we collected 200 arterial blood gases (ABG) and simultaneously registered SpO2 levels, as well as hemodynamic and ventilation parameters and vasoactive medication. Patients under therapeutic hypothermia were excluded. RESULTS: The risk of arterial hyperoxia, defined as PaO2>100mmHg or >125mmHg, was negligible when SpO2 was ≤95% or ≤96%, respectively. The majority (89% and 54%, respectively for PaO2>100mmHg and 125mmHg) of ICU patients with SpO2 of 100% had arterial hyperoxia. The relation between SpO2 and PaO2 was not clearly affected by hemodynamic or other clinical variables (pH, pCO2, body temperature, recent blood transfusion). CONCLUSION: In critically ill patients, the prevalence of arterial hyperoxia increases when SpO2 is >95%. Above this saturation level, supplemental oxygen should be administered with caution in patients potentially susceptible to adverse effects of hyperoxia.


Subject(s)
Hyperoxia/diagnosis , Hyperoxia/prevention & control , Oximetry/methods , Oxygen/blood , Respiration, Artificial/adverse effects , Adult , Aged , Blood Gas Analysis , Critical Care , Critical Illness , Cross-Sectional Studies , Female , Humans , Intensive Care Units , Male , Middle Aged , Monitoring, Physiologic , Patient Admission , Prospective Studies , Treatment Outcome
8.
Intensive care med ; 43(3)Mar. 2017. tab
Article in English | BIGG - GRADE guidelines | ID: biblio-948580

ABSTRACT

PURPOSE: To provide evidence-based guidelines for early enteral nutrition (EEN) during critical illness. METHODS: We aimed to compare EEN vs. early parenteral nutrition (PN) and vs. delayed EN. We defined "early" EN as EN started within 48 h independent of type or amount. We listed, a priori, conditions in which EN is often delayed, and performed systematic reviews in 24 such subtopics. If sufficient evidence was available, we performed meta-analyses; if not, we qualitatively summarized the evidence and based our recommendations on expert opinion. We used the GRADE approach for guideline development. The final recommendations were compiled via Delphi rounds. RESULTS: We formulated 17 recommendations favouring initiation of EEN and seven recommendations favouring delaying EN. We performed five meta-analyses: in unselected critically ill patients, and specifically in traumatic brain injury, severe acute pancreatitis, gastrointestinal (GI) surgery and abdominal trauma. EEN reduced infectious complications in unselected critically ill patients, in patients with severe acute pancreatitis, and after GI surgery. We did not detect any evidence of superiority for early PN or delayed EN over EEN. All recommendations are weak because of the low quality of evidence, with several based only on expert opinion. CONCLUSIONS: We suggest using EEN in the majority of critically ill under certain precautions. In the absence of evidence, we suggest delaying EN in critically ill patients with uncontrolled shock, uncontrolled hypoxaemia and acidosis, uncontrolled upper GI bleeding, gastric aspirate >500 ml/6 h, bowel ischaemia, bowel obstruction, abdominal compartment syndrome, and high-output fistula without distal feeding access.


Subject(s)
Humans , Catastrophic Illness/therapy , Critical Illness/therapy , Enteral Nutrition/standards , Time Factors , GRADE Approach
9.
Anaesthesia ; 70(11): 1307-19, 2015 Nov.
Article in English | MEDLINE | ID: mdl-26348878

ABSTRACT

During and after cardiac surgery with cardiopulmonary bypass, high concentrations of oxygen are routinely administered, with the intention of preventing cellular hypoxia. We systematically reviewed the literature addressing the effects of arterial hyperoxia. Extensive evidence from pre-clinical experiments and clinical studies in other patient groups suggests predominant harm, caused by oxidative stress, vasoconstriction, perfusion heterogeneity and myocardial injury. Whether these alterations are temporary and benign, or actually affect clinical outcome, remains to be demonstrated. In nine clinical cardiac surgical studies in low-risk patients, higher oxygen targets tended to compromise cardiovascular function, but did not affect clinical outcome. No data about potential beneficial effects of hyperoxia, such as reduction of gas micro-emboli or post-cardiac surgery infections, were reported. Current evidence is insufficient to specify optimal oxygen targets. Nevertheless, the safety of supraphysiological oxygen suppletion is unproven. Randomised studies with a variety of oxygen targets and inclusion of high-risk patients are needed to identify optimal oxygen targets during and after cardiac surgery.


Subject(s)
Cardiac Surgical Procedures/adverse effects , Heart/physiopathology , Hyperoxia/chemically induced , Oxygen/adverse effects , Cardiopulmonary Bypass , Humans , Hyperoxia/physiopathology , Inflammation/etiology , Inflammation/physiopathology , Oxidative Stress/physiology , Postoperative Period , Vasoconstriction/physiology
10.
J Crit Care ; 30(1): 167-72, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25446372

ABSTRACT

PURPOSE: The Behavioral Pain Scale (BPS) and Critical-Care Pain Observation Tool (CPOT) are behavioral pain assessment tools for uncommunicative and sedated intensive care unit (ICU) patients. This study compares the discriminant validation and reliability of the CPOT and the BPS, simultaneously, in mechanically ventilated patients on a mixed-adult ICU. MATERIALS AND METHODS: This is a prospective observational cohort study in 68 mechanically ventilated medical ICU patients who were unable to report pain. RESULTS: The BPS and CPOT scores showed a significant increase of 2 points between rest and the painful procedure (turning). The median BPS scores between rest and the nonpainful procedure (oral care) showed a significant increase of 1 point, whereas the median CPOT score remained unchanged. The interrater reliability of the BPS and CPOT scores showed a fair to good agreement (0.74 and 0.75, respectively). CONCLUSIONS: This study showed that the BPS and the CPOT are reliable and valid for use in a daily clinical setting. Although both scores increased with a presumed painful stimulus, the discriminant validation of the BPS use was less supported because it increased during a nonpainful stimulus. The CPOT appears preferable in this particular group of patients, especially with regard to its discriminant validation.


Subject(s)
Critical Illness , Pain Measurement/methods , Respiration, Artificial , Adult , Behavior , Critical Care/methods , Discriminant Analysis , Female , Humans , Male , Middle Aged , Observer Variation , Prospective Studies , Reproducibility of Results
11.
Crit Care ; 18(2): R12, 2014 Jan 13.
Article in English | MEDLINE | ID: mdl-24410863

ABSTRACT

INTRODUCTION: Higher body mass index (BMI) is associated with lower mortality in mechanically ventilated critically ill patients. However, it is yet unclear which body component is responsible for this relationship. METHODS: This retrospective analysis in 240 mechanically ventilated critically ill patients included adult patients in whom a computed tomography (CT) scan of the abdomen was made on clinical indication between 1 day before and 4 days after admission to the intensive care unit. CT scans were analyzed at the L3 level for skeletal muscle area, expressed as square centimeters. Cutoff values were defined by receiver operating characteristic (ROC) curve analysis: 110 cm2 for females and 170 cm2 for males. Backward stepwise regression analysis was used to evaluate low-muscle area in relation to hospital mortality, with low-muscle area, sex, BMI, Acute Physiologic and Chronic Health Evaluation (APACHE) II score, and diagnosis category as independent variables. RESULTS: This study included 240 patients, 94 female and 146 male patients. Mean age was 57 years; mean BMI, 25.6 kg/m2. Muscle area for females was significantly lower than that for males (102 ± 23 cm2 versus 158 ± 33 cm2; P < 0.001). Low-muscle area was observed in 63% of patients for both females and males. Mortality was 29%, significantly higher in females than in males (37% versus 23%; P = 0.028). Low-muscle area was associated with higher mortality compared with normal-muscle area in females (47.5% versus 20%; P = 0.008) and in males (32.3% versus 7.5%; P < 0.001). Independent predictive factors for mortality were low-muscle area, sex, and APACHE II score, whereas BMI and admission diagnosis were not. Odds ratio for low-muscle area was 4.3 (95% confidence interval, 2.0 to 9.0, P < 0.001). When applying sex-specific cutoffs to all patients, muscle mass appeared as primary predictor, not sex. CONCLUSIONS: Low skeletal muscle area, as assessed by CT scan during the early stage of critical illness, is a risk factor for mortality in mechanically ventilated critically ill patients, independent of sex and APACHE II score. Further analysis suggests muscle mass as primary predictor, not sex. BMI is not an independent predictor of mortality when muscle area is accounted for.


Subject(s)
Critical Illness/mortality , Critical Illness/therapy , Muscle, Skeletal/diagnostic imaging , Respiration, Artificial/mortality , Adult , Aged , Female , Humans , Male , Middle Aged , Respiration, Artificial/trends , Retrospective Studies , Risk Factors , Survival Rate/trends , Tomography, X-Ray Computed/trends
12.
Br J Surg ; 100(12): 1579-88, 2013 Nov.
Article in English | MEDLINE | ID: mdl-24264779

ABSTRACT

BACKGROUND: Studies on selective decontamination of the digestive tract (SDD) in elective gastrointestinal surgery have shown decreased rates of postoperative infection and anastomotic leakage. However, the prophylactic use of perioperative SDD in elective gastrointestinal surgery is not generally accepted. METHODS: A systematic review of randomized clinical trials (RCTs) was conducted to compare the effect of perioperative SDD with systemic antibiotics (SDD group) with systemic antibiotic prophylaxis alone (control group), using MEDLINE, Embase and the Cochrane Central Register of Controlled Trials. Endpoints included postoperative infection, anastomotic leakage, and in-hospital or 30-day mortality. RESULTS: Eight RCTs published between 1988 and 2011, with a total of 1668 patients (828 in the SDD group and 840 in the control group), were included in the meta-analysis. The total number of patients with infection (reported in 5 trials) was 77 (19.2 per cent) of 401 in the SDD group, compared with 118 (28.2 per cent) of 418 in the control group (odds ratio 0.58, 95 per cent confidence interval 0.42 to 0.82; P = 0.002). The incidence of anastomotic leakage was significantly lower in the SDD group: 19 (3.3 per cent) of 582 patients versus 44 (7.4 per cent) of 595 patients in the control group (odds ratio 0.42, 0.24 to 0.73; P = 0.002). CONCLUSION: This systematic review and meta-analysis suggests that a combination of perioperative SDD and perioperative intravenous antibiotics in elective gastrointestinal surgery reduces the rate of postoperative infection including anastomotic leakage compared with use of intravenous antibiotics alone.


Subject(s)
Anti-Bacterial Agents/administration & dosage , Antibiotic Prophylaxis/methods , Cross Infection/prevention & control , Digestive System Surgical Procedures/methods , Postoperative Complications/prevention & control , Administration, Oral , Anastomotic Leak/prevention & control , Decontamination/methods , Elective Surgical Procedures/methods , Humans , Infusions, Intravenous , Randomized Controlled Trials as Topic , Surgical Wound Infection/prevention & control , Treatment Outcome
14.
Dig Surg ; 28(5-6): 338-44, 2011.
Article in English | MEDLINE | ID: mdl-22005707

ABSTRACT

OBJECTIVE: To study the current application of selective decontamination of the digestive tract (SDD), the use of preoperative antibiotics and mechanical bowel preparation (MBP) in elective gastrointestinal (GI) surgery in surgical departments in the Netherlands. METHODS: A point prevalence survey was carried out and an online questionnaire was sent to GI surgeons of 86 different hospitals. RESULTS: The response rate was 74%. Only 4/64 (6.3%) of the Dutch surgical wards are currently using perioperative SDD as a prophylactic strategy to prevent postoperative infectious complications. The 4 hospitals using SDD on their surgical wards also use it on their ICUs. All hospitals make use of perioperative intravenous antibiotic prophylaxis in elective GI surgery. In most hospitals, a cephalosporin and metronidazole are applied (81.3 and 76.6%). MBP was used in 58 hospitals (90.6%) mainly in left colonic surgery. CONCLUSIONS: Perioperative SDD is rarely used in elective GI surgery in the Netherlands. Perioperative intravenous antibiotic prophylaxis is given in all Dutch hospitals, conforming to guidelines. Although the recent literature does not recommend MBP before surgery, it is still selectively used in 90.6% of the Dutch surgical departments, mainly in open or laparoscopic left colonic surgery (including sigmoid resections).


Subject(s)
Antibiotic Prophylaxis/statistics & numerical data , Cephalosporins/therapeutic use , Gastrointestinal Tract/surgery , Metronidazole/therapeutic use , Preoperative Care , Surgical Wound Infection/prevention & control , Anti-Bacterial Agents/therapeutic use , Anti-Infective Agents/therapeutic use , Cathartics/therapeutic use , Critical Care , Decontamination , Hospitals/statistics & numerical data , Humans , Laxatives/therapeutic use , Netherlands , Surveys and Questionnaires
15.
Br J Surg ; 98(10): 1365-72, 2011 Oct.
Article in English | MEDLINE | ID: mdl-21751181

ABSTRACT

BACKGROUND: This randomized clinical trial analysed the effect of perioperative selective decontamination of the digestive tract (SDD) in elective gastrointestinal surgery on postoperative infectious complications and leakage. METHODS: All patients undergoing elective gastrointestinal surgery during a 5-year period were evaluated for inclusion. Randomized patients received either SDD (polymyxin B sulphate, tobramycin and amphotericin) or placebo in addition to standard antibiotic prophylaxis. The primary endpoint was postoperative infectious complications and anastomotic leakage during the hospital stay or 30 days after surgery. RESULTS: A total of 289 patients were randomized to either SDD (143) or placebo (146). Most patients (190, 65·7 per cent) underwent colonic surgery. There were 28 patients (19·6 per cent) with infectious complications in the SDD group compared with 45 (30·8 per cent) in the placebo group (P = 0·028). The incidence of anastomotic leakage in the SDD group was 6·3 per cent versus 15·1 per cent in the placebo group (P = 0·016). Hospital stay and mortality did not differ between groups. CONCLUSION: Perioperative SDD in elective gastrointestinal surgery combined with standard intravenous antibiotics reduced the rate of postoperative infectious complications and anastomotic leakage compared with standard intravenous antibiotics alone. Perioperative SD.D should be considered for patients undergoing gastrointestinal surgery. REGISTRATION NUMBER: P02.1187L (Dutch Central Committee on Research Involving Human Subjects).


Subject(s)
Anti-Bacterial Agents/administration & dosage , Intraoperative Care/methods , Surgical Wound Dehiscence/prevention & control , Surgical Wound Infection/prevention & control , Aged , Aged, 80 and over , Amphotericin B/administration & dosage , Anastomotic Leak/prevention & control , Antibiotic Prophylaxis , Double-Blind Method , Drug Therapy, Combination , Elective Surgical Procedures , Female , Humans , Length of Stay , Male , Middle Aged , Polymyxin B/administration & dosage , Tobramycin/administration & dosage , Treatment Outcome
16.
Int J Artif Organs ; 30(4): 281-92, 2007 Apr.
Article in English | MEDLINE | ID: mdl-17520564

ABSTRACT

Using a large, international cohort, we sought to determine the effect of initial technique of renal replacement therapy (RRT) on the outcome of acute renal failure (ARF) in the intensive care unit (ICU). We enrolled 1218 patients treated with continuous RRT (CRRT) or intermittent RRT (IRRT) for ARF in 54 ICUs in 23 countries. We obtained demographic, biochemical and clinical data and followed patients to either death or hospital discharge. Information was analyzed to assess the independent impact of treatment choice on survival and renal recovery. Patients treated first with CRRT (N=1006, 82.6%) required vasopressor drugs and mechanical ventilation more frequently compared to those receiving IRRT (N=212, 17.4%), (p<0.0001). Unadjusted hospital survival was lower (35.8% vs. 51.9%, p<0.0001). However, unadjusted dialysis-independence at hospital discharge was higher after CRRT (85.5% vs. 66.2%, p<0.0001). Multivariable logistic regression showed that choice of CRRT was not an independent predictor of hospital survival or dialysis-free hospital survival. However, the choice of CRRT was a predictor of dialysis independence at hospital discharge among survivors (OR: 3.333, 95% CI: 1.845 - 6.024, p<0.0001). Further adjustment using a propensity score did not significantly change these results. We conclude that worldwide, the choice of CRRT as initial therapy is not a predictor of hospital survival or dialysis-free hospital survival but is an independent predictor of renal recovery among survivors.


Subject(s)
Acute Kidney Injury/therapy , Critical Illness , Renal Dialysis/methods , Acute Kidney Injury/physiopathology , Aged , Cause of Death , Cohort Studies , Critical Care , Female , Follow-Up Studies , Forecasting , Humans , Kidney/physiopathology , Male , Middle Aged , Patient Discharge , Prospective Studies , Recovery of Function/physiology , Respiration, Artificial , Survival Rate , Treatment Outcome , Vasoconstrictor Agents/therapeutic use
17.
Neth J Med ; 65(3): 101-8, 2007 Mar.
Article in English | MEDLINE | ID: mdl-17387236

ABSTRACT

BACKGROUND: In critically ill patients, heparin-induced thrombocytopenia (HIT) is estimated to account for approximately 1 to 10% of all causes of thrombocytopenia. HIT exerts a strong procoagulant state. In case of suspected HIT, it is an important clinical decision to stop heparin and start treatment with alternative nonheparin anticoagulation, awaiting the results of laboratory testing for the final diagnosis of HIT (bridging therapy). Fondaparinux acts by factor Xa inhibition and expresses no cross-reactivity with HIT antibodies. Excretion of fondaparinux is mainly renal. We describe our early experience with fixed low-dose fondaparinux bridging therapy and monitoring of anticoagulant activity for safety reasons. METHODS: This retrospective cohort study was conducted in a closed format general intensive care unit in a teaching hospital. Consecutive critically ill patients suspected of HIT were treated with fondaparinux after discontinuation of unfractionated heparin or nadroparin. Anti-Xa levels were determined afterwards. RESULTS: Seven patients were treated with fondaparinux 2.5 mg/day for 1.8 to 6.5 days. Anti-Xa levels varied from 0.1 to 0.6 U/ml. A negative correlation was found between creatinine clearance and mean and maximum anti-Xa levels. No thromboembolic complications occurred. Bleeding complications were only minor during fondaparinux treatment. Transfusion requirements did not differ significantly between treatment episodes with fondaparinux or with heparin anticoagulants. CONCLUSION: In this small sample of critically ill patients suspected of HIT, bridging therapy with fixed low-dose fondaparinux resulted in prophylactic and therapeutic anti-Xa levels. Monitoring of anticoagulant activity is advised in patients with renal insufficiency.


Subject(s)
Anticoagulants/administration & dosage , Critical Care/methods , Heparin/adverse effects , Polysaccharides/administration & dosage , Thrombocytopenia/chemically induced , Aged , Aged, 80 and over , Anticoagulants/adverse effects , Anticoagulants/pharmacology , Chemoprevention , Critical Illness , Dose-Response Relationship, Drug , Drug Monitoring , Female , Fondaparinux , Hospitals, Teaching , Humans , Intensive Care Units , Male , Middle Aged , Polysaccharides/adverse effects , Polysaccharides/pharmacology , Retrospective Studies , Thrombocytopenia/blood
19.
Intensive Care Med ; 32(2): 188-202, 2006 Feb.
Article in English | MEDLINE | ID: mdl-16453140

ABSTRACT

OBJECTIVES: Critical illness increases the tendency to both coagulation and bleeding, complicating anticoagulation for continuous renal replacement therapy (CRRT). We analyzed strategies for anticoagulation in CRRT concerning implementation, efficacy and safety to provide evidence-based recommendations for clinical practice. METHODS: We carried out a systematic review of the literature published before June 2005. Studies were rated at five levels to create recommendation grades from A to E, A being the highest. Grades are labeled with minus if the study design was limited by size or comparability of groups. Data extracted were those on implementation, efficacy (circuit survival), safety (bleeding) and monitoring of anticoagulation. RESULTS: Due to the quality of the studies recommendation grades are low. If bleeding risk is not increased, unfractionated heparin (activated partial thromboplastin time, APTT, 1-1.4 times normal) or low molecular weight heparin (anti-Xa 0.25-0.35 IU/l) are recommended (grade E). If facilities are adequate, regional anticoagulation with citrate may be preferred (grade C). If bleeding risk is increased, anticoagulation with citrate is recommended (grade D(-)). CRRT without anticoagulation can be considered when coagulopathy is present (grade D(-)). If clotting tendency is increased predilution or the addition of prostaglandins to heparin may be helpful (grade C(-)). CONCLUSION: Anticoagulation for CRRT must be tailored to patient characteristics and local facilities. The implementation of regional anticoagulation with citrate is worthwhile to reduce bleeding risk. Future trials should be randomized and should have sufficient power and well defined endpoints to compensate for the complexity of critical illness-related pro- and anticoagulant forces. An international consensus to define clinical endpoints is advocated.


Subject(s)
Anticoagulants/administration & dosage , Blood Coagulation Disorders/prevention & control , Renal Replacement Therapy , Anticoagulants/adverse effects , Blood Coagulation Disorders/etiology , Blood Coagulation Tests , Evidence-Based Medicine , Humans
20.
Minerva Cardioangiol ; 53(5): 445-63, 2005 Oct.
Article in English | MEDLINE | ID: mdl-16179886

ABSTRACT

Contrast nephropathy (CN) is a common cause of iatrogenic acute renal failure. Its incidence rises with the growing use of intra-arterial contrast in older patients for diagnostic and interventional procedures. Aim of the present review is to discuss the mechanisms and risk factors of CN, to summarize the controlled studies evaluating measures for prevention, and to conclude with evidence-based strategies for prevention. Pathophysiological mechanisms of CN are intrarenal vasoconstriction, leading to medullary ischemia, direct cytotoxicity, oxidative tissue damage and apoptosis. Nephro-toxicity is related to osmolality, dose and route of the contrast agent and only occurs in synergy with patient factors, such as previous renal impairment, cardiovascular disease, oxidant stress and the use of certain drugs. CN has impact on morbidity and mortality. In patients at risk, the following measures are recommended: discontinuation of potentially nephrotoxic drugs, treatment of intravascular volume depletion, hydration with sodium-bicarbonate (which seems superior to sodium-chloride), limitation of contrast volume and the use of low-osmolal contrast. Furthermore, if starting the day before is feasible, administer oral N-acetylcysteine, or, with urgent interventions, theophylline 200 mg i.v. (once before the intervention) or high dose ascorbic acid. In patients with combined severe cardiac and renal insufficiency, periprocedural hemofiltration may be considered; this is the only intervention with proven clinical improvement. Large randomised controlled trials are necessary to show whether pharmacological interventions can improve clinical outcomes.


Subject(s)
Acute Kidney Injury/chemically induced , Acute Kidney Injury/prevention & control , Contrast Media/adverse effects , Acetylcysteine/therapeutic use , Ascorbic Acid/therapeutic use , Humans , Phosphodiesterase Inhibitors/therapeutic use , Prostaglandins/therapeutic use , Theophylline/therapeutic use
SELECTION OF CITATIONS
SEARCH DETAIL
...