Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 15 de 15
Filter
1.
Kidney Int ; 88(4): 897-904, 2015 Oct.
Article in English | MEDLINE | ID: mdl-26154928

ABSTRACT

In patients with severe acute kidney injury (AKI) but no urgent indication for renal replacement therapy (RRT), the optimal time to initiate RRT remains controversial. While starting RRT preemptively may have benefits, this may expose patients to unnecessary RRT. To study this, we conducted a 12-center open-label pilot trial of critically ill adults with volume replete severe AKI. Patients were randomized to accelerated (12 h or less from eligibility) or standard RRT initiation. Outcomes were adherence to protocol-defined time windows for RRT initiation (primary), proportion of eligible patients enrolled, follow-up to 90 days, and safety in 101 fully eligible patients (57 with sepsis) with a mean age of 63 years. Median serum creatinine and urine output at enrollment were 268 micromoles/l and 356 ml per 24 h, respectively. In the accelerated arm, all patients commenced RRT and 45/48 did so within 12 h from eligibility (median 7.4 h). In the standard arm, 33 patients started RRT at a median of 31.6 h from eligibility, of which 19 did not receive RRT (6 died and 13 recovered kidney function). Clinical outcomes were available for all patients at 90 days following enrollment, with mortality 38% in the accelerated and 37% in the standard arm. Two surviving patients, both randomized to standard RRT initiation, were still RRT dependent at day 90. No safety signal was evident in either arm. Our findings can inform the design of a large-scale effectiveness randomized control trial.


Subject(s)
Acute Kidney Injury/therapy , Kidney/physiopathology , Renal Replacement Therapy , Time-to-Treatment , Watchful Waiting , Acute Kidney Injury/diagnosis , Acute Kidney Injury/mortality , Acute Kidney Injury/physiopathology , Aged , Canada , Critical Illness , Feasibility Studies , Female , Hospital Mortality , Humans , Length of Stay , Male , Middle Aged , Pilot Projects , Renal Replacement Therapy/adverse effects , Renal Replacement Therapy/mortality , Risk Factors , Severity of Illness Index , Time Factors , Treatment Outcome
2.
Can J Anaesth ; 59(9): 861-70, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22752716

ABSTRACT

PURPOSE: The optimal timing for starting renal replacement therapy (RRT) in patients with acute kidney injury (AKI) is unknown. Defining current practice is necessary to design interventional trials. We describe the current Canadian practice regarding the timing of RRT initiation for AKI. METHODS: An observational study of patients undergoing RRT for AKI was undertaken at 11 intensive care units (ICUs) across Canada. Data were captured on demographics, clinical and laboratory findings, indications for RRT, and timing of RRT initiation. RESULTS: Among 119 consecutive patients, the most common ICU admission diagnosis was sepsis/septic shock, occurring in 54%. At the time of RRT initiation, the median and interquartile range (IQR) serum creatinine level was 322 (221-432) µmol·L(-1). The mean (SD) values for other parameters were as follows: Sequential Organ Failure Assessment (SOFA) score 13.4 (4.1), pH 7.25 (0.15), potassium 4.6 (1.0) mmol·L(-1). Also, 64% fulfilled the serum creatinine-based criterion for Acute Kidney Injury Network (AKIN) stage 3. Severity of illness, measured using Acute Physiology and Chronic Health Evaluation (APACHE II) and SOFA scores, did not correlate with AKI severity as defined by the serum creatinine-based AKIN criteria. Median (IQR) time from hospital and ICU admission to the start of RRT was 2.0 (1.0-7.0) days and 1.0 (0-2.0) day, respectively. CONCLUSION: Patients admitted to an ICU who were started on RRT generally had advanced AKI, high-grade illness severity, and multiorgan dysfunction. Also, they were started on RRT shortly after hospital presentation. We describe the current state of practice in Canada regarding the initiation of RRT for AKI in critically ill patients, which can inform the designs of future interventional trials.


Subject(s)
Acute Kidney Injury/therapy , Intensive Care Units/statistics & numerical data , Renal Replacement Therapy/methods , Acute Kidney Injury/physiopathology , Adult , Aged , Canada , Creatinine/blood , Critical Illness , Female , Humans , Male , Middle Aged , Multiple Organ Failure/etiology , Prospective Studies , Retrospective Studies , Severity of Illness Index , Time Factors
3.
Semin Dial ; 25(4): 419-22, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22694662

ABSTRACT

Tuberculosis is an important issue for nephrologists caring for dialysis patients. Because dialysis patients are immunocompromised, they are at higher risk for reactivation of latent tuberculosis, and they frequently have atypical presentation. Furthermore, hemodialysis units may foster rapid spread of active pulmonary tuberculosis. The diagnosis of active pulmonary tuberculosis still depends on detection of organisms by smear and culture. Newer nucleic acid detection techniques are more sensitive and specific. Nephrologists should remember that nonspecific presentation of tuberculosis including fever, weight loss, and adenopathy are more common in dialysis patients than in the general population, and diagnosis may require biopsy of extrapulmonary tissue. Detection of latent tuberculosis in dialysis patients should only be undertaken if treatment is planned. Generally, this should apply only to potential transplant candidates and younger dialysis patients with longer life expectancy. Tuberculin skin test is very insensitive in dialysis patients, and false-positives occur in patients born in countries where Bacillus Calmette-Guérin vaccine has been used. Blood tests using stimulation of gamma interferon have been shown to be more sensitive tests of latent tuberculosis and may be used in conjunction with tuberculin skin tests.


Subject(s)
Tuberculosis/diagnosis , DNA, Bacterial/genetics , Humans , Immunocompromised Host , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/therapy , Mycobacterium tuberculosis/genetics , Radiography, Thoracic , Renal Dialysis , Sputum/microbiology
4.
Nephrol Dial Transplant ; 22(2): 477-83, 2007 Feb.
Article in English | MEDLINE | ID: mdl-17018541

ABSTRACT

BACKGROUND: Central venous catheters (CVCs) continue to be used at a high rate for dialysis access and are frequently complicated by thrombus-related malfunction. Prophylactic locking with an anticoagulant, such as heparin, has become standard practice despite its associated risks. Trisodium citrate (citrate) 4% is an alternative catheter locking anticoagulant. METHODS: The objective was to prospectively study the clinical effectiveness, safety and cost of citrate 4% vs heparin locking by comparing rates of CVC exchanges, thrombolytic use (TPA) and access-associated hospitalizations during two study periods: heparin period (HP) (1 June 2003-15 February 2004) and Citrate Period (CP) 15 March-15 November 2004. Incident catheters evaluated did not overlap the two periods. RESULTS: There were 176 CVC in 121 patients (HP) and 177 CVC in 129 patients (CP). The event rates in incident CVC were: CVC exchange 2.98/1000 days (HP) vs 1.65/1000 days (CP) (P = 0.01); TPA use 5.49/1000 (HP) vs 3.3/1000 days (CP) (P = 0.002); hospitalizations 0.59/1000 days (HP) vs 0.28/1000 days (CP) (P = 0.49). There was a longer time from catheter insertion to requiring CVC exchange (P = 0.04) and TPA (P = 0.006) in the citrate compared with the heparin lock group. Citrate locking costs less than heparin locking but a formal economic analysis including indirect costs was not done. CONCLUSION: Citrate 4% has equivalent or better outcomes with regards to catheter exchange, TPA use and access-related hospitalizations compared with heparin locking. It is a safe and less expensive alternative. Randomized trials comparing these anticoagulants with a control group would definitively determine the optimal haemodialysis catheter locking solution.


Subject(s)
Anticoagulants/pharmacology , Catheters, Indwelling , Citrates/pharmacology , Heparin/pharmacology , Renal Replacement Therapy/instrumentation , Venous Thrombosis/prevention & control , Adult , Aged , Aged, 80 and over , Anticoagulants/economics , Citrates/economics , Cost-Benefit Analysis , Female , Follow-Up Studies , Heparin/economics , Humans , Incidence , Male , Middle Aged , Ontario/epidemiology , Prospective Studies , Venous Thrombosis/epidemiology
5.
Crit Care Med ; 34(10): 2666-73, 2006 Oct.
Article in English | MEDLINE | ID: mdl-16915116

ABSTRACT

OBJECTIVE: To describe the physiologic consequences of dialysis against distilled water and to provide recommendations by which other institutions may avoid similar errors in dialysate preparation. DATA SOURCE: Four cases of dialysis against distilled water are described, occurring at three teaching hospitals within a 2-yr period. In addition, an in vitro experiment of banked whole blood exposure to distilled water dialysate was performed. DATA EXTRACTION: Because all four cases occurred within a critical care setting, intensive monitoring of clinical, biochemical, and hematologic abnormalities was possible. DATA SYNTHESIS: Serum sodium decreased by an average of 22 mmol/L, followed by a decrease in hemoglobin averaging 32 g/L. Additional investigations and the in vitro experiment provided evidence that hemolysis occurred primarily via clearance of damaged erythrocytes within the patient's reticuloendothelial system. Physiologic derangements secondary to dialysis against distilled water likely contributed to a stroke suffered by one patient and the death of at least one other patient. CONCLUSIONS: Accidental dialysis against distilled water is a potentially serious but preventable complication of bedside dialysate preparation.


Subject(s)
Dialysis Solutions/adverse effects , Hemolysis , Medical Errors , Renal Dialysis/adverse effects , Adult , Aged , Bicarbonates/therapeutic use , Drug Compounding , Female , Humans , Hyponatremia/etiology , Magnesium/therapeutic use , Male , Sodium Chloride/therapeutic use
6.
ASAIO J ; 52(4): 479-81, 2006.
Article in English | MEDLINE | ID: mdl-16883131

ABSTRACT

Limited data are available on the use of dialysis to treat cyanide or thiocyanate intoxication. This report describes the case of a 65-year-old woman with renal insufficiency who had development of thiocyanate toxicity secondary to a nitroprusside infusion. A rapid decline in her blood thiocyanate level was observed in response to initiation of continuous venovenous hemodiafiltration.


Subject(s)
Nitroprusside/adverse effects , Renal Dialysis , Thiocyanates/blood , Thiocyanates/toxicity , Aged , Female , Hemodiafiltration/methods , Humans , Infusions, Intravenous , Nitroprusside/administration & dosage , Nitroprusside/metabolism , Nitroprusside/therapeutic use , Renal Insufficiency/complications , Renal Insufficiency/etiology , Treatment Outcome
7.
Nephrol Dial Transplant ; 21(5): 1407-12, 2006 May.
Article in English | MEDLINE | ID: mdl-16504981

ABSTRACT

BACKGROUND: End-stage renal disease (ESRD) is associated with a markedly increased cardiac calcification burden, as reflected by computed tomography scans of the heart. Nocturnal haemodialysis (NHD) is a novel form of renal replacement therapy which has multiple physiologic effects that may affect vascular calcification, including improvements in phosphate and uraemia control. The objective of the present study is the determination of the natural history of coronary calcification progression in patients converted to NHD, and the examination of the relationships between calcification risk factors and calcification progression in these patients. METHODS: Thirty-eight ESRD patients were converted to NHD, and included in our observational cohort study. Coronary artery calcification scores (CACS) were documented at baseline and post-conversion (mean interscan duration 16+/-1 months). Other variables of interest included age, dialysis vintage, Framingham risk profile, phosphate binder and vitamin D usage, and plasma levels of calcium, phosphate and parathyroid hormone. RESULTS: Our cohort was stratified according to baseline calcification burden (minimal calcification: CACS < or = 10 vs significant calcification: CACS > 10). Twenty-four patients had baseline CACS < or = 10. These patients demonstrated no change in coronary calcification after 1 year of NHD (from 0.7+/-0.5 to 6+/-3, P = 0.1). Fourteen patients had higher initial CACS at baseline (1874+/-696), and demonstrated a non-significant 9% increase over 1 year to 2038+/-740 (P = 0.1). Plasma phosphate and calcium x phosphate product were significantly reduced, as were calcium-based phosphate binder and antihypertensive usage. CONCLUSIONS: Our study is the first to document CACS progression in a cohort of NHD patients. Further analysis of the effect of NHD on the physiology of cardiovascular calcification is required.


Subject(s)
Calcinosis/etiology , Coronary Disease/diagnostic imaging , Coronary Disease/etiology , Kidney Failure, Chronic/therapy , Renal Dialysis/adverse effects , Renal Dialysis/methods , Adult , Calcinosis/diagnostic imaging , Cohort Studies , Coronary Angiography , Female , Follow-Up Studies , Humans , Kidney Failure, Chronic/diagnosis , Male , Middle Aged , Probability , Prospective Studies , Risk Assessment , Severity of Illness Index , Statistics, Nonparametric , Time Factors , Tomography, X-Ray Computed
8.
ASAIO J ; 51(3): 236-41, 2005.
Article in English | MEDLINE | ID: mdl-15968953

ABSTRACT

Inflammation is implicated in the pathogenesis of erythropoietin (EPO) resistance in patients with end-stage renal disease. Interleukin (IL)-6 and tumor necrosis factor (TNF)-alpha are suggested to suppress erythropoiesis in uremia. Insulin like growth factor (IGF)-1 has been proposed to stimulate erythropoiesis. Nocturnal hemodialysis (NHD) has been demonstrated to improve anemia management with enhanced EPO responsiveness without altering survival of red blood cells. We tested the hypothesis that augmentation of uremia clearance by NHD results in a reduction of proinflammatory cytokine levels, thereby enhancing EPO responsiveness. Using a cross-sectional study design, 14 prevalent patients on NHD and 14 patients on conventional hemodialysis (CHD) matched for age and comorbidities and controlled for hemoglobin concentrations and iron status were studied. Outcome variables included EPO requirement and plasma levels of EPO, parathyroid hormone, C reactive protein, IL-6, TNF-alpha, and IGF-1. The primary outcome was to determine the between group differences in (1) cytokine profile and (2) EPO requirement. The secondary outcome was to examine the potential correlation between cytokine levels and EPO requirement. There were no significant differences in patient characteristics, comorbidities, hemoglobin, iron indices, and parathyroid hormone levels between the two cohorts. EPO requirement was significantly lower in the NHD cohort [90.5 +/- 22.1 U/kg/ week (NHD) vs. 167.2 +/- 25.4 U/kg/week (CHD), p = 0.04]. Plasma IL-6 levels were lower in the NHD cohort [3.9 +/- 0.7 pg/ml (NHD) vs. 6.5 +/- 0.8 pg/ml (CHD), p = 0.04]. C reactive protein tended to decrease [4.59 +/- 1.34 (NHD) vs. 8.43 +/- 1.83 mg/L (CHD), p = 0.14]. TNF-alpha, and IGF-1 levels did not differ between the two groups. Direct associations were found between EPO requirement and C reactive protein levels (R = 0.62, p = 0.001), and IL-6 levels (R = 0.57, p = 0.002). Augmentation of uremic clearance by NHD improves EPO responsiveness in end-stage renal disease. A possible mechanism for this improvement is through better control of inflammation, as manifested by lowering of plasma IL-6 levels. Further studies are required to clarify the mechanisms by which NHD decreases inflammation.


Subject(s)
Cytokines/biosynthesis , Erythropoietin/therapeutic use , Renal Dialysis , Adult , Anemia/drug therapy , Anemia/immunology , Drug Resistance , Erythropoietin/blood , Female , Hemoglobins/analysis , Humans , Inflammation/prevention & control , Kidney Failure, Chronic/complications , Male , Middle Aged , Time Factors
9.
Nephrol Dial Transplant ; 18(6): 1174-80, 2003 Jun.
Article in English | MEDLINE | ID: mdl-12748352

ABSTRACT

BACKGROUND: Thrombosis is the primary cause of access failure in polytetrafluoroethylene grafts and arteriovenous fistulas. It can lead to significant patient and access morbidity and mortality, and is difficult to prevent medically. Intervention is largely limited to maximizing access patency by detecting culprit lesions early and intervening with angioplasty or surgical revision. The most efficacious monitoring strategy is undetermined. METHODS: This 3 year prospective study took advantage of a change in monitoring strategy used in a large dialysis centre to compare the efficacy of two methods used to monitor grafts and fistulas in order to prevent access thrombosis. Accesses were monitored using Duplex ultrasonography in year 1, while the saline ultrasound dilution technique (Transonic) became the primary monitoring strategy in year 3 (year 2 was a transition year). Risk factors for thrombosis were determined using multivariate survival analysis, and the performance of Duplex ultrasonography and Transonic monitoring was assessed. RESULTS: A total of 303 656 access days at risk were assessed, with 344, 385 and 425 accesses in years 1, 2 and 3, respectively. The total thrombosis rate was 1.01/1000 access days in year 1 compared with 0.66/1000 access days in year 3. This was accomplished despite a reduction in procedure rates of 55% for angiograms, 13% for angioplasties and 31% for thrombolysis. CONCLUSION: Low flow rates detected using Transonic monitoring were associated with increased thrombosis, while stenosis detected using Duplex ultrasonography was not a strong predictor of incipient thrombosis; however, these different access characteristics were compared using monitoring techniques that may be ideal in different clinical situations.


Subject(s)
Arteriovenous Shunt, Surgical/adverse effects , Graft Occlusion, Vascular/diagnostic imaging , Thrombosis/prevention & control , Adult , Aged , Aged, 80 and over , Female , Humans , Kidney Failure, Chronic/therapy , Male , Middle Aged , Monitoring, Physiologic , Proportional Hazards Models , Prospective Studies , Renal Dialysis , Risk Factors , Sensitivity and Specificity , Survival Analysis , Thrombosis/diagnostic imaging , Thrombosis/etiology , Ultrasonography/methods
10.
ASAIO J ; 49(1): 70-3, 2003.
Article in English | MEDLINE | ID: mdl-12558310

ABSTRACT

In the past 15 years, there has been a trend to decrease dialysate calcium concentrations to prevent hypercalcemia. However, low dialysate calcium can provoke hyperparathyroidism. The time course of the effect of increasing dialysate calcium is not well characterized, and the effect on calcium-phosphate product is unclear. Therefore, we studied the effect of increasing dialysate calcium from 1.5 to 1.75 mM in 21 stable patients on hemodialysis who had serum phosphate of less than 2 mM and serum calcium of less than 2.4 mM. Over 10 months, parathyroid hormone levels fell from 39.6 to 16.6 pM (p < 0.0001), whereas serum calcium increased from 2.27 to 2.41 mM. There were no significant changes in serum phosphate or the calcium-phosphate product. Three patients became hypercalcemic when their parathyroid hormone levels were suppressed to less than 10 pM. We conclude that in carefully selected patients, increasing dialysate calcium can safely treat hyperparathyroidism with minimal risk of complications. This treatment has the advantage over the use of vitamin D therapy of being less expensive, independent of patient compliance, and less likely to cause increases in serum phosphate or calcium-phosphate product.


Subject(s)
Calcium/administration & dosage , Hemodialysis Solutions/administration & dosage , Hyperparathyroidism/drug therapy , Kidney Failure, Chronic/therapy , Renal Dialysis/methods , Adult , Aged , Aged, 80 and over , Calcium/blood , Humans , Hyperparathyroidism/etiology , Kidney Failure, Chronic/complications , Middle Aged , Parathyroid Hormone/metabolism , Phosphates/blood , Prospective Studies
11.
Am J Kidney Dis ; 41(1): 225-9, 2003 Jan.
Article in English | MEDLINE | ID: mdl-12500241

ABSTRACT

A 42-year-old man with end-stage renal disease (ESRD) was referred for conversion to nocturnal hemodialysis (NHD) therapy from conventional hemodialysis (CHD) therapy because of refractory intermittent claudication secondary to peripheral arterial disease (PAD). The patient was initiated on CHD therapy in 1976 and subsequently had undergone two unsuccessful renal transplantations. While on CHD therapy, his clinical course was complicated by worsening vascular and soft-tissue calcification. Extensive dystrophic soft-tissue calcification was noted bilaterally in his hands, lower extremities, and sacral region, requiring surgical excision. Lower-extremity arterial Doppler scans documented vascular calcification and a pronounced decrease in peripheral arterial flow bilaterally. After conversion to NHD therapy (7.5 h/session five times weekly), the patient became symptom free and had significant clinical improvements in (1) hemodynamics, measured by clinic blood pressure and two-dimensional echocardiography, (2) biochemical profile, and (3) a sustained improvement in arterial Doppler flow measured by duplex Doppler ultrasound. We conclude that NHD was able to improve lower-extremity PAD in our patient. Further observational and interventional studies are required to investigate the therapeutic potential of NHD for the treatment of PAD in patients with ESRD.


Subject(s)
Lower Extremity/blood supply , Night Care/methods , Peripheral Vascular Diseases/therapy , Renal Dialysis/methods , Adult , Calcinosis/etiology , Calcinosis/surgery , Hemodynamics , Humans , Intermittent Claudication/etiology , Intermittent Claudication/physiopathology , Intermittent Claudication/therapy , Kidney Failure, Chronic/etiology , Kidney Failure, Chronic/physiopathology , Kidney Failure, Chronic/therapy , Lower Extremity/pathology , Lower Extremity/physiopathology , Lower Extremity/surgery , Male , Peripheral Vascular Diseases/etiology , Peripheral Vascular Diseases/physiopathology , Peripheral Vascular Diseases/surgery
12.
Kidney Int ; 61(6): 2235-9, 2002 Jun.
Article in English | MEDLINE | ID: mdl-12028465

ABSTRACT

BACKGROUND: Left ventricular hypertrophy (LVH) is an independent risk factor for mortality in the dialysis population. LVH has been attributed to several factors, including hypertension, excess extracellular fluid (ECF) volume, anemia and uremia. Nocturnal hemodialysis is a novel renal replacement therapy that appears to improve blood pressure control. METHODS: This observational cohort study assessed the impact on LVH of conversion from conventional hemodialysis (CHD) to nocturnal hemodialysis (NHD). In 28 patients (mean age 44 +/- 7 years) receiving NHD for at least two years (mean duration 3.4 +/- 1.2 years), blood pressure (BP), hemoglobin (Hb), ECF volume (single-frequency bioelectrical impedance) and left ventricular mass index (LVMI) were determined before and after conversion. For comparison, 13 control patients (mean age 52 +/- 15 years) who remained on self-care home CHD for one year or more (mean duration 2.8 +/- 1.8 years) were studied also. Serial measurements of BP, Hb and LVMI were also obtained in this control group. RESULTS: There were no significant differences between the two cohorts with respect to age, use of antihypertensive medications, Hb, BP or LVMI at baseline. After transfer from CHD to NHD, there were significant reductions in systolic, diastolic and pulse pressure (from 145 +/- 20 to 122 +/- 13 mm Hg, P < 0.001; from 84 +/- 15 to 74 +/- 12 mm Hg, P = 0.02; from 61 +/- 12 to 49 +/- 12 mm Hg, P = 0.002, respectively) and LVMI (from 147 +/- 42 to 114 +/- 40 g/m2, P = 0.004). There was also a significant reduction in the number of prescribed antihypertensive medications (from 1.8 to 0.3, P < 0.001) and an increase in Hb in the NHD cohort. Post-dialysis ECF volume did not change. LVMI correlated with systolic blood pressure (r = 0.6, P = 0.001) during nocturnal hemodialysis. There was no relationship between changes in LVMI and changes in BP or Hb. In contrast, there were no changes in BP, Hb or LVMI in the CHD cohort over the same time period. CONCLUSIONS: Reductions in BP with NHD are accompanied by regression of LVH.


Subject(s)
Circadian Rhythm , Echocardiography , Hypertrophy, Left Ventricular/diagnostic imaging , Hypertrophy, Left Ventricular/therapy , Kidney Failure, Chronic/therapy , Renal Dialysis/methods , Aged , Blood Pressure , Cohort Studies , Home Care Services , Humans , Hypertrophy, Left Ventricular/physiopathology , Middle Aged , Self Care
13.
Article in English | MEDLINE | ID: mdl-11987444

ABSTRACT

OBJECTIVES: To evaluate the cost-effectiveness of reusing hemodialyzers for patients with kidney failure on dialysis employing either a heated citric acid or formaldehyde sterilization method, in comparison to the standard practice of single-use dialysis. METHODS: A meta-analysis of all relevant studies was performed to determine whether hemodialyzer reuse was associated with an increased relative risk of mortality or hospitalization. A decision tree was constructed to model the effect of three different dialysis strategies (single-use dialysis, heated citric acid, and formaldehyde dialyzer reuse) on the costs and quality-adjusted life expectancy of "typical" hemodialysis patients. The cost of heated citric acid reuse was estimated from a center experienced with the technique. The cost of end-stage renal disease (ESRD) care, survival data, and patient utilities were estimated from published sources. RESULTS: There was evidence of a higher relative risk of hospitalization (but not mortality) for hemodialyzer reuse compared with single-use dialysis. Depending on the assumptions used, the cost savings that could be expected by switching from single-use dialysis to heated citric acid reuse were small, ranging from CAN $0-739 per patient per year. CONCLUSIONS: ESRD programs can incorporate the results of this study based on their individual situations to determine whether hemodialyzer reuse is appropriate in their setting.


Subject(s)
Equipment Reuse/economics , Kidney Failure, Chronic/therapy , Renal Dialysis/instrumentation , Canada/epidemiology , Citric Acid , Decision Support Techniques , Female , Formaldehyde , Health Care Costs , Health Care Rationing , Hospitalization , Humans , Kidney Failure, Chronic/economics , Kidney Failure, Chronic/mortality , Male , Quality of Life , Quality-Adjusted Life Years , Renal Dialysis/economics , Risk Factors , Sterilization/methods , Technology Assessment, Biomedical
14.
ASAIO J ; 48(1): 45-56, 2002.
Article in English | MEDLINE | ID: mdl-11820220

ABSTRACT

The surface features, morphology, and tensile properties of fibers obtained from pristine, reprocessed, and reused Fresenius Polysulfone High-Flux (Hemoflow F80A) hemodialyzers have been studied. Scanning electron microscopy of the dialyzer fibers revealed a dense skin layer on the inner surface of the membrane and a relatively thick porous layer on the outer surface. Transmission electron microscopy and atomic force microscopy showed an alteration in membrane morphology due to reprocessing and reuse, or to a deposition of blood-borne material on the membrane that is not removed with reprocessing. Fluorescent microscopy images also showed that a fluorescent material not removed by heat/citric acid reprocessing builds up with continued use of the dialyzers. The tensile properties of the dialyzer fibers were not affected by the heat/citric acid reprocessing procedure. The protein layers formed on pristine and reused hemodialyzer membranes during clinical use were also studied using sodium dodecyl sulfate polyacrylamide gel electrophoresis and immunoblotting. A considerable amount of protein was found on the blood side of single and multiple use dialyzers. Proteins adsorbed on the dialysate side of the membrane were predominantly in the molecular weight region below 30 kDa. Little protein was detected on the membranes of reprocessed hemodialyzers.


Subject(s)
Anticoagulants , Biocompatible Materials/analysis , Citric Acid , Polymers/analysis , Renal Dialysis/instrumentation , Sulfones/analysis , Blood Proteins/analysis , Electrophoresis, Polyacrylamide Gel , Equipment Reuse , Hot Temperature , Humans , Immunoblotting , Kidney Failure, Chronic/therapy , Membranes, Artificial , Microscopy, Atomic Force , Microscopy, Electron , Sodium Dodecyl Sulfate , Surface-Active Agents , Tensile Strength
15.
Can J Urol ; 6(6): 901-905, 1999 Dec.
Article in English | MEDLINE | ID: mdl-11180794

ABSTRACT

PURPOSE: To assess the effect of donor nephrectomy on blood pressure, 24-hour urine protein excretion, and renal function. MATERIALS AND METHODS: Of the 198 individuals who donated a kidney between 1991-1996, 101 had their blood pressure, 24-hour urine protein excretion, and serum creatinine concentration levels measured. The mean duration of follow-up was 3.2 +/- 1.6 years (range: 8.5 months to 6.5 years). RESULTS: Serum creatinine concentration was significantly higher (p<.001) at follow-up (107 +/- 20 umol/L) compared to before donation (86 +/- 18 umol/L). When follow-up serum creatinine concentrations were expressed as percentages of their pre-operative values, a gradual decline was observed with time (R= -.380). Diastolic blood pressures (p<.05) and 24-hour urine protein levels (p<.001) were significantly higher at follow-up, however, neither increased with time. The prevalence of hypertension and proteinuria in our donors was no different from that of the general population. CONCLUSIONS: Donor nephrectomy does not impair renal function or result in a progressive rise in blood pressure or urine protein excretion up to 6.5 years after nephrectomy.

SELECTION OF CITATIONS
SEARCH DETAIL
...