Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 55
Filter
2.
Hemodial Int ; 22(S2): S29-S64, 2018 10.
Article in English | MEDLINE | ID: mdl-30457224

ABSTRACT

Hemodialysis for chronic renal failure was introduced and developed in Seattle, WA, in the 1960s. Using Kiil dialyzers, weekly dialysis time and frequency were established to be about 30 hours on 3 time weekly dialysis. This dialysis time and frequency was associated with 10% yearly mortality in the United States in 1970s. Later in 1970s, newer and more efficient dialyzers were developed and it was felt that dialysis time could be shortened. An additional incentive to shorten dialysis was felt to be lower cost and higher convenience. Additional support for shortening dialysis time was provided by a randomized prospective trial performed by National Cooperative Dialysis Study (NCDS). This study committed a Type II statistical error rejecting the time of dialysis as an important factor in determining the quality of dialysis. This study also provided the basis for the establishment of the Kt/Vurea index as a measure of dialysis adequacy. This index having been established in a sacrosanct randomized controlled trial (RCT), was readily accepted by the HD community, and led to shorter dialysis, and higher mortality in the United States. Kt/Vurea is a poor measure of dialysis quality because it combines three unrelated variables into a single formula. These variables influence the clinical status of the patient independent of each other. It is impossible to compensate short dialysis duration (t) with the increased clearance of urea (K), because the tolerance of ultrafiltration depends on the plasma-refilling rate, which has nothing in common with urea clearance. Later, another RCT (the HEMO study) committed a Type III statistical error by asking the wrong research question, thus not yielding any valuable results. Fortunately, it did not lead to deterioration of dialysis outcomes in the United States. The third RCT in this field ("in-center hemodialysis 6 times per week versus 3 times per week") did not bring forth any valuable results, but at least confirmed what was already known. The fourth such trial ("The effects of frequent nocturnal home hemodialysis") too did not show any positive results primarily due to significant subject recruitment issues leading to inappropriate selection of patients. Comparison of the value of peritoneal dialysis and HD in RCTs could not be completed because of recruitment problems. Randomized controlled trials have therefore failed to yield any meaningful information in the area of dose and or frequency of hemodialysis.


Subject(s)
Renal Dialysis/methods , Sodium/isolation & purification , Urea/metabolism , Blood Pressure , Hemodialysis, Home , Humans , Kidney Failure, Chronic/therapy , Prospective Studies , Randomized Controlled Trials as Topic , Regional Blood Flow , Renal Dialysis/mortality , Renal Dialysis/standards , Time Factors , Urea/toxicity
4.
J Vasc Access ; 16 Suppl 9: S54-60, 2015.
Article in English | MEDLINE | ID: mdl-25751552

ABSTRACT

There are two methods of fistula cannulation for hemodialysis. The first, different site or rope-ladder cannulation method, established by originators of the arteriovenous fistula as a blood access for hemodialysis in 1966, relies on changing the puncture sites for each dialysis. The second, constant site or buttonhole method, developed several years later, recommends using the same puncture sites for consecutive dialyses. The first method is prevailing at present, but the second method is becoming more and more popular. The major advantage of this buttonhole method is lower cannulation pain, fewer fistula complications, with the exception of fistula infection, which is more common in some studies. This method is more difficult and requires experienced single cannulator to establish good puncture sites. Home hemodialysis patients using single cannulator, the patient or helper, have better results with this method. Busy dialysis centers with high rotation of cannulators do not have as good results and prefer the rope-ladder method.


Subject(s)
Arteriovenous Shunt, Surgical , Catheterization/methods , Renal Dialysis , Catheterization/adverse effects , Catheterization/instrumentation , Equipment Design , Humans , Needles , Punctures , Time Factors , Treatment Outcome
6.
Nephrol Dial Transplant ; 28(4): 826-32; discussion 832, 2013 Apr.
Article in English | MEDLINE | ID: mdl-23543723

ABSTRACT

All progress in dialysis methods was made in research presented in case reports, case-control studies and other observational studies. On the contrary, randomized controlled trials (RCTs) did not bring any valuable results. Comparison of the value of peritoneal dialysis and hemodialysis (HD) in RCTs was not completed because of recruitment problems. Four RCTs in HD did not provide any useful data. The worst example was the National Cooperative Dialysis Study, which committed a Type II statistical error rejecting the time of dialysis as an important factor determining the quality of dialysis. This study also provided the basis for the establishment of the Kt/V index as a measure of dialysis adequacy. This index was accepted by the HD community, having been established in a sacrosanct RCT, led to short dialysis, and possibly higher mortality in the USA. The second trial (the HEMO study) committed a Type III statistical error asking the wrong question and did not bring any valuable results, but at least it did not lead to deterioration of dialysis outcomes in the USA. The third, the Frequent Hemodialysis Network Trial Group, did not bring forth any valuable results, but at least confirmed what was already known. The fourth, the Frequent Hemodialysis Network Nocturnal Trial, committed a Type II statistical error because of tremendous recruitment problems leading to an inadequate number of subjects. Moreover, the study methodology was absolutely unreliable.


Subject(s)
Kidney Failure, Chronic/therapy , Randomized Controlled Trials as Topic , Renal Dialysis , Humans
7.
Kidney Int ; 82(1): 114-5; author reply 115, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22699382
11.
Nephrol Nurs J ; 37(6): 641-6; quiz 647, 2010.
Article in English | MEDLINE | ID: mdl-21290918

ABSTRACT

This study compares patient and technique survival on continuous ambulatory peritoneal dialysis (CAPD) and other peritoneal dialysis (PD) modalities in relation to body size indicators, race, sex, and peritoneal transport characteristics. Data were abstracted from a PD adequacy database, with 354 patients subjected to analysis. Transfers between PD modalities were almost exclusively from CAPD to various offshoots of PD, mostly due to inadequate dialysis or inadequate ultrafiltration. Survival analysis showed better technique survival for other PD modalities compared to CAPD when body mass index was less than 25 kg/m2, body surface area (BSA) was less than 1.9 m2, total body water was less than 39 L, and the dialysate-to-plasma ratio of creatinine at four hours was less than 0.65 by the peritoneal equilibration test (PET). There were no differences found in relation to gender, race, or PET ratio of dialysate glucose at four hours to dialysate glucose at time zero. In other PD modalities, no differences in technique and patient survival were found in regard to the same parameters, with the exception of better technique survival in males with a BSA over 1.9 m2. In conclusion, CAPD technique survival is better in the small patient with below average peritoneal transport characteristics. In other PD modalities, survival is not related to anthropometric indices or peritoneal transport characteristics.


Subject(s)
Body Size , Peritoneal Dialysis, Continuous Ambulatory , Education, Continuing , Female , Humans , Male , Survival Analysis
12.
Adv Perit Dial ; 25: 155-64, 2009.
Article in English | MEDLINE | ID: mdl-19886338

ABSTRACT

Technique survival in continuous ambulatory peritoneal dialysis (CAPD) depends mostly on clearances in relation to body size and residual renal function (RRF). Our clinical impression has been that when RRF fails, larger patients leave CAPD sooner than smaller patients do. Peritoneal equilibration tests (PETs) and 24-hour adequacy evaluations performed in 277 patients in a single center from 1986 through 2009 were abstracted from the existing peritoneal dialysis adequacy database. A PET (using 2 L of 2.5% dextrose dialysis solution) was performed in 272 patients during the first 4 months of dialysis. Every 3 months, the patients brought their 24-hour urine and dialysate collections for adequacy evaluations and had height and weight recorded. Body surface area (BSA), body mass index (BMI), and total body water (TBW) were calculated. There were 1372 adequacy evaluations abstracted. The number of patients gradually declined over time because of death (28%) or transfer to other peritoneal regimens (25%) or to hemodialysis (23%). A small number of patients received a kidney graft (6%) or left CAPD for other reasons (12%); only 6% of patients remained on CAPD after 80 months of treatment. The mean (+/- standard deviation) PET 4-hour values were 0.652 +/- 0.128 for dialysate-to-plasma (D/P) ratio of creatinine (Cr), 0.403 +/- 0.0969 for 4-hour dialysate-to-initial dialysate (D/D0) glucose concentration ratio, and 2336 +/- 211 mL for the drain volume. There was no correlation between PET D/P Cr and BSA (r = 0.0051, p = 0.934), PET D/D0 glucose and BSA (r = 0.0042, p = 0.945), or PET drain volume and TBW. The correlations with other size indicators were very poor. None of the large patients (BSA > 1.9 m2, weight > 75 kg, BMI > 25 kg/m2) remained on CAPD for more than 80 months once they lost RRF. These results confirm our impression that, with declining RRF, larger patients do not continue CAPD as long as smaller patients do.


Subject(s)
Body Size , Kidney/physiopathology , Peritoneal Dialysis, Continuous Ambulatory , Peritoneum/metabolism , Biological Transport , Body Surface Area , Body Water , Body Weight , Creatinine/metabolism , Female , Glucose/metabolism , Humans , Male , Middle Aged
14.
15.
Hemodial Int ; 12(4): 412-25, 2008 Oct.
Article in English | MEDLINE | ID: mdl-19090863

ABSTRACT

Sodium balance is precisely regulated by intake and output. The kidneys are responsible for adjusting sodium excretion to maintain balance at varying intakes. Our distant ancestors were herbivores. Their diet contained little sodium, so they developed powerful mechanisms for conserving sodium and achieving low urinary excretion. About 10,000 years ago, early humans became villagers and discovered that food could be preserved in brine. This led to increased consumption of salt. High salt intake increases extracellular volume (ECV), blood volume, and cardiac output resulting in elevation of blood pressure. High ECV induces release of a digitalis-like immunoreactive substance and other inhibitors of Na(+)-K(+)-ATPase. As a consequence, intracellular sodium and calcium concentrations increase in vascular smooth muscles predisposing them to contraction. Moreover, high ECV increases synthesis and decreases clearance of asymmetrical dimethyl-l-arginine leading to inhibition of nitric oxide (NO) synthase. High concentration of sodium and calcium in vascular smooth muscles, and decreased synthesis of NO lead to an increase in total peripheral resistance. Restoration of normal ECV and blood pressure are attained by increased glomerular filtration and decreased sodium reabsorption. In some individuals, the kidneys have difficulty in excreting sodium, so the equilibrium is achieved at the expense of elevated blood pressure. There is some lag time between reduction of ECV and normalization of blood pressure because the normal levels of Na(+)-K(+)-ATPase inhibitors and asymmetrical dimethyl-l-arginine are restored slowly. In dialysis patients, all mechanisms intended to increase renal sodium removal are futile but they still operate and elevate blood pressure. The sodium balance must be achieved via dialysis and ultrafiltration. Blood pressure is normalized a few weeks after ECV is returned to normal, i.e., when the patient reaches dry body weight. This is called the "lag phenomenon."


Subject(s)
Hypertension, Renal/metabolism , Kidney Failure, Chronic/metabolism , Kidney/metabolism , Renal Dialysis , Sodium Chloride, Dietary/metabolism , Animals , Homeostasis/physiology , Humans , Kidney Failure, Chronic/therapy
16.
Hemodial Int ; 12(2): 173-210, 2008 Apr.
Article in English | MEDLINE | ID: mdl-18394051

ABSTRACT

Accumulation of knowledge requisite for development of hemodialysis started in antiquity and continued through Middle Ages until the 20th century. Firstly, it was determined that the kidneys produce urine containing toxic substances that accumulate in the body if the kidneys fail to function properly; secondly, it was necessary to discover the process of diffusion and dialysis; thirdly, it was necessary to develop a safe method to prevent clotting in the extracorporeal circulation; and fourthly, it was necessary to develop biocompatible dialyzing membranes. Most of the essential knowledge was acquired by the end of the 19th century. Hemodialysis as a practical means of replacing kidney function started and developed in the 20th century. The original hemodialyzers, using celloidin as a dialyzing membrane and hirudin as an anticoagulant, were used in animal experiments at the beginning of the 20th century, and then there were a few attempts in humans in the 1920s. Rapid progress started with the application of cellophane membranes and heparin as an anticoagulant in the late 1930s and 1940s. The explosion of new dialyzer designs continued in the 1950s and 1960s and ended with the development of capillary dialyzers. Cellophane was replaced by other dialyzing membranes in the 1960s, 1970s, and 1980s. Dialysis solution was originally prepared in the tank from water, electrolytes, and glucose. This solution was recirculated through the dialyzer and back to the tank. In the 1960s, a method of single-pass dialysis solution preparation and delivery system was designed. A large quantity of dialysis solution was used for a single dialysis. Sorbent systems, using a small volume of regenerated dialysis solution, were developed in the mid 1960s, and continue to be used for home hemodialysis and acute renal failure. At the end of the 20th century, a new closed system, which prepared and delivered ultrapure dialysis solution preparation, was developed. This system also had automatic reuse of lines and dialyzers and prepared the machine for the next dialysis. This was specifically designed for quotidian home hemodialysis. Another system for frequent home hemodialysis or acute renal failure was developed at the turn of the 21st century. This system used premanufactured dialysis solution, delivered to the home or dialysis unit, as is done for peritoneal dialysis.


Subject(s)
Kidneys, Artificial , Renal Dialysis/instrumentation , Equipment Design/history , Hemodialysis Solutions/history , History, 19th Century , History, 20th Century , History, 21st Century , History, Ancient , History, Medieval , Humans , Kidneys, Artificial/history , Renal Dialysis/history , Renal Insufficiency/history , Renal Insufficiency/physiopathology , Renal Insufficiency/therapy
18.
Blood Purif ; 25(1): 90-8, 2007.
Article in English | MEDLINE | ID: mdl-17170543

ABSTRACT

Chronic hemodialysis sessions, as developed in Seattle in the 1960s, were long procedures with minimal intra- and interdialytic symptoms. Over the next three decades, dialysis duration was shorten to 4, 3, even 2 h in thrice weekly schedules. This method spread rapidly, particularly in the United States, after the National Cooperative Dialysis Study suggested that the time of dialysis is of minor importance as long as urea clearance multiplied by dialysis time and scaled to total body water (Kt/V(urea)) equals 0.95-1.0. This number was later increased to 1.3, but the assumption that hemodialysis time is of minimal importance remained unchanged. However, Kt/V(urea) measures only the removal of low molecular weight substances and does not consider the removal of larger molecules. Nor does it correlate with the other important function of hemodialysis, namely ultrafiltration. Rapid ultrafiltration is associated with cramps, nausea, vomiting, headache, fatigue, hypotensive episodes during dialysis, and hangover after dialysis; patients remain fluid overloaded with subsequent poor blood pressure control leading to left ventricular hypertrophy, diastolic dysfunction, and high cardiovascular mortality. Kt/V(urea) should be abandoned as a measure of dialysis quality. The formula suggests that it is possible to decrease t as long as K is proportionately increased, but this is not true. Time of dialysis should be adjusted in such a way that patients would not suffer from symptoms related to rapid ultrafiltration, would not have other uremic symptoms and most patients would have blood pressure controlled without antihypertensive drugs.


Subject(s)
Renal Dialysis/adverse effects , Renal Dialysis/methods , Ultrafiltration/methods , Urea/metabolism , Body Composition , Humans , Hypertension/etiology , Hypertension/prevention & control , Hypotension/etiology , Hypotension/prevention & control , Metabolic Clearance Rate , Renal Dialysis/mortality , Time Factors , United States
19.
Adv Perit Dial ; 22: 147-52, 2006.
Article in English | MEDLINE | ID: mdl-16983959

ABSTRACT

The Tenckhoff catheter was developed in 1968 and has been widely used since for chronic peritoneal dialysis (PD) patients. Variations of the Tenckhoff catheter have been designed over the years in a search for the ideal PD catheter--an access that can provide reliable dialysate flow rates with few complications. Currently, data derived from randomized, controlled, multicenter trials dedicated to testing how catheter design and placement technique influence long-term catheter survival and function are scarce. As a result, no firm guidelines exist at the national or international levels on optimal PD catheter type or implantation technique. Also, no current statistics on the use of PD catheters are available. The last survey was carried out using an audience response system at the Annual Peritoneal Dialysis Conference in Orlando, Florida, in January 1994. The present analysis is based on a new survey done at the 2005 Annual Dialysis Conference in Tampa, Florida. It is a snapshot of preferences in catheter design and implantation technique in 2004 from an international sample of 65 respondent chronic PD centers. The Tenckhoff catheter remains the most widely used catheter, followed closely by the swan-neck catheter in both adult and pediatric respondent centers. Double-cuff catheters continue to be preferred over single-cuff catheters, and coiled intraperitoneal segments are generally preferred over straight intra-peritoneal segments. Surgical implantation technique remains the prevailing placement method in both pediatric and adult respondent centers.


Subject(s)
Catheters, Indwelling/statistics & numerical data , Peritoneal Dialysis/instrumentation , Adult , Child , Humans
20.
Contrib Nephrol ; 150: 13-19, 2006.
Article in English | MEDLINE | ID: mdl-16720986

ABSTRACT

The peritoneal membrane has the surface area similar to the body surface area. It consists of mesothelial cells, interstitium, connective tissue fibers, blood vessels, and lymphatics. Solutes of various sizes traverse the peritoneal membrane through at least three various pores: 'large' pores located in the venular interendothelial gaps, small 'paracellular' pores, and ultrasmall, 'transcellular' pores or aquaporins localized in peritoneal capillaries and mesothelial cells. High molecular weight solutes are mass-transfer limited; thus, their clearances do not increase significantly with high dialysate flow. Clearances of small molecular weight solutes are dialysate flow limited. Ultrafiltration is proportional to the hydrostatic and osmotic transmembrane pressures. The peritoneum offers greater resistance to accompanying solutes than to water (solute sieving), so that the concentration of solutes in the ultrafiltrate is less than in plasma water. Sodium sieving leads to hypertension, which is frequently observed in patients treated with short-dwell or continuous flow peritoneal dialysis. Peritoneal equilibration test is the most commonly used test to characterize peritoneal function and select the most suitable dialysis technique for a patient. Long-term peritoneal dialysis is associated with progressive loss of ultrafiltration capability due to structural and functional alterations in the membrane mostly as the consequence of exposure to glucose degradation products or advanced glycation end products generated during the sterilization process.


Subject(s)
Peritoneal Dialysis , Peritoneum/metabolism , Biological Transport , Body Water/metabolism , Dialysis Solutions , Diffusion , Humans , Peritoneum/anatomy & histology
SELECTION OF CITATIONS
SEARCH DETAIL
...