Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 68
Filter
2.
Am J Transplant ; 16(3): 877-85, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26474298

ABSTRACT

From 5000 to 10 000 kidney patients die prematurely in the United States each year, and about 100 000 more suffer the debilitating effects of dialysis, because of a shortage of transplant kidneys. To reduce this shortage, many advocate having the government compensate kidney donors. This paper presents a comprehensive cost-benefit analysis of such a change. It considers not only the substantial savings to society because kidney recipients would no longer need expensive dialysis treatments--$1.45 million per kidney recipient--but also estimates the monetary value of the longer and healthier lives that kidney recipients enjoy--about $1.3 million per recipient. These numbers dwarf the proposed $45 000-per-kidney compensation that might be needed to end the kidney shortage and eliminate the kidney transplant waiting list. From the viewpoint of society, the net benefit from saving thousands of lives each year and reducing the suffering of 100 000 more receiving dialysis would be about $46 billion per year, with the benefits exceeding the costs by a factor of 3. In addition, it would save taxpayers about $12 billion each year.


Subject(s)
Compensation and Redress , Financing, Organized/legislation & jurisprudence , Health Policy/economics , Kidney Failure, Chronic/surgery , Kidney Transplantation/economics , Living Donors/legislation & jurisprudence , Tissue and Organ Procurement/economics , Cost-Benefit Analysis , Female , Financing, Organized/organization & administration , Follow-Up Studies , Government Regulation , Health Care Costs , Health Services Needs and Demand/economics , Health Services Needs and Demand/legislation & jurisprudence , Humans , Kidney Transplantation/legislation & jurisprudence , Living Donors/supply & distribution , Male , Middle Aged , Tissue and Organ Procurement/legislation & jurisprudence , Tissue and Organ Procurement/organization & administration , United States
5.
Semin Dial ; 14(3): 157-9, 2001.
Article in English | MEDLINE | ID: mdl-11422917

ABSTRACT

Outcomes among dialysis patients vary considerably internationally and across regions within the United States. The Dialysis Outcomes and Practice Patterns Study (DOPPS) is a large, prospective, observational study of representative samples of hemodialysis patients in France, Germany, Italy, Japan, Spain, the United Kingdom, and the United States. The DOPPS collects a wealth of data regarding the patients' demographic characteristics, medical histories, laboratory values, prescriptions, dialysis unit practices, and outcomes. The study seeks to clarify which dialysis practices contribute to improved mortality rates, hospitalization rates, health related quality of life, and vascular access outcomes, after adjusting for the effects of comorbid disease and demographic variables. Over 18,000 patients have been enrolled to date. This paper describes the initial findings and outlines the plans to expand the trial.


Subject(s)
Kidney Failure, Chronic/therapy , Outcome Assessment, Health Care , Practice Patterns, Physicians'/standards , Quality of Life , Renal Dialysis/standards , Female , Humans , Kidney Failure, Chronic/mortality , Male , Program Evaluation , Prospective Studies , Renal Dialysis/methods , Survival Analysis , Treatment Outcome , United States
6.
Am J Kidney Dis ; 37(2): 276-86, 2001 Feb.
Article in English | MEDLINE | ID: mdl-11157367

ABSTRACT

Hemodialyzer reuse is commonly practiced in the United States. Recent studies have raised concerns about the mortality risk associated with certain reuse practices. We evaluated adjusted mortality risk during 1- to 2-year follow-up in a representative sample of 12,791 chronic hemodialysis patients treated in 1,394 dialysis facilities from 1994 through 1995. Medical record abstraction provided data on reuse practice, use of bleach, dialyzer membrane, dialysis dose, and patient characteristics and comorbidity. Mortality risk was analyzed by bootstrapped Cox models by (1) no reuse versus reuse, (2) reuse agent, and (3) dialyzer membrane with and without the use of bleach, while considering dialysis and patient factors. The relative risk (RR) for mortality did not differ for patients in reuse versus no-reuse units (RR = 0.96; 95% confidence interval [CI], 0.86 to 1.08; P > 0.50), and similar results were found with different levels of adjustment and subgroups (RR = 1.01 to 1.05; 95% CI, lower bound > 0.90, upper bound < 1.19 each; each P > 0.40). The RR for peracetic acid mixture versus formalin varied significantly by membrane type and use of bleach during reprocessing, achieving borderline significance for synthetic membranes. Among synthetic membranes, mortality was greater with low-flux than high-flux membranes (RR = 1.24; 95% CI, 1.02 to 1.52; P = 0.04) and without than with bleach during reprocessing (RR = 1.24; 95% CI, 1.01 to 1.48; P = 0.04). Among all membranes, mortality was lowest for patients treated with high-flux synthetic membranes (RR = 0.82; 95% CI, 0.72 to 0.93; P = 0.002). Although mortality was not greater in reuse than no-reuse units overall, differences may exist in mortality risk by reuse agent. Use of high-flux synthetic membrane dialyzers was associated with lower mortality risk, particularly when exposed to bleach. Clearance of larger molecules may have a role.


Subject(s)
Membranes, Artificial , Renal Dialysis/instrumentation , Renal Dialysis/mortality , Ambulatory Care Facilities , Comorbidity , Equipment Design , Equipment Reuse , Hospitals , Proportional Hazards Models , Renal Dialysis/statistics & numerical data , Risk , Sodium Hypochlorite , Sterilization/methods , United States/epidemiology
7.
Nephrologie ; 22(8): 379-84, 2001.
Article in French | MEDLINE | ID: mdl-11810992

ABSTRACT

Central venous catheters are widely used as vascular accesses for chronic haemodialysis. Different factors may lead to catheter use, whether clinical such as emergency dialysis, or related to practices specific to each dialysis unit or country. The Dialysis Outcomes and Practice Patterns Study is an observational study of more than 10,000 representative patients treated by haemodialysis followed over a two-year period in the United States, Japan, and in five European countries (France, Germany, Italy, Spain, United Kingdom). DOPPS data from the United States and Europe about catheters are reported in this paper. Catheter use is less frequent in Europe than in the US, both in incident and prevalent patients, and in patients who have been seen by a nephrologist in the pre-dialysis period. Tunneled and untunneled catheters are each associated with a significantly higher frequency of access infection compared to native arteriovenous fistulae and grafts. Patients with important comorbidities such as diabetes, cardiovascular diseases, malnutrition or dementia are more likely to be dialysed with tunneled catheters. Furthermore, patients initiating hemodialysis with a tunneled catheter display higher mortality risk compared to patients starting hemodialysis with a permanent access. In summary, DOPPS data indicate that central venous catheters are used for chronic haemodialysis in patients with a high level of morbidity, and that their utilisation is associated to an additional risk, particularly of infection, and to a lower survival for tunneled catheters. Appropriate care should limit the utilisation of central venous catheters to clinically undisputable indications.


Subject(s)
Catheterization, Central Venous , Kidney Failure, Chronic/therapy , Renal Dialysis , Treatment Outcome , Catheterization, Central Venous/adverse effects , Catheterization, Central Venous/statistics & numerical data , Europe , Humans , Infections , Japan , Kidney Failure, Chronic/mortality , Risk Factors , Survival Rate , United States
8.
Am J Kidney Dis ; 36(5): 991-9, 2000 Nov.
Article in English | MEDLINE | ID: mdl-11054356

ABSTRACT

Dialyzer reuse is practiced in more than 75% of the patients and dialysis units in the United States. However, reuse is not practiced in a small fraction of patients treated in reuse units (RUUs). This study evaluates both patient and facility characteristics associated with nonreuse in RUUs. The data source is from the Dialysis Mortality and Morbidity Study, Waves 1, 3, and 4, of the US Renal Data System. Only facilities that practiced dialyzer reuse were included in the analysis. A total of 12,094 patients from 1,095 reuse facilities were studied. Patients undergoing hemodialysis as of December 31, 1993, were selected. Of all patients treated in RUUs, 8% did not reuse dialyzers. Nonreuse was significantly (P < 0.02) more common, based on adjusted odds ratios (ORs), among patients who were younger (OR = 1.16 per 10 years younger), had primary glomerulonephritis (OR = 1.26 versus diabetes), had lower serum albumin level (OR = 1.72 per 1 g/dL lower), had more years on dialysis, and had higher level of education. Nonreuse patients were more likely to be treated with low-flux dialyzers (OR = 7.35; P < 0. 0001) and have a lower dialysis dose. No reuse was more likely in larger units and in not-for-profit and hospital-based units. Patient refusal accounted for one fourth of nonreuse in RUUs and was associated with the same factors, as well as with fewer comorbid conditions and non-Hispanic ethnicity. Significant geographic variations (up to eightfold) were documented. Nonreuse patients are treated with smaller, low-flux dialyzers and, on average, receive a lower Kt/V than reuse patients in the same units.


Subject(s)
Durable Medical Equipment/statistics & numerical data , Kidney Failure, Chronic/therapy , Renal Dialysis/instrumentation , Ambulatory Care Facilities/statistics & numerical data , Educational Status , Ethnicity , Female , Hemodialysis Units, Hospital/statistics & numerical data , Humans , Logistic Models , Male , Membranes, Artificial , Middle Aged , Multivariate Analysis , Odds Ratio , Prospective Studies , Renal Dialysis/statistics & numerical data , Sex Factors , Treatment Refusal , United States
9.
Am J Kidney Dis ; 36(5): 1025-33, 2000 Nov.
Article in English | MEDLINE | ID: mdl-11054361

ABSTRACT

This national study compares waitlisting and transplantation rates by gender, race, and diabetes and evaluates physiologic factors (panel-reactive antibodies [PRA], blood type, HLA matchability) and related practices (early and multiple waitlisting) as explanatory factors. This longitudinal study of the time to transplant waitlisting among 228,552 incident end-stage renal disease (ESRD) dialysis patients and to cadaveric transplantation among 46,164 waitlist dialysis patients (n = 23,275 first cadaveric transplants) used US data for 1991 to 1997. Relative rates of waitlisting (RRWL) after ESRD onset and of cadaveric transplantation (RRTx) after waitlist (Cox proportional hazards models) were adjusted for age, race, sex, ESRD cause, region, and incidence/waitlist year. We found that women have an RRWL = 0.84 (P < 0.0001) and RRTx = 0.86 (P < 0. 0001). PRA levels can explain the difference in the transplantation rate, because accounting for PRA gives an adjusted RRTx = 0.98 (NS) for women. For blacks versus whites, the RRWL = 0.59 (P < 0.0001) and RRTx = 0.55 (P < 0.0001). However, the transplantation rate can only partly be explained by ABO types, rare HLA types, and early and multiple waitlisting (adjusted RRTx = 0.67 [P < 0.0001]). For diabetes versus glomerulonephritis, the RRWL = 0.52 (P < 0.0001) and RRTx = 0.98 (NS). Older patients (40 to 59 years of age) are less likely to be waitlisted and to receive a transplant after waitlisting (RRWL = 0.57 [P < 0.0001], RRTx = 0.88 [P < 0.0001]) versus younger patients (ages 18 to 39 years). These results indicate substantial differences by age, sex, race, and diabetes in rates of waitlisting for transplantation and by age and race for transplantation after waitlisting. These differences by race were not explained by referral practices or the physiologic factors studied here.


Subject(s)
Kidney Failure, Chronic/surgery , Kidney Transplantation/statistics & numerical data , Adolescent , Adult , Age Distribution , Aged , Cadaver , Child , Child, Preschool , Diabetic Nephropathies/surgery , Ethnicity , Female , Humans , Infant , Kidney Failure, Chronic/ethnology , Longitudinal Studies , Male , Middle Aged , Odds Ratio , Proportional Hazards Models , Sex Distribution , Waiting Lists
11.
N Engl J Med ; 341(23): 1725-30, 1999 Dec 02.
Article in English | MEDLINE | ID: mdl-10580071

ABSTRACT

BACKGROUND AND METHODS: The extent to which renal allotransplantation - as compared with long-term dialysis - improves survival among patients with end-stage renal disease is controversial, because those selected for transplantation may have a lower base-line risk of death. In an attempt to distinguish the effects of patient selection from those of transplantation itself, we conducted a longitudinal study of mortality in 228,552 patients who were receiving long-term dialysis for end-stage renal disease. Of these patients, 46,164 were placed on a waiting list for transplantation, 23,275 of whom received a first cadaveric transplant between 1991 and 1997. The relative risk of death and survival were assessed with time-dependent nonproportional-hazards analysis, with adjustment for age, race, sex, cause of end-stage renal disease, geographic region, time from first treatment for end-stage renal disease to placement on the waiting list, and year of initial placement on the list. RESULTS: Among the various subgroups, the standardized mortality ratio for the patients on dialysis who were awaiting transplantation (annual death rate, 6.3 per 100 patient-years) was 38 to 58 percent lower than that for all patients on dialysis (annual death rate, 16.1 per 100 patient-years). The relative risk of death during the first 2 weeks after transplantation was 2.8 times as high as that for patients on dialysis who had equal lengths of follow-up since placement on the waiting list, but at 18 months the risk was much lower (relative risk, 0.32; 95 percent confidence interval, 0.30 to 0.35; P<0.001). The likelihood of survival became equal in the two groups within 5 to 673 days after transplantation in all the subgroups of patients we examined. The long-term mortality rate was 48 to 82 percent lower among transplant recipients (annual death rate, 3.8 per 100 patient-years) than patients on the waiting list, with relatively larger benefits among patients who were 20 to 39 years old, white patients, and younger patients with diabetes. CONCLUSIONS: Among patients with end-stage renal disease, healthier patients are placed on the waiting list for transplantation, and long-term survival is better among those on the waiting list who eventually undergo transplantation.


Subject(s)
Kidney Failure, Chronic/mortality , Kidney Transplantation/mortality , Renal Dialysis/mortality , Waiting Lists , Adolescent , Adult , Age Factors , Aged , Cadaver , Child , Child, Preschool , Diabetes Complications , Female , Humans , Infant , Kidney Failure, Chronic/ethnology , Kidney Failure, Chronic/surgery , Kidney Failure, Chronic/therapy , Longitudinal Studies , Male , Middle Aged , Patient Selection , Risk , Survival Analysis , United States/epidemiology
12.
Kidney Int ; 55(6): 2467-76, 1999 Jun.
Article in English | MEDLINE | ID: mdl-10354296

ABSTRACT

BACKGROUND: Daily hemodialysis has been proposed to improve outcomes for patients with end-stage renal disease. There has been increasing evidence that daily hemodialysis might have potential advantages over intermittent dialysis. However, despite these potential advantages, daily hemodialysis is infrequently used in the United States, and published accounts on the technique are few. METHODS: We describe patient outcomes after increasing their hemodialysis frequency from three to six times per week in a cohort of 72 patients treated at nine centers during 1972 to 1996. Analyses of predialysis blood pressure and laboratory parameters from 6 months before until 12 months after starting frequent hemodialysis used a repeated-measures statistical technique. RESULTS: Predialysis systolic and diastolic blood pressures fell by 7 and 4 mm Hg, respectively, after starting frequent hemodialysis (P = 0.02). Reductions were greatest among patients being treated with antihypertensive medications, despite a reduction in their dosage of medications. Postdialysis weight fell by 1.0% within one month of starting frequent hemodialysis and improved control of hypertension. After the initial drop, postdialysis weight increased at a rate of 0.85 kg per six months. Serum albumin rose by 0.29 g/dl (P < 0.001) between months 1 to 12 of treatment with daily hemodialysis. Hematocrit rose by 3.0 percentage points (P = 0.02) among patients (N = 56) not treated with erythropoietin during this period. Two years after the start of daily hemodialysis, Kaplan-Meier analyses showed a patient survival of 93%, a technique survival of 77%, and an arteriovenous fistula patency of 92%. Vascular access patency was excellent despite more frequent use of the access. CONCLUSIONS: These results suggest that in certain patients, daily hemodialysis might have advantages over three times per week hemodialysis.


Subject(s)
Renal Dialysis/methods , Adolescent , Adult , Aged , Arteriovenous Shunt, Surgical/adverse effects , Blood Pressure , Body Weight , Calcium Phosphates/blood , Cholesterol/blood , Cohort Studies , Europe , Female , Hematocrit , Humans , Kidney Failure, Chronic/pathology , Kidney Failure, Chronic/physiopathology , Kidney Failure, Chronic/therapy , Male , Middle Aged , Serum Albumin/metabolism , Time Factors , United States
13.
Health Serv Res ; 33(6): 1567-92, 1999 Feb.
Article in English | MEDLINE | ID: mdl-10029498

ABSTRACT

OBJECTIVE: To evaluate the effects of case mix, practice patterns, features of the payment system, and facility characteristics on the cost of dialysis. DATA SOURCES/STUDY SETTING: The nationally representative sample of dialysis units in the 1991 U.S. Renal Data System's Case Mix Adequacy (CMA) Study. The CMA data were merged with data from Medicare Cost Reports, HCFA facility surveys, and HCFA's end-stage renal disease patient registry. STUDY DESIGN: We estimated a statistical cost function to examine the determinants of costs at the dialysis unit level. PRINCIPAL FINDINGS: The relationship between case mix and costs was generally weak. However, dialysis practices (type of dialysis membrane, membrane reuse policy, and treatment duration) did have a significant effect on costs. Further, facilities whose payment was constrained by HCFA's ceiling on the adjustment for area wage rates incurred higher costs than unconstrained facilities. The costs of hospital-based units were considerably higher than those of freestanding units. Among chain units, only members of one of the largest national chains exhibited significant cost savings relative to independent facilities. CONCLUSIONS: Little evidence showed that adjusting dialysis payment to account for differences in case mix across facilities would be necessary to ensure access to care for high-cost patients or to reimburse facilities equitably for their costs. However, current efforts to increase dose of dialysis may require higher payments. Longer treatments appear to be the most economical method of increasing the dose of dialysis. Switching to more expensive types of dialysis membranes was a more costly means of increasing dose and hence must be justified by benefits beyond those of higher dose. Reusing membranes saved money, but the savings were insufficient to offset the costs associated with using more expensive membranes. Most, but not all, of the higher costs observed in hospital-based units appear to reflect overhead cost allocation rather than a difference in real resources devoted to treatment. The economies experienced by the largest chains may provide an explanation for their recent growth in market share. The heterogeneity of results by chain size implies that characterizing units using a simple chain status indicator variable is inadequate. Cost differences by facility type and the effects of the ongoing growth of large chains are worthy of continued monitoring to inform both payment policy and antitrust enforcement.


Subject(s)
Ambulatory Care Facilities/economics , Diagnosis-Related Groups/economics , Hemodialysis Units, Hospital/economics , Hospital Costs/statistics & numerical data , Medicare/economics , Practice Patterns, Physicians'/economics , Reimbursement Mechanisms/economics , Renal Dialysis/economics , Centers for Medicare and Medicaid Services, U.S. , Cost Savings , Diagnosis-Related Groups/classification , Female , Health Services Research , Humans , Male , Middle Aged , Organizational Policy , Registries , Renal Dialysis/instrumentation , Renal Dialysis/methods , United States
14.
Transplantation ; 67(2): 291-5, 1999 Jan 27.
Article in English | MEDLINE | ID: mdl-10075596

ABSTRACT

BACKGROUND: The role of renal transplantation as treatment for end-stage sickle cell nephropathy (SCN) has not been well established. METHODS: We performed a comparative investigation of patient and allograft outcomes among age-matched African-American kidney transplant recipients with ESRD as a result of SCN (n=82) and all other causes (Other-ESRD, n=22,565). RESULTS: The incidence of delayed graft function and predischarge acute rejection in SCN group (24% and 26%) was similar to that observed in the Other-ESRD group (29% and 27%). The mean discharge serum creatinine (SCr) was 2.7 (+/-2.5) mg/dl in the SCN recipients compared to 3.0 (+/-2.5) mg/dl in the Other-ESRD recipients (P=0.42). There was no difference in the 1-year cadaveric graft survival (SCN: 78% vs. Other-ESRD: 77%), and the multivariable adjusted 1-year risk of graft loss indicated no significant effect of SCN (relative risk [RR]=1.39, P=0.149). However, the 3-year cadaveric graft survival tended to be lower in the SCN group (48% vs. 60%, P=0.055) and their adjusted 3-year risk of graft loss was significantly greater (RR= 1.60, P=0.003). There was a trend toward improved survival in the SCN transplant recipients compared to their dialysis-treated, wait-listed counterparts (RR=0.14, P=0.056). In comparison to the Other-ESRD (RR=1.00), the adjusted mortality risk in the SCN group was higher both at 1 year (RR=2.95, P=0.001) and at 3 years (RR=2.82, P=0.0001) after renal transplantation. CONCLUSIONS: The short-term renal allograft result in recipients with end-stage SCN was similar to that obtained in other causes of ESRD, but the long-term outcome was comparatively diminished. There was a trend toward better patient survival with renal transplantation relative to dialysis in end-stage SCN.


Subject(s)
Anemia, Sickle Cell/complications , Graft Survival , Kidney Failure, Chronic/etiology , Kidney Failure, Chronic/surgery , Kidney Transplantation/physiology , Adolescent , Adult , Black or African American , Black People , Child , Cohort Studies , Female , Histocompatibility Testing , Humans , Kidney Transplantation/mortality , Male , Middle Aged , Risk Factors , Survival Analysis , Time Factors , Tissue Donors , Treatment Outcome , United States
15.
Am J Kidney Dis ; 33(1): 1-10, 1999 Jan.
Article in English | MEDLINE | ID: mdl-9915261

ABSTRACT

A number of studies have suggested that type of dialysis membrane is associated with differences in long-term outcome of patients undergoing hemodialysis, both in terms of morbidity and mortality. The purpose of this study was to determine the relationship of membrane type and specific causes of death. Data from the United States Renal Data System Case Mix Adequacy Study, a national random sample of hemodialysis patients who were alive on December 31, 1990, were used. Our study was limited to patients in this data set who were undergoing dialysis for at least 1 year (n = 4,055). For the main analytic models, membrane type was classified into two categories: unmodified cellulose or MC/SYN (which combines modified cellulose [MC] and synthetic membranes [SYN]). The relationships of membrane type and major causes of mortality were analyzed using Cox proportional hazards models, which adjusted for multiple (21) covariates, including demographics, comorbidity, Kt/V, and other parameters. Patients were censored at transplantation or 60 days after a switch to peritoneal dialysis. Compared with patients dialyzed with unmodified cellulose membranes, the adjusted relative mortality risk (RR) from infection was 31% lower (RR = 0.69; P = 0.03) and from coronary artery disease was 26% lower (RR = 0.74; P = 0.07) for patients dialyzed with MC/SYN membranes. No statistically significant difference (all P > 0.1) was found in mortality risk from cerebrovascular disease (RR = 1.08), other cardiac causes (RR = 0.86), malignancy (RR = 0.90), or other known causes (RR = 0.82) between patients dialyzed with MC/SYN compared with unmodified cellulose membranes. These results offer support to reported experimental and observational clinical studies that have found that unmodified cellulose membranes may increase the risk for both infection and atherogenesis. Further studies are necessary to evaluate the possibility of confounding factors, compare more specific membrane types, and determine the pathophysiology linking membrane type to cause-specific mortality.


Subject(s)
Kidney Failure, Chronic/mortality , Membranes, Artificial , Renal Dialysis/instrumentation , Cause of Death , Comorbidity , Databases, Factual/statistics & numerical data , Female , Humans , Kidney Failure, Chronic/therapy , Male , Middle Aged , Patient Selection , Proportional Hazards Models , Random Allocation , Renal Dialysis/mortality , Renal Dialysis/statistics & numerical data , Risk , United States/epidemiology
16.
Am J Kidney Dis ; 32(1): 139-45, 1998 Jul.
Article in English | MEDLINE | ID: mdl-9669435

ABSTRACT

Noncompliance with hemodialysis (HD), depending on the definition, occurs in 2% to more than 50% of patients. To better understand predictors and outcomes of noncompliance, we evaluated patient characteristics associated with noncompliance and the impact of noncompliance on survival. Using data from two USRDS special studies, we identified 6,251 patients who were on dialysis for more than 1 year for inclusion in this study. Noncompliance was defined in four ways: skipping one or more HD sessions in a month, shortening by 10 or more minutes one or more HD sessions in a month, an interdialytic weight gain (IWG) of more than 5.7% of dry weight, or a serum phosphate (PO4) of greater than 7.5 mg/dL. Sociodemographic predictors of noncompliance were identified using logistic regression. Survival analysis was done using Cox proportional hazards models with adjustments for sociodemographics, comorbid conditions, and dose of HD. Overall, 8.5% of patients skipped HD, 20% shortened HD (7% three or more times), 10% had more than a 5.7% IWG, and 22% had a PO4 greater than 7.5. There was a significant correlation among the measures of noncompliance. Blacks (adjusted odds ratio [AOR] = 2.10), patients aged 20 to 39 years (AOR = 1.62), and smokers (AOR = 1.34) were significantly more likely to skip HD than whites, patients aged 40 to 59 years, and nonsmokers, respectively (P < 0.01 for each). Similar results were seen for the other measures of noncompliance, except for PO4, in which blacks were significantly less likely to be noncompliant (NC) (AOR = 0.85, P < 0.05). Compared with compliant patients, those who skipped one or more HD sessions in a month had a 25% higher risk of death (P < 0.01). Those who had greater than a 5.7% IWG had a 35% higher risk of death (P < 0.001), whereas those with a PO4 > 7.5 had a 13% higher risk of death (P < 0.05). Overall, patients who shortened HD sessions did not have a higher risk of death, but those who shortened three or more in 1 month had a 20% higher risk of death (P < 0.05). Compliance with a medical regimen is a complex issue. Noncompliance in HD often, but not always, is associated with a higher risk of an adverse outcome. This is a US government work. There are no restrictions on its use.


Subject(s)
Kidney Failure, Chronic/therapy , Patient Compliance , Renal Dialysis , Adult , Black or African American/psychology , Black or African American/statistics & numerical data , Black People , Cohort Studies , Female , Humans , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/psychology , Logistic Models , Male , Middle Aged , Proportional Hazards Models , Renal Dialysis/mortality , Renal Dialysis/psychology , Renal Dialysis/statistics & numerical data , Risk Factors , Smoking/epidemiology , Socioeconomic Factors , Survival Analysis , White People/psychology , White People/statistics & numerical data
17.
Am J Kidney Dis ; 31(6): 997-1006, 1998 Jun.
Article in English | MEDLINE | ID: mdl-9631845

ABSTRACT

A strong association exists between nutritional status and morbidity and mortality in patients with end-stage renal disease who are treated with hemodialysis. Described here is the predictive value for mortality over 5 years of follow-up of a number of risk factors, recorded at baseline, in a national sample of 3,607 hemodialysis patients. Among the variables studied were case-mix covariates, caregiver classifications of nutritional status, serum albumin concentration, and body mass index (BMI). The Case Mix Adequacy special study of the United States Renal Data System (USRDS) provided these measurements as of December 31, 1990. The USRDS patient standard analysis file provided follow-up data on mortality for all patients through December 31, 1995, by which time 64.7% of the patients had died. BMI is a simple anthropometric measurement that has received little attention in dialysis practice. Caregiver classification refers to documentation in a patient's dialysis facility medical records that stated an individual to be "undernourished/cachectic," "obese/overweight," or "well-nourished." The mean serum albumin was 3.7 +/- 0.45 (SD) g/dL, and the mean BMI was 24.4 +/- 5.3 (SD) kg/m2. By caregiver classification, 20.1% of patients were undernourished, and 24.9% obese. In hazard regression models, including but not limited to the Cox proportional hazards model, low BMI, low serum albumin, and the caregiver classification "undernourished" were independently and significantly predictive of increased mortality. In analyses allowing for time-varying relative mortality risks (ie, nonproportional hazards), the greatest predictive value of all three variables occurred early during follow-up, but the independent predictive value of baseline serum albumin and BMI measurements on mortality risk persisted even 5 years later. No evidence of increasing mortality risk was found for higher values of BMI. Serum albumin was confirmed to be a useful predictor of mortality risk in hemodialysis patients; BMI was established as an independently important predictor of mortality; both serum albumin and BMI, measured at baseline, continue to possess predictive value 5 years later; the subjective caregiver classification of nutritional status "undernourished" has independent value in predicting mortality risk beyond the information gained from two other markers of nutritional status--BMI and serum albumin.


Subject(s)
Nutritional Status , Renal Dialysis/mortality , Adult , Aged , Aged, 80 and over , Body Mass Index , Female , Humans , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/therapy , Male , Middle Aged , Nutrition Disorders/complications , Risk Factors , Serum Albumin/analysis , Survival Rate
18.
Am J Kidney Dis ; 32(6 Suppl 4): S9-15, 1998 Dec.
Article in English | MEDLINE | ID: mdl-9892361

ABSTRACT

The past decade has seen substantial improvements in end-stage renal disease (ESRD) outcomes, especially mortality, in the United States. Incidence rates for treated ESRD have doubled for most age groups, probably because of improved survival among high-risk populations, such as patients with diabetes and hypertension. The ESRD patient population is becoming older and has a greater incidence of diabetes because of changes in the types of patients starting treatment. The number of patients added to the waiting list each year for transplants has increased dramatically, whereas the number of transplantations performed annually has remained relatively constant. Although transplantation is consequently less available than before, transplant survival, both of the patient and the graft, has improved dramatically. Length of stay for hospitalizations has decreased. Both dialysis mortality and all ESRD mortality have decreased. It is important to monitor such statistics to try and modify adverse trends in outcomes for patients with ESRD. The ability to monitor patient outcomes through national databases has improved greatly during the last decade. Large-scale population-based studies of practices and outcomes for patients with ESRD offer a potent addition to the previously available arsenal of research tools, which was previously dominated by studies from single or few institutions and more expensive randomized clinical trials.


Subject(s)
Kidney Failure, Chronic/mortality , Adolescent , Adult , Aged , Aged, 80 and over , Female , Hospitalization/statistics & numerical data , Humans , Incidence , Kidney Failure, Chronic/epidemiology , Kidney Failure, Chronic/therapy , Kidney Transplantation/statistics & numerical data , Male , Middle Aged , Outcome Assessment, Health Care , Peritoneal Dialysis/statistics & numerical data , Prevalence , Renal Dialysis/statistics & numerical data , Survival Rate , United States/epidemiology
19.
Am J Kidney Dis ; 32(6 Suppl 4): S34-8, 1998 Dec.
Article in English | MEDLINE | ID: mdl-9892363

ABSTRACT

Details regarding dialysis therapy have been studied by the US Renal Data System (USRDS) in four random samples of US hemodialysis patients during the years 1986 to 1997. During this decade, the delivered dose of hemodialysis therapy has increased by at least 0.2 Kt/V. The frequency of twice weekly dialysis prescription decreased, whereas the duration of each treatment showed only minor changes. A large shift to more biocompatible membranes, particularly to synthetic membranes, was observed. The use of acetate dialysate almost disappeared. Outcomes research by the USRDS showed significantly lower mortality risk associated independently with higher delivered Kt/V, substituted cellulose or synthetic membranes, and bicarbonate dialysate. The projected reduction in mortality risk from these changes in hemodialysis therapy was of a similar magnitude to the observed 14% to 17% reduction in mortality rate during the years 1990 to 1996. National observational studies of dialysis patients may influence the practice of dialysis and lead to improved survival.


Subject(s)
Renal Dialysis/mortality , Renal Dialysis/trends , Humans , Membranes, Artificial , Renal Dialysis/methods , Risk Factors , Survival Rate , United States/epidemiology
20.
Transplantation ; 66(12): 1651-9, 1998 Dec 27.
Article in English | MEDLINE | ID: mdl-9884254

ABSTRACT

BACKGROUND: Survival of transplant recipients after primary renal allograft failure has not been well studied. METHODS: A cohort of 19,208 renal transplant recipients with primary allograft failure between 1985 and 1995 were followed from the date of allograft loss until death, repeat transplantation, or December 31, 1996. The mortality, wait-listing, and repeat transplantation rates were assessed. The mortality risks associated with repeat transplantation were estimated with a time-dependent survival model. RESULTS: In total, 34.5% (n=6,631) of patients died during follow-up. Of these deaths, 82.9% (n=5,498) occurred in patients not wait-listed for repeat transplantation, 11.9% (n=789) occurred in wait-listed patients, and 5.2% (n=344) occurred in second transplant recipients. Before repeat transplantation, the adjusted 5-year patient survival was 36%, 49%, and 65% for type I diabetes mellitus (DM), type II DM, and nondiabetic end-stage renal disease, respectively (P<0.001; DM vs. nondiabetics). The adjusted 5-year patient survival was lower in Caucasians (57%, P<0.001) compared with African-Americans (67%) and other races (64%). The 5-yr repeat transplantation rate was 29%, 15%, and 19%, whereas the median waiting time for a second transplant was 32, 90, and 81 months for Caucasians, African-Americans, and other races, respectively (P<0.0001 each). Repeat transplantation was associated with 45% and 23% reduction in 5-year mortality for type I DM and nondiabetic end-stage renal disease, respectively, when compared with their wait-listed dialysis counterparts with prior transplant failure. CONCLUSIONS: The loss of a primary renal allograft was associated with significant mortality, especially in recipients with type I DM. Repeat transplantation was associated with a substantial improvement in 5-year patient survival. Recipients with type I DM achieved the greatest proportional benefit from repeat transplantation.


Subject(s)
Kidney Transplantation/mortality , Adolescent , Adult , Aged , Female , Humans , Kidney Failure, Chronic/etiology , Kidney Failure, Chronic/mortality , Male , Middle Aged , Multivariate Analysis , Prognosis , Reoperation , Survival Rate , Transplantation, Homologous
SELECTION OF CITATIONS
SEARCH DETAIL
...