Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 23
Filter
1.
Kidney Med ; 5(2): 100580, 2023 Feb.
Article in English | MEDLINE | ID: mdl-36712314

ABSTRACT

Rationale & Objective: Compared to the original nursing home status (any nursing home stay in the previous calendar year), new nursing home status variables were developed to improve the risk adjustment of Standardized Mortality/Hospitalization Ratio (SMR/SHR) models used in public reporting of dialysis quality of care, such as the Annual Dialysis Facility Report. Study Design: Retrospective observational study. Setting & Participants: 625,040 US maintenance dialysis patients with >90 kidney failure days in 2019. Predictors: Nursing home status variables; patient characteristics; comorbid conditions. Outcomes: Mortality/hospitalization. Analytical Approach: We assigned patients and patient times (SMR/SHR model) to one of 3 mutually exclusive categories: long-term care (≥90 days), short-term care (1-89 days), or non-nursing home, based on nursing home stay during the previous 365 days from the first day of the time period at risk. Nursing home status was derived from the Nursing Home Minimum Data Set. Comparisons of hazard ratios from adjusted models, facility SMR/SHR performance, and model C-statistics between the original/new models were performed. Results: SMR's hazard ratio of original nursing home status (2.09) was lower than both ratios of short-term care (2.38) and long-term care (2.43), whereas SHR's hazard ratio of original nursing home status (1.10) was between the ratios of long-term care (1.01) and short-term care (1.20). There was a difference in hazard ratios between short-term care and long-term care for both measures. Small percentages of facilities changed performance categories: 0.7% for SMR and 0.4% for SHR. The SMR C-statistic improved whereas the SHR C-statistic was relatively unchanged. Limitations: Limited capture of subacute rehabilitation stays in the nursing home by using a 90-day cutoff for short-term care and long-term care; unable to draw causal inference about nursing home care. Conclusions: Use of a nursing home metric that effectively separates short-term from long-term nursing home utilization results in more meaningful risk adjustment that generally comports with Medicare payment policy, potentially resulting in more interpretable results for dialysis stakeholders.

2.
Kidney Int Rep ; 7(6): 1278-1288, 2022 Jun.
Article in English | MEDLINE | ID: mdl-35685310

ABSTRACT

Introduction: Rather than generating 1 transplant by directly donating to a candidate on the waitlist, deceased donors (DDs) could achieve additional transplants by donating to a candidate in a kidney paired donation (KPD) pool, thereby, initiating a chain that ends with a living donor (LD) donating to a candidate on the waitlist. We model outcomes arising from various strategies that allow DDs to initiate KPD chains. Methods: We base simulations on actual 2016 to 2017 US DD and waitlist data and use simulated KPD pools to model DD-initiated KPD chains. We also consider methods to assess and overcome the primary criticism of this approach, namely the potential to disadvantage blood type O-waitlisted candidates. Results: Compared with shorter DD-initiated KPD chains, longer chains increase the number of KPD transplants by up to 5% and reduce the number of DDs allocated to the KPD pool by 25%. These strategies increase the overall number of blood type O transplants and make LDs available to candidates on the waitlist. Restricting allocation of blood type O DDs to require ending KPD chains with LD blood type O donations to the waitlist markedly reduces the number of KPD transplants achieved. Conclusion: Allocating fewer than 3% of DD to initiate KPD chains could increase the number of kidney transplants by up to 290 annually. Such use of DDs allows additional transplantation of highly sensitized and blood type O KPD candidates. Collectively, patients of each blood type, including blood type O, would benefit from the proposed strategies.

3.
Am J Transplant ; 21(1): 103-113, 2021 01.
Article in English | MEDLINE | ID: mdl-32803856

ABSTRACT

As proof of concept, we simulate a revised kidney allocation system that includes deceased donor (DD) kidneys as chain-initiating kidneys (DD-CIK) in a kidney paired donation pool (KPDP), and estimate potential increases in number of transplants. We consider chains of length 2 in which the DD-CIK gives to a candidate in the KPDP, and that candidate's incompatible donor donates to theDD waitlist. In simulations, we vary initial pool size, arrival rates of candidate/donor pairs and (living) nondirected donors (NDDs), and delay time from entry to the KPDP until a candidate is eligible to receive a DD-CIK. Using data on candidate/donor pairs and NDDs from the Alliance for Paired Kidney Donation, and the actual DDs from the Scientific Registry of Transplant Recipients (SRTR) data, simulations extend over 2 years. With an initial pool of 400, respective candidate and NDD arrival rates of 2 per day and 3 per month, and delay times for access to DD-CIK of 6 months or less, including DD-CIKs increases the number of transplants by at least 447 over 2 years, and greatly reduces waiting times of KPDP candidates. Potential effects on waitlist candidates are discussed as are policy and ethical issues.


Subject(s)
Kidney Transplantation , Tissue and Organ Procurement , Donor Selection , Humans , Kidney , Living Donors
4.
Comput Biol Med ; 108: 345-353, 2019 05.
Article in English | MEDLINE | ID: mdl-31054501

ABSTRACT

BACKGROUND AND OBJECTIVES: The aim in kidney paired donation (KPD) is typically to maximize the number of transplants achieved through the exchange of donors in a pool comprising incompatible donor-candidate pairs and non-directed (or altruistic) donors. With many possible options in a KPD pool at any given time, the most appropriate set of exchanges cannot be determined by simple inspection. In practice, computer algorithms are used to determine the optimal set of exchanges to pursue. Here, we present our software application, KPDGUI (Kidney Paired Donation Graphical User Interface), for management and optimization of KPD programs. METHODS: While proprietary software platforms for managing KPD programs exist to provide solutions to the standard KPD problem, our application implements newly investigated optimization criteria that account for uncertainty regarding the viability of selected transplants and arrange for fallback options in cases where potential exchanges cannot proceed, with intuitive resources for visualizing alternative optimization solutions. RESULTS: We illustrate the advantage of accounting for uncertainty and arranging for fallback options in KPD using our application through a case study involving real data from a paired donation program, comparing solutions produced under different optimization criteria and algorithmic priorities. CONCLUSIONS: KPDGUI is a flexible and powerful tool for offering decision support to clinicians and researchers on possible KPD transplant options to pursue under different user-specified optimization schemes.


Subject(s)
Algorithms , Kidney Transplantation , Kidney , Software , Humans
5.
Stat Med ; 38(11): 1957-1967, 2019 05 20.
Article in English | MEDLINE | ID: mdl-30609113

ABSTRACT

Center-specific survival outcomes of kidney transplant recipients are an important quality measure, with several challenges. Existing methods based on restricted mean lifetime tend to focus on short- and medium-term clinical outcomes and may fail to capture long-term effects associated with quality of follow-up care. In this report, we propose methods that combine a lognormal frailty model and piecewise exponential baseline rates to compare the mean survival time across centers. The proposed methods allow for the consistent estimation of mean survival time as opposed to restricted mean lifetime and, within this context, permits more accurate profiling of long-term center-specific outcomes. Asymptotic properties of the proposed estimators are derived, and finite-sample properties are examined through simulation. The proposed methods are then applied to national kidney transplant data. The novelty of the proposed techniques arises from several angles. We utilize mean survival, in contrast to the most previous works that considered the restricted mean. Few previous studies have used the integrated survival function as a basis for center effects. Few provider profiling methods use a random effects model to estimate fixed center effects.


Subject(s)
Kidney Transplantation/mortality , Outcome Assessment, Health Care , Algorithms , Databases, Factual , Female , Humans , Male , Models, Statistical , Survival Analysis , Survival Rate , Time Factors , United States/epidemiology
6.
Transplantation ; 103(8): 1714-1721, 2019 08.
Article in English | MEDLINE | ID: mdl-30451742

ABSTRACT

BACKGROUND: The Kidney Donor Risk Index (KDRI) is a score applicable to deceased kidney donors which reflects relative graft failure risk associated with deceased donor characteristics. The KDRI is widely used in kidney transplant outcomes research. Moreover, an abbreviated version of KDRI is the basis, for allocation purposes, of the "top 20%" designation for deceased donor kidneys. Data upon which the KDRI model was based used kidney transplants performed between 1995 and 2005. Our purpose in this report was to evaluate the need to update the coefficients in the KDRI formula, with the objective of either (a) proposing new coefficients or (b) endorsing continued used of the existing formula. METHODS: Using data obtained from the Scientific Registry of Transplant Recipients, we analyzed n = 156069 deceased donor adult kidney transplants occurring from 2000 to 2016. Cox regression was used to model the risk of graft failure. We then tested for differences between the original and updated regression coefficients and compared the performance of the original and updated KDRI formulas with respect to discrimination and predictive accuracy. RESULTS: In testing for equality between the original and updated KDRIs, few coefficients were significantly different. Moreover, the original and updated KDRI yielded very similar risk discrimination and predictive accuracy. CONCLUSIONS: Overall, our results indicate that the original KDRI is robust and is not meaningfully improved by an update derived through modeling analogous to that originally employed.


Subject(s)
Graft Rejection/epidemiology , Kidney Transplantation/statistics & numerical data , Registries , Risk Assessment/methods , Tissue Donors/statistics & numerical data , Transplant Recipients/statistics & numerical data , Waiting Lists , Adult , Graft Survival , Humans , Incidence , Middle Aged , Retrospective Studies , United States/epidemiology
7.
Clin J Am Soc Nephrol ; 12(7): 1148-1160, 2017 Jul 07.
Article in English | MEDLINE | ID: mdl-28596416

ABSTRACT

BACKGROUND AND OBJECTIVES: Outcomes for transplants from living unrelated donors are of particular interest in kidney paired donation (KPD) programs where exchanges can be arranged between incompatible donor-recipient pairs or chains created from nondirected/altruistic donors. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: Using Scientific Registry of Transplant Recipients data, we analyzed 232,705 recipients of kidney-alone transplants from 1998 to 2012. Graft failure rates were estimated using Cox models for recipients of kidney transplants from living unrelated, living related, and deceased donors. Models were adjusted for year of transplant and donor and recipient characteristics, with particular attention to mismatches in age, sex, human leukocyte antigens (HLA), body size, and weight. RESULTS: The dependence of graft failure on increasing donor age was less pronounced for living-donor than for deceased-donor transplants. Male donor-to-male recipient transplants had lower graft failure, particularly better than female to male (5%-13% lower risk). HLA mismatch was important in all donor types. Obesity of both the recipient (8%-18% higher risk) and donor (5%-11% higher risk) was associated with higher graft loss, as were donor-recipient weight ratios of <75%, compared with transplants where both parties were of similar weight (9%-12% higher risk). These models are used to create a calculator of estimated graft survival for living donors. CONCLUSIONS: This calculator provides useful information to donors, candidates, and physicians of estimated outcomes and potentially in allowing candidates to choose among several living donors. It may also help inform candidates with compatible donors on the advisability of joining a KPD program.


Subject(s)
Body Size , Decision Support Techniques , Donor Selection , Graft Survival , HLA Antigens/immunology , Histocompatibility , Kidney Transplantation , Living Donors , Adolescent , Adult , Age Factors , Child , Female , Histocompatibility Testing , Humans , Kidney Transplantation/adverse effects , Male , Middle Aged , Predictive Value of Tests , Registries , Risk Assessment , Risk Factors , Sex Factors , Time Factors , Treatment Outcome , United States , Young Adult
8.
J Am Soc Nephrol ; 26(11): 2641-5, 2015 Nov.
Article in English | MEDLINE | ID: mdl-25882829

ABSTRACT

Standardized mortality ratios (SMRs) reported by Medicare compare mortality at individual dialysis facilities with the national average, and are currently adjusted for race. However, whether the adjustment for race obscures or clarifies disparities in quality of care for minority groups is unknown. Cox model-based SMRs were computed with and without adjustment for patient race for 5920 facilities in the United States during 2010. The study population included virtually all patients treated with dialysis during this period. Without race adjustment, facilities with higher proportions of black patients had better survival outcomes; facilities with the highest percentage of black patients (top 10%) had overall mortality rates approximately 7% lower than expected. After adjusting for within-facility racial differences, facilities with higher proportions of black patients had poorer survival outcomes among black and non-black patients; facilities with the highest percentage of black patients (top 10%) had mortality rates approximately 6% worse than expected. In conclusion, accounting for within-facility racial differences in the computation of SMR helps to clarify disparities in quality of health care among patients with ESRD. The adjustment that accommodates within-facility comparisons is key, because it could also clarify relationships between patient characteristics and health care provider outcomes in other settings.


Subject(s)
Ethnicity , Healthcare Disparities/statistics & numerical data , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Renal Dialysis/statistics & numerical data , Adolescent , Adult , Black or African American , Aged , Algorithms , Black People , Female , Health Status Disparities , Humans , Kidney Failure, Chronic/ethnology , Male , Medicare , Middle Aged , Proportional Hazards Models , Quality of Health Care , Risk Assessment , Risk Factors , Treatment Outcome , United States , White People , Young Adult
9.
Health Serv Res ; 50(2): 330-50, 2015 Apr.
Article in English | MEDLINE | ID: mdl-24838079

ABSTRACT

OBJECTIVE: To evaluate evidence of practice changes affecting kidney transplant program volumes, and donor, recipient and candidate selection in the era surrounding the introduction of Centers for Medicare and Medicaid Services (CMS) conditions of participation (CoPs) for organ transplant programs. DATA: Scientific Registry of Transplant Recipients; CMS ESRD and Medicare claims databases. DESIGN: Retrospective analysis of national registry data. METHODS: A Cox proportional hazards model of 1-year graft survival was used to derive risks associated with deceased-donor kidney transplants performed from 2001 to 2010. FINDINGS: Among programs with ongoing noncompliance with the CoPs, kidney transplant volumes declined by 38 percent (n = 766) from 2006 to 2011, including a 55 percent drop in expanded criteria donor transplants. Volume increased by 6 percent (n = 638) among programs remaining in compliance. Aggregate risk of 1-year graft failure increased over time due to increasing recipient age and obesity, and longer ESRD duration. CONCLUSIONS: Although trends in aggregate risk of 1-year kidney graft loss do not indicate that the introduction of the CoPs has systematically reduced opportunities for marginal candidates or that there has been a systematic shift away from utilization of higher risk deceased donor kidneys, total volume and expanded criteria donor utilization decreased overall among programs with ongoing noncompliance.


Subject(s)
Centers for Medicare and Medicaid Services, U.S./standards , Kidney Transplantation/standards , Patient Selection , Black or African American , Age Factors , Body Weights and Measures , Comorbidity , Creatinine/blood , Graft Survival , Humans , Registries , Retrospective Studies , Risk Factors , Sex Factors , Tissue and Organ Procurement/standards , United States
10.
Transplantation ; 98(1): 94-9, 2014 Jul 15.
Article in English | MEDLINE | ID: mdl-24646768

ABSTRACT

BACKGROUND: We sought to compare liver transplant waiting list access by demographics and geography relative to the pool of potential liver transplant candidates across the United States using a novel metric of access to care, termed a liver wait-listing ratio (LWR). METHODS: We calculated LWRs from national liver transplant registration data and liver mortality data from the Scientific Registry of Transplant Recipients and the National Center for Healthcare Statistics from 1999 to 2006 to identify variation by diagnosis, demographics, geography, and era. RESULTS: Among patients with ALF and CLF, African Americans had significantly lower access to the waiting list compared with whites (acute: 0.201 versus 0.280; pre-MELD 0.201 versus 0.290; MELD era: 0.201 versus 0.274; all, P<0.0001) (chronic: 0.084 versus 0.163; pre-MELD 0.085 versus 0.179; MELD 0.084 versus 0.154; all, P<0.0001). Hispanics and whites had similar LWR in both eras (both P>0.05). In the MELD era, female subjects had greater access to the waiting list compared with male subjects (acute: 0.428 versus 0.154; chronic: 0.158 versus 0.140; all, P<0.0001). LWRs varied by three-fold by state (pre-MELD acute: 0.122-0.418, chronic: 0.092-0.247; MELD acute: 0.121-0.428, chronic: 0.092-0.243). CONCLUSIONS: The marked inequity in early access to liver transplantation underscores the need for local and national policy initiatives to affect this disparity.


Subject(s)
End Stage Liver Disease/surgery , Health Services Accessibility/trends , Healthcare Disparities/trends , Liver Failure, Acute/surgery , Liver Transplantation/trends , Tissue and Organ Procurement/trends , Waiting Lists , Adult , Black or African American , Aged , End Stage Liver Disease/diagnosis , End Stage Liver Disease/ethnology , End Stage Liver Disease/mortality , Female , Health Care Rationing/trends , Health Services Needs and Demand/trends , Healthcare Disparities/ethnology , Hispanic or Latino , Humans , Liver Failure, Acute/diagnosis , Liver Failure, Acute/ethnology , Liver Failure, Acute/mortality , Male , Middle Aged , Registries , Residence Characteristics , Risk Factors , Sex Factors , Time Factors , United States/epidemiology , Waiting Lists/mortality , White People , Young Adult
11.
Am J Kidney Dis ; 53(4): 647-57, 2009 Apr.
Article in English | MEDLINE | ID: mdl-19150157

ABSTRACT

BACKGROUND: The Hispanic ethnic group is heterogeneous, with distinct genetic, cultural, and socioeconomic characteristics, but most prior studies of patients with end-stage renal disease focus on the overall Hispanic ethnic group without further granularity. We examined survival differences among Mexican-American, Puerto Rican, and Cuban-American dialysis patients in the United States. STUDY DESIGN: Prospective observational study. SETTING & PARTICIPANTS: Data from individuals randomly selected for the End-Stage Renal Disease Clinical Performance Measures Project (2001 to 2005) were examined. Mexican-American (n = 2,742), Puerto Rican (n = 838), Cuban-American (n = 145), and Hispanic-other dialysis patients (n = 942) were compared with each other and with non-Hispanic (n = 33,076) dialysis patients in the United States. PREDICTORS: Patient characteristics of interest included ethnicity/race, comorbidities, and specific available laboratory values. OUTCOMES: The major outcome of interest was mortality. RESULTS: In the fully adjusted multivariable model, 2-year mortality risk was significantly lower for the Mexican-American and Hispanic-other groups compared with non-Hispanics (adjusted hazard ratio, 0.79; 95% confidence interval, 0.73 to 0.85; adjusted hazard ratio, 0.81; 95% confidence interval, 0.71 to 0.92, respectively). Differences in 2-year mortality rates within the Hispanic ethnic groups were statistically significant (P = 0.004) and ranged from 21% lower mortality in Mexican Americans to 3% higher mortality in Puerto Ricans compared with non-Hispanics. LIMITATIONS: Include those inherent to an observational study, potential ethnic group misclassification, and small sample sizes for some Hispanic subgroups. CONCLUSION: Mexican-American and Hispanic-other dialysis patients have a survival advantage compared with non-Hispanics. Furthermore, Mexican Americans, Cuban Americans, and Hispanic others had a survival advantage compared with their Puerto Rican counterparts. Future research should continue to examine subgroups within Hispanic ethnicity to understand underlying reasons for observed differences that may be masked by examining the Hispanic ethnic group as only a single entity.


Subject(s)
Hispanic or Latino/ethnology , Kidney Failure, Chronic/ethnology , Kidney Failure, Chronic/mortality , Mexican Americans/ethnology , Adult , Aged , Female , Hispanic or Latino/statistics & numerical data , Humans , Kaplan-Meier Estimate , Kidney Failure, Chronic/therapy , Male , Mexican Americans/statistics & numerical data , Middle Aged , Proportional Hazards Models , Prospective Studies , Renal Dialysis , Retrospective Studies , United States/epidemiology
12.
Clin J Am Soc Nephrol ; 3(2): 463-70, 2008 Mar.
Article in English | MEDLINE | ID: mdl-18199847

ABSTRACT

BACKGROUND AND OBJECTIVES: Disparities in time to placement on the waiting list on the basis of socioeconomic factors decrease access to deceased-donor renal transplantation for some groups of patients with end-stage renal disease. This study was undertaken to determine candidate factors that influence duration of dialysis before placement on the waiting list among candidates for deceased-donor renal transplantation in the United States from January 2001 to December 2004 and the impact of Medicare eligibility rules on access. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: Access to the waiting list was measured as the percentage of all wait-listed candidates in the Scientific Registry of Transplant Recipients database who were listed before dialysis and by the duration of dialysis before placement on the waiting list. Multivariate logistic and linear regressions were used to determine variables that were predictive of preemptive listing and the duration of dialysis before listing. RESULTS: The odds for preemptive placement on the waiting list improved during the course of the study period, whereas the median duration of prelisting dialysis did not. The candidate factors that were associated with low rates of preemptive listing and prolonged exposure to prelisting dialysis included Medicare insurance, minority race/ethnicity, and low educational attainment. In patients who were listed after the age of 64 yr, the adverse effect of Medicare insurance on access largely disappeared. CONCLUSIONS: The disparity in dialysis exposure could potentially be diminished by concerted efforts on the part of the nephrology and transplant communities to promote early referral and preemptive placement on the waiting list, by calculating waiting time from the date of initiation of dialysis for patients who are on dialysis at the time of referral, and by relaxing Medicare eligibility requirements.


Subject(s)
Insurance, Health/statistics & numerical data , Kidney Failure, Chronic/surgery , Kidney Transplantation/statistics & numerical data , Minority Groups/statistics & numerical data , Waiting Lists , Adolescent , Adult , Aged , Humans , Middle Aged , Time Factors , United States
13.
Transplantation ; 83(8): 1069-74, 2007 Apr 27.
Article in English | MEDLINE | ID: mdl-17452897

ABSTRACT

BACKGROUND: Elderly patients (ages 70 yr and older) are among the fastest-growing group starting renal-replacement therapy in the United States. The outcomes of elderly patients who receive a kidney transplant have not been well studied compared with those of their peers on the waiting list. METHODS: Using the Scientific Registry of Transplant Recipients, we analyzed data from 5667 elderly renal transplant candidates who initially were wait-listed from January 1, 1990 to December 31, 2004. Of these candidates, 2078 received a deceased donor transplant, and 360 received a living donor transplant by 31 December 2005. Time-to-death was studied using Cox regression models with transplant as a time-dependent covariate. Mortality hazard ratios (RRs) of transplant versus waiting list were adjusted for recipient age, sex, race, ethnicity, blood type, panel reactive antibody, year of placement on the waiting list, dialysis modality, comorbidities, donation service area, and time from first dialysis to first placement on the waiting list. RESULTS: Elderly transplant recipients had a 41% lower overall risk of death compared with wait-listed candidates (RR=0.59; P<0.0001). Recipients of nonstandard, that is, expanded criteria donor, kidneys also had a significantly lower mortality risk (RR=0.75; P<0.0001). Elderly patients with diabetes and those with hypertension as a cause of end-stage renal disease also experienced a large benefit. CONCLUSIONS: Transplantation offers a significant reduction in mortality compared with dialysis in the wait-listed elderly population with end-stage renal disease.


Subject(s)
Kidney Transplantation/statistics & numerical data , Patients/statistics & numerical data , Registries , Age Distribution , Aged , Female , Graft Rejection/mortality , Graft Rejection/pathology , Graft Survival , Humans , Kidney Transplantation/adverse effects , Male , Risk Factors , Survival Rate , Time Factors
14.
JAMA ; 294(21): 2726-33, 2005 Dec 07.
Article in English | MEDLINE | ID: mdl-16333008

ABSTRACT

CONTEXT: Transplantation using kidneys from deceased donors who meet the expanded criteria donor (ECD) definition (age > or =60 years or 50 to 59 years with at least 2 of the following: history of hypertension, serum creatinine level >1.5 mg/dL [132.6 micromol/L], and cerebrovascular cause of death) is associated with 70% higher risk of graft failure compared with non-ECD transplants. However, if ECD transplants offer improved overall patient survival, inferior graft outcome may represent an acceptable trade-off. OBJECTIVE: To compare mortality after ECD kidney transplantation vs that in a combined standard-therapy group of non-ECD recipients and those still receiving dialysis. DESIGN, SETTING, AND PATIENTS: Retrospective cohort study using data from a US national registry of mortality and graft outcomes among kidney transplant candidates and recipients. The cohort included 109,127 patients receiving dialysis and added to the kidney waiting list between January 1, 1995, and December 31, 2002, and followed up through July 31, 2004. MAIN OUTCOME MEASURE: Long-term (3-year) relative risk of mortality for ECD kidney recipients vs those receiving standard therapy, estimated using time-dependent Cox regression models. RESULTS: By end of follow-up, 7790 ECD kidney transplants were performed. Because of excess ECD recipient mortality in the perioperative period, cumulative survival did not equal that of standard-therapy patients until 3.5 years posttransplantation. Long-term relative mortality risk was 17% lower for ECD recipients (relative risk, 0.83; 95% confidence interval, 0.77-0.90; P<.001). Subgroups with significant ECD survival benefit included patients older than 40 years, both sexes, non-Hispanics, all races, unsensitized patients, and those with diabetes or hypertension. In organ procurement organizations (OPOs) with long median waiting times (>1350 days), ECD recipients had a 27% lower risk of death (relative risk, 0.73; 95% confidence interval, 0.64-0.83; P<.001). In areas with shorter waiting times, only recipients with diabetes demonstrated an ECD survival benefit. CONCLUSIONS: ECD kidney transplants should be offered principally to candidates older than 40 years in OPOs with long waiting times. In OPOs with shorter waiting times, in which non-ECD kidney transplant availability is higher, candidates should be counseled that ECD survival benefit is observed only for patients with diabetes.


Subject(s)
Donor Selection/standards , Kidney Transplantation/mortality , Adolescent , Adult , Aged , Algorithms , Child , Child, Preschool , Cohort Studies , Female , Humans , Infant , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Male , Middle Aged , Proportional Hazards Models , Renal Dialysis , Retrospective Studies , Survival Analysis , Waiting Lists
15.
Am J Transplant ; 5(4 Pt 2): 950-7, 2005 Apr.
Article in English | MEDLINE | ID: mdl-15760420

ABSTRACT

This article provides detailed explanations of the methods frequently employed in outcomes analyses performed by the Scientific Registry of Transplant Recipients (SRTR). All aspects of the analytical process are discussed, including cohort selection, post-transplant follow-up analysis, outcome definition, ascertainment of events, censoring, and adjustments. The methods employed for descriptive analyses are described, such as unadjusted mortality rates and survival probabilities, and the estimation of covariant effects through regression modeling. A section on transplant waiting time focuses on the kidney and liver waiting lists, pointing out the different considerations each list requires and the larger questions that such analyses raise. Additionally, this article describes specialized modeling strategies recently designed by the SRTR and aimed at specific organ allocation issues. The article concludes with a description of simulated allocation modeling (SAM), which has been developed by the SRTR for three organ systems: liver, thoracic organs, and kidney-pancreas. SAMs are particularly useful for comparing outcomes for proposed national allocation policies. The use of SAMs has already helped in the development and implementation of a new policy for liver candidates with high MELD scores to be offered organs regionally before the organs are offered to candidates with low MELD scores locally.


Subject(s)
Kidney Transplantation/statistics & numerical data , Liver Transplantation/statistics & numerical data , Research , Data Interpretation, Statistical , Graft Survival , Humans , Patient Selection , Waiting Lists
16.
Am J Kidney Dis ; 45(1): 127-35, 2005 Jan.
Article in English | MEDLINE | ID: mdl-15696452

ABSTRACT

BACKGROUND: Benefits in terms of reductions in mortality corresponding to improvements in Kidney Disease Outcomes Quality Initiative (K/DOQI) compliance for adequacy of dialysis dose and anemia control have not been documented in the literature. We studied changes in achieving K/DOQI guidelines at the facility level to determine whether those changes are associated with corresponding changes in mortality. METHODS: Adjusted mortality and fractions of patients achieving K/DOQI guidelines for urea reduction ratios (URRs; > or =65%) and hematocrit levels (> or =33%) were computed for 2,858 dialysis facilities from 1999 to 2002 using national data for patients with end-stage renal disease. Linear and Poisson regression were used to study the relationship between K/DOQI compliance and mortality and between changes in compliance and changes in mortality. RESULTS: In 2002, facilities in the lowest quintile of K/DOQI compliance for URR and hematocrit guidelines had 22% and 14% greater mortality rates (P < 0.0001) than facilities in the highest quintile, respectively. A 10-percentage point increase in fraction of patients with a URR of 65% or greater was associated with a 2.2% decrease in mortality (P = 0.0006), and a 10-percentage point increase in percentage of patients with a hematocrit of 33% or greater was associated with a 1.5% decrease in mortality (P = 0.003). Facilities in the highest tertiles of improvement for URR and hematocrit had a change in mortality rates that was 15% better than those observed for facilities in the lowest tertiles (P < 0.0001). CONCLUSION: Both current practice and changes in practices with regard to achieving anemia and dialysis-dose guidelines are associated significantly with mortality outcomes at the dialysis-facility level.


Subject(s)
Anemia/prevention & control , Renal Dialysis/mortality , Urea/blood , Guideline Adherence , Hematocrit/standards , Hematocrit/statistics & numerical data , Hemodialysis Units, Hospital/trends , Humans , Kidney Failure, Chronic/blood , Kidney Failure, Chronic/therapy , Practice Guidelines as Topic/standards , Proportional Hazards Models , Quality Assurance, Health Care/methods , Quality Assurance, Health Care/statistics & numerical data , Renal Dialysis/standards , Retrospective Studies , Survival Analysis , United States/epidemiology
17.
Clin Transpl ; : 37-55, 2005.
Article in English | MEDLINE | ID: mdl-17424724

ABSTRACT

The worsening shortage of donor kidneys for transplant and the aging of both the donor and candidate populations have contributed to the increasing importance of ECD kidney transplantation. While ECD transplants have an increased risk of graft failure, for most candidates patient survival is still improved over remaining on dialysis. Because of this risk, however, ECD kidneys have a high likelihood of discard; significant geographic variation in discard and transplant rates impedes maximum utilization of these kidneys. The ECD allocation system was implemented to help facilitate expeditious placement of ECD kidneys to pre-consented candidates by a simplified allocation algorithm. Under this system, recovery and transplantation of ECD kidneys have increased at rates not seen with non-ECD kidneys and not predicted by preexisting trends. More disappointing has been the lack of effect on the percentage of discards and DGF, despite significant reductions in CIT. The disadvantage in graft survival for ECD kidneys extends equally across the spectrum of recipient characteristics, such that no one group of candidates has a proportionately smaller increase in risk. However, benefit analyses comparing the risk of accepting an ECD kidney versus waiting for a non-ECD kidney demonstrate a significant ECD benefit for older and diabetic candidates in regions with prolonged waiting times. The potential value of an ECD kidney to an individual candidate hinges upon the ability to receive it substantially earlier than a non-ECD kidney. Thus, future allocation efforts may focus on ensuring that is the case. In allocation driven by net benefit, ECD kidneys may become an alternative for those who might not otherwise receive a kidney transplant.


Subject(s)
Kidney Transplantation/physiology , Kidney Transplantation/statistics & numerical data , Living Donors/statistics & numerical data , Tissue Donors/statistics & numerical data , Age Distribution , Cadaver , Cause of Death , Female , Graft Survival , Humans , Kidney Transplantation/mortality , Male , Odds Ratio , Patient Selection , Resource Allocation/methods , Survival Analysis , Treatment Failure
18.
Am J Kidney Dis ; 43(6): 1014-23, 2004 Jun.
Article in English | MEDLINE | ID: mdl-15168381

ABSTRACT

BACKGROUND: Several observational studies reported lower mortality risk among hemodialysis patients treated with doses greater than the standard dose. The present study evaluates, with observational data, the secondary randomized Hemodialysis (HEMO) Study finding that greater dialysis dose may benefit women, but not men. METHODS: Data from 74,120 US hemodialysis patients starting end-stage renal disease therapy were analyzed. Patients were classified into 1 of 5 categories of hemodialysis dose according to their average urea reduction ratio (URR), and their relative risk (RR) for mortality was evaluated by using Cox proportional hazards models. Similar analyses using equilibrated Kt/V were completed for 10,816 hemodialysis patients in the Dialysis Outcomes and Practice Patterns Study (DOPPS) in 7 countries. RESULTS: For both men and women, RR was substantially lower in the URR 70%-to-75% category compared with the URR 65%-to-70% category. Among women, RR in the URR greater-than-75% category was significantly lower compared with the URR 70%-to-75% group (P < 0.0001); however, no further association with mortality risk was observed for the greater-than-75% category among men (P = 0.22). RR associated with doses greater than the Kidney Disease Outcomes Quality Initiative guidelines (URR > or = 65%) was significantly different for men compared with women (P < 0.01). Similar differences by sex were observed in DOPPS analyses. CONCLUSION: The agreement of these observational studies with the HEMO Study supports the existence of a survival benefit from greater dialysis doses for women, but not for men. Responses to greater dialysis dose by sex deserve additional study to explain these differences.


Subject(s)
Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Renal Dialysis/methods , Aged , Female , Humans , Male , Middle Aged , Outcome and Process Assessment, Health Care/methods , Outcome and Process Assessment, Health Care/statistics & numerical data , Proportional Hazards Models , Sex Distribution , Survival Rate
19.
Am J Transplant ; 4 Suppl 9: 72-80, 2004.
Article in English | MEDLINE | ID: mdl-15113356

ABSTRACT

Data from the Scientific Registry of Transplant Recipients offer a unique and comprehensive view of US trends in kidney and pancreas waiting list characteristics and outcomes, transplant recipient and donor characteristics, and patient and allograft survival. Important findings from our review of developments during 2002 and the decade's transplantation trends appear below. The kidney waiting list has continued to grow, increasing from 47,830 in 2001 to 50,855 in 2002. This growth has occurred despite the increasing importance of living donor transplantation, which rose from 28% of total kidney transplants in 1993 to 43% in 2002. Policies and procedures to expedite the allocation of expanded criteria donor (ECD) kidneys were developed and implemented during 2002, when 15% of deceased donor transplants were performed with ECD kidneys. Unadjusted 1- and 5-year deceased donor kidney allograft survivals were 81% and 51% for ECD kidney recipients, and 90% and 68% for non-ECD kidney recipients, respectively. Although more patients have been placed on the simultaneous kidney-pancreas waiting list, the number of these transplants dropped from a peak of 970 in 1998 to 905 in 2002. This decline may be due to competition for organs from increasing numbers of isolated pancreas and islet transplants.


Subject(s)
Kidney Transplantation/statistics & numerical data , Pancreas Transplantation/statistics & numerical data , Age Distribution , Aged , Diabetes Mellitus, Type 1/surgery , Diabetic Nephropathies/surgery , Humans , Kidney Transplantation/trends , Middle Aged , Pancreas Transplantation/trends , Registries , Treatment Outcome , United States , Waiting Lists
20.
Am J Transplant ; 4 Suppl 9: 106-13, 2004.
Article in English | MEDLINE | ID: mdl-15113359

ABSTRACT

It is highly desirable to base decisions designed to improve medical practice or organ allocation policies on the analyses of the most recent data available. Yet there is often a need to balance this desire with the added value of evaluating long-term outcomes (e.g. 5-year mortality rates), which requires the use of data from earlier years. This article explains the methods used by the Scientific Registry of Transplant Recipients in order to achieve these goals simultaneously. The analysis of waiting list and transplant outcomes depends strongly on statistical methods that can combine data from different cohorts of patients that have been followed for different lengths of time. A variety of statistical methods have been designed to address these goals, including the Kaplan-Meier estimator, Cox regression models, and Poisson regression. An in-depth description of the statistical methods used for calculating waiting times associated with the various types of organ transplants is provided. Risk of mortality and graft failure, adjusted analyses, cohort selection, and the many complicating factors surrounding the calculation of follow-up time for various outcomes analyses are also examined.


Subject(s)
Research/trends , Transplantation/methods , Cohort Studies , Humans , Patient Selection , Research Design , Transplantation/mortality , Transplantation/statistics & numerical data , Treatment Failure , Treatment Outcome , Waiting Lists
SELECTION OF CITATIONS
SEARCH DETAIL
...