Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
1.
Clin Transplant ; 36(7): e14676, 2022 07.
Article in English | MEDLINE | ID: mdl-35437836

ABSTRACT

INTRODUCTION: Time-zero biopsies can detect donor-derived lesions at the time of kidney transplantation, but their utility in predicting long-term outcomes is unclear under the updated Kidney Allocation System. METHODS: We conducted a single-center retrospective cohort study of 272 consecutive post-reperfusion time-zero biopsies. We tested the hypothesis that abnormal time-zero histology is a strong indicator of donor quality that increases the precision of the kidney donor profile index (KDPI) score to predict long-term outcomes. RESULTS: We detected abnormal biopsies in 42% of the cohort, which were independently associated with a 1.2-fold increased hazard for a composite of acute rejection, allograft failure, and death after adjusting for clinical characteristics including KDPI. By Kaplan-Meier analysis, the relationship between abnormal time-zero histology and the composite endpoint was only significant in the subgroup of deceased donor kidney transplants with KDPI scores >35. Abnormal time-zero histology, particularly vascular intimal fibrosis and arteriolar hyalinosis scores, was independently associated with lower 12-month estimated GFR. CONCLUSION: In conclusion, abnormal time-zero histology is relatively common and identifies a group of kidney recipients at increased risk for worse long-term outcomes. Further studies are needed to determine the optimal patient population in which to deploy time-zero biopsies as an additional surveillance tool.


Subject(s)
Kidney Transplantation , Transplants , Graft Survival , Humans , Kidney/pathology , Kidney Transplantation/adverse effects , Retrospective Studies , Tissue Donors
2.
Ann Surg ; 271(1): 177-183, 2020 01.
Article in English | MEDLINE | ID: mdl-29781845

ABSTRACT

OBJECTIVE: To examine the largest single-center experience of simultaneous kidney/pancreas transplantation (SPK) transplantation among African-Americans (AAs). BACKGROUND: Current dogma suggests that AAs have worse survival following SPK than white recipients. We hypothesize that this national trend may not be ubiquitous. METHODS: From August 30, 1999, through October 1, 2014, 188 SPK transplants were performed at the University of Alabama at Birmingham (UAB) and 5523 were performed at other US centers. Using Kaplan-Meier survival estimates and Cox proportional hazards regression, we examined the influence of recipient ethnicity on survival. RESULTS: AAs comprised 36.2% of the UAB cohort compared with only 19.1% nationally (P < 0.01); yet, overall, 3-year graft survival was statistically higher among UAB than US cohort (kidney: 91.5% vs 87.9%, P = 0.11; pancreas: 87.4% vs 81.3%; P = 0.04, respectively) and persisted on adjusted analyses [kidney adjusted hazard ratio (aHR): 0.58, 95% confidence interval (95% CI) 0.35-0.97, P = 0.04; pancreas aHR: 0.54, 95% CI 0.34-0.85, P = 0.01]. Among the UAB cohort, graft survival did not differ between AA and white recipients; in contrast, the US cohort experienced significantly lower graft survival rates among AA than white recipients (kidney 5 years: 76.5% vs 82.3%, P < 0.01; pancreas 5 years: 72.2% vs 76.3%, P = 0.01; respectively). CONCLUSION: Among a single-center cohort of SPK transplants overrepresented by AAs, we demonstrated similar outcomes among AA and white recipients and better outcomes than the US experience. These data suggest that current dogma may be incorrect. Identifying best practices for SPK transplantation is imperative to mitigate racial disparities in outcomes observed at the national level.


Subject(s)
Black or African American , Forecasting , Graft Rejection/ethnology , Kidney Transplantation , Pancreas Transplantation , Registries , Adolescent , Adult , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged , Retrospective Studies , Survival Rate/trends , United States/epidemiology , Young Adult
3.
Ann Surg ; 270(4): 639-646, 2019 10.
Article in English | MEDLINE | ID: mdl-31348035

ABSTRACT

OBJECTIVE: In this study, we sought to assess likelihood of living donor kidney transplantation (LDKT) within a single-center kidney transplant waitlist, by race and sex, after implementation of an incompatible program. SUMMARY BACKGROUND DATA: Disparities in access to LDKT exist among minority women and may be partially explained by antigen sensitization secondary to prior pregnancies, transplants, or blood transfusions, creating difficulty finding compatible matches. To address these and other obstacles, an incompatible LDKT program, incorporating desensitization and kidney paired donation, was created at our institution. METHODS: A retrospective cohort study was performed among our kidney transplant waitlist candidates (n = 8895). Multivariable Cox regression was utilized, comparing likelihood of LDKT before (era 1: 01/2007-01/2013) and after (era 2: 01/2013-11/2018) implementation of the incompatible program. Candidates were stratified by race [white vs minority (nonwhite)], sex, and breadth of sensitization. RESULTS: Program implementation resulted in the nation's longest single-center kidney chain, and likelihood of LDKT increased by 70% for whites [adjusted hazard ratio (aHR) 1.70; 95% confidence interval (CI), 1.46-1.99] and more than 100% for minorities (aHR 2.05; 95% CI, 1.60-2.62). Improvement in access to LDKT was greatest among sensitized minority women [calculated panel reactive antibody (cPRA) 11%-49%: aHR 4.79; 95% CI, 2.27-10.11; cPRA 50%-100%: aHR 4.09; 95% CI, 1.89-8.82]. CONCLUSIONS: Implementation of an incompatible program, and the resulting nation's longest single-center kidney chain, mitigated disparities in access to LDKT among minorities, specifically sensitized women. Extrapolation of this success on a national level may further serve these vulnerable populations.


Subject(s)
Donor Selection/organization & administration , Health Services Accessibility/organization & administration , Healthcare Disparities/statistics & numerical data , Kidney Transplantation/statistics & numerical data , Living Donors/statistics & numerical data , Racism/statistics & numerical data , Sexism/statistics & numerical data , Adult , Alabama , Donor Selection/statistics & numerical data , Ethnicity/statistics & numerical data , Female , Health Services Accessibility/statistics & numerical data , Humans , Male , Middle Aged , Minority Groups/statistics & numerical data , Retrospective Studies , Waiting Lists
4.
J Am Coll Surg ; 226(4): 615-621, 2018 04.
Article in English | MEDLINE | ID: mdl-29309944

ABSTRACT

BACKGROUND: Widespread implementation of ABO-incompatible (ABOi) living donor kidney transplantation (LDKT) has been proposed as a means to partially ameliorate the national shortage of deceased donor kidneys. Acceptance of this practice has been encouraged by reports from experienced centers demonstrating acute rejection (AR) rates similar to those obtained with ABO-compatible (ABOc) LDKT. Acute rejection rate and graft survival after ABOi LDKT on a national level have yet to be fully determined. STUDY DESIGN: We studied adult (>18 years) LDKT recipients, from 2000 to 2015, reported to the Scientific Registry of Transplant Recipients. Acute rejection rates in the first post-transplant year (modified Poisson regression) and graft survival (Cox proportional hazards) were assessed by ABO compatibility status (ABOi: 930; ABOc: 89,713). RESULTS: Patients undergoing ABOi LDKT had an AR rate of 19.4% compared with 10.5% for ABOc recipients (p < 0.0001). After adjusting for recipient- and donor-related risk factors, patients undergoing ABOi LDKT were found to have a 1.76-fold greater risk for AR within 1 year of transplantation compared with ABOc LDKT recipients (adjusted relative risk [aRR] 1.76; 95% CI 1.54 to 2.01). Moreover, there was a 2.34-fold greater risk of death-censored graft loss at 1-year post-transplant among ABOi vs ABOc LDKT recipients (adjusted hazard ratio [aHR] 2.34; 95% CI 1.85 to 2.96). CONCLUSIONS: Based on these findings, the low rates of AR and excellent short-term graft survival presented in single center series may not be sustainable on a national level. These findings highlight the potential utility for identification of centers of excellence and regionalization of ABOi LDKT.


Subject(s)
ABO Blood-Group System , Blood Group Incompatibility , Graft Rejection/epidemiology , Graft Survival , Kidney Transplantation , Living Donors , Donor Selection , Female , Humans , Male , Middle Aged , Retrospective Studies , United States/epidemiology
5.
Am J Kidney Dis ; 66(1): 84-90, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25700554

ABSTRACT

BACKGROUND: Arteriovenous fistulas (AVFs) often fail to mature, but the mechanism of AVF nonmaturation is poorly understood. Arterial microcalcification is common in patients with chronic kidney disease (CKD) and may limit vascular dilatation, thereby contributing to early postoperative juxta-anastomotic AVF stenosis and impaired AVF maturation. This study evaluated whether preexisting arterial microcalcification adversely affects AVF outcomes. STUDY DESIGN: Prospective study. SETTING & PARTICIPANTS: 127 patients with CKD undergoing AVF surgery at a large academic medical center. PREDICTORS: Preexisting arterial microcalcification (≥1% of media area) assessed independently by von Kossa stains of arterial specimens obtained during AVF surgery and by preoperative ultrasound. OUTCOMES: Juxta-anastomotic AVF stenosis (ascertained by ultrasound obtained 4-6 weeks postoperatively), AVF nonmaturation (inability to cannulate with 2 needles with dialysis blood flow ≥ 300mL/min for ≥6 sessions in 1 month within 6 months of AVF creation), and duration of primary unassisted AVF survival after successful use (time to first intervention). RESULTS: Arterial microcalcification was present by histologic evaluation in 40% of patients undergoing AVF surgery. The frequency of a postoperative juxta-anastomotic AVF stenosis was similar in patients with or without preexisting arterial microcalcification (32% vs 42%; OR, 0.65; 95% CI, 0.28-1.52; P=0.3). AVF nonmaturation was observed in 29%, 33%, 33%, and 33% of patients with <1%, 1% to 4.9%, 5% to 9.9%, and ≥10% arterial microcalcification, respectively (P=0.9). Sonographic arterial microcalcification was found in 39% of patients and was associated with histologic calcification (P=0.001), but did not predict AVF nonmaturation. Finally, among AVFs that matured, unassisted AVF maturation (time to first intervention) was similar for patients with and without preexisting arterial microcalcification (HR, 0.64; 95% CI, 0.35-1.21; P=0.2). LIMITATIONS: Single-center study. CONCLUSIONS: Arterial microcalcification is common in patients with advanced CKD, but does not explain postoperative AVF stenosis, AVF nonmaturation, or AVF failure after successful cannulation.


Subject(s)
Arterial Occlusive Diseases/complications , Arteriovenous Shunt, Surgical , Brachial Artery/pathology , Calcinosis/complications , Renal Dialysis , Renal Insufficiency, Chronic/complications , Aged , Aged, 80 and over , Arterial Occlusive Diseases/diagnostic imaging , Brachial Artery/diagnostic imaging , Calcinosis/diagnostic imaging , Diabetic Angiopathies/complications , Diabetic Nephropathies/complications , Diabetic Nephropathies/therapy , Female , Humans , Male , Middle Aged , Prospective Studies , Renal Insufficiency, Chronic/therapy , Treatment Outcome , Ultrasonography
6.
Clin J Am Soc Nephrol ; 8(10): 1750-5, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23813559

ABSTRACT

BACKGROUND AND OBJECTIVES: Arteriovenous fistulas often fail to mature, and nonmaturation has been attributed to postoperative stenosis caused by aggressive neointimal hyperplasia. Preexisting intimal hyperplasia in the native veins of uremic patients may predispose to postoperative arteriovenous fistula stenosis and arteriovenous fistula nonmaturation. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: This work explored the relationship between preexisting venous intimal hyperplasia, postoperative arteriovenous fistula stenosis, and clinical arteriovenous fistula outcomes in 145 patients. Venous specimens obtained during arteriovenous fistula creation were quantified for maximal intimal thickness (median thickness=22.3 µm). Postoperative ultrasounds at 4-6 weeks were evaluated for arteriovenous fistula stenosis. Arteriovenous fistula maturation within 6 months of creation was determined clinically. RESULTS: Postoperative arteriovenous fistula stenosis was equally frequent in patients with preexisting venous intimal hyperplasia (thickness>22.3 µm) and patients without hyperplasia (46% versus 53%; P=0.49). Arteriovenous fistula nonmaturation occurred in 30% of patients with postoperative stenosis versus 7% of those patients without stenosis (hazard ratio, 4.33; 95% confidence interval, 1.55 to 12.06; P=0.001). The annual frequency of interventions to maintain arteriovenous fistula patency for dialysis after maturation was higher in patients with postoperative stenosis than patients without stenosis (0.83 [95% confidence interval, 0.58 to 1.14] versus 0.42 [95% confidence interval, 0.28 to 0.62]; P=0.008). CONCLUSIONS: Preexisting venous intimal hyperplasia does not predispose to postoperative arteriovenous fistula stenosis. Postoperative arteriovenous fistula stenosis is associated with a higher arteriovenous fistula nonmaturation rate. Arteriovenous fistulas with hemodynamically significant stenosis frequently mature without an intervention. Postoperative arteriovenous fistula stenosis is associated with an increased frequency of interventions to maintain long-term arteriovenous fistula patency after maturation.


Subject(s)
Arteriovenous Shunt, Surgical/adverse effects , Renal Insufficiency, Chronic/surgery , Tunica Intima/pathology , Adult , Aged , Constriction, Pathologic/etiology , Female , Humans , Hyperplasia , Male , Middle Aged , Postoperative Complications , Prospective Studies , Renal Insufficiency, Chronic/pathology , Treatment Outcome
7.
Am J Kidney Dis ; 62(6): 1122-9, 2013 Dec.
Article in English | MEDLINE | ID: mdl-23746379

ABSTRACT

BACKGROUND: Arteriovenous grafts (AVGs) are prone to neointimal hyperplasia leading to AVG failure. We hypothesized that pre-existing pathologic abnormalities of the vessels used to create AVGs (including venous intimal hyperplasia, arterial intimal hyperplasia, arterial medial fibrosis, and arterial calcification) are associated with inferior AVG survival. STUDY DESIGN: Prospective observational study. SETTING & PARTICIPANTS: Patients with chronic kidney disease undergoing placement of a new AVG at a large medical center who had vascular specimens obtained at the time of surgery (n = 76). PREDICTOR: Maximal intimal thickness of the arterial and venous intima, arterial medial fibrosis, and arterial medial calcification. OUTCOME & MEASUREMENTS: Unassisted primary AVG survival (time to first intervention) and frequency of AVG interventions. RESULTS: 55 patients (72%) underwent interventions and 148 graft interventions occurred during 89.9 years of follow-up (1.65 interventions per graft-year). Unassisted primary AVG survival was not associated significantly with arterial intimal thickness (HR, 0.72; 95% CI, 0.40-1.27; P = 0.3), venous intimal thickness (HR, 0.64; 95% CI, 0.37-1.10; P = 0.1), severe arterial medial fibrosis (HR, 0.58; 95% CI, 0.32-1.06; P = 0.6), or severe arterial calcification (HR, 0.68; 95% CI, 0.37-1.31; P = 0.3). The frequency of AVG interventions per year was associated inversely with arterial intimal thickness (relative risk [RR], 1.99; 95% CI, 1.16-3.42; P < 0.001 for thickness <10 vs. >25 µm), venous intimal thickness (RR, 2.11; 95% CI, 1.39-3.20; P < 0.001 for thickness <5 vs. >10 µm), arterial medial fibrosis (RR, 3.17; 95% CI, 1.96-5.13; P < 0.001 for fibrosis <70% vs. ≥70%), and arterial calcification (RR, 2.12; 95% CI, 1.31-3.43; P = 0.001 for <10% vs. ≥10% calcification). LIMITATIONS: Single-center study. Study may be underpowered to demonstrate differences in unassisted primary AVG survival. CONCLUSIONS: Pre-existing vascular pathologic abnormalities in patients with chronic kidney disease may not be associated significantly with unassisted primary AVG survival. However, vascular intimal hyperplasia, arterial medial fibrosis, and arterial calcification may be associated with a decreased frequency of AVG interventions.


Subject(s)
Arteriovenous Shunt, Surgical , Neointima/pathology , Postoperative Complications/pathology , Renal Dialysis , Arm/blood supply , Calcinosis/pathology , Female , Fibrosis , Follow-Up Studies , Graft Survival/physiology , Humans , Male , Middle Aged , Prospective Studies , Risk Factors , Thigh/blood supply , Tunica Intima/pathology , Tunica Media/pathology , Ultrasonography
8.
Am J Kidney Dis ; 58(3): 437-43, 2011 Sep.
Article in English | MEDLINE | ID: mdl-21719173

ABSTRACT

BACKGROUND: Arteriovenous fistulas (AVFs) for hemodialysis frequently fail to mature because of inadequate dilation or early stenosis. The pathogenesis of AVF nonmaturation may be related to pre-existing vascular pathologic states: medial fibrosis or microcalcification may limit arterial dilation, and intimal hyperplasia may cause stenosis. STUDY DESIGN: Observational study. SETTING & PARTICIPANTS: Patients with chronic kidney disease (N = 50) undergoing AVF placement. PREDICTORS: Medial fibrosis, microcalcification, and intimal hyperplasia in arteries and veins obtained during AVF creation. OUTCOME & MEASUREMENTS: AVF nonmaturation. RESULTS: AVF nonmaturation occurred in 38% of patients despite attempted salvage procedures. Preoperative arterial diameter was associated with upper-arm AVF maturation (P = 0.007). Medial fibrosis was similar in patients with nonmaturing and mature AVFs (60% ± 14% vs 66% ± 13%; P = 0.2). AVF nonmaturation was not associated with patient age or diabetes, although both variables were associated significantly with severe medial fibrosis. Conversely, AVF nonmaturation was higher in women than men despite similar medial fibrosis in both sexes. Arterial microcalcification (assessed semiquantitatively) tended to be associated with AVF nonmaturation (1.3 ± 0.8 vs 0.9 ± 0.8; P = 0.08). None of the arteries or veins obtained at AVF creation had intimal hyperplasia. However, repeated venous samples obtained in 6 patients during surgical revision of an immature AVF showed venous neointimal hyperplasia. LIMITATIONS: Single-center study. CONCLUSION: Medial fibrosis and microcalcification are frequent in arteries used to create AVFs, but do not explain AVF nonmaturation. Unlike previous studies, intimal hyperplasia was not present at baseline, but developed de novo in nonmaturing AVFs.


Subject(s)
Arteries/pathology , Arteriovenous Shunt, Surgical , Renal Insufficiency, Chronic/therapy , Tunica Media/pathology , Adult , Aged , Capsules , Elasticity , Female , Fibrosis , Humans , Hyperplasia , Male , Middle Aged , Tunica Intima/pathology
9.
J Am Soc Nephrol ; 19(6): 1191-6, 2008 Jun.
Article in English | MEDLINE | ID: mdl-18369087

ABSTRACT

Individuals waiting for a renal transplant experience excessive cardiovascular mortality, which is not fully explained by the prevalence of ischemic heart disease in this population. Overt heart failure is known to increase the mortality of patients with ESRD, but the impact of lesser degrees of ventricular systolic dysfunction is unknown. For examination of the association between left ventricular ejection fraction(LVEF) and mortality of renal transplant candidates, the records of 2718 patients evaluated for transplantation at one institution were reviewed. During 6355 patient-years (median 27 mo) of follow-up, 681 deaths occurred. Patients with systolic dysfunction (LVEF

Subject(s)
Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/physiopathology , Kidney Transplantation , Systole , Female , Follow-Up Studies , Humans , Male , Middle Aged , Prognosis , Waiting Lists
10.
Am J Cardiol ; 100(6): 1020-5, 2007 Sep 15.
Article in English | MEDLINE | ID: mdl-17826390

ABSTRACT

Cardiovascular disease is the major cause of mortality in patients with end-stage renal disease (ESRD). This study examined the all-cause mortality in 3,698 patients with ESRD evaluated for kidney transplantation at our institution from 2001 to 2004. Mean age for the cohort was 48+/-12 years, and 42% were women. Stress myocardial perfusion imaging was done in 2,207 patients (60%) and coronary angiography in 260 patients (7%). There were 622 deaths (17%) during a mean follow-up period of 30+/-15 months. The presence and severity of coronary disease on angiography was not predictive of survival. Coronary revascularization did not impact survival (p=0.6) except in patients with 3-vessel disease (p=0.05). The best predictor of death was left ventricular ejection fraction, measured by gated myocardial perfusion imaging, with 2.7% mortality increase for each 1% ejection fraction decrease. In conclusion, left ventricular ejection fraction is a strong predictor of survival in patients with ESRD awaiting renal transplantation. Strategies to improve cardiac function or earlier renal transplantation deserve further studies.


Subject(s)
Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/physiopathology , Ventricular Function, Left , Adult , Coronary Angiography , Diabetic Nephropathies/mortality , Electrocardiography , Female , Humans , Kidney Failure, Chronic/surgery , Kidney Transplantation , Male , Middle Aged , Multivariate Analysis , Myocardial Revascularization , Prognosis , Stroke Volume , Survival Analysis , Tomography, Emission-Computed, Single-Photon , Ventricular Dysfunction, Left/diagnostic imaging , Ventricular Dysfunction, Left/mortality , Ventricular Dysfunction, Left/therapy
11.
J Am Coll Surg ; 204(5): 894-902; discussion 902-3, 2007 May.
Article in English | MEDLINE | ID: mdl-17481506

ABSTRACT

BACKGROUND: Racial disparities in renal transplantation outcomes have been documented with inferior allograft survival among African Americans compared with non-African Americans. These differences have been attributed to a variety of factors, including immunologic hyperresponsiveness, socioeconomic status, compliance, HLA matching, and access to care. The purpose of this study was to examine both immunologic and nonimmunologic risk factors for allograft loss with a goal of defining targeted strategies to improve outcomes among African Americans. STUDY DESIGN: We retrospectively analyzed all primary deceased-donor adult renal transplants (n = 2,453) at our center between May 1987 and December 2004. Analysis included the impact of recipient and donor characteristics, HLA typing, and immunosuppressive regimen on graft outcomes. Data were analyzed using standard Kaplan-Meier actuarial techniques and were explored with nonparametric and parametric methods. Multivariable analyses in the hazard-function domain were done to identify specific risk factors associated with graft loss. RESULTS: The 1-year allograft survival in recipients improved substantially throughout the study period, and 3-year allograft survival also improved. Risk factor analyses are shown by type of allograft and according to specific time periods. Risk of immunologic graft loss (acute rejection) was most prominent during the early phase. During late-phase, immunologic risk persists (chronic rejection), but recurrent disease, graft quality, and recipient's comorbidities have an increasingly greater role. CONCLUSIONS: Advances in immunosuppression regimens have contributed to allograft survival in both early and late (constant) phases throughout all eras, but improvement in longterm outcomes for African Americans continues to lag behind non-African Americans. The disparity in renal allograft loss between African Americans and non-African Americans over time indicates that beyond immunologic risk, the impact of nonimmunologic variables, such as time on dialysis pretransplantation, diabetes, and access to medical care, can be key issues.


Subject(s)
Black or African American/statistics & numerical data , Graft Survival , Kidney Transplantation , Age Factors , Diabetes Mellitus, Type 2/complications , Female , Graft Survival/immunology , Health Services Accessibility , Humans , Immunosuppression Therapy/methods , Male , Renal Dialysis , Retrospective Studies , Risk Factors , Socioeconomic Factors , Survival Analysis , Time Factors
12.
J Heart Lung Transplant ; 24(11): 1828-33, 2005 Nov.
Article in English | MEDLINE | ID: mdl-16297789

ABSTRACT

BACKGROUND: Heart-lung transplantation (Tx) is known to offer a protective effect against acute cardiac rejection. This study was undertaken to evaluate acute and chronic heart and/or lung rejection in the setting of multiple-transplanted organs from the same donor compared with single-organ transplantation. METHODS: Acute (treated rejection episodes of heart or lungs) and chronic (allograft vasculopathy in hearts and bronchiolitis obliterans syndrome [BOS] in lungs) rejection events were analyzed in 348 heart transplant (H) recipients, 24 heart-lung (HL) recipients, 82 double-lung (L) recipients and 8 heart-kidney (HK) recipients >18 years of age, who were transplanted between 1990 and 2002. RESULTS: Survival at 3 years differed among groups as follows: HK, 100%; H, 82%; HL, 74%; and L, 70%. The probability of acute rejection within the first 3 months was higher in H recipients than in HL (81% vs 22%; p < 0.0001) or HK (81% vs 12%; p = 0.00009) recipients. Acute cardiac rejection occurred more frequently during the first 2 years in isolated H recipients compared with HL (2.8 vs 0.27 episodes; p < 0.0001) and HK (2.8 vs 0.54; p < 0.001) recipients. Acute lung rejection occurred more frequently in the first 2 years in L than HL (2.4 vs 1.0 episodes; p = 0.02) recipients. Chronic cardiac rejection (allograft vasculopathy) was more likely within 3 years after H compared with HL (32% vs 16%; p = 0.04) or HK (32% vs 0%; p = 0.14). The onset of chronic lung rejection (BOS) within 3 years was similar in HL and L recipients (39% vs 40%; p = 0.9). CONCLUSIONS: Recipients of multiple organs from a single donor undergo less acute rejection of the heart or lungs compared with isolated heart or lung transplant recipients. Cardiac allograft vasculopathy is decreased significantly when cardiac transplantation is combined with a lung allograft. A lower incidence of cardiac allograft vasculopathy is observed when cardiac transplantation is combined with a renal allograft, and may prove statistically significant when more cases have been accumulated. These phenomena may result from immune modulation of the recipient by simultaneous transplant of disparate tissues or introduction of immune-modulating hematopoietic elements.


Subject(s)
Coronary Disease/epidemiology , Graft Rejection/epidemiology , Heart Transplantation/immunology , Heart-Lung Transplantation/immunology , Kidney Transplantation/immunology , Acute Disease , Bronchiolitis Obliterans/epidemiology , Chronic Disease , Female , Humans , Male , Middle Aged , Retrospective Studies
13.
J Urol ; 171(1): 40-3, 2004 Jan.
Article in English | MEDLINE | ID: mdl-14665839

ABSTRACT

PURPOSE: Laparoscopic donor nephrectomy (LAP) has been gaining more popularity among kidney donors and transplant surgeons. There have been some concerns about the function of kidney grafts harvested by laparoscopic procedures. We report our results of LAP. MATERIALS AND METHODS: Prospective data were collected for our donor nephrectomy operations. A telephone survey was done by an independent investigator on the impact of surgery on quality of life. Graft function was also evaluated by serial serum creatinine and mercaptoacetyltriglycine renal nuclear scans. RESULTS: A total of 100 patients were included in the study; of whom 55 underwent open donor nephrectomy (OD), 28 underwent LAP and 17 underwent hand assisted donor nephrectomy (HAL). Mean patient age was 39 +/- 12 years and it was similar in all groups. Mean operative time was 306 +/- 40 minutes for LAP, 294 +/- 42 minutes for HAL and 163 +/- 24 minutes for OD (p = 0.001). Laparoscopic operative time was decreased to 180 +/- 56 minutes for LAP and 155 +/- 40 minutes for HAL in the last 10 patients. Mean estimated blood loss was 200 +/- 107 cc for LAP, 167 +/- 70 cc for HAL and 320 +/- 99 cc for OD (p = 0.0001). Mean warm ischemia time was 3 +/- 2 minutes for LAP, 2 +/- 2 minutes for HAL and 2 +/- 1 minutes for OD (p = 0.002). Postoperative hospitalization was 2 +/- 2 days for LAP and 3 +/- 2 days for OD (p = 0.01). LAP required 30% less narcotic medicine than OD postoperatively (p = 0.04). There were no major complications in LAP cases and no complete or partial graft loss was noted. Mean followup was 7 months. Recipient creatinine was not significantly different for kidneys harvested by LAP or OD (p = 0.5). Diuretic mercaptoacetyltriglycine renograms were performed in all recipients 1 to 3 days after surgery and mean effective renal plasma flow was similar for the 3 groups (p = 0.9). According to telephone survey results 85% of LAP, 71% of HAL and 43% of OD patients reported a return to normal physical activity within 4 weeks after surgery. Similarly 74% of LAP, 62% of HAL and 26% of OD patients were able to return to work within 4 weeks after surgery. CONCLUSIONS: Our data show no significant difference in graft function between LAP and OD. LAP and HAL were safe and complications were minimal. The main difference was that patients treated with LAP and HAL returned to normal physical activity and work significantly earlier than those who underwent OD.


Subject(s)
Laparoscopy/methods , Nephrectomy/methods , Adult , Follow-Up Studies , Humans , Kidney Transplantation , Living Donors , Prospective Studies
14.
Clin Transpl ; : 121-6, 2004.
Article in English | MEDLINE | ID: mdl-16704144

ABSTRACT

Characterization of renal transplant recipients as "high-risk" originated in the 1970's, although attributes that defined this category, such as diabetes mellitus, are no longer applicable. The changing paradigm of risk after renal transplantation reflects the impact of nonspecific advances in clinical care, specific interventions that address previously defined problems, changing demographics, and new issues that have arisen as a consequence of changes in clinical practice. In the current era, diabetes, retransplantation, and presensitization are no longer considered risk factors for poor outcomes after kidney transplantation. Significant risk factors influencing intermediate-term graft survival now include donor and recipient age over 60 years, DR mismatching (only in kidneys from deceased donors), time awaiting transplantation, and African-American race. Current clinical approaches to care of the renal transplant candidate/recipient focus on minimizing the impact of identified risk factors rather than avoiding transplantation altogether. For most ESRD patients, the greatest risk lies in not receiving a transplant.


Subject(s)
Kidney Transplantation , Adult , Female , Graft Survival , Humans , Kidney Failure, Chronic/surgery , Kidney Transplantation/statistics & numerical data , Los Angeles , Male , Middle Aged , Risk Factors , Time Factors
15.
Am J Cardiol ; 92(2): 146-51, 2003 Jul 15.
Article in English | MEDLINE | ID: mdl-12860215

ABSTRACT

Cardiovascular disease is a significant cause of morbidity and mortality after renal transplantation. Pretransplant screening in a subset of these patients for occult coronary artery disease (CAD) may improve outcome. The objective of this study was to examine the outcome of 600 patients after renal transplantation for end-stage renal disease. Prospective outcome data were collected on 600 consecutive patients who had renal transplantation between 1996 and 1998 at our institution at 42 +/- 12 months after surgery. Stress single-photon emission computed tomographic (SPECT) myocardial perfusion imaging was performed in 174 patients before surgery, 136 (78%) of whom had diabetes mellitus. There were a total of 59 events: 17 cardiac deaths, 14 nonfatal myocardial infarctions, and 28 noncardiac deaths. There were 12 cardiac events and 11 noncardiac deaths among those who had SPECT myocardial perfusion imaging. In a multivariate analysis that included important risk factors, age (p = 0.03 and 0.003, respectively) and diabetes (p = 0.02 and 0.005, respectively) were the predictors of total events and cardiac events in patients who did not undergo stress SPECT perfusion imaging. In the subgroup who had stress perfusion imaging, an abnormal perfusion SPECT study was the only predictor of cardiac events (p = 0.006). The 42-month cardiac event-free survival rate was 97% in patients with normal SPECT images and 85% in patients with abnormal SPECT images (RR 5.04, 95% confidence interval 1.4 to 17.6, p = 0.006). Thus, there is a 2.8% event rate per year after renal transplantation, and approximately 50% of these events are noncardiac. In high-risk patients (most of whom had diabetes) with preoperative stress perfusion imaging, those with normal images had significantly lower cardiac events than those with abnormal images. These results have important implications in patient screening and postoperative management.


Subject(s)
Coronary Artery Disease/diagnostic imaging , Coronary Artery Disease/etiology , Coronary Circulation/physiology , Kidney Failure, Chronic/diagnostic imaging , Kidney Failure, Chronic/surgery , Kidney Transplantation/adverse effects , Outcome Assessment, Health Care , Postoperative Complications , Preoperative Care , Tomography, Emission-Computed, Single-Photon , Adult , Aged , Coronary Artery Disease/physiopathology , Exercise Test , Female , Follow-Up Studies , Humans , Kidney Failure, Chronic/physiopathology , Male , Middle Aged , Patient Selection , Predictive Value of Tests , Prognosis , Reproducibility of Results , Time Factors
16.
Am J Transplant ; 3(7): 775-85, 2003 Jul.
Article in English | MEDLINE | ID: mdl-12814469

ABSTRACT

In March, 2002, over 100 members of the transplant community assembled in Philadelphia for a meeting designed to address problems associated with the growing number of patients seeking kidney transplantation and added to the waiting list each year. The meeting included representatives of nine US organizations with interests in these issues. Participants divided into work groups addressing access to the waiting list, assigning priority on the list, list management, and identifying appropriate candidates for expanded criteria donor kidneys. Each work group outlined problems and potential remedies within each area. This report summarized the issues and recommendations regarding the waiting list for kidney transplantation addressed in the Philadelphia meeting.


Subject(s)
Congresses as Topic , Kidney Transplantation , Tissue and Organ Procurement , Humans , Philadelphia
17.
Clin Transplant ; 17(2): 77-88, 2003 Apr.
Article in English | MEDLINE | ID: mdl-12709071

ABSTRACT

Each year, 55 000 organ transplants are performed worldwide. Cumulatively, the number of living organ recipients is now estimated to be over 300 000. Most of these transplant recipients will remain on immunosuppressive drugs for the remainder of their lives to prevent rejection episodes. Controlled doses of these drugs are required to prevent over-medication, which may leave the patient susceptible to opportunistic infection and drug toxicity effects, or under-dosing, which may lead to shortened graft survival because of rejection episodes. This paper describes the result of a multicenter study conducted at the Universities of Pittsburgh, Alabama and Maryland to evaluate an in vitro assay (CylexTM Immune Cell Function Assay) for the measurement of global immune response in transplant patients receiving immunosuppressive therapy. The assay uses a whole blood sample to maintain the presence of the drug during incubation. Following overnight incubation of blood with phytohemagglutinin (PHA), CD4 cells are selected using paramagnetic particles coated with a monoclonal antibody to the CD4 epitope. The CD4-positive cells are targeted as major immunosuppressive drugs are designed to specifically inhibit T-cell activation which has been implicated in rejection. The data generated at these three sites were submitted in support of an Food and Drug Association (FDA) application for the use of this assay in the detection of cell-mediated immunity in an immunosuppressed population. The assay was cleared by the FDA on April 2, 2002. This cross-sectional study was designed to establish ranges for reactivity of this bioassay in the assessment of functional immunity for an individual solid organ recipient at any point in time.


Subject(s)
Drug Monitoring , Immunity, Cellular , Immunoassay/methods , Transplantation Immunology , Adult , CD4 Lymphocyte Count , Case-Control Studies , Cross-Sectional Studies , Cyclosporine/blood , Female , Flow Cytometry , Humans , Immunosuppressive Agents/blood , Lymphocyte Activation/drug effects , Male , Middle Aged , Phytohemagglutinins/pharmacology , T-Lymphocytes/drug effects , Tacrolimus/blood
18.
Radiology ; 225(1): 59-64, 2002 Oct.
Article in English | MEDLINE | ID: mdl-12354984

ABSTRACT

PURPOSE: To compare various objective ultrasonographic (US) criteria for native arteriovenous fistula (AVF) maturation with subsequent fistula outcomes and clinical evaluation by experienced dialysis nurses. MATERIALS AND METHODS: US fistula evaluation results were analyzed retrospectively in 69 patients within 4 months after AVF placement; adequacy for dialysis was known in 54. Measurements included minimum venous diameter and blood flow rate. Experienced dialysis nurses examined 30 fistulas clinically. Predictors of fistula adequacy were analyzed with univariate and multivariate logistic regression. Mean fistula diameters and blood flow rates were compared by using analysis of variance or unpaired Student t tests. RESULTS: Fistula adequacy for dialysis doubled if the minimum venous diameter was 0.4 cm or greater (89% [24 of 27]) versus less than 0.4 cm (44% [12 of 27]; P <.001). Fistula adequacy for dialysis was nearly doubled if flow volume was 500 mL/min or greater (84% [26 of 31]) versus less than 500 mL/min (43% [nine of 21]; P =.002). Combining venous diameter and flow volume increased fistula adequacy predictive value: minimum venous diameter of 0.4 cm or greater and flow volume of 500 mL/min or greater (95% [19 of 20]) versus neither criterion met (33% [five of 15]; P =.002). Women were less likely to have an adequate fistula diameter of 0.4 cm or greater: 40% (12 of 30) of women versus 69% (27 of 39; P =.015) of men. No significant differences in blood flow or minimum venous diameter were found during 2-4 postoperative months. Experienced dialysis nurses' accuracy in predicting eventual fistula maturity was 80% (24 of 30). CONCLUSION: US measurements of AVF at 2-4 months in patients undergoing hemodialysis are highly predictive of fistula maturation and adequacy for dialysis.


Subject(s)
Arm/blood supply , Arteriovenous Shunt, Surgical , Blood Flow Velocity , Renal Dialysis , Ultrasonography, Doppler , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Sensitivity and Specificity , Veins/diagnostic imaging
SELECTION OF CITATIONS
SEARCH DETAIL
...