Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 31
Filter
1.
Ann Surg ; 276(4): 597-604, 2022 10 01.
Article in English | MEDLINE | ID: mdl-35837899

ABSTRACT

BACKGROUND: The burden of end-stage kidney disease (ESKD) and kidney transplant rates vary significantly across the United States. This study aims to examine the mismatch between ESKD burden and kidney transplant rates from a perspective of spatial epidemiology. METHODS: US Renal Data System data from 2015 to 2017 on incident ESKD and kidney transplants per 1000 incident ESKD cases was analyzed. Clustering of ESKD burden and kidney transplant rates at the county level was determined using local Moran's I and correlated to county health scores. Higher percentile county health scores indicated worse overall community health. RESULTS: Significant clusters of high-ESKD burden tended to coincide with clusters of low kidney transplant rates, and vice versa. The most common cluster type had high incident ESKD with low transplant rates (377 counties). Counties in these clusters had the lowest overall mean transplant rate (61.1), highest overall mean ESKD incidence (61.3), and highest mean county health scores percentile (80.9%, P <0.001 vs all other cluster types). By comparison, counties in clusters with low ESKD incidence and high transplant rates (n=359) had the highest mean transplant rate (110.6), the lowest mean ESKD incidence (28.9), and the lowest county health scores (20.2%). All comparisons to high-ESKD/low-transplant clusters were significant at P value <0.001. CONCLUSION: There was a significant mismatch between kidney transplant rates and ESKD burden, where areas with the greatest need had the lowest transplant rates. This pattern exacerbates pre-existing disparities, as disadvantaged high-ESKD regions already suffer from worse access to care and overall community health, as evidenced by the highest county health scores in the study.


Subject(s)
Kidney Failure, Chronic , Kidney Transplantation , Cluster Analysis , Humans , Incidence , Kidney Failure, Chronic/epidemiology , Kidney Failure, Chronic/surgery , Kidney Transplantation/adverse effects , United States/epidemiology
2.
Prog Transplant ; 31(4): 305-313, 2021 12.
Article in English | MEDLINE | ID: mdl-34713750

ABSTRACT

INTRODUCTION: Transplant candidate participation in the Living Donor Navigator Program is associated with an increased likelihood of achieving living donor kidney transplantation; yet not every transplant candidate participates in navigator programming. RESEARCH QUESTION: We sought to assess interest and ability to participate in the Living Donor Navigator Program by the degree of social vulnerability. DESIGN: Eighty-two adult kidney-only candidates initiating evaluation at our center provided Likert-scaled responses to survey questions on interest and ability to participate in the Living Donor Navigator Program. Surveys were linked at the participant-level to the Centers for Disease Control and Prevention Social Vulnerability Index and county health rankings and overall social vulnerability and subthemes, individual barriers, telehealth capabilities/ knowledge, interest, and ability to participate were assessed utilizing nonparametric Wilcoxon ranks sums tests, chi-square, and Fisher's exact tests. RESULTS: Participants indicating distance as a barrier to participation in navigator programming lived approximately 82 miles farther from our center. Disinterested participants lived in areas with the highest social vulnerability, higher physical inactivity rates, lower college education rates, and higher uninsurance (lack of insurance) and unemployment rates. Similarly, participants without a computer, who never heard of telehealth, and who were not encouraged to participate in telehealth resided in areas of highest social vulnerability. CONCLUSION: These data suggest geography combined with being from under-resourced areas with high social vulnerability was negatively associated with health care engagement. Geography and poverty may be surrogates for lower health literacy and fewer health care interactions.


Subject(s)
Kidney Transplantation , Social Vulnerability , Adult , Educational Status , Humans , Kidney , Living Donors
3.
Obesity (Silver Spring) ; 29(9): 1538-1546, 2021 09.
Article in English | MEDLINE | ID: mdl-34338423

ABSTRACT

OBJECTIVE: The aim of this study was to characterize end-stage renal disease (ESRD) patients with obesity as their only contraindication to listing and to quantify wait-list and transplant access. METHODS: Using the US Renal Data System, a retrospective cohort study of incident dialysis cases (2012 to 2014) was performed. The primary outcomes were time to wait-listing and time to transplantation. RESULTS: Of 157,572 dialysis patients not already listed, 39,844 had BMI as their only demonstrable transplant contraindication. They tended to be younger, female, and Black. Compared with patients with BMI < 35, those with BMI 35 to 39.9, 40 to 44.9, and ≥45 were, respectively, 15% (adjusted hazard ratio [aHR] 0.85; 95% CI: 0.83-0.88; p < 0.001), 45% (aHR 0.55; 95% CI: 0.52-0.57; p < 0.001), and 71% (aHR 0.29; 95% CI: 0.27-0.31; p < 0.001) less likely to be wait-listed. Wait-listed patients with BMI 35 to 39.9 were 24% less likely to achieve transplant (aHR 0.76; 95% CI: 0.72-0.80; p < 0.0001), BMI 40 to 44.9 were 21% less likely (aHR 0.79; 95% CI: 0.72-0.86; p < 0.0001), and BMI ≥ 45 were 15% less likely (aHR 0.85; 95% CI: 0.75-0.95; p = 0.004) compared with patients with BMI < 35. CONCLUSIONS: Obesity was the sole contraindication to wait-listing for 40,000 dialysis patients. They were less likely to be wait-listed. For those who were, they had a lower likelihood of transplant. Aggressive weight-loss interventions may help this population achieve wait-listing and transplant.


Subject(s)
Kidney Failure, Chronic , Kidney Transplantation , Cohort Studies , Contraindications , Female , Humans , Kidney Failure, Chronic/surgery , Obesity/complications , Obesity/surgery , Retrospective Studies
5.
Am J Surg ; 222(1): 36-41, 2021 07.
Article in English | MEDLINE | ID: mdl-33413873

ABSTRACT

BACKGROUND: The Living Donor Navigator (LDN) Program pairs kidney transplant candidates (TC) with a friend or family member for advocacy training to help identify donors and achieve living donor kidney transplantation (LDKT). However, some TCs participate alone as self-advocates. METHODS: In this retrospective cohort study of TCs in the LDN program (04/2017-06/2019), we evaluated the likelihood of LDKT using Cox proportional hazards regression and rate of donor screenings using ordered events conditional models by advocate type. RESULTS: Self-advocates (25/127) had lower likelihood of LDKT compared to patients with an advocate (adjusted hazard ratio (aHR): 0.22, 95% confidence interval (CI): 0.03-1.66, p = 0.14). After LDN enrollment, rate of donor screenings increased 2.5-fold for self-advocates (aHR: 2.48, 95%CI: 1.26-4.90, p = 0.009) and 3.4-fold for TCs with an advocate (aHR: 3.39, 95%CI: 2.20-5.24, p < 0.0001). CONCLUSIONS: Advocacy training was beneficial for self-advocates, but having an independent advocate may increase the likelihood of LDKT.


Subject(s)
Donor Selection/statistics & numerical data , Healthcare Disparities/statistics & numerical data , Kidney Failure, Chronic/surgery , Kidney Transplantation/statistics & numerical data , Patient Advocacy/statistics & numerical data , Black or African American/statistics & numerical data , Donor Selection/standards , Female , Health Services Accessibility/standards , Health Services Accessibility/statistics & numerical data , Humans , Kidney Transplantation/standards , Living Donors/statistics & numerical data , Male , Marital Status/statistics & numerical data , Middle Aged , Retrospective Studies , Sex Factors , White People/statistics & numerical data
8.
Transplantation ; 104(1): 122-129, 2020 01.
Article in English | MEDLINE | ID: mdl-30946213

ABSTRACT

BACKGROUND: To date, no living donation program has simultaneously addressed the needs of both transplant candidates and living donors by separating the advocacy role from the candidate and improving potential donor comfort with the evaluation process. We hypothesized that the development of a novel program designed to promote both advocacy and systems training among transplant candidates and their potential living kidney donors would result in sustained increases in living-donor kidney transplantation (LDKT). To this end, we developed and implemented a Living Donor Navigator (LDN) Program at the University of Alabama at Birmingham. METHODS: We included adult patients awaiting kidney-only transplant in a retrospective cohort analysis. Using time-varying Cox proportional hazards regression, we explored likelihood of living donor screening and approval by participation in the LDN program. RESULTS: There were 56 LDN participants and 1948 nonparticipants (standard of care). LDN was associated with a 9-fold increased likelihood of living donor screenings (adjusted hazard ratio, 9.27; 95% confidence interval, 5.97-14.41, P < 0.001) and a 7-fold increased likelihood of having an approved living donor (adjusted hazard ratio, 7.74; 95% confidence interval, 3.54-16.93; P < 0.001) compared with the standard of care. Analyses by participant race demonstrated higher likelihood of screened donors and a similar likelihood of having an approved donor among African Americans compared with Caucasians. CONCLUSIONS: These data suggest that both advocacy and systems training are needed to increase actual LDKT rates, and that LDN programs may mitigate existing racial disparities in access to LDKT.


Subject(s)
Donor Selection/organization & administration , Health Services Accessibility/organization & administration , Healthcare Disparities/statistics & numerical data , Kidney Transplantation , Patient Advocacy , Patient Navigation , Black or African American/statistics & numerical data , Alabama , Donor Selection/statistics & numerical data , Female , Humans , Living Donors , Male , Middle Aged , Program Evaluation , Retrospective Studies , White People/statistics & numerical data
9.
J Am Soc Nephrol ; 31(1): 12-21, 2020 01.
Article in English | MEDLINE | ID: mdl-31792154

ABSTRACT

Patients with ESKD who would benefit from a kidney transplant face a critical and continuing shortage of kidneys from deceased human donors. As a result, such patients wait a median of 3.9 years to receive a donor kidney, by which time approximately 35% of transplant candidates have died while waiting or have been removed from the waiting list. Those of blood group B or O may experience a significantly longer waiting period. This problem could be resolved if kidneys from genetically engineered pigs offered an alternative with an acceptable clinical outcome. Attempts to accomplish this have followed two major paths: deletion of pig xenoantigens, as well as insertion of "protective" human transgenes to counter the human immune response. Pigs with up to nine genetic manipulations are now available. In nonhuman primates, administering novel agents that block the CD40/CD154 costimulation pathway, such as an anti-CD40 mAb, suppresses the adaptive immune response, leading to pig kidney graft survival of many months without features of rejection (experiments were terminated for infectious complications). In the absence of innate and adaptive immune responses, the transplanted pig kidneys have generally displayed excellent function. A clinical trial is anticipated within 2 years. We suggest that it would be ethical to offer a pig kidney transplant to selected patients who have a life expectancy shorter than the time it would take for them to obtain a kidney from a deceased human donor. In the future, the pigs will also be genetically engineered to control the adaptive immune response, thus enabling exogenous immunosuppressive therapy to be significantly reduced or eliminated.


Subject(s)
Kidney Transplantation , Swine/genetics , Tissue and Organ Procurement/methods , Transplantation, Heterologous , Animals , Animals, Genetically Modified , Clinical Trials as Topic , Models, Animal , Patient Selection , Primates
10.
Ann Surg ; 271(1): 177-183, 2020 01.
Article in English | MEDLINE | ID: mdl-29781845

ABSTRACT

OBJECTIVE: To examine the largest single-center experience of simultaneous kidney/pancreas transplantation (SPK) transplantation among African-Americans (AAs). BACKGROUND: Current dogma suggests that AAs have worse survival following SPK than white recipients. We hypothesize that this national trend may not be ubiquitous. METHODS: From August 30, 1999, through October 1, 2014, 188 SPK transplants were performed at the University of Alabama at Birmingham (UAB) and 5523 were performed at other US centers. Using Kaplan-Meier survival estimates and Cox proportional hazards regression, we examined the influence of recipient ethnicity on survival. RESULTS: AAs comprised 36.2% of the UAB cohort compared with only 19.1% nationally (P < 0.01); yet, overall, 3-year graft survival was statistically higher among UAB than US cohort (kidney: 91.5% vs 87.9%, P = 0.11; pancreas: 87.4% vs 81.3%; P = 0.04, respectively) and persisted on adjusted analyses [kidney adjusted hazard ratio (aHR): 0.58, 95% confidence interval (95% CI) 0.35-0.97, P = 0.04; pancreas aHR: 0.54, 95% CI 0.34-0.85, P = 0.01]. Among the UAB cohort, graft survival did not differ between AA and white recipients; in contrast, the US cohort experienced significantly lower graft survival rates among AA than white recipients (kidney 5 years: 76.5% vs 82.3%, P < 0.01; pancreas 5 years: 72.2% vs 76.3%, P = 0.01; respectively). CONCLUSION: Among a single-center cohort of SPK transplants overrepresented by AAs, we demonstrated similar outcomes among AA and white recipients and better outcomes than the US experience. These data suggest that current dogma may be incorrect. Identifying best practices for SPK transplantation is imperative to mitigate racial disparities in outcomes observed at the national level.


Subject(s)
Black or African American , Forecasting , Graft Rejection/ethnology , Kidney Transplantation , Pancreas Transplantation , Registries , Adolescent , Adult , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged , Retrospective Studies , Survival Rate/trends , United States/epidemiology , Young Adult
11.
N Engl J Med ; 364(20): 1909-19, 2011 May 19.
Article in English | MEDLINE | ID: mdl-21591943

ABSTRACT

BACKGROUND: There are few comparisons of antibody induction therapy allowing early glucocorticoid withdrawal in renal-transplant recipients. The purpose of the present study was to compare induction therapy involving alemtuzumab with the most commonly used induction regimens in patient populations at either high immunologic risk or low immunologic risk. METHODS: In this prospective study, we randomly assigned patients to receive alemtuzumab or conventional induction therapy (basiliximab or rabbit antithymocyte globulin). Patients were stratified according to acute rejection risk, with a high risk defined by a repeat transplant, a peak or current value of panel-reactive antibodies of 20% or more, or black race. The 139 high-risk patients received alemtuzumab (one dose of 30 mg, in 70 patients) or rabbit antithymocyte globulin (a total of 6 mg per kilogram of body weight given over 4 days, in 69 patients). The 335 low-risk patients received alemtuzumab (one dose of 30 mg, in 164 patients) or basiliximab (a total of 40 mg over 4 days, in 171 patients). All patients received tacrolimus and mycophenolate mofetil and underwent a 5-day glucocorticoid taper in a regimen of early steroid withdrawal. The primary end point was biopsy-confirmed acute rejection at 6 months and 12 months. Patients were followed for 3 years for safety and efficacy end points. RESULTS: The rate of biopsy-confirmed acute rejection was significantly lower in the alemtuzumab group than in the conventional-therapy group at both 6 months (3% vs. 15%, P<0.001) and 12 months (5% vs. 17%, P<0.001). At 3 years, the rate of biopsy-confirmed acute rejection in low-risk patients was lower with alemtuzumab than with basiliximab (10% vs. 22%, P=0.003), but among high-risk patients, no significant difference was seen between alemtuzumab and rabbit antithymocyte globulin (18% vs. 15%, P=0.63). Adverse-event rates were similar among all four treatment groups. CONCLUSIONS: By the first year after transplantation, biopsy-confirmed acute rejection was less frequent with alemtuzumab than with conventional therapy. The apparent superiority of alemtuzumab with respect to early biopsy-confirmed acute rejection was restricted to patients at low risk for transplant rejection; among high-risk patients, alemtuzumab and rabbit antithymocyte globulin had similar efficacy. (Funded by Astellas Pharma Global Development; INTAC ClinicalTrials.gov number, NCT00113269.).


Subject(s)
Antibodies, Monoclonal/therapeutic use , Antibodies, Neoplasm/therapeutic use , Graft Rejection/prevention & control , Immunosuppressive Agents/therapeutic use , Kidney Transplantation , Acute Disease , Adolescent , Adult , Aged , Alemtuzumab , Animals , Antibodies, Monoclonal/adverse effects , Antibodies, Monoclonal, Humanized , Antibodies, Neoplasm/adverse effects , Antilymphocyte Serum/adverse effects , Antilymphocyte Serum/therapeutic use , Basiliximab , Biopsy , Drug Therapy, Combination , Female , Glucocorticoids/therapeutic use , Graft Rejection/pathology , Humans , Immunosuppressive Agents/adverse effects , Kaplan-Meier Estimate , Kidney/pathology , Kidney Transplantation/immunology , Kidney Transplantation/mortality , Lymphocyte Count , Male , Middle Aged , Prospective Studies , Rabbits , Recombinant Fusion Proteins/adverse effects , Recombinant Fusion Proteins/therapeutic use , Young Adult
12.
Liver Transpl ; 13(2): 258-65, 2007 Feb.
Article in English | MEDLINE | ID: mdl-17256756

ABSTRACT

Primary fascial closure is often difficult after adult orthotopic liver transplantation (OLT), complicated by donor-to-recipient graft size mismatch, post-reperfusion hepatic edema, coagulopathy, or intestinal edema. Attempts at closing the abdomen under these circumstances can cause increase in intra-abdominal pressures, resulting in significant complications, including graft loss. Temporary closure with silastic mesh has been used as a viable option in children receiving transplants, but there is no experience recorded with its use in adults. A retrospective review was conducted on 200 consecutive liver transplantations performed over 42 months (October 2002 to February 2006). Records were evaluated for patient and donor demographics, perioperative factors including Model for End-Stage Liver Disease and Child-Turcotte-Pugh scores, indications for OLT, ischemic times, blood product administration, and use of temporary silastic mesh closure. Patients requiring silastic mesh were further evaluated for indication, time to primary fascial closure, duration of intubation, length of stay, graft function, and complications (infectious, vascular, biliary, and hernia development). Comparisons were made with a cohort of patients undergoing OLT over the same time period but who were closed primarily, without the use of temporary silastic mesh. Fifty-one liver transplantations (25.5%) of the 200 total transplant cohort used silastic mesh closure. Comparison of the cohorts (primary closure vs. temporary mesh) revealed that no differences existed, except the requirement of all blood products was significantly greater in the silastic mesh group (P < 0.001). Bowel edema (47.1%) and coagulopathy (37.3%) were the most common indications for mesh closure, with less frequent reasons including donor to recipient size mismatch (11.8%), hemodynamic instability, and a large preexisting fascial defect (2.0% each). The average time from transplant to final fascial closure was 3.4 days (range 2-9 days). In the silastic cohort, 41 transplants where closed primarily, 3 required the addition of synthetic mesh, and 6 had component separation and flap closure. After fascial closure, the mean time to extubation was 1 day. The median length of follow-up was 1.3 years for the silastic closure group. Long-term wound complications in the silastic closure group included 1 instance of colonic fistula, 2 incisional hernias, and 2 wound infections. The 30-day and 1-year patient survival for this group were 93.6 and 82.4%, respectively, and the graft survival for those same periods were 90.2 and 77.7%, respectively. Wound complications, rates of hepatic artery thrombosis or stricture, portal vein thrombosis or stricture, biliary complications, and allograft and patient survival were no different than those in patients undergoing initial primary closure. In adult liver transplantation with a difficult (or potentially difficult) abdomen, temporary closure with silastic mesh was found to allow for uncomplicated fascial closure in a short period of time, with rapid extubation times, excellent graft function, and minimal instances of infectious or wound complications. In circumstances where large amounts of blood products are required, where a size mismatch exists, or where bowel edema is present during adult liver transplantation, temporary closure with silastic mesh is an ideal strategy.


Subject(s)
Abdominal Cavity/surgery , Liver Transplantation/mortality , Surgical Mesh , Suture Techniques , Adult , Female , Humans , Male , Retrospective Studies , Treatment Outcome
13.
Clin Transplant ; 19(6): 711-6, 2005 Dec.
Article in English | MEDLINE | ID: mdl-16313314

ABSTRACT

BACKGROUND: Liver transplant recipients are at high risk for multi-drug resistant infections because of broad-spectrum antibiotic and immunosuppression. This study evaluates the clinical and financial impact of vancomycin resistant Enterococcus (VRE) in liver transplant recipients. METHODS: Liver transplant recipients with VRE from 1995 to 2002 were identified and matched (age, gender, UNOS status, liver disease and transplant date) to controls. Demographics, clinical factors, co-infections, antibiotic use, length of stay, abdominal surgeries, biliary complications, survival and resource utilization were compared with matched controls. RESULTS: Nineteen patients were found to have 28 VRE infections via evaluation of microbiologic culture results of all liver transplant patients in the transplant registry. Thirty-eight non-VRE patients served as matched controls. The four most common sites VRE was cultured from included blood (35%), peritoneal fluid (35%), bile (20%), and urine (12%). Median time from transplant to infection was 48 d (range of 4-348). No significant differences in demographics were observed. The VRE group had a higher incidence of prior antibiotic use than the non-VRE group (95% vs. 34%; p < 0.05). The VRE group also experienced more abdominal surgery (20/19 vs. 3/38; p = 0.029), biliary complications (9/19 vs. 9/38; p = 0.018) and a longer length of stay (42.5 vs. 21.7 d; p = .005). Survival in the VRE group was lower (52% vs. 82%; p = 0.048). Six of the 19 VRE patients were treated with linezolid for eight infection episodes, and four of six patients survived. Eight patients were treated with quinupristin/dalfopristin for nine infections, and two of eight survived. Increased cost of care was observed in the VRE group. Laboratory costs were higher in the VRE group (6500 dollars vs. 1750; p = 0.02) as well. CONCLUSION: VRE was associated with prior antibiotic use, multiple abdominal surgeries, biliary complications and resulted in decreased survival compared to non-VRE control patients. VRE patients also utilized more hospital resources. Linezolid showed a trend toward improved survival.


Subject(s)
Gram-Positive Bacterial Infections/epidemiology , Liver Transplantation , Liver/microbiology , Enterococcus/drug effects , Female , Humans , Incidence , Length of Stay , Liver Transplantation/immunology , Male , Matched-Pair Analysis , Middle Aged , Retrospective Studies , Risk Factors , Vancomycin Resistance
14.
Am J Transplant ; 5(4 Pt 1): 775-80, 2005 Apr.
Article in English | MEDLINE | ID: mdl-15760401

ABSTRACT

Significant mortality is associated with post-transplant lymphoproliferative disorder (PTLD) in kidney transplant recipients (KTX). Univariate/multivariate risk factor survival analysis of US PTLD KTX reported to Israel Penn International Transplant Tumor Registry from November 1968 to January 2000 was performed. PTLD presented 18 (median) (range 1-310) months in 402 KTX. Death rates were greater for those diagnosed within 6 months (64%) versus beyond 6 months (54%, p = 0.04). No differences in death risk for gender, race, immunosuppression, EBV, B or T cell positivity were identified. Death risk increased for multiple versus single sites (73% vs. 53%, hazards ratio (HR) 1.4). A 1-year increase in age increased HR for death by 2%. Surgery was associated with increased survival (55% vs. 0% without surgery) (p < 0.0001). Patients with allograft involvement, treated with transplant nephrectomy alone (n = 20), had 80% survival versus 53% without allograft removal (n = 15) (p < 0.001). Overall survival was 69% for allograft involvement alone versus 36% for other organ involvement plus allograft (n = 19 alive) (p < 0.0001). Death risk was greater for multiple site PTLD and increasing age, and risks were additive. Univariate analysis identified increased death risk for those not receiving surgery, particularly allograft involvement alone.


Subject(s)
Kidney Transplantation , Lymphoproliferative Disorders/mortality , Registries , Survival , Graft Rejection/prevention & control , Immunosuppression Therapy , Lymphoproliferative Disorders/physiopathology , Lymphoproliferative Disorders/therapy , Survival Analysis
16.
Am J Transplant ; 5(2): 356-65, 2005 Feb.
Article in English | MEDLINE | ID: mdl-15643996

ABSTRACT

African-Americans (AAs) have historically been considered high-risk renal transplant recipients due to increased rejection rates and reduced long-term graft survival. As a result, AAs are often excluded from corticosteroid withdrawal (CSWD) protocols. Modern immunosuppression has reduced rejections and improved graft survival in AAs and may allow successful CSWD. Outcomes in 56 AAs were compared to 56 non-AAs. All patients were enrolled in one of four early CSWD protocols. Results are reported as AA versus non-AA. Acute rejection at 1-year was 23% and 18%; (p = NS); creatinine clearance at 1-year was 75 versus 80 mL/min (p = NS); patient and graft survival was 96% versus 98% and 91% versus 91%; (p = NS). AAs benefit from early CSWD with significantly improved blood pressure, LDL < 130 mg/dL and HDL > 45 mg/dL at 1-year, post-transplant diabetes of 8.7%, and mean weight change at 1-year of 4.8 +/- 7.2 kg. In conclusion, early CSWD in AAs is associated with acceptable rejection rates, excellent patient and graft survival, and improved cardiovascular risk, indicating that the risks and benefits of early CSWD are similar between AAs and non-AAs. Additional follow-up is needed to determine long-term renal function, graft survival, and cardiovascular risk in AAs with early CSWD.


Subject(s)
Adrenal Cortex Hormones/pharmacology , Kidney Transplantation , Black or African American , Graft Rejection/drug therapy , Graft Rejection/epidemiology , Graft Survival/drug effects , Humans , Immunosuppression Therapy , Survival , Time Factors
17.
Clin Transplant ; 19(1): 102-9, 2005 Feb.
Article in English | MEDLINE | ID: mdl-15659142

ABSTRACT

BACKGROUND: Few studies have compared the quality of life (QoL) and functional recuperation of laproscopic donor nephrectomy (LDN) vs. open donor nephrectomy (ODN) donors. This study utilized the SF-36 health survey, single-item health-related quality of life (HRQOL) score, and a functional assessment questionnaire ('Donor Survey'). METHODS: Questionnaires were sent to 100 LDN and 50 ODN donors. These donors were patients whose procedures were performed at The University Hospital and The Christ Hospital in Cincinnati, Ohio. RESULTS: A total of 46 (46%) LDN and 21 (42%) ODN donors returned the completed surveys. The demographics of the two groups were similar. LDN patients reported a more rapid return to 100% normal health (69 vs. 116 d; p = 0.24), part-time work (21.9 vs. 23.2 d; p = 0.09), and necessitated fewer physician office visits post-operative (2.8 vs. 4.4; p = 0.01). ODN patients reported shorter duration of oral pain medication use (13.4 vs. 7.2 d; p = 0.02). However, a greater number of ODN patients reported post-surgical chronic pain (3 vs. 6; p < 0.05) and hernia (0 vs. 2; p = 0.19). The overall QoL for both groups was comparable with the general USA population. CONCLUSIONS: The results of this study support the decisions of many kidney transplant centers to adopt LDN programs as standard of care.


Subject(s)
Living Donors , Nephrectomy/methods , Adult , Female , Humans , Laparoscopy , Male , Middle Aged , Quality of Life , Recovery of Function , Treatment Outcome
18.
Transplantation ; 78(11): 1676-82, 2004 Dec 15.
Article in English | MEDLINE | ID: mdl-15591959

ABSTRACT

BACKGROUND: Posttransplant lymphoproliferative disorder (PTLD) is a life-threatening complication that occurs in a small but significant minority of solid organ transplant recipients. Published experiences with PTLD in cardiac transplant recipients are limited to relatively small single-center reports. METHODS: This report presents experience with 274 cases of PTLD in cardiac transplant recipients reported to the Israel Penn International Transplant Tumor Registry (IPITTR). RESULTS: PTLD carried an ominous prognosis: Kaplan Meier survival after PTLD diagnosis was 45%, 33%, 30%, and 13%, respectively, at 1, 3, 5, and 10 years. Common causes of death included: PTLD, cardiovascular collapse, and infection; all occurred at a median of less than 6 months. Risk of death from cardiovascular collapse secondary to immunosuppression withdrawal was substantial (28%), indicating that a fine balance exists between death from PTLD and from sudden cardiac death due to acute rejection. PTLD therapy in the majority of patients consisted of combination therapy (49%). Survival in patients receiving immunosuppression minimization (ISM) alone was 32%, with ISM plus other therapy was 27%, and with other therapies not containing ISM was 11% (P < 0.01). CONCLUSION: PTLD in cardiac transplant recipients is associated with low long-term survival rates. Analysis of PTLD therapies and outcomes suggest that immunosuppression minimization, when applied, improves survival. However, risk of sudden death may mitigate the positive effect of ISM. This observation has important implications for ISM in PTLD therapy in cardiac transplant recipients. Carefully designed prospective studies are needed to evaluate the positive and negative effects of ISM in cardiac transplant recipients with PTLD.


Subject(s)
Heart Transplantation/adverse effects , Lymphoproliferative Disorders/etiology , Adult , Aged , Female , Heart Transplantation/mortality , Humans , Immunosuppression Therapy , Lymphoproliferative Disorders/therapy , Male , Middle Aged , Registries
19.
Dis Colon Rectum ; 47(11): 1898-903, 2004 Nov.
Article in English | MEDLINE | ID: mdl-15622583

ABSTRACT

PURPOSE: Immunosuppression used in transplantation is associated with an increased incidence of various cancers. Although the incidence of colorectal cancer in transplant patients seems to be equal to nontransplant population, the effects of immunosuppression on patients who develop colorectal cancer are not well defined. The purpose of this study was to define the characteristics and survival patterns of transplant patients developing de novo colorectal cancer. METHODS: The Israel Penn International Transplant Tumor Registry was queried for patients with colorectal cancer. Analysis included patient demographics, age at transplantation and colorectal cancer diagnosis, tumor stage, and survival. Age and survival rates were compared to United States population-based colorectal cancer statistics using the National Cancer Institute Surveillance Epidemiology and End Results database. RESULTS: A total of 150 transplant patients with de novo colorectal cancer were identified: 93 kidney, 29 heart, 27 liver, and 1 lung. Mean age at transplantation was 53 years. Age at transplantation and colorectal cancer diagnosis was not significant for gender, race, or stage of disease. Compared to National Cancer Institute Surveillance Epidemiology and End Results database, transplantation patients had a younger mean age at colorectal cancer diagnosis (58 vs. 70 years; P < 0.001), and a worse five-year survival (overall, 44 vs. 62 percent, P < 0.001; Dukes A and B, 74 vs. 90 percent, P < 0.001; Dukes C, 20 vs. 66 percent, P < 0.001; and Dukes D, 0 vs. 9 percent, P = 0.08). CONCLUSIONS: Transplant patients develop colorectal cancer at a younger age and exhibit worse five-year survival rates than the general population. These data suggest that chronic immunosuppression results in a more aggressive tumor biology. Frequent posttransplantation colorectal cancer screening program may be warranted.


Subject(s)
Colorectal Neoplasms/epidemiology , Immunosuppression Therapy/adverse effects , Organ Transplantation , Aged , Chi-Square Distribution , Colorectal Neoplasms/immunology , Female , Humans , Incidence , Male , Middle Aged , Ohio/epidemiology , Registries , Risk Factors , Survival Analysis
20.
Prog Transplant ; 14(3): 193-200, 2004 Sep.
Article in English | MEDLINE | ID: mdl-15495778

ABSTRACT

In the past few decades, great advances have been made in the field of solid-organ transplantation. A greater understanding of immune system function, the development of modern immunosuppression, and advancements in surgical technique have led to marked improvements in both recipient and graft survivals, as well as recipients' quality of life. However, improved survival rates have also led to prolonged exposure to chronic immunosuppression, which increases the risk for the development of posttransplant malignancies. In addition, older transplant candidates are being considered, carrying with them the increased likelihood of preexisting malignancy. Consequently, the potential risk of posttransplant malignancy must be considered. Moreover, as long-term transplant survivors continue to age, posttransplant malignancies will be seen more frequently. This review presents the more commonly encountered posttransplant malignancies and the measures that are currently being utilized to prevent and treat them.


Subject(s)
Neoplasms/etiology , Neoplasms/prevention & control , Organ Transplantation/adverse effects , Adult , Child , Graft Survival , Humans , Immunosuppression Therapy/adverse effects , Immunosuppressive Agents/adverse effects , Mass Screening , Neoplasm Recurrence, Local/epidemiology , Neoplasm Recurrence, Local/etiology , Neoplasm Recurrence, Local/prevention & control , Neoplasms/epidemiology , Primary Prevention , Risk Factors , Transplantation Immunology
SELECTION OF CITATIONS
SEARCH DETAIL
...