Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 19 de 19
Filter
1.
Ann Intern Med ; 153(4): 222-30, 2010 Aug 17.
Article in English | MEDLINE | ID: mdl-20713790

ABSTRACT

BACKGROUND: Posttransplantation acute renal failure (ARF) occurs in roughly 25% of recipients of organs from deceased donors. Inflammation in the donor organ is associated with risk for ARF. OBJECTIVE: To determine whether administering corticosteroids to deceased organ donors reduces the incidence and duration of ARF in organ recipients more than placebo. DESIGN: Parallel, blocked randomized trial, performed between February 2006 and November 2008, with computer-generated randomization and centralized allocation. Investigators were masked to group assignment. (Controlled-trials.com registration number: ISRCTN78828338) SETTING: 3 renal transplantation centers in Austria and Hungary. PATIENTS: 306 deceased heart-beating donors and 455 renal transplant recipients. INTERVENTIONS: Organ donors were administered an intravenous infusion of either 1000 mg of methylprednisolone (136 donors) or placebo (0.9% saline) (133 donors) at least 3 hours before organ harvesting. MEASUREMENTS: Incidence of ARF, defined as more than 1 dialysis session in the first week after transplantation, was the primary end point. Secondary and other end points included duration of ARF and trajectories of serum creatinine level. The suppression of immune response and inflammation by the intervention was assessed in the donor organ on a genome-wide basis. RESULTS: 52 of 238 recipients (22%) of kidneys from steroid-treated donors and 54 of 217 recipients (25%) of kidneys from placebo-treated donors had ARF (difference, 3 percentage points [95% CI, -11 to 5 percentage points]). One graft was lost on day 1 in each group, and 1 recipient in the placebo group died of cardiac arrest on day 2. The median duration of ARF was 5 days (interquartile range, 2 days) in the steroid group and 4 days (interquartile range, 2 days) in the placebo group (P = 0.31). The groups had similar trajectories of serum creatinine level in the first week (P = 0.72). Genomic analysis showed suppressed inflammation and immune response in kidney biopsies from deceased donors who received corticosteroids. LIMITATION: Donors and recipients were mainly white, and all were from 3 transplantation centers in central Europe, which may limit generalizability. CONCLUSION: Systemic suppression of inflammation in deceased donors by corticosteroids did not reduce the incidence or duration of posttransplantation ARF in allograft recipients. PRIMARY FUNDING SOURCE: Austrian Science Fund and Austrian Academy of Science.


Subject(s)
Acute Kidney Injury/prevention & control , Anti-Inflammatory Agents/administration & dosage , Immunosuppressive Agents/administration & dosage , Ischemia/prevention & control , Kidney Transplantation/adverse effects , Kidney/blood supply , Methylprednisolone/administration & dosage , Tissue Donors , Acute Kidney Injury/etiology , Adult , Creatinine/blood , Double-Blind Method , Female , Gene Expression Profiling , Graft Survival/drug effects , Humans , Infusions, Intravenous , Ischemia/etiology , Kidney/physiology , Kidney Transplantation/immunology , Male , Middle Aged , Time Factors , Transplantation, Homologous
2.
Transplantation ; 87(12): 1821-9, 2009 Jun 27.
Article in English | MEDLINE | ID: mdl-19543059

ABSTRACT

BACKGROUND: It is unclear whether the choice of maintenance immunosuppression modulates the negative effect of advanced donor age on outcome after renal transplantation. METHODS: All 1829 patients who received their first transplant between 1990 and 2003 at the Vienna Medical Centre and had a functioning graft after 90 days were studied. At this time point, 1587 received calcineurin inhibitors (CNI+), 242 did not (CNI-). Actual and functional graft survival was analyzed in subgroups based on donor age (<36, 36-49, 50-64, and >64 years) and immunosuppressive therapy. RESULTS: The median follow-up time was 7 years. In total, we observed 312 deaths and 275 graft losses. After adjusting for several variables considered as potential confounders, actual graft survival was better in CNI+ patients compared with CNI- patients only if donor age was less than 36 years (adjusted hazard ratio 0.25, 95% confidence interval 0.17-0.38) or 36 to 49 years (0.43, 95% confidence interval 0.29-0.62). Similar results were obtained for functional graft survival. Patient survival was significantly better in CNI+ subjects irrespective of donor age (0.41, 95% confidence interval 0.30-0.57). DISCUSSION: Use of CNI 90 days after transplantation is associated with improved patient survival even after adjustment for confounders, but its beneficial association with actual and functional graft survival is lost or at least reduced if kidneys from donors older than 50 years are used.


Subject(s)
Calcineurin Inhibitors , Graft Survival/physiology , Immunosuppressive Agents/therapeutic use , Kidney Transplantation/immunology , Tissue Donors/statistics & numerical data , Adult , Age Factors , Aged , Austria , Cadaver , Follow-Up Studies , Graft Survival/immunology , Humans , Kidney Failure, Chronic/surgery , Kidney Failure, Chronic/therapy , Kidney Transplantation/mortality , Kidney Transplantation/physiology , Middle Aged , Proportional Hazards Models , Registries , Renal Replacement Therapy , Retrospective Studies , Survival Rate , Survivors , Treatment Outcome
3.
J Am Soc Nephrol ; 19(11): 2211-8, 2008 Nov.
Article in English | MEDLINE | ID: mdl-18650477

ABSTRACT

The efficacy of statins for the prevention of cardiovascular events is well established in the general population but remains unknown in renal transplant recipients. In this study, the association of statin use with patient and graft survival was investigated in a cohort of 2041 first-time recipients of renal allografts between 1990 and 2003. Multivariable Cox regression demonstrated that statin use was independently associated with lower mortality rates. Twelve-year survival rates were 73% for statin users and 64% for nonusers (P = 0.055). The adjusted hazard ratio for all-cause mortality associated with statin use was 0.64 (95% confidence interval 0.48 to 0.86). Graft survival rates during the same time period were 76% for statin users and 70% for nonusers (P = 0.055). The adjusted hazard ratio for graft survival associated with statin use was 0.76 (95% confidence interval 0.55 to 1.04). Results from marginal structural models were virtually identical. In summary, statin use was associated with prolonged patient survival, but no difference in graft survival was detected. Although these results are encouraging, a definitive causal relationship can be determined only from randomized clinical trials.


Subject(s)
Hydroxymethylglutaryl-CoA Reductase Inhibitors/therapeutic use , Kidney Transplantation , Adult , Cardiovascular Diseases/etiology , Cardiovascular Diseases/mortality , Cardiovascular Diseases/prevention & control , Cohort Studies , Female , Graft Survival/drug effects , Humans , Kaplan-Meier Estimate , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Kidney Transplantation/mortality , Male , Middle Aged , Proportional Hazards Models
4.
Aging Cell ; 7(4): 491-7, 2008 Aug.
Article in English | MEDLINE | ID: mdl-18462273

ABSTRACT

Although chronological donor age is the most potent predictor of long-term outcome after renal transplantation, it does not incorporate individual differences of the aging-process itself. We therefore hypothesized that an estimate of biological organ age as derived from markers of cellular senescence in zero hour biopsies would be of higher predictive value. Telomere length and mRNA expression levels of the cell cycle inhibitors CDKN2A (p16INK4a) and CDKN1A (p21WAF1) were assessed in pre-implantation biopsies of 54 patients and the association of these and various other clinical parameters with serum creatinine after 1 year was determined. In a linear regression analysis, CDKN2A turned out to be the best single predictor followed by donor age and telomere length. A multiple linear regression analysis revealed that the combination of CDKN2A values and donor age yielded even higher predictive values for serum creatinine 1 year after transplantation. We conclude that the molecular aging marker CDKN2A in combination with chronological donor age predict renal allograft function after 1 year significantly better than chronological donor age alone.


Subject(s)
Cellular Senescence , Kidney Transplantation , Kidney/pathology , Adult , Aging/metabolism , Biomarkers/metabolism , Biopsy , Creatinine/blood , Cyclin-Dependent Kinase Inhibitor p16/genetics , Cyclin-Dependent Kinase Inhibitor p16/metabolism , Cyclin-Dependent Kinase Inhibitor p21/genetics , Cyclin-Dependent Kinase Inhibitor p21/metabolism , Demography , Female , Humans , Male , Middle Aged , Postoperative Period , Regression Analysis , Telomere/metabolism , Time Factors , Tissue Donors , Transplantation, Homologous , Treatment Outcome
5.
Transpl Int ; 21(7): 615-24, 2008 Jul.
Article in English | MEDLINE | ID: mdl-18346011

ABSTRACT

Post-transplant renal osteopathy (ROP) remains a serious problem, which contributes to substantial long-term morbidity of the graft recipients. Bone loss is most pronounced during the first months after engraftment; concerning bone density development in long-term transplant recipients, controversial data exist. The clinical impact of ROP is a marked increase in fracture rate following kidney transplantation compared with both general population and patients on dialysis treatment. The following review will focus on post-transplant ROP and discuss its epidemiology, the clinical features, factors contributing to the pathogenesis of this complication, as well as the evaluation, prevention and treatment options available for kidney allograft recipients.


Subject(s)
Bone Diseases, Metabolic/etiology , Kidney Transplantation/adverse effects , Bone Diseases, Metabolic/drug therapy , Bone Diseases, Metabolic/prevention & control , Fractures, Spontaneous/etiology , Humans , Risk Factors
6.
Nephrol Dial Transplant ; 23(5): 1742-6, 2008 May.
Article in English | MEDLINE | ID: mdl-18234845

ABSTRACT

BACKGROUND: Angiotensin-converting enzyme inhibitors (ACEI) or angiotensin II type 1 receptor blockers (ARB) are frequently prescribed to renal transplant recipients with a reduced glomerular filtration rate (GFR). The aim of this study was to investigate the association of ACEI/ARB use and serum potassium levels in renal graft recipients. METHODS: We conducted an open cohort study of 2041 first renal allograft recipients, transplanted at the Medical University of Vienna between 1990 and 2003. Serum potassium levels were compared over an up to 10 years of observation period between subjects with versus without ACEI/ARB therapy using a mixed effects general linear model. The analysis was adjusted for several covariables known to influence serum potassium such as the use of diuretics, beta blockers, calcineurin inhibitor (CNI) based immunosuppression, estimated GFR, time since renal transplantation, diabetes, years on dialysis and recipient age. RESULTS: The overall adjusted estimated serum potassium difference between recipients with versus without ACEI/ARB therapy was 0.08 mmol/l (P < 0.001). The use of diuretics was associated with a 0.11 mmol/l (P < 0.001) lower potassium concentration whereas each GFR decrease by 10 ml/min led to an increase of 0.04 mmol/l (P < 0.001). CNI intake increased serum potassium by 0.06 mmol/l (P = 0.002). Furthermore, serum potassium increased by 0.17 mmol/l within the first decade after transplantation (P < 0.001) while holding the other covariables constant. No effect modification between ACEI/ARB and time since transplantation was observed. Nineteen subjects (2.4%) discontinued the ACEI/ARB therapy due to hyperkalaemia. CONCLUSIONS: In summary, relevant hyperkalaemia associated with ACEI/ARB therapy is negligible in renal transplant recipients during long-term follow-up. The hyperkalaemic effect of ACEI/ARB is balanced by the use of diuretics.


Subject(s)
Angiotensin II Type 1 Receptor Blockers/adverse effects , Angiotensin-Converting Enzyme Inhibitors/adverse effects , Kidney Transplantation/adverse effects , Potassium/blood , Adult , Cohort Studies , Databases, Factual , Diuretics/therapeutic use , Female , Glomerular Filtration Rate , Humans , Hyperkalemia/blood , Hyperkalemia/etiology , Hyperkalemia/physiopathology , Kidney Transplantation/physiology , Linear Models , Male , Middle Aged , Multivariate Analysis , Renin-Angiotensin System/drug effects , Renin-Angiotensin System/physiology , Retrospective Studies
7.
Transplantation ; 83(8): 1048-54, 2007 Apr 27.
Article in English | MEDLINE | ID: mdl-17452894

ABSTRACT

BACKGROUND: Donor factors such as age profoundly influence long-term graft function after cadaveric renal transplantation, but the molecular signature of these aspects in the allograft remains unknown. METHODS: We analyzed the genome-wide gene expression signature of donor kidney biopsies of different ages obtained before transplantation. Subsequent analysis compared expression profiles from allografts with excellent function versus impaired function at 1 yr after engraftment. Differential expression profiles were analyzed on the level of molecular function and biologic role, as well as by analysis of co-regulation through transcription factors, regulatory networks, and protein-protein interaction data utilizing extended bioinformatics. RESULTS: The 15 subjects with excellent transplant function defined as calculated GFR>or=45 mL/min/1.73 m2 at 1 yr exhibited a distinctly different gene expression profile than the matched 16 subjects with impaired function defined as calculated GFR<45 mL/min/1.73 m2. Donor kidneys from recipients with impaired allograft function showed activation of genes mainly belonging to the functional classes of immunity, signal transduction, and oxidative stress response. Two-thirds of these genes exhibited at least one protein interacting partner, suggesting choreographed intracellular events differentiating the two recipient groups. However, donor age may have confounded some of the associations found between gene profiles and graft function. CONCLUSION: In summary, a distinctive gene expression profile in the donor kidney at transplantation together with donor age predicts medium term allograft function in recipients of cadaveric allografts.


Subject(s)
Aging/genetics , Aging/physiology , Gene Expression Profiling , Graft Survival/genetics , Graft Survival/physiology , Kidney Transplantation , Tissue Donors , Adult , Biopsy , Female , Gene Expression Regulation , Humans , Male , Middle Aged , Time Factors
8.
Bone ; 40(2): 516-21, 2007 Feb.
Article in English | MEDLINE | ID: mdl-17070128

ABSTRACT

BACKGROUND: The incidence of fractures averages 20 per 1000 hemodialysis patient years at risk. This study sought to design and evaluate the utility of a simple prediction rule for fractures in dialysis patients using only standard demographical and biochemical information. METHODS: 1777 prevalent hemodialysis patients of the Austrian dialysis and transplant database who had an evaluation of fractures in 2004 and 2005 were included into analysis. Validation of the prediction rule model by a test set was performed using three different resampling techniques, the split sample approach, a 100-fold cross validation and a 100x bootstrap. Calibration of the model was performed visually by comparing the observed to the expected number of outcomes in each category and by calculating the Hosmer and Lemeshow goodness-of-fit statistic. RESULTS: A multivariable logistic regression model built on clinical expertise yielded a discrimination of c=0.73 (AUC of ROC). Further reduction of the covariables to age and sex as the only predictive variables did not result in loss of discrimination (c=0.71) and at the same time provided adequate calibration (p=0.69). The probability of fractures (PF) occurring within the next year of hemodialysis can be calculated from our prediction model as, PF=e(-6.25+0.4*age(in decades)-0.93(if male))/1+e(-6.25+0.4*age(in decades)-0.93(if male)), e.g., a 70-year-old male would have a fracture probability of 0.01 or 1%, a female 3%. The optimism derived by all resampling techniques was between 1% and 2% suggesting adequate generalizability of the prediction rule. CONCLUSION: A sufficient and parsimonious prediction rule for fractures in hemodialysis patients consists of the independent variables age and sex.


Subject(s)
Aging , Fractures, Bone/physiopathology , Kidney Failure, Chronic/physiopathology , Aged , Female , Humans , Kidney Failure, Chronic/therapy , Male , Middle Aged , Renal Dialysis , Sex Factors
9.
Nephrol Dial Transplant ; 21(8): 2275-81, 2006 Aug.
Article in English | MEDLINE | ID: mdl-16574684

ABSTRACT

BACKGROUND: Bone loss remains a serious problem after kidney transplantation and is most pronounced during the first months after engraftment. Bisphosphonates are frequently used to treat post-transplant osteodystrophy, but data of large randomized controlled trials (RCTs) are missing. METHODS: We, therefore, conducted this systematic review of the literature, searching electronic databases, reference lists and abstracts from scientific meetings to identify RCTs in all languages. The primary outcome assessed was the change in bone mineral density (BMD) during the early post-transplantation period. Based on the mean BMD change presented in the identified publications, the authors were asked for the individual BMD results of all randomized patients, determined at lumbar spine and femoral neck before and after bisphosphonate therapy. Data were pooled for summary estimates by using weighted mean differences of absolute change in BMD. An analysis of covariance was performed, adjusted for individual baseline values, treatment arm and individual trial. RESULTS: Five studies involving 180 participants were included in our meta-analysis. Treatment with bisphosphonates showed a substantial effect in preventing post-transplant osteodystrophy. BMD decline at the lumbar spine within 6-12 months after transplantation was significantly reduced by 0.06 g/cm(2) in patients treated with bisphosphonates (95% CI 0.05-0.08 g/cm(2)). At the femoral neck, the loss of BMD was reduced by 0.05 g/cm(2) during this period (95% CI 0.0-0.11 g/cm(2)), reaching just non-statistical significance. This benefit of bone loss prevention could be reached without major side effects. CONCLUSION: Bisphosphonates are effective in preventing bone loss in the early post-transplant period.


Subject(s)
Bone Density/drug effects , Bone Diseases, Metabolic/prevention & control , Chronic Kidney Disease-Mineral and Bone Disorder/prevention & control , Diphosphonates/therapeutic use , Kidney Transplantation , Postoperative Complications/drug therapy , Bone Diseases, Metabolic/drug therapy , Calcium/therapeutic use , Chronic Kidney Disease-Mineral and Bone Disorder/drug therapy , Diphosphonates/pharmacology , Drug Therapy, Combination , Female , Femur/chemistry , Fractures, Spontaneous/epidemiology , Fractures, Spontaneous/etiology , Fractures, Spontaneous/prevention & control , Graft Rejection/epidemiology , Graft Rejection/prevention & control , Humans , Immunosuppressive Agents/therapeutic use , Lumbar Vertebrae/chemistry , Male , Middle Aged , Minerals/analysis , Postoperative Care , Postoperative Complications/epidemiology , Postoperative Complications/prevention & control , Preoperative Care , Randomized Controlled Trials as Topic/statistics & numerical data , Treatment Outcome , Vitamin D/therapeutic use
10.
J Am Soc Nephrol ; 17(3): 889-99, 2006 Mar.
Article in English | MEDLINE | ID: mdl-16481415

ABSTRACT

Angiotensin-converting enzyme inhibitors (ACEI) or angiotensin II type 1 receptor blockers (ARB) reduce cardiovascular death in the general population, but data for renal transplant recipients remain elusive. Similarly, ACEI/ARB have been shown to reduce proteinuria, but data on graft survival are lacking. Therefore a retrospective open cohort study was conducted of 2031 patients who received their first renal allograft at the Medical University of Vienna between 1990 and 2003 and survived at least 3 mo. Patient and graft survival was compared between patients with versus without ACEI and/or ARB therapy. Data were analyzed with and without propensity score models for ACEI/ARB therapy. Medication and comorbidities were analyzed as time-dependent variables in the Cox regression analyses. Ten-year survival rates were 74% in the ACEI/ARB group but only 53% in the noACEI/ARB group (P<0.001). The hazard ratio (HR) of ACEI/ARB use for mortality was 0.57 (95% confidence interval [CI] 0.40 to 0.81) compared with nonuse. Ten-year actual graft survival rate was 59% in ACEI/ARB patients but only 41% in nonusers (P=0.002). The HR of actual graft failure for ACEI/ARB recipients was 0.55 (95% CI 0.43 to 0.70) compared with nonusers; the HR of functional graft survival was 0.56 (95% CI 0.40 to 0.78). Ten-year unadjusted functional graft survival rates were 76% among ACEI/ARB patients and 71% in noACEI/ARB recipients (P=0.57). In summary, the use of ACEI/ARB therapy was associated with longer patient and graft survival after renal transplantation. More frequent use of these medications may reduce the high incidence of death and renal allograft failure in these patients.


Subject(s)
Angiotensin II Type 1 Receptor Blockers/therapeutic use , Angiotensin-Converting Enzyme Inhibitors/therapeutic use , Graft Survival/drug effects , Kidney Failure, Chronic/mortality , Postoperative Complications/prevention & control , Adult , Age Factors , Aged , Cohort Studies , Confidence Intervals , Dose-Response Relationship, Drug , Drug Administration Schedule , Female , Follow-Up Studies , Graft Rejection/prevention & control , Humans , Kidney Failure, Chronic/diagnosis , Kidney Failure, Chronic/surgery , Kidney Transplantation/methods , Kidney Transplantation/mortality , Logistic Models , Male , Middle Aged , Multivariate Analysis , Postoperative Care , Probability , Retrospective Studies , Risk Assessment , Sex Factors , Survival Rate
11.
Kidney Int ; 68(6): 2497-507, 2005 Dec.
Article in English | MEDLINE | ID: mdl-16316326

ABSTRACT

BACKGROUND: Unilateral loss of kidney function is followed by compensatory contralateral growth. The early, genome-wide transcriptional response of the untouched kidney to unilateral ureteral obstruction (UUO) or unilateral nephrectomy is unknown. METHODS: Twelve adult male Sprague-Dawley rats were subjected to UUO and twelve rats to unilateral nephrectomy. At time points 12, 24, and 72 hours after insult four rats each were sacrificed and the contralateral kidney harvested for genome-wide gene expression analysis, transcription factor analysis, and histomorphology. RESULTS: Microarray studies revealed that the majority of differentially expressed transcripts were suppressed in UUO and unilateral nephrectomy compared to control kidneys. The function of these suppressed genes is predominantly growth inhibition and apoptosis suggesting a net pro-hypertrophic response. Insulin-like growth factor-2 (IGF-2)-binding protein was one of the few activated genes. We observed a distinctly different molecular signature between UUO and unilateral nephrectomy at the three time points investigated. The early response in UUO rats suggests a counterbalance to the nonfiltering kidney by activation of transport pathways such as the aquaporins. Unilateral nephrectomy kidneys, on the other hand, respond immediately to contralateral nephrectomy by activation of cell cycle regulators such as the cyclin family. Several genes with weakly defined function were found to be associated with either UUO or unilateral nephrectomy. Transcription factor analysis of the identified transcripts suggests common regulation at least of some of these genes. All kidneys showed normal histology. CONCLUSION: Release of growth inhibition by nephrectomy leads to immediate cell cycle activation after unilateral nephrectomy, whereas UUO kidneys counterbalance filtration failure by activation of several transporters.


Subject(s)
Hydronephrosis/genetics , Hydronephrosis/physiopathology , Kidney/physiology , Oligonucleotide Array Sequence Analysis , Transcriptional Activation/physiology , Animals , Carrier Proteins/genetics , Insulin-Like Growth Factor II/genetics , Kidney/surgery , Male , Nephrectomy , Rats , Rats, Sprague-Dawley , Transcription Factors/genetics , Ureteral Obstruction/genetics , Ureteral Obstruction/physiopathology
12.
Am J Transplant ; 4(10): 1595-604, 2004 Oct.
Article in English | MEDLINE | ID: mdl-15367214

ABSTRACT

Recipients of live donor transplant kidneys (LIV) exhibit a significantly longer allograft half-life compared with cadaveric donor organs (CADs). The reasons are incompletely understood. Therefore this study sought to elucidate the genome-wide gene expression profiles in microdissected transplant kidney biopsies obtained from five cadaveric and five matched live donors before transplantation. cDNA microarrays were used to determine the transcripts in isolated glomeruli (G) and the tubulointerstitial (TI) compartment. Data were subjected to hierarchical clustering, maxT adjustment and a jackknife procedure to ensure robustness of reported findings; validation was performed by independent analysis of split biopsies and TaqMan-PCR. One hundred and thirteen sequences representing 62 unique genes (17 redundant features), and 34 ESTs separated G from TI. No difference in gene expression was found in G between LIV and CAD kidneys, but nine genes (two represented twice) and three ESTs were abundantly expressed in the CAD TI compared with LIV. The main biological function of these genes is counter regulation of oxidative stress. Promoter analysis of significant features suggested coregulated gene groups. These data suggest that CAD kidneys exhibit a distinctly different set of transcripts in the TI compartment but not in the G compartment when compared with LIV kidneys.


Subject(s)
Gene Expression/physiology , Kidney Transplantation , Kidney/physiology , Oxidative Stress , Transplants , Cadaver , Gene Expression Profiling , Humans , Kidney Glomerulus/physiology , Living Donors , Oligonucleotide Array Sequence Analysis , Promoter Regions, Genetic
13.
Lab Invest ; 84(3): 353-61, 2004 Mar.
Article in English | MEDLINE | ID: mdl-14704720

ABSTRACT

Roughly 25% of cadaveric, but rarely living donor renal transplant recipients, develop postischemic acute renal failure, which is a main risk factor for reduced long-term allograft survival. An accurate prediction of recipients at risk for ARF is not possible on the basis of donor kidney morphology or donor/recipient demographics. We determined the genome-wide gene-expression pattern using cDNA microarrays in three groups of 36 donor kidney wedge biopsies: living donor kidneys with primary function, cadaveric donor kidneys with primary function and cadaveric donor kidneys with biopsy proven acute renal failure. The descriptive genes were characterized in gene ontology terms to determine their functional role. The validation of microarray experiments was performed by real-time PCR. We retrieved 132 genes after maxT adjustment for multiple testing that significantly separated living from cadaveric kidneys, and 48 genes that classified the donor kidneys according to their post-transplant course. The main functional roles of these genes are cell communication, apoptosis and inflammation. In particular, members of the complement cascade were activated in cadaveric, but not in living donor kidneys. Thus, suppression of inflammation in the cadaveric donor might be a cheap and promising intervention for postischemic acute renal failure.


Subject(s)
Kidney Transplantation/physiology , Acute Kidney Injury/prevention & control , Adult , Aged , Biopsy , Cadaver , Gene Expression Profiling , Graft Survival/genetics , Humans , Inflammation/prevention & control , Living Donors , Middle Aged , Oligonucleotide Array Sequence Analysis , Tissue Donors
14.
Health Qual Life Outcomes ; 2: 2, 2004 Jan 08.
Article in English | MEDLINE | ID: mdl-14713316

ABSTRACT

With the improvements in short and long term graft and patient survival after renal transplantation over the last two decades Health-Related Quality of Life (HRQL) is becoming an important additional outcome parameter. Global and disease specific instruments are available to evaluate objective and subjective QOL. Among the most popular global tools is the SF-36, examples of disease specific instruments are the Kidney Transplant Questionnaire (KTQ), the Kidney Disease Questionnaire (KDQ) and the Kidney Disease-Quality of Life (KDQOL). It is generally accepted that HRQL improves dramatically after successful renal transplantation compared to patients maintained on dialysis treatment but listed for a transplant. It is less clear however which immunosuppressive regimen confers the best QOL. Only few studies compared the different regimens in terms of QOL outcomes. Although limited in number, these studies seem to favour non-cyclosporine based protocols. The main differences that could be observed between patients on cyclosporine versus tacrolimus or sirolimus therapy concern the domains of appearance and fatigue. This may be explained by two common adverse effects occurring under cyclosporine therapy, gingival hyperplasia and hair growth. Another more frequently occurring side effect under calcineurin inhibitor therapy is tremor, which may favour CNI free protocols. This hypothesis, however, has not been formally evaluated in a randomised trial using HRQL measurements.In summary HRQL is becoming more of an issue after renal transplantation. Whether a specific immunosuppressive protocol is superior to others in terms of HRQL remains to be determined.


Subject(s)
Kidney Failure, Chronic/surgery , Kidney Transplantation/immunology , Outcome Assessment, Health Care/methods , Quality of Life , Sickness Impact Profile , Humans , Immunosuppressive Agents/therapeutic use , Kidney Failure, Chronic/therapy , Psychometrics/instrumentation , Renal Dialysis , Sirolimus/therapeutic use , Surveys and Questionnaires , Tacrolimus/therapeutic use
15.
Kidney Int ; 65(1): 304-9, 2004 Jan.
Article in English | MEDLINE | ID: mdl-14675064

ABSTRACT

BACKGROUND: We recently showed that two doses of 4 mg of zoledronic acid (ZOL) ameliorated the bone loss and improved bone histology within the first six months after kidney transplantation. The aim of the present study was to evaluate whether this early short-term intervention exhibited a sustained bone-sparing effect. METHODS: A homogenous group of 20 de novo renal transplant recipients were equally randomized to two infusions of 4 mg of ZOL or placebo at two weeks and three months after engraftment. Patients were followed up for three years by sequential determination of bone densitometry and specific biochemical markers. RESULTS: From month six to three years after transplantation, both treatment groups exhibited an improvement of bone mineralization. Femoral neck bone mineral density z-scores increased statistically significantly from -1.3 (2.6) to -0.2 (3.6) in the placebo group and from -1.6 (2.9) to -1.2 (1.9) in the ZOL group (median, range). Biochemical parameters of osteoblast activity such as osteocalcin and bone-specific alkaline phosphatase did not increase significantly in both groups. Osteoprotegerin, a marker of osteoclast inhibition, was significantly elevated over the first six months in the ZOL group, but decreased to similar levels, as in the placebo group, over the next two and a half years. Other markers of osteoclast activity such as c-telopeptide of type 1 collagen, calcitonin, and intact parathyroid hormone were not different between six months and three years in either group. CONCLUSION: The early bone-sparing effect of short-term ZOL therapy confers no sustained benefit versus placebo at three year post-transplantation.


Subject(s)
Chronic Kidney Disease-Mineral and Bone Disorder/drug therapy , Chronic Kidney Disease-Mineral and Bone Disorder/metabolism , Diphosphonates/administration & dosage , Femur/metabolism , Imidazoles/administration & dosage , Kidney Transplantation , Acute Disease , Biomarkers , Bone Density/drug effects , Calcification, Physiologic/drug effects , Graft Rejection/metabolism , Humans , Osteoblasts/metabolism , Osteoclasts/metabolism , Zoledronic Acid
16.
Am J Kidney Dis ; 42(3): 539-45, 2003 Sep.
Article in English | MEDLINE | ID: mdl-12955682

ABSTRACT

BACKGROUND: Measurement of access blood flow is the preferred noninvasive screening test for hemodialysis arteriovenous (AV) fistula stenosis. However, performance characteristics of the 2 most frequently used ultrasound techniques compared with fistulography remain elusive. METHODS: We evaluated 59 hemodialysis patients with native forearm AV fistulae who underwent all 3 measurements in a prospective order: the ultrasound dilution technique (UDT), color Doppler ultrasonography (CDUS), and fistulography. Patients with angiographically diagnosed access stenosis underwent angioplasty and were followed up by means of monthly UDT measurements until restenosis occurred within the first 6 months. RESULTS: Both ultrasound techniques predicted access stenosis (P < 0.01). Performance was similar between both techniques, evaluated by receiver operating characteristic curves. Areas under the curve averaged 0.79 (95% confidence interval [CI], 0.66 to 0.91) for UDT and 0.80 (95% CI, 0.65 to 0.94) for CDUS. Correlation between measured UDT and CDUS blood flow rates was 0.37 (Spearman's rho, rho = 0.004). The calculated optimal cutoff value for the prediction of stenosis was 465 mL/min for the UDT and 390 mL/min for the CDUS technique. Access stenosis was diagnosed in 41 patients who subsequently underwent percutaneous angioplasty (PTA), which was successful in 34 patients. Restenosis occurred in 13 patients within the first 6 months after PTA. UDT access blood flow after PTA was significantly lower in these 13 patients compared with the other 21 patients. CONCLUSION: Our data suggest that blood flow monitoring of AV hemodialysis access by ultrasound provides a reasonable prediction of access stenosis and restenosis.


Subject(s)
Arteriovenous Shunt, Surgical , Forearm/blood supply , Hemorheology/methods , Indicator Dilution Techniques , Ultrasonography, Doppler, Color , Ultrasonography/methods , Vascular Patency , Adult , Aged , Angioplasty, Balloon , Arterial Occlusive Diseases/diagnostic imaging , Constriction, Pathologic/diagnostic imaging , Constriction, Pathologic/therapy , Female , Forearm/diagnostic imaging , Humans , Male , Middle Aged , Phlebography , Predictive Value of Tests , Prospective Studies , Single-Blind Method , Sodium Chloride , Veins/diagnostic imaging
17.
Transplantation ; 76(4): 715-20, 2003 Aug 27.
Article in English | MEDLINE | ID: mdl-12973116

ABSTRACT

BACKGROUND: About 30% of cadaveric renal allografts, but almost never living-donor kidneys, develop postischemic acute renal-transplant failure (ARF). We therefore quantified the expression of essential reperfusion regulators in different compartments of cadaveric and living-donor kidney biopsies. METHODS: Specimens were obtained from donor kidneys at the end of the cold ischemia time before implantation and categorized into three groups according to donor source and early posttransplant function. Ten living-donor biopsies (LIV) were compared with nine cadaveric kidney biopsies (CAD) with primary posttransplant function (CAD-PF) and to nine with ARF (CAD-ARF). Laser capture microdissection was used to isolate glomeruli from tubulointerstitium. The gene expression of intercellular adhesion molecule (ICAM)-1, interleukin (IL)-1beta, endothelin (ET)-1, inducible nitric oxide synthase (iNOS), and endothelial nitric oxide synthase (eNOS) was quantified in glomeruli and tubulointerstitium by real-time polymerase chain reaction (TaqMan). RESULTS: Tubulointerstitial areas of all CAD kidneys revealed significantly lower mRNA levels of all investigated genes compared with LIV. Tubulointerstitial ET-1, iNOS, and eNOS in CAD-ARF averaged only half of the expression in CAD-PF kidneys. ICAM-1 and IL-1beta mRNA concentrations were equal in CAD-PF and CAD-ARF. Glomerular expression of the investigated genes was equal in CAD and LIV kidneys with the exception of ICAM-1 and ET-1, which were two times higher in CAD-PF compared with LIV and CAD-ARF. CONCLUSION: These data suggest that CAD compared with LIV kidneys have an impaired expression of immune and vasoregulatory genes in the tubulointerstitium, which may represent reduced cellular vitality and capacity to adaptation. The observed further reduction of ET-1, iNOS, and eNOS expression in CAD-ARF might contribute to reperfusion injury and delayed allograft function.


Subject(s)
Acute Kidney Injury/metabolism , Endothelin-1/genetics , Ischemia/metabolism , Kidney/blood supply , Nitric Oxide Synthase/genetics , Tissue Donors , Adult , Biopsy , Gene Expression , Humans , Intercellular Adhesion Molecule-1/genetics , Interleukin-1/genetics , Kidney/metabolism , Middle Aged , Nitric Oxide Synthase Type II , Nitric Oxide Synthase Type III , RNA, Messenger/analysis , Transplantation, Homologous
18.
Kidney Int ; 63(3): 1130-6, 2003 Mar.
Article in English | MEDLINE | ID: mdl-12631097

ABSTRACT

BACKGROUND: Bisphosphonates can prevent bone mineral density loss after renal transplantation, but their effect on trabecular mineralization and bone morphology, two key factors of bone stability, remains unknown. METHODS: In a 6-month, randomized, placebo-controlled study, 20 kidney transplant recipients received either 4 mg zoledronic acid or placebo twice within 3 months after engraftment. At transplantation and after 6 months, mean trabecular calcium concentration and trabecular morphometry were measured in bone biopsies. Bone mineral density (BMD) of the femoral neck and the lumbar spine were evaluated by dual-energy x-ray absorptiometry, and serum biochemical markers of bone metabolism were determined monthly. RESULTS: Trabecular calcium content increased significantly in the zoledronic acid group, but remained unchanged in the placebo group. BMD at femoral neck showed no change in the zoledronic acid group, but decreased in the placebo group. BMD of the lumbar spine was increased in the zoledronic acid group without change in the placebo group. High-turnover bone disease resolved similarly in both groups, as evidenced by a significant decrease of eroded bone surface, osteoclast and osteoblast surface. Serologic markers of bone formation and resorption were significantly lower in zoledronic acid-treated patients throughout the study. Kidney transplant function was stable after zoledronic acid therapy. CONCLUSIONS: Our results show that administration of zoledronic acid improves the calcium content of cancellous bone after kidney transplantation. The beneficial effect of bisphosphonate therapy is further evidenced by an increase of lumbar spine BMD, and stabilization of femur BMD.


Subject(s)
Chronic Kidney Disease-Mineral and Bone Disorder/drug therapy , Chronic Kidney Disease-Mineral and Bone Disorder/prevention & control , Diphosphonates/therapeutic use , Imidazoles/therapeutic use , Kidney Transplantation , Biomarkers , Bone Density/drug effects , Calcium/metabolism , Female , Femur Neck/metabolism , Humans , Lumbar Vertebrae/metabolism , Male , Middle Aged , Postoperative Complications/drug therapy , Postoperative Complications/prevention & control , Zoledronic Acid
SELECTION OF CITATIONS
SEARCH DETAIL
...