Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 24
Filter
1.
Am J Transplant ; 16(10): 2943-2953, 2016 10.
Article in English | MEDLINE | ID: mdl-27088545

ABSTRACT

The indication for antimicrobial treatment of asymptomatic bacteriuria (AB) after kidney transplantation (KT) remains controversial. Between January 2011 and December 2013, 112 KT recipients that developed one episode or more of AB beyond the second month after transplantation were included in this open-label trial. Participants were randomized (1:1 ratio) to the treatment group (systematic antimicrobial therapy for all episodes of AB occurring ≤24 mo after transplantation [53 patients]) or control group (no antimicrobial therapy [59 patients]). Systematic screening for AB was performed similarly in both groups. The primary outcome was the occurrence of acute pyelonephritis at 24-mo follow-up. Secondary outcomes included lower urinary tract infection, acute rejection, Clostridium difficile infection, colonization or infection by multidrug-resistant bacteria, graft function and all-cause mortality. There were no differences in the primary outcome in the intention-to-treat population (7.5% [4 of 53] in the treatment group vs. 8.4% [5 of 59] in the control group; odds ratio [OR] 0.88, 95% confidence interval [CI] 0.22-3.47) or the per-protocol population (3.8% [1 of 26] in the treatment group vs. 8.0% [4 of 50] in the control group; OR 0.46, 95% CI 0.05-4.34). Moreover, we found no differences in any of the secondary outcomes. In conclusion, systematic screening and treatment of AB beyond the second month after transplantation provided no apparent benefit among KT recipients (NCT02373085).


Subject(s)
Anti-Bacterial Agents/therapeutic use , Bacteriuria/drug therapy , Graft Rejection/drug therapy , Kidney Failure, Chronic/surgery , Kidney Transplantation/adverse effects , Pyelonephritis/prevention & control , Bacteria/drug effects , Bacteriuria/complications , Bacteriuria/microbiology , Cohort Studies , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection/etiology , Graft Survival/drug effects , Humans , Kidney Function Tests , Male , Middle Aged , Prognosis , Pyelonephritis/etiology , Risk Factors
2.
Transplant Proc ; 47(1): 42-4, 2015.
Article in English | MEDLINE | ID: mdl-25645766

ABSTRACT

BACKGROUND: Different strategies have been initiated to shorten the waiting list time to receive a kidney transplant. Donors with acute kidney injury (AKI) may be a new option. METHODS: Fifty-nine patients received a kidney transplant from an AKI donor defined as having serum creatinine >2 mg/dL at the time of organ procurement. They were compared with a transplant group with normal kidney function defined as creatinine <1.5 mg/dL organ procurement in the same time period, paired by donor and recipient age (control group). Initial evolution, at 1 year, and at the end of the follow-up were evaluated. RESULTS: The AKI donor group had greater delayed graft function (68% versus 36%, P < .01). Graft and recipient survival were similar in both groups at 1 year (92% versus 88%, P = NS; 97% versus 98%, P = NS) and at the end of follow-up (66% versus 66%, P = NS; 90% versus 88%, P = NS). Serum creatinine at 1 year and at the end of the follow-up did not show any differences (1.4 ± 0.5 versus 1.4 ± 0.7 mg/dL, P = NS; 1.4 ± 0.5 versus 1.6 ± 0.9 mg/dL, P = NS). CONCLUSIONS: The transplants from donors with AKI showed greater incidence of delayed graft function, but this did not affect the short- or long-term prognosis of the graft or recipient. This type of donor may be a source of acceptable kidneys.


Subject(s)
Acute Kidney Injury , Delayed Graft Function/epidemiology , Kidney Failure, Chronic/surgery , Kidney Transplantation , Tissue and Organ Procurement , Adult , Aged , Cadaver , Creatinine/blood , Delayed Graft Function/diagnosis , Delayed Graft Function/physiopathology , Female , Graft Survival , Humans , Incidence , Kidney/physiopathology , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/physiopathology , Male , Middle Aged , Retrospective Studies , Treatment Outcome
3.
Transplant Proc ; 47(1): 57-61, 2015.
Article in English | MEDLINE | ID: mdl-25645770

ABSTRACT

BACKGROUND: Mammalian target of rapamycin inhibitors (mTOR-i) have been proposed as possible immunosuppressants of choice in BK virus nephropathy (BKN) because of their antiviral capacity. On this basis, in 2007, our Service proposed a conversion to everolimus (EVE)-based therapy from calcineurin inhibitors with an anti-calcineurin-free therapy protocol in those patients diagnosed of BKN. METHODS: A prospective, single-center case series study was performed. Fifteen cases of BKN were diagnosed from 2007 to the end of 2010. According to our protocol, immunosuppressant treatment was modified in 9 of these patients with suspension of mycophenolate and conversion from tacrolimus to EVE. RESULTS: The renal function achieved by our patients after the transplantation was excellent. Mean serum creatinine (sCr) achieved was 1.16 ± 0.2 mg/dL. Evolution of the renal function after BKN diagnosis and conversion to mTOR-i was positive in all the patients. sCr on diagnosis was 1.85 ± 0.22 mg/dL, sCr at the point in time of conversion to EVE was 2 ± 0.21 mg/dL, and final sCr of the follow-up was 1.6 ± 0.39 mg/dL (P = .05). BK viremia became negative in 5 of our patients and decreased more than 95% in the remaining 4. None of the patients had an acute rejection episode after the change of immunosuppressant. CONCLUSIONS: Conversion to mTOR-i-based therapy could provide an added benefit in BKN and could be an effective strategy for the decrease of the viremia and increase of graft survival in selected patients.


Subject(s)
BK Virus , Immunosuppressive Agents/therapeutic use , Kidney Diseases/therapy , Kidney Transplantation , Polyomavirus Infections/prevention & control , Sirolimus/analogs & derivatives , Adult , Calcineurin Inhibitors , Everolimus , Female , Graft Survival , Humans , Immunosuppression Therapy , Kidney Diseases/diagnosis , Kidney Diseases/epidemiology , Male , Middle Aged , Polyomavirus Infections/diagnosis , Polyomavirus Infections/epidemiology , Prospective Studies , Sirolimus/therapeutic use , Tacrolimus/therapeutic use , Viral Load , Viremia/diagnosis , Viremia/etiology , Viremia/prevention & control
4.
Transplant Proc ; 47(1): 70-2, 2015.
Article in English | MEDLINE | ID: mdl-25645773

ABSTRACT

INTRODUCTION: A significant number of patients with chronic kidney disease (CKD) have cardiac abnormalities, and left ventricular systolic dysfunction (LVSD) is a common manifestation. Our hypothesis is that a decrease in the left ventricular ejection fraction (LVEF) at the time of kidney transplantation is a factor of poor prognosis associated with poor graft evolution. METHODS AND RESULTS: A total of 954 kidney transplantations were performed in our center between 2005 and 2012. Nineteen (2%) of these patients had been diagnosed with left ventricular dysfunction that was defined by the presence of LVEF <50% on echocardiography. This group of patients was compared with a control group of recipients without LVSD who had received the contralateral kidney from the same donor. During a mean follow-up of 52 ± 14 months, it was observed that the patients with LVSD had a higher incidence of delayed graft function (DGF) as well as a significantly longer renal function recovery period than in the control group until they became dialysis free (19.8 [range, 0-90] vs 12 [range, 0-36] days; P = .01). Furthermore, graft function achieved by the LVSD group was worse during the evolution (serum creatinine 2.3 ± 1.9 vs 1.4 ± 0.5 mg/dL; P = .01). Patients with LVSD showed worse kidney graft survival at the end of the follow-up when compared with the control group (79% vs 100%; P = .03). CONCLUSIONS: Systolic dysfunction of the renal transplant recipient is associated with greater delay in graft function and worse graft survival with poorer renal function.


Subject(s)
Delayed Graft Function/epidemiology , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/surgery , Kidney Transplantation , Ventricular Dysfunction, Left/complications , Adult , Aged , Case-Control Studies , Delayed Graft Function/diagnosis , Delayed Graft Function/therapy , Donor Selection , Female , Graft Survival , Humans , Incidence , Kidney/physiopathology , Kidney Failure, Chronic/mortality , Male , Middle Aged , Prognosis , Renal Dialysis , Risk Factors , Stroke Volume , Ventricular Dysfunction, Left/diagnosis , Ventricular Dysfunction, Left/physiopathology
5.
An Sist Sanit Navar ; 37(2): 189-201, 2014.
Article in Spanish | MEDLINE | ID: mdl-25189977

ABSTRACT

BACKGROUND: In 2010 the Basque Government launched its "Strategy for tackling the challenge of chronicity in the Basque Country", in order to transform healthcare organisations into more integrated care models, with the aim of improving quality and efficiency in chronicity management. Four Integrated Healthcare Organisations (IHOs) were created to unify primary and specialised care into one single organisation. The aim of this study is to measure the degree of readiness of these IHOs to cope with chronicity, using the Chronic Care Model (CCM) as a reference. MATERIAL AND METHODS: Self-assessment processes using ARCHO (Assessment of Readiness for Chronicity in Health Care Organisations) were carried out in four IHOs by the management teams. RESULTS: The average score was 16 out of 100 points, which signals that healthcare organisations are undertaking action plans in the realm of integrated care, but with limited deployment and without a systematic process for evaluating outcomes. The dimension that ranks best is "Organization of the health system", while "Community health" has the lowest ranking. CONCLUSIONS: IHOs are the key for achieving integrated care for chronic illnesses. Integration of community resources and evaluation of results are two of the fields that need to be improved in order to achieve the set of interventions proposed in ARCHO. The organisational changes involved in the CCM require periods longer than two years.


Subject(s)
Chronic Disease/therapy , Delivery of Health Care, Integrated/standards , Quality Assurance, Health Care/organization & administration , Humans , Spain
6.
An. sist. sanit. Navar ; 37(2): 198-201, mayo-ago. 2014. tab, graf, ilus
Article in Spanish | IBECS | ID: ibc-128696

ABSTRACT

Fundamento: En el año 2010 Euskadi puso en marcha la "Estrategia para afrontar el reto de la cronicidad" tratando de reorientar las organizaciones sanitarias hacia modelos más integrados que proporcionaran una respuesta de calidad y más eficiente al paciente crónico. Se han creado cuatro Organizaciones Sanitarias Integradas (OSI) que suponen la integración en una única organización de los dos niveles asistenciales: atención primaria y especializada. El objetivo del estudio es conocer el grado de desarrollo alcanzado por las cuatro OSI en su estrategia de gestión integrada de la cronicidad tomando como referente de la evaluación el Chronic Care Model. Material y métodos: Autoevaluación en las cuatros OSI con la herramienta IEMAC por parte del equipo directivo de cada organización. Resultados: La puntuación promedio es de 16 puntos sobre 100, lo que implica que las organizaciones están emprendiendo planes de acción en el ámbito de la gestión de la atención integrada, pero con un despliegue limitado y sin que se puedan aportar evidencias de evaluación de los resultados de las acciones puestas en marcha. La dimensión mejor valorada fue "Organización del Sistema de Salud", mientras que la menos valorada fue "Salud Comunitaria". Conclusiones: Las OSI son motores del cambio para lograr una atención integral a la cronicidad. Los aspectos relacionados con la integración de recursos comunitarios y la evaluación de resultados constituyen dos de las áreas de mejora para desplegar el conjunto de intervenciones propuestas en IEMAC. Los cambios organizativos que supone el CCM requieren de períodos de tiempo superiores a dos años (AU)


Background: In 2010 the Basque Government launched its "Strategy for tackling the challenge of chronicity in the Basque Country", in order to transform healthcare organisations into more integrated care models, with the aim of improving quality and efficiency in chronicity management. Four Integrated Healthcare Organisations (IHOs) were created to unify primary and specialized care into one single organisation. The aim of this study is to measure the degree of readiness of these IHOs to cope with chronicity, using the Chronic Care Model (CCM) as a reference. Material and methods: Self-assessment processes using ARCHO (Assessment of Readiness for Chronicity in Health Care Organisations) were carried out in four IHOs by the management teams. Results: The average score was 16 out of 100 points, which signals that healthcare organisations are undertaking action plans in the realm of integrated care, but with limited deployment and without a systematic process for evaluating outcomes. The dimension that ranks best is "Organization of the health system", while "Community health" has the lowest ranking. Conclusions: IHOs are the key for achieving integrated care for chronic illnesses. Integration of community resources and evaluation of results are two of the fields that need to be improved in order to achieve the set of interventions proposed in ARCHO. The organizational changes involved in the CCM require periods longer than two years (AU)


Subject(s)
Humans , Male , Female , Delivery of Health Care/organization & administration , Delivery of Health Care/standards , /standards , Self-Assessment , Quality of Health Care/standards , Quality of Health Care , Primary Health Care/organization & administration , Primary Health Care/standards , Delivery of Health Care, Integrated/organization & administration , Delivery of Health Care, Integrated/standards , Delivery of Health Care, Integrated
7.
Transpl Infect Dis ; 15(6): 600-11, 2013 Dec.
Article in English | MEDLINE | ID: mdl-24011120

ABSTRACT

BACKGROUND: The impact of iron metabolism on the risk of infectious complications has been demonstrated in various immunosuppressed populations. However, no previous studies have assessed this potential association in kidney transplant (KT) recipients. METHODS: We prospectively analyzed 228 patients undergoing KT at our institution from November 2008 to February 2011. Serum iron parameters (iron level, ferritin, total iron-binding capacity, unsaturated iron-binding capacity, transferrin, and transferrin saturation) were assessed within the first 2 weeks after transplantation (median interval, 3 days; interquartile [Q1 -Q3 ] range, 1-6 days), and before the occurrence of the first infectious episode (median interval, 26 days; Q1 -Q3 range, 11-76 days). Primary outcome was the occurrence of any episode of infection during the first year. Multivariate-adjusted hazard ratios (aHRs) were estimated by Cox regression models. RESULTS: Patients with ferritin level ≥ 500 ng/mL had higher incidence rates (per 1000 transplant-days) of overall infection (P = 0.017), bacterial infection (P = 0.002), and bloodstream infection (P = 0.011) during the first post-transplant year. One-year infection-free survival rate was lower in these recipients (26% vs. 41%; P = 0.004). On multivariate analysis, after adjusting for potential confounders, ferritin emerged as an independent predictor of overall infection (aHR [per unitary increment], 1.001; P = 0.006), and bacterial infection (aHR [per unitary increment], 1.001; P = 0.020). CONCLUSION: Monitoring of serum iron parameters in the early post-transplant period may be useful in predicting the occurrence of infection in KT recipients, although further studies should be carried out to confirm this preliminary finding.


Subject(s)
Bacterial Infections/epidemiology , Ferritins/blood , Iron/blood , Kidney Transplantation/adverse effects , Sepsis/epidemiology , Transferrin/metabolism , Adult , Aged , Bacterial Infections/blood , Disease-Free Survival , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged , Postoperative Period , Predictive Value of Tests , Prospective Studies , Risk Factors , Sepsis/blood
8.
Am J Transplant ; 13(3): 685-94, 2013 Mar.
Article in English | MEDLINE | ID: mdl-23311502

ABSTRACT

The usefulness of monitoring of complement levels in predicting the occurrence of infection in kidney transplant (KT) recipients remains largely unknown. We prospectively assessed serum complement levels (C3 and C4) at baseline and at months 1 and 6 in 270 patients undergoing KT. Adjusted hazard ratios (aHRs) for infection in each posttransplant period were estimated by Cox regression. The prevalence of C3 hypocomplementemia progressively decreased from 21.5% at baseline to 11.6% at month 6 (p = 0.017), whereas the prevalence of C4 hypocomplementemia rose from 3.7% at baseline to 9.2% at month 1 (p = 0.004). Patients with C3 hypocomplementemia at month 1 had higher incidences of overall (p = 0.002), bacterial (p = 0.004) and fungal infection (p = 0.019) in the intermediate period (months 1-6). On multivariate analysis C3 hypocomplementemia at month 1 emerged as a risk factor for overall (aHR 1.911; p = 0.009) and bacterial infection (aHR 2.130; p = 0.014) during the intermediate period, whereas C3 hypocomplementemia at month 6 predicted the occurrence of bacterial infection (aHR 3.347; p = 0.039) in the late period (>6 month). A simple monitoring strategy of serum C3 levels predicts the risk of posttransplant infectious complications in KT recipients.


Subject(s)
Complement C3/deficiency , Graft Rejection/etiology , Infections/etiology , Kidney Diseases/complications , Kidney Transplantation/adverse effects , Postoperative Complications , Female , Graft Rejection/mortality , Graft Survival , Humans , Infections/mortality , Kidney Diseases/surgery , Male , Middle Aged , Prognosis , Prospective Studies , Risk Factors , Survival Rate , Tertiary Care Centers
9.
Transplant Proc ; 42(8): 2837-8, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20970544

ABSTRACT

BACKGROUND: Living kidney donor transplantation, a treatment option for end-stage kidney failure, may achieve better results than cadaveric donor transplantation. Although its significant use in some countries is due to the scarcity of cadaveric donors, it is also useful because it reduces waiting time for young recipients and avoids dialysis when performed before starting renal replacement therapy. Due to the high rate of cadaveric donation in Spain, there has only been a limited increase in the number of living donor kidney transplantations. METHODS: In February 2004, we initiated a program to promote living kidney donation (LKD) through an information plan that was transmitted to the patients by dialysis nephrologists and chronic kidney failure outpatient clinics. RESULTS: From February 2004 to March 2010, we evaluated 109 donor and recipient pairs: parent to child (n=48 cases; 44%), spouses (n=32 cases; 29.3%), siblings (n=27; 24.7%), and uncle and nephew (n=2; 1.8%). The mean donor age (49±9 years) was significantly higher than the 39±13 years of the recipients (P<.01). In 45 cases (41.3%), the procedure led to of living kidney donor transplantation but in 58 (53.2%), a transplantation was not performed due to recipient problems (n=53) or donor problems (n=5). In 6 cases (5.5%), the evaluation is still pending. With the initiation of this project, it has been possible to significantly increase the rate of living kidney donor transplantation in our hospital from 0.8% (March to January 2004: 16/1964) to 4.2% (February 2004 to March 2010: 43/1022 transplants; P<.01). CONCLUSION: A policy of active information together with adequate studies of the potential donor and recipient significantly increased the number of living kidney donor transplantations. The profitability of the study procedure was 50%. The most frequent cause of noncompletion of the procedure was recipient-related problems.


Subject(s)
Kidney Transplantation , Living Donors , Adult , Child , Family , Humans , Middle Aged
10.
Transplant Proc ; 42(8): 2899-901, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20970564

ABSTRACT

Experimental and clinical data strongly suggest that aldosterone may contribute to proteinuria and progressive renal disease. In fact, an aldosterone antagonist seems to be effective for controlling proteinuria in native kidneys. However, there is little information about this approach in renal transplant patients, a population in whom the presence and amount of proteinuria represent risk factors for graft loss, cardiovascular disease, and death. The aim of our study was to evaluate whether addition of an aldosterone antagonist, spironolactone, provided an additional antiproteinuric effect to the angiotensin-converting enzyme inhibitor (ACEI) and angiotensin type I receptor antagonists (ARB). We evaluated the effects on severe proteinuria (4.4±1.4 g/d) at 6 months after prescription of spironolactone (25 mg/d) among 11 renal transplant patients with serum creatinine values less than 3 mg/dL who were under treatment with an ACEI plus an ARB. Patients were examined in the renal transplant outpatient clinic every week for the first month and twice a month thereafter. Nine patients showed a more than 50% (mean=81.5%) reduction in proteinuria not only early, but also sustained at 6 months (4.4±1.4 to 2.3±1.1 g/d) with a mild, nonsignificant deterioration in renal function (serum creatinine 1.6±0.32 to 1.7±0.54 mg/dL). This study showed that spironolactone decreased severe proteinuria among patients treated with an ACEI plus an ARB. This therapy is not recommended for patients with glomerular filtration rates below 40 mL/min. Therefore, it is suggested that using triple blockade of RAS is feasible in selected renal transplant patients to reduce proteinuria, although caution is required to avoid severe hyperkalemia.


Subject(s)
Angiotensin II Type 1 Receptor Blockers/therapeutic use , Angiotensin-Converting Enzyme Inhibitors/therapeutic use , Kidney Transplantation , Mineralocorticoid Receptor Antagonists/therapeutic use , Proteinuria/prevention & control , Renin-Angiotensin System/drug effects , Spironolactone/therapeutic use , Adult , Angiotensin II Type 1 Receptor Blockers/administration & dosage , Angiotensin II Type 1 Receptor Blockers/pharmacology , Angiotensin-Converting Enzyme Inhibitors/administration & dosage , Angiotensin-Converting Enzyme Inhibitors/pharmacology , Combined Modality Therapy , Female , Follow-Up Studies , Humans , Male , Middle Aged , Mineralocorticoid Receptor Antagonists/administration & dosage , Mineralocorticoid Receptor Antagonists/pharmacology , Pilot Projects , Spironolactone/administration & dosage , Spironolactone/pharmacology
11.
Transplant Proc ; 42(8): 3034-7, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20970602

ABSTRACT

BACKGROUND: Available data for extended-release tacrolimus (Tac) except in clinical trials are limited. OBJECTIVE: To describe our initial experience with once-daily Tac in combination with corticosteroids and mycophenolate mofetil therapy in patients undergoing de novo renal transplantation. PATIENTS AND METHODS: In this retrospective, observational, single-center study, data were obtained for 49 adult recipients treated with extended-release Tac and 30 patients treated with standard-release Tac (control group). Mean (SD) follow-up in the 2 groups was 3.5 (2.5) months and 4.0 (2.6) months, respectively. The primary characteristics were comparable between the groups. RESULTS: The acute rejection rate in the extended-release group was 10%, and 13% in the standard-release group. Patient and graft survival rates were 98% and 96% vs 100% and 90%, respectively. Renal function in the 2 groups was comparable: serum creatinine concentration 1.3 (0.2) mg/dL vs 1.45 (0.4) mg/dL. At day 14 posttransplantation, Tac doses were 0.17 mg/kg/d vs 0.14 mg/kg/d, and blood concentrations were 9.0 ng/mL vs 14.0 ng/mL. In recipients older than 60 years, lower dosages of Tac resulted in blood concentrations similar to those in younger patients, with less variation in dosage. CONCLUSIONS: Short-term experience with extended-release Tac therapy in de novo renal recipients confirms its efficacy and safety. Adjusting blood concentrations in the immediate posttransplantation period is less difficult with extended-release Tac compared with the twice-daily formulation.


Subject(s)
Immunosuppressive Agents/administration & dosage , Kidney Transplantation , Tacrolimus/administration & dosage , Adrenal Cortex Hormones/administration & dosage , Adrenal Cortex Hormones/therapeutic use , Delayed-Action Preparations , Graft Survival , Humans , Immunosuppressive Agents/therapeutic use , Mycophenolic Acid/administration & dosage , Mycophenolic Acid/analogs & derivatives , Mycophenolic Acid/therapeutic use , Retrospective Studies , Survival Analysis , Tacrolimus/therapeutic use
12.
Transplant Proc ; 41(6): 2304-5, 2009.
Article in English | MEDLINE | ID: mdl-19715903

ABSTRACT

INTRODUCTION: Family refusal is an important factor that limits the number of organ donors. Cultural and religious factors as well as perception of brain death are the principal reasons for these refusals. We examined whether the type of potential donor, that is brain-dead or non-heart-beating, had an influence on family refusal. In July 2005, we initiated a program of non-heart-beating donors who had died in the street or at home. MATERIALS AND METHODS: We compared family refusals among these potential donors with those among potential brain-dead donors from July 2005 to October 2008. RESULTS: The mean time of stay in the hospital was significantly greater for brain-dead donors than those who were non-heart-beating: 4 +/- 2 versus 0.23 +/- 0.01 days (P < .01). The rate of family refusals was significantly greater among the families of potential brain-dead donors, that is 24% (24/99) than non-heart-beating donors, that is, 4% (2/47; P < .01). Donor age was similar in both groups. CONCLUSION: The rate of family refusals among potential non-heart-beating donors was significantly lower than that among families of brain-dead individuals. Greater understanding of death because the heart is not beating, less time of uncertainty about death, and shorter hospital stay could explain this difference.


Subject(s)
Brain Death , Family , Refusal to Treat/statistics & numerical data , Tissue Donors/statistics & numerical data , Tissue and Organ Harvesting/statistics & numerical data , Adult , Attitude to Death , Attitude to Health , Female , Heart Rate , Humans , Interviews as Topic , Male , Middle Aged , Spain , Young Adult
13.
Transplant Proc ; 41(6): 2332-3, 2009.
Article in English | MEDLINE | ID: mdl-19715911

ABSTRACT

Renal transplantation provides the best quality of life for the patients with chronic end-stage renal failure. However, the immunosuppression necessary for graft survival may give rise to infectious complications, an increased risk of cardiovascular and neoplastic diseases, all of which can shorten the patient's survival. The objective of this study was to evaluate the efficacy and safety of the proliferation signal inhibitor immunosuppressant drugs everolimus among patients who develop neoplasms after renal transplantation. This retrospective study included 25 patients (mean age -56.5 +/- 14.1 years) who were diagnosed with posttransplant neoplastic disease and immunosuppressed with calcineurin inhibitors (CNIs). Treatment was initiated with everolimus with or without CNIs. During the follow-up, the renal function (initial serum creatinine 1.4 mg/dL vs final serum creatinine 1.3 mg/dL) and proteinuria levels (initial 0.3 g/d vs final 0.4 g/d) remained stable. There was a low percentage of patients with relapse of their tumor. One patient had a relapse of bladder cancer with tumor progression at 3 years; another patient with melanoma developed lymph node invasion. There were neither acute rejection episodes nor cardiovascular complications. The results suggested that tumor relapse was low. The results suggested that immunosuppression with everolimus combined with low doses of CNIs or in single-drug therapy is safe immunosuppression for patients who develop posttransplant malignant diseases.


Subject(s)
Immunosuppressive Agents/therapeutic use , Kidney Transplantation/immunology , Neoplasms/complications , Postoperative Complications , Sirolimus/analogs & derivatives , Adult , Aged , Creatinine/blood , Cyclosporine/therapeutic use , Everolimus , Female , Humans , Kidney Transplantation/mortality , Male , Middle Aged , Mycophenolic Acid/analogs & derivatives , Mycophenolic Acid/therapeutic use , Neoplasms/pathology , Proteinuria , Recurrence , Retrospective Studies , Sirolimus/therapeutic use , Survival Rate , Tacrolimus/therapeutic use
14.
Transplant Proc ; 41(6): 2379-81, 2009.
Article in English | MEDLINE | ID: mdl-19715925

ABSTRACT

Although deceased donors older than 60 years of age (D > 60) are increasing in number, little information exists on the rate of discarded kidneys from these aged individuals. This study sought to analyze causes of discard of kidneys from D > 60. Since 1997, we have transplanted kidneys from D > 60 into elderly recipients after assessing their functional and anatomical viability. Among 3444 renal offers for transplantation between 1997 and 2005, 1967 (57%) came from D > 60. Of these, 1145 offers were discarded, because the kidney donor was not adequate (n = 470) or because there was no elderly recipient on our waiting list (n = 675). We also examined 1745 kidneys, 822 (47%) of which came from D > 60. The percentage of discarded kidneys due to macroscopic or microscopic alterations was 46% in the D > 60 group compared with 14.7% in the donor group younger than 60 years of age (D < 60; P < .01). We transplanted 443 kidneys from D > 60 (85 dual, 273 single) to 358 recipients of matching age and 900 kidneys from D < 60. Three-year death-censored actuarial graft survival rate was 83% for D > 60 compared with 89% for D < 60 transplant (P = not significant). In conclusion, kidneys from D > 60 were discarded for transplantation mainly because there was no elderly recipient on the waiting list and due to macroscopic or microscopic alterations. Given the increasing offer of kidneys from D > 60 and the good results of transplantation with these aged kidneys in elderly recipients, the indications for kidney transplantation should be expanded to include more of the elderly population on dialysis to the waiting list.


Subject(s)
Cadaver , Kidney Transplantation/statistics & numerical data , Patient Selection , Tissue Donors/statistics & numerical data , Aged , Graft Survival/physiology , Humans , Kidney/pathology , Kidney Transplantation/standards , Middle Aged , Retrospective Studies , Waiting Lists
15.
Nefrologia ; 29(3): 208-13, 2009.
Article in Spanish | MEDLINE | ID: mdl-19554053

ABSTRACT

INTRODUCTION: Fibrates represent one of the medications used to treat patients with hyperlipemia. Deterioration in renal function is not a very known adverse effect of fibric acid derivates. In the last 26 months we have detected thirteen patients with acute renal failure associated to fibrates in our outpatients' clinic. SUBJECTS AND METHODS: The aim of our study is to analyze our experience in deterioration in renal function associated to fibrates use. This is a retrospective charts review. RESULTS: From the thirteen patients (8 males/5 females) with mean age of 65.5 +/- 12.2 years, ten received Fenofibrate (FN), one Bezafibrate (BZ) and two Gemfibrozil (GF). Six cases had previously normal renal function and the seven remaining had mild chronic renal failure (CRF). The increase of serum Creatinine (Crs) value was higher than 74%. Acute renal failure was reversible in 9 patients (group 1), but the other 4 did not recover their previous renal function (group 2). The average of Crs before fibrate treatment was 1.33 +/- 0.36 mg/dl (Creatinine clearance 63.2 +/- 26.6 ml/min) and the highest average of Crs during the treatment was 2.22 +/- 0.49 mg/d (Creatinine clearance 37.3 +/- 11.9 ml/min). The average time until acute renal failure diagnosis was 6.7 +/- 5.8 months and the recovery of renal function was delayed an average of 3.8 +/- 3.5 months after fibrates withdrawn. Group 2 patients had a higuer Crs and longer time with fibrates than group 1 patients. CPK values were normal in all cases. In two patients renal biopsy was performed and no significant lesions were detected. CONCLUSION: The fibrate treatment can induce an acute renal failure. Four patients (30.8%) did not recover their basal renal function. When fibrate treatment begins a renal function should be monitored specially in patients with CRF.


Subject(s)
Acute Kidney Injury/chemically induced , Clofibric Acid/adverse effects , Hypolipidemic Agents/adverse effects , Kidney/drug effects , Kidney/physiopathology , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Retrospective Studies
16.
Nefrología (Madr.) ; 29(3): 208-213, mayo-jun. 2009. ilus, tab
Article in Spanish | IBECS | ID: ibc-104389

ABSTRACT

Introducción: Los fibratos representan uno de los grupos de fármacos indicados para el tratamiento de la hiperlipidemia. Uno de sus efectos secundarios, aún poco conocido, es el deterioro agudo de la función renal. En los últimos 26 meses hemos objetivado en nuestra consulta externa de Nefrología un total de13 pacientes con deterioro de la función renal asociado al uso de fibratos. Material y métodos: El objetivo de nuestro estudio es evaluar nuestra experiencia en el incremento de Creatinina sérica (Crs) inducido por fibratos. Se trata de una revisión retrospectiva de una serie de casos. Resultados: De los 13 pacientes (8 hombres/5 mujeres) con edad media de 65,5 ± 12,2 años, diez fueron tratados con fenofibrato, uno con bezafibrato y dos con gemfibrozilo. Seis pacientes partían de una función renal normal y los otros siete presentaban una Insuficiencia Renal Crónica (IRC) leve-moderada previamente al inicio del tratamiento. El incremento de creatinina con respecto a la basal expresado en porcentaje fue superior al 74%. En nueve de los pacientes el deterioro de función renal fue completamente reversible (grupo 1), mientras que en cuatro de ellos la recuperación fue parcial (grupo 2). La media de creatinina antes de recibir tratamiento con fibratos fue de 1,33 ± 0,36 mg/dl (aclaramiento de creatinina 63,2 ± 26,6 ml/min) y la media de la creatinina máxima durante el tratamiento fue de 2,22± 0,49 mg/dl (aclaramiento de creatinina 33,4 ± 8,1 ml/min). El tiempo medio de evolución hasta objetivarse el incremento de creatinina fue de 6,7 ± 5,8 meses y la recuperación de la función renal ocurrió a los 3,8 ± 3,5 meses de la suspensión del tratamiento con fibratos. En los pacientes del grupo 2 se objetivó un mayor incremento de Crs y un tiempo de tratamiento confibratos más prolongado. En los pacientes en los que se obtuvieron niveles de CPK, éstos fueron normales. En dos de nuestros (..) (AU)


Introduction: Fibrates represent one of the medications used to treat patients with hyperlipemia. Deterioration in renal function is not a very known adverse effect of fibric acid derivates . In the last 26 months we have detected thirteen patient s with acute renal failure associated to fibrates in our outpatients’ clinic. Subjects and methods : The aim of our study is to analyze our experience in deterioration in renal function associated to fibrates use. This is a retrospective char t s review. Results: From the thirteen patients (8 males/5 females) with mean age of 65.5 ± 12.2 year s , ten received Fenofibrate (FN) , one Beza fibrate (BZ) and two Gemfibrozil (GF). Six cases had previously normal renal function and the seven remaining had mi ld chronicrenal failure (CRF) . The increase of serum Creatinine (Cr s ) value was higher than 74%. (..) (AU)


Subject(s)
Humans , Fibric Acids/adverse effects , Renal Insufficiency/chemically induced , Hypolipidemic Agents/adverse effects , Retrospective Studies , Risk Factors , Creatinine/blood , Creatinine/urine
19.
Blood Purif ; 25(1): 69-76, 2007.
Article in English | MEDLINE | ID: mdl-17170541

ABSTRACT

Inflammation and infection seem to be important causes of morbidity and mortality in chronic kidney disease (CKD) patients; subclinical infections have been proposed as an important cause of inflammatory syndrome, but to date this hypothesis remains speculative. We developed a method for the molecular detection of the presence of bacterial DNA in a population of CKD patients in order to correlate the molecular data with the degree and level of inflammation and to evaluate its usefulness in the diagnosis of subclinical infection. The study was divided into two phases: (1) a population of 81 CKD patients was screened for the prevalence and level of inflammation and the presence of possible infection, and (2) a subgroup of 38 patients, without evident clinical causes of inflammation, underwent complete molecular evaluation for subclinical infection using bacterial DNA primers for sequencing. Additionally, complete analysis was carried out in the blood and dialysate compartments of the hemodialyzers used. The general population showed a certain degree of subclinical inflammation and no difference was found between patients with and without evident causes of inflammation. Hemoculture-negative patients were positive for the presence of bacterial DNA when molecular methods were used. We found a correlation trend between the presence of bacterial DNA and the increase in hs-CRP, IL-6 and oxidative stress (advanced oxidation protein product) levels and a reduction in the mean fluorescence intensity for HLA-DR. Hemodialyzer membranes seem to have properties that stick to bacteria/bacterial DNA and work as concentrators. In fact, patients with negative bacterial DNA in the circulating blood displayed positivity in the blood compartment of the dialyzer. The dialysate was negative for bacterial DNA but the dialysate compartment of the hemodialyzers used was positive in a high percentage. Moreover our data suggest that bacterial DNA can traverse hemodialysis membranes. Molecular methods have been found to be far more sensitive than standard methods in detecting subclinical infection. The presence of bacterial DNA seems to influence the variation in some parameters of inflammation and immunity. Apart from the limitations and pitfalls, the molecular method could be useful to screen for subclinical infection and diagnose subclinical sepsis when the hemoculture is negative. However, the identification of the microorganism implicated must be done with species-specific primers.


Subject(s)
Bacterial Infections/diagnosis , DNA, Bacterial/analysis , Hemodialysis Solutions/analysis , Kidney Failure, Chronic/microbiology , RNA, Ribosomal, 16S/isolation & purification , Renal Dialysis , Bacteremia/etiology , Bacteremia/microbiology , Biomarkers/blood , Cross Infection/microbiology , Cross Infection/prevention & control , Humans , Inflammation/blood , Inflammation/microbiology , Kidney Failure, Chronic/complications , Renal Dialysis/adverse effects , Renal Dialysis/instrumentation
SELECTION OF CITATIONS
SEARCH DETAIL