Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
Add more filters











Database
Language
Publication year range
1.
Nephron Clin Pract ; 113(4): c270-80, 2009.
Article in English | MEDLINE | ID: mdl-19684412

ABSTRACT

OBJECTIVE: We examined the relationship between various volume indicators, i.e. multifrequency bioelectric impedance analysis (BIA), predialysis serum N-terminus-pro-brain natriuretic peptide (NT[-]pro[-]BNP) levels, and inferior vena cava diameter, and left ventricular mass index (LVMI) at baseline and with rigorous volume management on thrice-weekly hemodialysis. METHODS: Twenty-two patients on chronic thrice-weekly hemodialysis were followed for 52 weeks. Left ventricular hypertrophy was present in 100% of the cohort at baseline. RESULTS: There were no significant correlations among volume indicators except for a correlation between extracellular-volume-to-body-mass ratio and collapsibility index (r = 0.476; p = 0.039) at 6 months. There were no correlations between blood pressure and volume indicators. Baseline (but not follow-up) collapsibility index correlated with LVMI (r = 0.506; p = 0.038). In 'lag-time' analyses, there were no correlations between volume indicators at baseline or 6 months and LVMI at subsequent time points. LVMI decreased from 243.6 +/- 83.3 g/m(2) at baseline to 210.6 +/- 62.9 g/m(2) at 6 months (p = 0.104) and further to 203.2 +/- 49.0 g/m(2) at 12 months (p = 0.035). CONCLUSIONS: (1) Left ventricular hypertrophy was prevalent in hemodialysis patients; (2) BIA, inferior vena cava ultrasound and serum NT-pro-BNP levels yield discordant results for fluid volumes; (3) regression of LVMI could occur with rigorous fluid management, even with thrice-weekly dialysis.


Subject(s)
Hypertrophy, Left Ventricular/diagnosis , Hypertrophy, Left Ventricular/etiology , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/rehabilitation , Renal Dialysis/adverse effects , Renal Dialysis/methods , Adult , Aged , Female , Humans , Male , Middle Aged , Organ Size , Prognosis , Reproducibility of Results , Sensitivity and Specificity , Treatment Outcome
2.
Clin Transplant ; 22(3): 263-72, 2008.
Article in English | MEDLINE | ID: mdl-18482047

ABSTRACT

BACKGROUND: Factors associated with outcome in renal transplant recipients with lupus nephritis have not been studied. METHODS: Using the data from the United States Renal Data System of patients transplanted between January 1, 1995 through December 31, 2002 (and followed through December 31, 2003) (n = 2882), we performed a retrospective analysis of factors associated with long-term death-censored graft survival and recipient survival. RESULTS: The number of pretransplant pregnancies incrementally increased the risk of graft failure [hazard ratio (HR) 1.54, p < 0.05] in the entire subgroup of females and in the subgroup of recipients aged 25-35 yr. Recipient and donor age had an association with both the risk of graft failure (HR 0.96, p < 0.001; HR 1.01, p < 0.005) and recipient death (HR 1.04, p < 0.001; HR 1.01, p < 0.05). Greater graft-failure risk accompanied increased recipient weight (HR 1.01, p < 0.001); African Americans compared with whites (HR 1.55, p < 0.001); greater Charlson comorbidity index (HR 1.17, p < 0.05); and greater panel reactive antibody (PRA) levels (HR 1.06, p < 0.001). Pretransplant peritoneal dialysis as the predominant modality had an association with decreased risk of graft failure (HR 0.49, p < 0.001), while prior transplantation was associated with greater risk of graft failure and recipient death (HR 2.29, p < 0.001; HR 3.59, p < 0.001, respectively) compared with hemodialysis (HD). The number of matched human leukocyte antigens (HLA) antigens and living donors (HR 0.92, p < 0.05; HR 0.64, p < 0.001, respectively) was associated with decreased risk of graft failure. Increased risk of graft failure and recipient death was associated with nonuse of calcineurin inhibitors (HR 1.89, p < 0.005; HR 1.80, p < 0.005) and mycophenolic acid (MPA) (including mycophenolate mofetil and MPA) or azathioprine (HR 1.41, p < 0.05; HR 1.66, p < 0.01). Using both cyclosporine and tacrolimus was associated with increased risk of graft failure (HR 2.09, p < 0.05). Using MPA is associated with greater risk of recipient death compared with azathioprine (HR 1.47, p < 0.05). CONCLUSION: In renal transplant recipients with lupus nephritis, multiple pregnancies, multiple blood transfusions, greater comorbidity index, higher body weight, age and African American race of the donor or recipient, prior history of transplantation, greater PRA levels, lower level of HLA matching, deceased donors, and HD in pretransplant period have an association with increased risk of graft failure. Similarly, higher recipient and donor age, prior transplantations, and higher rate of pretransplant transfusions are associated with greater risk of recipient mortality. Using neither cyclosporine nor tacrolimus or using both (compared with tacrolimus) and neither MPA nor azathioprine (compared with azathioprine) was associated with increased risk of graft failure and recipient death. Using MPA is associated with greater risk of recipient death compared with azathioprine. Testing these results in a prospective study might provide important information for clinical practice.


Subject(s)
Kidney Transplantation , Lupus Nephritis/surgery , Adult , Age Factors , Antibodies/blood , Azathioprine/therapeutic use , Body Weight , Calcineurin Inhibitors , Cyclosporine/adverse effects , Female , Graft Survival , HLA Antigens/blood , Humans , Kidney Transplantation/mortality , Male , Mycophenolic Acid/therapeutic use , Parity , Peritoneal Dialysis , Pregnancy , Racial Groups , Retrospective Studies , Survival Rate , Tacrolimus/adverse effects , Tissue Donors , Treatment Outcome
3.
ASAIO J ; 53(5): 601-8, 2007.
Article in English | MEDLINE | ID: mdl-17885334

ABSTRACT

Cardiovascular disease (CVD) leads to increased mortality rates among renal transplant recipients; however, its effect on allograft survival has not been well studied. The records from the United States Renal Data System and the United Network for Organ Sharing from January 1, 1995, through December 31, 2002, were examined in this retrospective study. The outcome variables were allograft survival time and recipient survival time. The primary variable of interest was CVD, defined as the presence of at least one of the following: cardiac arrest, myocardial infarction, dysrhythmia, congestive heart failure, ischemic heart disease, peripheral vascular disease, and unstable angina. The Cox models were adjusted for potential confounding factors. Of the 105,181 patients in the data set, 20,371 had a diagnosis of CVD. The presence of CVD had an adverse effect on allograft survival time (HR 1.12, p < 0.001) and recipient survival time (HR 1.41, p < 0.001). Among the subcategories, congestive heart failure (HR 1.14, p < 0.005) and dysrhythmia (HR 1.26, p < 0.05) had adverse effects on allograft survival time. In addition to increasing mortality rates, CVD at the time of end-stage renal disease onset is also a significant risk factor for renal allograft failure. Further research is needed to evaluate the role of specific forms of CVD in allograft and recipient outcome.


Subject(s)
Cardiovascular Diseases/complications , Kidney Transplantation/mortality , Kidney Transplantation/statistics & numerical data , Adult , Arrhythmias, Cardiac/complications , Case-Control Studies , Child , Databases, Factual , Female , Follow-Up Studies , Graft Survival , Heart Failure/complications , Humans , Living Donors , Male , Retrospective Studies , Risk Factors , Survival Analysis , Time Factors , Transplantation, Homologous , Treatment Outcome , United States
4.
Nephrol Dial Transplant ; 22(12): 3623-30, 2007 Dec.
Article in English | MEDLINE | ID: mdl-17640941

ABSTRACT

BACKGROUND: Clinical outcome of renal transplantation among systemic lupus erythematosus (SLE) patients remains a topic of controversy. Most of the previous reports were based upon small single-centre studies that were not always well-designed. METHODS: We conducted the retrospective analysis using data from USRDS and UNOS databases. Patients were divided into five groups based on the cause of end-stage renal disease (ESRD): diabetes mellitus (DM), SLE, glomerulonephritis, hypertension and other causes. Between 1990 and 1999, 2886 renal transplantation recipients with ESRD due to SLE were identified from a total of 92 844 patients. RESULTS: The mean follow-up period of this study was 4.7 +/- 2.4 years. While unadjusted analysis using Kaplan-Meier curves demonstrated an association between SLE and improved allograft survival compared with DM, in multivariate analysis the SLE group had worse allograft [hazard ratio (HR) 1.09, P < 0.05] and recipient (HR 1.19, P < 0.05) survival compared with the DM group. Subgroup analysis based on the type of donor showed that SLE patients who received deceased donor allograft had worse allograft and recipient survival (HR 1.14, P = 0.002 and HR 1.30, P = 0.001, respectively) compared with non-SLE deceased donor allograft recipients. Among living allograft recipients, there were no significant differences in either allograft or recipient survival compared with non-SLE recipients. CONCLUSIONS: SLE as a cause of ESRD in renal transplant recipients is associated with worse allograft and recipient survival compared with DM; this association is true for the entire population and for the recipients of deceased donor (but not living donor) transplant. Deceased donor allograft recipients have worse outcomes compared with living allograft recipients.


Subject(s)
Kidney Failure, Chronic/etiology , Kidney Failure, Chronic/surgery , Kidney Transplantation , Lupus Erythematosus, Systemic/complications , Lupus Nephritis/complications , Adult , Female , Humans , Male , Middle Aged , Retrospective Studies
5.
Nephrol Dial Transplant ; 22(3): 891-8, 2007 Mar.
Article in English | MEDLINE | ID: mdl-17172252

ABSTRACT

BACKGROUND: The causative role of alcohol consumption in renal disease is controversial, and its effect on renal graft and recipient survival has not been previously studied. METHODS: We analysed the association between pre-transplant [at the time of end-stage renal disease (ESRD) onset] alcohol dependency and renal graft and recipient survival. The United States Renal Data System (USRDS) records of kidney transplant recipients 18 years or older transplanted between 1 January 1995 and 31 December 2002 were examined. We used Kaplan-Meier analysis and Cox regression models adjusted for covariates to analyse the association between pre-transplant alcohol dependency and graft and recipient survival. RESULTS: In an entire study cohort of 60 523, we identified 425 patients with a history of alcohol dependency. Using Cox models, alcohol dependency was found to be associated with increased risk of death-censored graft failure [hazard ratio (HR) 1.38, P < 0.05] and increased risk of transplant recipient death (HR 1.56, P < 0.001). Subgroup analysis demonstrated an association of alcohol-dependency with recipient survival and death-censored graft survival in males (but not in females), and in both white and non-white racial subgroups. CONCLUSIONS: We concluded that alcohol dependency at the time of ESRD onset is a risk factor for renal graft failure and recipient death.


Subject(s)
Alcohol Drinking/adverse effects , Alcoholism/complications , Graft Survival , Kidney Failure, Chronic/surgery , Kidney Transplantation , Adult , Alcohol Drinking/mortality , Alcoholism/mortality , Female , Follow-Up Studies , Humans , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/mortality , Male , Middle Aged , Prognosis , Proportional Hazards Models , Retrospective Studies , Risk Factors , Survival Rate/trends , Time Factors , United States/epidemiology
6.
Clin Transplant ; 20(2): 245-52, 2006.
Article in English | MEDLINE | ID: mdl-16640534

ABSTRACT

BACKGROUND: There has been a general trend towards shortened length of post-kidney transplant hospitalization (LOH). The decision regarding patients's discharge from the hospital theoretically may be based on several factors, including, but not limited to, patient well being, insurance status, family situation and other, mostly socio-economic factors, as opposed to hard medical evidence. However, the appropriate LOH in kidney transplant recipients is not well studied regarding long-term outcomes. METHODS: This study retrospectively analysed the association between LOH and graft and recipient survival based on United States Renal Data System dataset. In total, 100,762 patients who underwent transplant during 1995-2002 were included. Kaplan-Meier survival analysis and Cox models were applied to the whole patient cohort and on sub-groups stratified by the presence of delayed graft function, patient comorbidity index and donor type (deceased or living). RESULTS: In recipient survival, both short (<4 d) and long (>5 d) LOH showed a significant adverse effect (p < 0.01) on survival times. In the analysis of graft survival, long LOH (>or=2 wk) also showed significant adverse effects (p < 0.001) on survival times. However, short LOH (<4 d) did not reach statistical significance, although it was still associated with adverse effects on graft survival. These observations were consistent across the whole patient cohort and sub-groups stratified by the presence of delayed graft function, patient comorbidity index and donor type. CONCLUSION: Clinical considerations should be used to make the decision regarding appropriate time of post-kidney transplant recipient discharge. Based on this study, shorter than four d post-kidney transplant hospitalization may potentially be harmful to long-term graft and recipient survival.


Subject(s)
Graft Survival/physiology , Kidney Transplantation/physiology , Length of Stay , Adult , Female , Humans , Kidney Transplantation/mortality , Male , Middle Aged , Patient Discharge , Survival Analysis
7.
Clin J Am Soc Nephrol ; 1(2): 313-22, 2006 Mar.
Article in English | MEDLINE | ID: mdl-17699222

ABSTRACT

There is controversy regarding the influence of genetic versus environmental factors on kidney transplant outcome in minority groups. The goal of this project was to evaluate the role of certain socioeconomic factors in allograft and recipient survival. Graft and recipient survival data from the United States Renal Data System were analyzed using Cox modeling with primary variables of interest, including recipient education level, citizenship, and primary source of pay for medical service. College (hazard ratio [HR] 0.93, P < 0.005) and postcollege education (HR 0.85, P < 0.005) improved graft outcome in the whole group and in patients of white race. Similar trends were observed for recipient survival (HR 0.9, P < 0.005 for college; HR 0.88, P = 0.09 for postcollege education) in the whole population and in white patients. Resident aliens had a significantly better graft outcome in the entire patient population (HR 0.81, P < 0.001) and in white patients in subgroup analysis (HR 0.823, P < 0.001) compared with US citizens. A similar effect was observed for recipient survival. Using Medicare as a reference group, there is a statistically significant benefit to graft survival from having private insurance in the whole group (HR 0.87, P < 0.001) and in the black (HR 0.8, P < 0.001) and the white (HR 0.89, P < 0.001) subgroups; a similar effect of private insurance is observed on recipient survival in the entire group of patients and across racial groups. Recipients with higher education level, resident aliens, and patients with private insurance have an advantage in the graft and recipient outcomes independent of racial differences.


Subject(s)
Kidney Transplantation/mortality , Adult , Female , Humans , Male , Socioeconomic Factors , Survival Rate , Treatment Outcome
8.
Semin Nephrol ; 25(2): 81-9, 2005 Mar.
Article in English | MEDLINE | ID: mdl-15791559

ABSTRACT

Hemodialysis membranes have undergone a gradual but substantial evolution over the past few decades. Classification of modern dialyzer membranes by chemical composition bears little relationship to their functional characteristics. The fundamental properties that determine the capacity of the membrane to remove solutes and fluids are its surface area, thickness, pore size, pore density, and potential to adsorb proteins. Dialyzer membrane performance is characterized clinically by its efficiency, defined as the potential to remove urea and presented as the mass-transfer area coefficient (KoA) and ultrafiltration coefficient (K(uf) ),defined as the potential to remove water adjusted for the transmembrane pressure. The parameter K(uf) usually, but not invariably, correlates with the membrane permeability, defined as the potential to remove middle molecules, with beta2-microglobulin being the currently popular marker. The sieving coefficient reflects the membrane potential to transport solutes by convection and is particularly useful for hemofiltration. Enhancing solute clearance is accomplished clinically by increasing blood and dialysate flow rates, strategies that also are applicable to middle molecules for highly permeable membranes. Novel designs of dialyzers include the optimization of fluid flow path geometry and increasing the membrane pore selectivity for solutes by using nanotechnology.


Subject(s)
Membranes, Artificial , Renal Dialysis/instrumentation , Renal Dialysis/standards , Humans , Kidney Failure, Chronic/therapy , beta 2-Microglobulin/analysis
SELECTION OF CITATIONS
SEARCH DETAIL