Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
1.
Transplant Proc ; 50(10): 3913-3916, 2018 Dec.
Article in English | MEDLINE | ID: mdl-30471832

ABSTRACT

Kidneys from donors with blood type A2 can be successfully transplanted into blood type B and O recipients without the need for desensitization if the recipient's starting anti-A hemagglutinin titer is within an acceptable range. National kidney allocation policy now offers priority for eligible B recipients to receive A2 or A2B deceased donor kidneys, and therefore, the frequency with which A2 or A2B to B transplants will occur is expected to increase. The precise mechanisms by which antibody-mediated rejection is averted in these cases despite the presence of both circulating anti-A antibody and expression of the A2 antigen on the graft endothelium are not known. Whether this process mirrors proposed mechanisms of accommodation, which can occur in recipients of ABO incompatible transplants, is also not known. Repeated exposure to mismatched antigens after retransplantation could elicit memory responses resulting in antibody rebound and accelerated antibody-mediated rejection. Whether this would occur in the setting of repeated A2 donor exposure was uncertain. Here we report the case of a patient with history of a prior A2 to B transplant which failed owing to nonimmunologic reasons; the patient successfully underwent a repeat A2 to B transplant. Neither rebound in anti-A2 antibody nor clinical evidence of antibody-mediated rejection were observed after the transplant. Current kidney allocation will likely enable more such transplants in the future, and this may provide a unique patient population in whom the molecular mechanisms of incompatible graft accommodation may be investigated.


Subject(s)
Blood Group Incompatibility/immunology , Kidney Transplantation/methods , Reoperation , ABO Blood-Group System/immunology , Aged , Antibodies , Blood Grouping and Crossmatching , Graft Survival/immunology , Humans , Male , Tissue Donors
2.
J Frailty Aging ; 5(3): 174-9, 2016.
Article in English | MEDLINE | ID: mdl-29240319

ABSTRACT

BACKGROUND: Frailty is associated with worse health-related quality of life (HRQOL) in older adults and worse clinical outcomes in adults of all ages with end stage renal disease (ESRD). It is unclear whether frail adults of all ages with ESRD are more likely to experience worse HRQOL. OBJECTIVE: The goal of this study was to identify factors associated with worsening HRQOL in this population. DESIGN, SETTING AND MEASUREMENTS: We studied 233 adults of all ages with ESRD enrolled (11/2009-11/2013) in a longitudinal cohort study. Frailty status was measured at enrollment and HRQOL was reported (Excellent, Very Good, Good, Fair or Poor) at the initial assessment and follow-up (median follow-up 9.4 months). We studied factors associated with Fair/Poor HRQOL at follow-up using logistic regression and factors associated with HRQOL change using multinomial regression. All models were adjusted for age, sex, race, education, BMI, diabetes status, history of a previous transplant, type of dialysis and time between assessments. RESULTS: Fair/Poor HRQOL was reported by 28% at initial assessment and 33% at follow-up. 47.2% of participants had stable HRQOL, 22.8% better HRQOL, and 30.0% worse HRQOL at follow-up (P<0.001). In adjusted models, only frailty was associated with Fair/Poor HRQOL at follow-up (OR: 2.79, 95% CI: 1.32-5.90) and worsening HRQOL at follow-up (RR: 2.91, 95%CI: 1.08-7.80). CONCLUSIONS: Frail adults of all ages with ESRD are more likely to experience fair/poor HRQOL and worsening HRQOL over time. Frailty represents a state of decreased physiologic reserve that impacts not only clinical outcomes but also the patient-centered outcome of HRQOL.


Subject(s)
Frailty , Kidney Failure, Chronic/physiopathology , Quality of Life , Adult , Aged , Aged, 80 and over , Disease Progression , Female , Humans , Longitudinal Studies , Male , Middle Aged , Prospective Studies
3.
Am J Transplant ; 14(2): 459-65, 2014 Feb.
Article in English | MEDLINE | ID: mdl-24400968

ABSTRACT

Renal transplantation in patients with antiphospholipid antibodies has historically proven challenging due to increased risk for thrombosis and allograft failure. This is especially true for patients with antiphospholipid antibody syndrome (APS) and its rare subtype, the catastrophic antiphospholipid antibody syndrome (CAPS). Since a critical mechanism of thrombosis in APS/CAPS is one mediated by complement activation, we hypothesized that preemptive treatment with the terminal complement inhibitor, eculizumab, would reduce the extent of vascular injury and thrombosis, enabling renal transplantation for patients in whom it would otherwise be contraindicated. Three patients with APS, two with a history of CAPS, were treated with continuous systemic anticoagulation together with eculizumab prior to and following live donor renal transplantation. Two patients were also sensitized to human leukocyte antigens (HLA) and required plasmapheresis for reduction of donor-specific antibodies. After follow-up ranging from 4 months to 4 years, all patients have functioning renal allografts. No systemic thrombotic events or early graft losses were observed. While the appropriate duration of treatment remains to be determined, this case series suggests that complement inhibitors such as eculizumab may prove to be effective in preventing the recurrence of APS after renal transplantation.


Subject(s)
Antibodies, Monoclonal, Humanized/therapeutic use , Antiphospholipid Syndrome/prevention & control , Complement Inactivating Agents/therapeutic use , Graft Rejection/prevention & control , Kidney Failure, Chronic/complications , Kidney Transplantation/adverse effects , Postoperative Complications/prevention & control , Adult , Antiphospholipid Syndrome/etiology , Follow-Up Studies , Graft Rejection/etiology , Humans , Kidney Failure, Chronic/surgery , Male , Middle Aged , Prognosis , Prospective Studies , Recurrence , Remission Induction
4.
Am J Transplant ; 13(4): 936-942, 2013 Apr.
Article in English | MEDLINE | ID: mdl-23414232

ABSTRACT

Organ shortage has led to increased utilization of higher risk liver allografts. In kidneys, aggressive center-level use of one type of higher risk graft clustered with aggressive use of other types. In this study, we explored center-level behavior in liver utilization. We aggregated national liver transplant recipient data between 2005 and 2009 to the center-level, assigning each center an aggressiveness score based on relative utilization of higher risk livers. Aggressive centers had significantly more patients reaching high MELDs (RR 2.19, 2.33 and 2.28 for number of patients reaching MELD>20, MELD>25 and MELD>30, p<0.001), a higher organ shortage ratio (RR 1.51, 1.60 and 1.51 for number of patients reaching MELD>20, MELD>25 and MELD>30 divided by number of organs recovered at the OPO, p<0.04), and were clustered within various geographic regions, particularly regions 2, 3 and 9. Median MELD at transplant was similar between aggressive and nonaggressive centers, but average annual transplant volume was significantly higher at aggressive centers (RR 2.27, 95% CI 1.47-3.51, p<0.001). In cluster analysis, there were no obvious phenotypic patterns among centers with intermediate levels of aggressiveness. In conclusion, highwaitlist disease severity, geographic differences in organ availability, and transplant volume are the main factors associated with the aggressive utilization of higher risk livers.


Subject(s)
End Stage Liver Disease/surgery , Liver Transplantation/methods , Tissue and Organ Procurement , Transplants/supply & distribution , Adult , Aged , Cluster Analysis , End Stage Liver Disease/diagnosis , Graft Survival , Humans , Liver Function Tests , Middle Aged , Phenotype , Regression Analysis , Risk Factors , Severity of Illness Index , Tissue Donors , Transplantation, Homologous
5.
Am J Transplant ; 10(11): 2472-80, 2010 Nov.
Article in English | MEDLINE | ID: mdl-20977638

ABSTRACT

UNet(SM) , the UNOS data collection and electronic organ allocation system, allows centers to specify organ offer acceptance criteria for patients on their kidney waiting list. We hypothesized that the system might not be fully utilized and that the criteria specified by most transplant centers would be much broader than the characteristics of organs actually transplanted by those centers. We analyzed the distribution of criteria values among waitlist patients (N = 304 385) between January 2000 and February 2009, mean criteria values among listed candidates on February 19, 2009 and differences between a center's specified criteria and the organs it accepted for transplant between July 2005 and April 2009. We found wide variation in use of criteria variables, with some variables mostly or entirely unused. Most centers specified very broad criteria, with little within-center variation by patient. An offer of a kidney with parameters more extreme than the maximum actually transplanted at that center was designated a 'surplus offer' and indicated a potentially avoidable delay in distribution. We found 7373 surplus offers (7.1% of all offers), concentrated among a small number of centers. The organ acceptance criteria system is currently underutilized, leading to possibly avoidable inefficiencies in organ distribution.


Subject(s)
Kidney Transplantation/statistics & numerical data , Tissue Donors/statistics & numerical data , Tissue and Organ Procurement , Waiting Lists , Adult , Body Mass Index , Child , Cold Ischemia , Creatinine/blood , Hepatitis C Antibodies/blood , Humans , Retrospective Studies , Warm Ischemia
6.
Am J Transplant ; 10(9): 2154-60, 2010 Sep.
Article in English | MEDLINE | ID: mdl-20636451

ABSTRACT

A 43-year-old patient with end-stage renal disease, a hypercoagulable condition and 100% panel reactive antibody was transferred to our institution with loss of hemodialysis access and thrombosis of the superior and inferior vena cava, bilateral iliac and femoral veins. A transhepatic catheter was placed but became infected. Access through a stented subclavian into a dilated azygos vein was established. Desensitization with two cycles of bortezomib was undertaken after anti-CD20 and IVIg were given. A flow-positive, cytotoxic-negative cross-match live-donor kidney at the end of an eight-way multi-institution domino chain became available, with a favorable genotype for this patient with impending total loss of a dialysis option. The patient received three pretransplant plasmapheresis treatments. Intraoperatively, the superior mesenteric vein was the only identifiable patent target for venous drainage. Eculizumab was administered postoperatively in the setting of antibody-mediated rejection and an inability to perform additional plasmapheresis. Creatinine remains normal at 6 months posttransplant and flow cross-match is negative. In this report, we describe the combined use of new agents (bortezomib and eculizumab) and modalities (nontraditional vascular access, splanchnic drainage of graft and domino paired donation) in a patient who would have died without transplantation.


Subject(s)
Antibodies, Monoclonal/therapeutic use , Boronic Acids/therapeutic use , Kidney Failure, Chronic/therapy , Kidney Transplantation , Living Donors , Protease Inhibitors/therapeutic use , Pyrazines/therapeutic use , Tissue and Organ Procurement/methods , Adult , Antibodies/blood , Antibodies/therapeutic use , Antibodies, Monoclonal, Humanized , Antigens, CD20/immunology , Bortezomib , Catheters, Indwelling , Creatinine/blood , Desensitization, Immunologic/methods , Drainage , Drug Therapy, Combination , Female , Femoral Vein , Humans , Iliac Vein , Immunoglobulins, Intravenous/therapeutic use , Kidney Failure, Chronic/blood , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/immunology , Plasmapheresis , Splanchnic Circulation , Therapies, Investigational , Vena Cava, Inferior , Vena Cava, Superior , Venous Thrombosis/complications
7.
Am J Transplant ; 10(5): 1238-46, 2010 May.
Article in English | MEDLINE | ID: mdl-20353475

ABSTRACT

Hepatitis C-positive (HCV(+)) candidates likely derive survival benefit from transplantation with HCV(+) kidneys, yet evidence remains inconclusive. We hypothesized that lack of good survival benefit data has led to wide practice variation. Our goal was to characterize national utilization of HCV(+) kidneys for HCV(+) recipients, and to quantify the risks/benefits of this practice. Of 93,825 deceased donors between 1995 and 2009, HCV(+) kidneys were 2.60-times more likely to be discarded (p < 0.001). However, of 6830 HCV(+) recipients, only 29% received HCV(+) kidneys. Patients over 60 relative rate (RR 0.86), women (RR 0.73) and highly sensitized patients (RR 0.42) were less likely to receive HCV(+) kidneys, while African Americans (RR 1.56), diabetics (RR 1.29) and those at centers with long waiting times (RR 1.19) were more likely to receive them. HCV(+) recipients of HCV(+) kidneys waited 310 days less than the average waiting time at their center, and 395 days less than their counterparts at the same center who waited for HCV(-) kidneys, likely offsetting the slightly higher patient (HR 1.29) and graft loss (HR 1.18) associated with HCV(+) kidneys. A better understanding of the risks and benefits of transplanting HCV(+) recipients with HCV(+) kidneys will hopefully improve utilization of these kidneys in an evidence-based manner.


Subject(s)
Hepatitis C/transmission , Tissue Donors , Black or African American/statistics & numerical data , Female , Hepacivirus , Humans , Kidney , Risk Assessment
8.
Am J Transplant ; 9(3): 578-85, 2009 Mar.
Article in English | MEDLINE | ID: mdl-19260837

ABSTRACT

Outcomes after heart and lung transplants have improved, and many recipients survive long enough to develop secondary renal failure, yet remain healthy enough to undergo kidney transplantation. We used national data reported to United Network for Organ Sharing (UNOS) to evaluate outcomes of 568 kidney after heart (KAH) and 210 kidney after lung (KAL) transplants performed between 1995 and 2008. Median time to kidney transplant was 100.3 months after heart, and 90.2 months after lung transplant. Renal failure was attributed to calcineurin inhibitor toxicity in most patients. Outcomes were compared with primary kidney recipients using matched controls (MC) to account for donor, recipient and graft characteristics. Although 5-year renal graft survival was lower than primary kidney recipients (61% KAH vs. 73.8% MC, p < 0.001; 62.6% KAL vs. 82.9% MC, p < 0.001), death-censored graft survival was comparable (84.9% KAH vs. 88.2% MC, p = 0.1; 87.6% KAL vs. 91.8% MC, p = 0.6). Furthermore, renal transplantation reduced the risk of death compared with dialysis by 43% for KAH and 54% for KAL recipients. Our findings that renal grafts function well and provide survival benefit in KAH and KAL recipients, but are limited in longevity by the general life expectancy of these recipients, might help inform clinical decision-making and allocation in this population.


Subject(s)
Heart Transplantation , Kidney Transplantation , Lung Transplantation , Follow-Up Studies , Graft Rejection/epidemiology , Graft Rejection/etiology , Graft Survival , Heart Transplantation/statistics & numerical data , Humans , Kidney Transplantation/statistics & numerical data , Lung Transplantation/statistics & numerical data , Time Factors , Transplantation, Homologous
9.
Am J Transplant ; 9(5): 1048-54, 2009 May.
Article in English | MEDLINE | ID: mdl-19298449

ABSTRACT

Single-center studies have reported equivalent outcomes of kidney allografts recovered with histidine-tryptophan-ketoglutarate (HTK) or University of Wisconsin (UW) solution. However, these studies were likely underpowered and often unadjusted, and multicenter studies have suggested HTK preservation might increase delayed graft function (DGF) and reduce graft survival of renal allografts. To further inform clinical practice, we analyzed the United Network for Organ Sharing (UNOS) database of deceased donor kidney transplants performed from July 2004 to February 2008 to determine if HTK (n = 5728) versus UW (n = 15 898) preservation impacted DGF or death-censored graft survival. On adjusted analyses, HTK preservation had no effect on DGF (odds ratio [OR] 0.99, p = 0.7) but was associated with an increased risk of death-censored graft loss (hazard ratio [HR] 1.20, p = 0.008). The detrimental effect of HTK was a relatively late one, with a strong association between HTK and subsequent graft loss in those surviving beyond 12 months (HR 1.43, p = 0.007). Interestingly, a much stronger effect was seen in African-American recipients (HR 1.55, p = 0.024) than in Caucasian recipients (HR 1.18, p = 0.5). Given recent studies that also demonstrate that HTK preservation reduces liver and pancreas allograft survival, we suggest that the use of HTK for abdominal organ recovery should be reconsidered.


Subject(s)
Graft Survival/drug effects , Kidney Transplantation/immunology , Organ Preservation Solutions/pharmacology , Adenosine , Adult , Allopurinol , Black People/statistics & numerical data , Cadaver , Cause of Death , Ethnicity , Female , Glucose/pharmacology , Glutathione , Humans , Insulin , Male , Mannitol/pharmacology , Middle Aged , Nephrectomy/methods , Potassium Chloride/pharmacology , Procaine/pharmacology , Racial Groups , Raffinose , Retrospective Studies , Tissue Donors , Tissue and Organ Harvesting/methods , Transplantation, Homologous/immunology , Treatment Outcome , White People/statistics & numerical data
10.
Am J Transplant ; 9(1): 217-21, 2009 Jan.
Article in English | MEDLINE | ID: mdl-18986383

ABSTRACT

Prior single-center studies have reported that pancreas allograft survival is not affected by preservation in histidine-tryptophan-ketoglutarate (HTK) versus University of Wisconsin (UW) solution. To expand on these studies, we analyzed the United Network for Organ Sharing (UNOS) database of pancreas transplants from July 2004, through February 2008, to determine if preservation with HTK (N = 1081) versus UW (N = 3311) impacted graft survival. HTK preservation of pancreas allografts increased significantly in this time frame, from 15.4% in 2004 to 25.4% in 2008. After adjusting for other recipient, donor, graft and transplant center factors that impact graft survival, HTK preservation was independently associated with an increased risk of pancreas graft loss (hazard ratio [HR] 1.30, p = 0.014), especially in pancreas allografts with cold ischemia time (CIT) >or=12 h (HR 1.42, p = 0.017). This reduced survival with HTK preservation as compared to UW preservation was seen in both simultaneous pancreas-kidney (SPK) transplants and pancreas alone (PA) transplants. Furthermore, HTK preservation was also associated with a 1.54-fold higher odds of early (<30 days) pancreas graft loss as compared to UW (OR 1.54, p = 0.008). These results suggest that the increasing use of HTK for abdominal organ preservation should be re-examined.


Subject(s)
Graft Survival , Organ Preservation Solutions , Pancreas Transplantation , Adult , Female , Glucose , Graft Rejection , Humans , Male , Mannitol , Potassium Chloride , Procaine
11.
Biol Blood Marrow Transplant ; 7(11): 589-95, 2001.
Article in English | MEDLINE | ID: mdl-11760146

ABSTRACT

Donor regulatory T cells (CD3+ alphabetaT-cell receptor [TCR]+) derived from the repopulating host thymus have been shown to be primarily responsible for suppression of GVHD following DLI therapy in murine BMT models. However, natural killer (NK) T cells also have regulatory properties, and a role for NK T cells in suppression of GVH reactivity has not been completely excluded. NK cells may also contribute to the graft-versus-leukemia (GVL) effect associated with DLI therapy. In this study, we used a murine BMT model (C57BL/6 into AKR) to study whether depletion of donor NK cells had any impact on the suppression of GVH reactivity after DLI or on the DLI-induced GVL effect against acute T-cell leukemia. Depletion of donor NK cells was accomplished in vivo by giving DLI-treated bone marrow chimeras multiple injections of anti-NK1.1 monoclonal antibody (MoAb). The chimeras treated with anti-NK1.1 MoAb had significantly fewer splenic NK1.1 cells than nontreated chimeras, and splenocytes from anti-NK1.1-treated mice were deficient in the ability to generate lymphokine-activated lytic activity. Results presented here showed that NK-cell depletion had no effect on the suppression of GVH reactivity after DLI. When DLI-treated chimeras were challenged with an acute T-cell leukemia, NK-cell depletion had no discernible effect on GVL reactivity. These preclinical data suggest that donor NK cells do not have a significant role in the suppression of GVHD after DLI or in the mediation of GVL reactivity induced by DLI.


Subject(s)
Graft vs Host Disease/prevention & control , Graft vs Leukemia Effect/immunology , Killer Cells, Natural/immunology , Leukocyte Transfusion , Animals , Antibodies, Monoclonal/administration & dosage , Bone Marrow Transplantation/methods , Bone Marrow Transplantation/mortality , Graft Survival , Killer Cells, Natural/transplantation , Leukemia, Experimental/therapy , Mice , Mice, Inbred Strains , Models, Animal , Survival Rate , Time Factors , Transplantation Chimera
13.
J Med Liban ; 48(5): 283-7, 2000.
Article in French | MEDLINE | ID: mdl-12492082

ABSTRACT

OBJECTIVES: To study the feasibility of IA-DSA for diagnostic or interventional purposes using gadolinium (Gd) as a contrast medium (CM) for patients with poor renal function. MATERIALS AND METHODS: This study is about 15 patients with renal insufficiency (creat. > 133 micromol/l) who needed IA-DSA for diagnostic or interventional purposes. Gd was used as a CM at a dose < or = 0.4 micromol/kg and a concentration > or = 75%. Serum creatinine level was evaluated before, at 24 h and 48 h after injection of Gd. An increase of creat. > or = 44 micromol/l was considered significant. Image quality was evaluated by two specialists. RESULTS: In 14/15 (93%) patients no significant elevation of the serum creat. level was noted. The images are of good to medium quality, the diagnostic or therapeutic purpose was however reached. Image quality depends on Gd concentration; the feasibility of the exam is limited by the quantity of Gd used and its dilution. The procedure is well tolerated without side effects. CONCLUSION: Gd allows arteriograms of adequate quality for diagnostic or therapeutic purposes without renal complications at doses used until now (0.4 micromol/kg). The procedure is easy to perform but only one region or organ can be studied in one procedure considering the relatively small volume of contrast allowed.


Subject(s)
Angiography, Digital Subtraction , Contrast Media , Gadolinium , Renal Insufficiency/diagnostic imaging , Adolescent , Adult , Aged , Creatinine/blood , Feasibility Studies , Female , Humans , Male , Middle Aged , Prospective Studies
14.
Tiers Monde (1960) ; 26(102): 335-50, 1985.
Article in French | MEDLINE | ID: mdl-12340320

ABSTRACT

PIP: Changes in female participation in Egypt's monetary economy in response to the political, economic, and social transformations underway in the country over the past few decades are traced. Official statistics are difficult to interpret because of changing definitions of activities from 1 census or other statistical source to another and because such statistics consistently underestimate true female activity rates by a wide margin. Because much of the work done by Egyptian women is clandestine and sporadic and is not even viewed by them as "work", it would be very difficult to supply an estimate of the number of women economically active, but trends over the past several years can be discerned. Economic participation of women was uncommon when Nasser assumed power in 1952. Several measures taken by his administration were intended to promote female participation, and the system guaranteeing a public sector job to every person earning the baccalaureate was responsible for a dramatic increase in the number and percentage of women in public administration. The number of women working in industry increased during the years of Nasser's rule, the average age of working women increased significantly as women retained their jobs after marriage, and public approval of working women increased, but uneducated women from the poorer classes had greater difficulty in finding employment through regular channels. The new policies of Sadat, despite their fundamental opposition to Nasser's orientations, accentuated the trends already underway. Economic growth was accompanied by serious inflation and economic pressure on households, the "Opening to the West" offered new models of consumption and created new needs, the massive emigration to rich neighboring Arab countries created labor shortages, and the encouragement of emigration and free enterprise spawned a vast movement of social restructuring. The Sadat years also saw increasing difficulty in finding gainful employment for the poorest women because of increased competition and higher standards for existing jobs, the decline in opportunities for employment as domestics because of increasing economic pressure on some parts of the population, increasing mechanization and replacement of female workers by machines, and lack of time for gainful employment because of the increasing temporal demands just for completion of basic household tasks in congested urban areas. Despite these difficulties, sociocultural factors still account in large part for the weak labor force participation of women and the high female unemployment rate. Modes of participation of women in economic life are still subject to strict social control.^ieng


Subject(s)
Economics , Educational Status , Employment , Health Workforce , Human Rights , Occupations , Politics , Poverty , Social Class , Social Planning , Socioeconomic Factors , Unemployment , Women's Rights , Africa , Africa, Northern , Developing Countries , Egypt , Middle East , Urban Population
SELECTION OF CITATIONS
SEARCH DETAIL
...