Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 51
Filter
1.
Actas urol. esp ; 47(6): 382-389, jul.- ago. 2023. tab
Article in Spanish | IBECS | ID: ibc-223186

ABSTRACT

Introducción El trasplante renal es el tratamiento de referencia para los pacientes con enfermedad renal terminal. Los reingresos hospitalarios tras el trasplante son una complicación frecuente y pueden considerarse un indicador de morbilidad evitable y de la calidad de la asistencia hospitalaria, y existe una correlación significativa entre reingreso hospitalario precoz (RHP) y resultados adversos para los pacientes. Este estudio pretende evaluar la tasa de reingresos tras el trasplante renal, las causas subyacentes y las posibles maneras de prevenirlo. Material y métodos Se revisaron retrospectivamente las historias clínicas de los receptores desde enero de 2016 hasta diciembre de 2021 en un único centro. El objetivo principal de este estudio es determinar la tasa de reingreso después del trasplante renal y las variables que contribuyen al reingreso. Las complicaciones postrasplante que dieron lugar al reingreso se clasificaron en complicaciones quirúrgicas, complicaciones relacionadas con el injerto, infecciones, trombosis venosa profunda (TVP) y otras complicaciones médicas. Resultados Cuatrocientos setenta y cuatro receptores de aloinjerto renal cumplieron nuestros criterios de inclusión y se adhirieron al estudio. De estos, 248 (52,3%) tuvieron al menos un reingreso durante los primeros 90días tras el trasplante. Un total de 89 (18,8%) receptores de aloinjerto tuvieron más de un episodio de reingreso en los primeros 90días postrasplante. La colección líquida perirrenal fue la complicación quirúrgica más frecuente (52,4%), y la infección del tracto urinario fue la infección más común (50%) entre las causas de reingreso en los primeros 90días postrasplante. El cociente de probabilidades (odds ratio [OR]) de reingreso fue significativamente mayor en los pacientes mayores de 60años y en los riñones con KDPI ≥85, así como en los receptores con RFI (AU)


Introduction Kidney transplantation (KT) is the gold standard treatment for end-stage renal disease (ESRD) patients. Hospital readmissions post-transplant is a common complication and can be considered an indication of avoidable morbidity and hospital quality, and there is a significant correlation between early hospital readmission (EHR) and adverse patient outcomes. This study aimed to assess the readmission rate following kidney transplants, the underlying causes, and possible ways to prevent it. Material and methods We retrospectively reviewed the medical records of recipients from January 2016 to December 2021 in a single center. The primary objective of this study is to find the readmission rate for kidney transplants and the variables that contribute to readmission. Post-transplant complications that were resulted in the readmission categorized into surgical complications, graft-related complications, infections, DVT, and other medical complications. Results Four hundred seventy-four renal allograft recipients met our inclusion criteria and were included in the study. 248 (52.3%) of the allograft recipients had at least one readmission during the first 90days after the transplantation. 89 (18.8%) allograft recipients had more than one readmission episode in the first 90days post-transplant. The perinephric fluid collection was the most common surgical complication (52.4%), and UTI was the most common infection (50%), causing readmission in the first 90days post-transplant. The readmission odd ratio was significantly higher in patients above 60years old and in kidneys with KDPI ≥85, and in recipients with DGF. Conclusion EHR following a kidney transplant is a common complication. Identifying the causes not only helps the transplant centers to take further steps to prevent some incidents and help to improve the patients’ morbidities and mortalities, but also it can reduce the unnecessary costs of readmissions (AU)


Subject(s)
Humans , Male , Female , Adult , Middle Aged , Aged , Renal Insufficiency, Chronic/surgery , Kidney Transplantation/adverse effects , Patient Readmission , Retrospective Studies
2.
Actas Urol Esp (Engl Ed) ; 47(6): 382-389, 2023.
Article in English, Spanish | MEDLINE | ID: mdl-36871623

ABSTRACT

INTRODUCTION: Kidney transplantation (KT) is the gold standard treatment for end-stage renal disease (ESRD) patients. Hospital readmissions post-transplant is a common complication and can be considered an indication of avoidable morbidity and hospital quality, and there is a significant correlation between EHR and adverse patient outcomes. This study aimed to assess the readmission rate following kidney transplants, the underlying causes, and possible ways to prevent it. MATERIAL AND METHODS: We retrospectively reviewed the medical records of recipients from January 2016 to December 2021 in a single center. The primary objective of this study is to find the readmission rate for kidney transplants and the variables that contribute to readmission. Post-transplant complications that were resulted in the readmission categorized into surgical complications, graft-related complications, infections, DVT, and other medical complications. RESULTS: Four hundred seventy-four renal allograft recipients met our inclusion criteria and were included in the study. 248 (52.3%) of the allograft recipients had at least one readmission during the first 90 days after the transplantation. 89 (18.8%) allograft recipients had more than one readmission episode in the first 90 days post-transplant. The perinephric fluid collection was the most common surgical complication (52.4%), and UTI was the most common infection (50%), causing readmission in the first 90 days post-transplant. The readmission odd ratio was significantly higher in patients above 60 years old and in kidneys with KDPI ≥ 85, and in recipients with DGF. CONCLUSION: Early hospital readmission (EHR) following a kidney transplant is a common complication. Identifying the causes not only helps the transplant centers to take further steps to prevent some incidents and help to improve the patients' morbidities and mortalities, but also it can reduce the unnecessary costs of readmissions.


Subject(s)
Kidney Failure, Chronic , Kidney Transplantation , Humans , Middle Aged , Patient Readmission , Retrospective Studies , Risk Factors , Kidney Failure, Chronic/etiology
3.
Sci Rep ; 13(1): 2640, 2023 02 14.
Article in English | MEDLINE | ID: mdl-36788315

ABSTRACT

Fusarium oxysporum (Fo) is ubiquitous in soil and forms a species complex of pathogenic and putatively non-pathogenic strains. Pathogenic strains cause disease in over 150 plant species. Fusarium oxysporum f. sp. ciceris (Foc) is a major fungal pathogen causing Fusarium wilt in chickpeas (Cicer arietinum). In some countries such as Australia, Foc is a high-priority pest of biosecurity concern. Specific, sensitive, robust and rapid diagnostic assays are essential for effective disease management on the farm and serve as an effective biosecurity control measure. We developed and validated a novel and highly specific PCR and a LAMP assay for detecting the Indian Foc race 1 based on a putative effector gene uniquely present in its genome. These assays were assessed against 39 Fo formae speciales and found to be specific, only amplifying the target species, in a portable real-time fluorometer (Genie III) and qPCR machine in under 13 min with an anneal derivative temperature ranging from 87.7 to 88.3 °C. The LAMP assay is sensitive to low levels of target DNA (> 0.009 ng/µl). The expected PCR product size is 143 bp. The LAMP assay developed in this study was simple, fast, sensitive and specific and could be explored for other Foc races due to the uniqueness of this marker to the Foc genome.


Subject(s)
Cicer , Fusarium , Fusarium/genetics , Cicer/genetics , Polymerase Chain Reaction , Plant Diseases/microbiology
4.
BMC Genomics ; 22(1): 734, 2021 Oct 09.
Article in English | MEDLINE | ID: mdl-34627148

ABSTRACT

BACKGROUND: The fungal pathogen Fusarium oxysporum f.sp. pisi (Fop) causes Fusarium wilt in peas. There are four races globally: 1, 2, 5 and 6 and all of these races are present in Australia. Molecular infection mechanisms have been studied in a few other F. oxysporum formae speciales; however, there has been no transcriptomic Fop-pea pathosystem study. RESULTS: A transcriptomic study was carried out to understand the molecular pathogenicity differences between the races. Transcriptome analysis at 20 days post-inoculation revealed differences in the differentially expressed genes (DEGs) in the Fop races potentially involved in fungal pathogenicity variations. Most of the DEGs in all the races were engaged in transportation, metabolism, oxidation-reduction, translation, biosynthetic processes, signal transduction, proteolysis, among others. Race 5 expressed the most virulence-associated genes. Most genes encoding for plant cell wall degrading enzymes, CAZymes and effector-like proteins were expressed in race 2. Race 6 expressed the least number of genes at this time point. CONCLUSION: Fop races deploy various factors and complex strategies to mitigate host defences to facilitate colonisation. This investigation provides an overview of the putative pathogenicity genes in different Fop races during the necrotrophic stage of infection. These genes need to be functionally characterised to confirm their pathogenicity/virulence roles and the race-specific genes can be further explored for molecular characterisation.


Subject(s)
Fusarium , Fusarium/genetics , Pisum sativum , Plant Diseases/genetics , Transcriptome , Virulence
5.
BMC Genomics ; 21(1): 248, 2020 Mar 20.
Article in English | MEDLINE | ID: mdl-32197583

ABSTRACT

BACKGROUND: The Fusarium oxysporum species complex (FOSC) is a ubiquitous group of fungal species readily isolated from agroecosystem and natural ecosystem soils which includes important plant and human pathogens. Genetic relatedness within the complex has been studied by sequencing either the genes or the barcoding gene regions within those genes. Phylogenetic analyses have demonstrated a great deal of diversity which is reflected in the differing number of clades identified: three, five and eight. Genetic limitation within the species in the complex has been studied through Genealogical Concordance Phylogenetic Species Recognition (GCPSR) analyses with varying number of phylogenetic 'species' identified ranging from two to 21. Such differing views have continued to confuse users of these taxonomies. RESULTS: The phylogenetic relationships between Australian F. oxysporum isolates from both natural and agricultural ecosystems were determined using three datasets: whole genome, nuclear genes, and mitochondrial genome sequences. The phylogenies were concordant except for three isolates. There were three concordant clades from all the phylogenies suggesting similar evolutionary history for mitochondrial genome and nuclear genes for the isolates in these three clades. Applying a multispecies coalescent (MSC) model on the eight single copy nuclear protein coding genes from the nuclear gene dataset concluded that the three concordant clades correspond to three phylogenetic species within the FOSC. There was 100% posterior probability support for the formation of three species within the FOSC. This is the first report of using the MSC model to estimate species within the F. oxysporum species complex. The findings from this study were compared with previously published phylogenetics and species delimitation studies. CONCLUSION: Phylogenetic analyses using three different gene datasets from Australian F. oxysporum isolates have all supported the formation of three major clades which delineated into three species. Species 2 (Clade 3) may be called F. oxysporum as it contains the neotype for F. oxysporum.


Subject(s)
Fusarium/classification , Whole Genome Sequencing/statistics & numerical data , Cell Nucleus/genetics , Evolution, Molecular , Fusarium/genetics , Fusarium/isolation & purification , Genome, Fungal , Mitochondria/genetics , Phylogeny
6.
Cancer Radiother ; 23(1): 58-61, 2019 Feb.
Article in French | MEDLINE | ID: mdl-30551930

ABSTRACT

For more than a decade, the majority of radiation oncology centres have been delivering intensity-modulated radiotherapy (then volumetric-modulated arctherapy) with 6 MV photons as their standard of care. This « dogma ¼ had been supported by the usual absence of dosimetric advantages with high-energy photons (15 to 18 MV or more), at least for the planning target volume and the dose received by the adjacent organs at risk, and by the neutron component as soon as the photon energy exceeds 10 MV. Recent data could question such a dogma. First, in 2019, one cannot avoid taking into account the integral dose, delivered outside the treated volume. Actually, most available data show that integral dose is higher with low energy photons (as 6 MV) than with higher energies. Moreover, recent studies have shown that the neutron component at high energies may have been overestimated in the past; in fact, the neutron dose appears to be lower, and sometimes much lower, than the dose we accept for imaging. Finally, a few cohort studies did not show any increase in second cancers incidence after high-energy photon radiotherapy. In such a context, the American Association of Physicists in Medicine (AAPM) TG 158 document, released a few months ago, clearly states that there is a trade-off between high- and low-energy treatments. High-energy therapy is associated with neutron production, while low-energy therapy results in higher stray photon dose. According to the AAPM, « the optimal energy is likely an intermediate such as 10 MV ¼.


Subject(s)
Photons , Radiotherapy, Intensity-Modulated/methods , Humans , Radiotherapy Dosage , Radiotherapy Planning, Computer-Assisted
7.
Int J Organ Transplant Med ; 8(3): 125-131, 2017.
Article in English | MEDLINE | ID: mdl-28924460

ABSTRACT

There has been ample of preclinical and animal studies showing efficacy and safety of using various cells, such as stem cells or T regulatory cells, after transplantation for tissue repair, immunosuppression or tolerance induction. However, there has been a significant progress recently using cell therapy in solid organ transplantation in small clinical trials. Recent results have been promising and using cell therapy in solid organ transplantation seems feasible and safe. However, there are more hurdles to overcome such as dose and timing of the infusions. Current studies mainly focused on live donor kidney transplantation. Expansion of current regimes to other organs and deceased donor transplantation would be crucial.

8.
Int J Organ Transplant Med ; 8(1): 49-51, 2017.
Article in English | MEDLINE | ID: mdl-28299028
9.
Int J Organ Transplant Med ; 7(2): 69-76, 2016.
Article in English | MEDLINE | ID: mdl-28435638

ABSTRACT

BACKGROUND: Splitting a liver for utilization in adult/pediatric recipients has been shown to decrease mortality on the wait list without increasing the overall risk of long-term graft failure compared to a whole graft. However, splitting a single donor organ for two adult recipients, full-right-full-left split liver transplantation (FRFLSLT), to overcome organ shortage is still considered controversial. OBJECTIVE: This study assessed the outcome of FRFLSLT comparing full-right (FR) and full-left (FL) with whole liver (WL) allografts in adults (1998-2010) using UNOS standard transplant analysis and research (STAR) file. Methods: Unadjusted allograft and patient survival were estimated using Kaplan-Meier survival curves. Adjusted analyses of survival were conducted controlling for propensity for WL allograft. RESULTS: There were 83,313 cases of WL, 651 FR and 117 FL. Significant differences were evident in the unadjusted cohort between recipients who received FR and FL including donor, cold ischemic time, and days on transplant waiting list. Use of FL allograft resulted in a trend toward lower graft and patient survival compared to WL and FR, which was not statistically significant (p=0.07). In the matched cohort, FL hemiliver allograft had no detrimental effect on the allograft or patient survival after split liver transplantation when compared to FR and WL. CONCLUSION: After adjusting for donor and recipient characteristics, there was no difference in allograft or patient survival with the use of FL, FR, or WL after liver transplantation in adults. FRFLSLT is a valuable and safe option to expand the donor pool.

10.
Int J Organ Transplant Med ; 6(4): 141-9, 2015.
Article in English | MEDLINE | ID: mdl-26576259

ABSTRACT

BACKGROUND: There are over 250 kidney transplant programs in the USA. OBJECTIVE: To determine if highly competitive regions, defined as regions with a higher number of transplant centers, will approve and wait-list more end-stage renal disease (ESRD) candidates for transplant despite consistent incidence and prevalence of ESRD nationwide. METHODS: ESRD Network and OPTN data completed in 2011 were obtained from all transplant centers including listing data, market saturation, market share, organs transplanted, and ESRD prevalence. Herfindahl-Hirschman Index (HHI) was used to measure the size of firms in relation to the industry to determine the amount of competition. RESULTS: States were separated into 3 groups (HHI<1000 considered competitive; HHI 1000-1800 considered moderate competition; and HHI>1800 considered highly concentrated). The percentage of ESRD patients listed in competitive, moderate, and highly concentrated regions were 19.73%, 17.02%, and 13.75%, respectively. The ESRD listing difference between competitive versus highly concentrated was significant (p<0.05). CONCLUSION: When there is strong competition without a dominant center as defined by the HHI, the entire state tends to list more patients for transplant to drive up their own center's market share. Our analysis of the available national data suggests a discrepancy in access for ESRD patient to transplantation due to transplant center competition.

11.
Prog Urol ; 25(12): 698-704, 2015 Oct.
Article in French | MEDLINE | ID: mdl-26341075

ABSTRACT

PURPOSE: Identify predictors for selecting patients who requires analgesia during lithotripsy. METHODS: This is a prospective study over a period of 13 months, 100 patients with kidney stones treated by an electromagnetic lithotripter (siemens; lithoskop) were selected. For the study of subjective pain caused by the ESWL at different times of the session, a visual analog scale (VAS) was used at different times (T) of the session (T0 before shots, T1 at 500 shots, T2 at 1500 shots, T end of treatment). A session was considered painless if VAS≤3. To identify predictors, were investigated association between pain and the different characteristics of patients, kidney stones and the shock wave specifications. RESULTS: The analytical study showed that pain was correlated with female gender, anxiety score, skin distance stones, parietal distance and the energy of the shock wave. While age, waist circumference, the circumstance found, the projection of stones on the rib and the number of shots had no impact on the level of pain. CONCLUSION: Our study showed that even with an electromagnetic lithotripter third generation; ESWL is still painful leading to the interruption of the session in 29% of cases. Four major predictors of pain leading to the use of sedo-analgesia early in the session were identified. LEVEL OF EVIDENCE: 3.


Subject(s)
Kidney Calculi/therapy , Lithotripsy/adverse effects , Pain/etiology , Anxiety/complications , Female , Humans , Lithotripsy/methods , Male , Middle Aged , Prospective Studies , Sex Factors , Visual Analog Scale
12.
Int J Organ Transplant Med ; 6(2): 55-60, 2015.
Article in English | MEDLINE | ID: mdl-26082829

ABSTRACT

BACKGROUND: Organ transplantation currently requires long-term immunosuppression. This is associated with multiple complications including infection, malignancy and other toxicities. Immunologic tolerance is considered the optimal solution to these limitations. OBJECTIVE: To develop a simple and non-toxic regimen to induce mixed chimerism and tolerance using mesenchymal stem cell (MSC) in a murine model. METHODS: Wild type C57BL6 (H2D(k)) and Bal/C (H2D(d)) mice were used as donors and recipients, respectively. We studied to achieve tolerance to skin grafts (SG) through mixed chimerism (MC) by simultaneous skin graft and non-myeloablative donor bone marrow transplantation (DBMT) +/- MSC. All recipients received rapamycin and CTLA-4 Ig without radiation. RESULTS: DBMT+MSC combined with co-stimulation blockage and rapamycin led to stable mixed chimerism, expansion of Tregs population and donor-specific skin graft tolerance. The flow cytometry analysis revealed that recipient mice developed 15%-85% chimerism. The skin allografts survived for a long time. Elimination of MSC failed to induce mixed chimerism and tolerance. CONCLUSION: Our results demonstrate that donor-specific immune tolerance can be effectively induced by non-myeloablative DBMT-MSC combination without any additional cytoreductive treatment. This approach provides a promising and non-toxic allograft tolerance strategy.

13.
Int J Organ Transplant Med ; 5(4): 137-45, 2014.
Article in English | MEDLINE | ID: mdl-25426282

ABSTRACT

Organ transplantation is not only considered as the last resort therapy but also as the treatment of choice for many patients with end-stage organ damage. Recipient-mediated acute or chronic immune response is the main challenge after transplant surgery. Nonspecific suppression of host immune system is currently the only method used to prevent organ rejection. Lifelong immunosuppression will cause significant side effects such as infections, malignancies, chronic kidney disease, hypertension and diabetes. This is more relevant in children who have a longer life expectancy so may receive longer period of immunosuppressive medications. Efforts to minimize or complete withdrawal of immunosuppression would improve the quality of life and long-term outcome of pediatric transplant recipients.

14.
Int J Organ Transplant Med ; 5(3): 87-96, 2014.
Article in English | MEDLINE | ID: mdl-25184029

ABSTRACT

Organ shortage is the greatest challenge facing the field of organ transplantation today. A variety of approaches have been implemented to expand the organ donor pool including live donation, a national effort to expand deceased donor donation, split organ donation, paired donor exchange, national sharing models and greater utilization of expanded criteria donors. Increased public awareness, improved efficiency of the donation process, greater expectations for transplantation, expansion of the living donor pool and the development of standardized donor management protocols have led to unprecedented rates of organ procurement and transplantation. Although live donors and donation after brain death account for the majority of organ donors, in the recent years there has been a growing interest in donors who have severe and irreversible brain injuries but do not meet the criteria for brain death. If the physician and family agree that the patient has no chance of recovery to a meaningful life, life support can be discontinued and the patient can be allowed to progress to circulatory arrest and then still donate organs (donation after circulatory death). Increasing utilization of marginal organs has been advocated to address the organ shortage.

15.
Int J Organ Transplant Med ; 5(2): 43-9, 2014.
Article in English | MEDLINE | ID: mdl-25013678

ABSTRACT

BACKGROUND: Live donor liver transplantation (LDLT) for patients with portal vein thrombosis (PVT) creates several technical challenges due to severe pre-operative condition and extensive collaterals. Although deceased donor liver transplantation in patients with PVT is now routinely performed at most centers, the impact of PVT on LDLT outcomes is still controversial. OBJECTIVE: To determine the outcome of patients with PVT who underwent LDLT. METHODS: We reviewed the outcome of adult patients with PVT who underwent LDLT in the USA from 1998 to 2009. RESULTS: 68 (2.9%) of 2402 patients who underwent LDLT had PVT. Comparing patients with and without PVT who underwent LDLT, those with PVT were older (53 vs 50 yrs), more likely to be male, had longer length of stay (25 vs 18 days) and higher retransplantation rate (19% vs 10.7%). The allograft and patient survival was lower in patients with PVT. In Cox regression analysis, PVT was associated with worse allograft survival (HR=1.7, 95% CI: 1.1-2.5, p<0.001) and patient survival (HR=1.6, 95% CI: 1.2-2.4, p<0.001) than patients without PVT. CONCLUSIONS: Patients with PVT who underwent LDLT had a worse prognosis than those without PVT.

16.
Am J Transplant ; 13(10): 2739-42, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23915277

ABSTRACT

Type 1 primary hyperoxaluria (PH1) causes renal failure, for which isolated kidney transplantation (KT) is usually unsuccessful treatment due to early oxalate stone recurrence. Although hepatectomy and liver transplantation (LT) corrects PH1 enzymatic defect, simultaneous auxiliary partial liver transplantation (APLT) and KT have been suggested as an alternative approach. APLT advantages include preservation of the donor pool and retention of native liver function in the event of liver graft loss. However, APLT relative mass may be inadequate to correct the defect. We here report the first case of native portal vein embolization (PVE) to increase APLT to native liver mass ratio (APLT/NLM-R). Following initial combined APLT-KT, both allografts functioned well, but oxalate plasma levels did not normalize. We postulated the inadequate APLT/NLM-R could be corrected by trans-hepatic native PVE. The resulting increased APLT/NLM-R decreased serum oxalate to normal levels within 1 month following PVE. We conclude that persistently elevated oxalate levels after combined APLT-KT for PH1 treatment, results from inadequate relative functional capacity. This can be reversed by partial native PVE to decrease portal flow to the native liver. This approach might be applicable to other scenarios where partial grafts have been transplanted to replace native liver function.


Subject(s)
Embolization, Therapeutic , Hyperoxaluria, Primary/therapy , Kidney Failure, Chronic/therapy , Kidney Transplantation , Liver Transplantation , Portal Vein , Adult , Combined Modality Therapy , Humans , Male , Oxalates/metabolism , Prognosis , Transplantation, Homologous
17.
Transplant Proc ; 45(1): 279-80, 2013.
Article in English | MEDLINE | ID: mdl-23267801

ABSTRACT

One possibility to increase the organ pool is to use grafts from hepatitis B virus (HBV) surface antigen (HBsAg)-positive donors, but few data are currently available in this setting. Herein, we reviewed the outcome of 92 liver transplantations using allografts from HBsAg-positive donors in the United States (1990-2009). They had experienced HBV-related (n = 68) or HBV-unrelated disease (n = 24). There was no difference between patients who received HBsAg-positive versus HBsAg-negative allografts based on age, Model for End-stage Liver Disease (MELD) score, length of stay, wait time, and donor risk index. HBsAg-positive allografts were more likely to be imported and used in MELD exceptional cases. Allograft and patient survival were comparable between the two groups. HBsAg-positive allografts deserve consideration when no other organ is available in a suitable waiting time in the present era of highly effective antiviral therapy.


Subject(s)
Donor Selection , End Stage Liver Disease/surgery , Hepatitis B Surface Antigens/immunology , Hepatitis B/therapy , Liver Transplantation/methods , Tissue Donors , Databases, Factual , Graft Survival , Hepatitis B Antibodies/immunology , Hepatitis B virus , Humans , Middle Aged , Risk Factors , Transplantation, Homologous , United States
18.
Int J Organ Transplant Med ; 4(1): 27-9, 2013.
Article in English | MEDLINE | ID: mdl-25013650

ABSTRACT

BACKGROUND: Liver transplantation (LT) for polycystic liver disease (PLD) has evolved to be an option for treating these patients. Patients with PLD suffer from incapacitating symptoms because of very large liver volumes but liver function is preserved until a late stage. OBJECTIVE/METHODS: Herein, we reviewed the outcome of adult patients with PLD who underwent LT in the US comparing pre-MELD (1990-2001) to MELD era (2002-2009). RESULTS: During this period, only 309 patients underwent LT for PLD. The number of LT for PLD is very low comparing the two eras. The percentage of patients who had combined liver and kidney transplantation (CLKT) for this disease has not changed during MELD era (42.8% vs 38.6%). The waiting time for LT (337 vs 272 days) and CLKT (289 vs 220) has increased in MELD era (p<0.001). In MELD era, 53.4% of LT and 31.2% of CLKT were done as MELD exceptional cases. The allograft and patent survival have significantly improved in MELD era. CONCLUSION: Patients with PLD had marked improvement of their outcomes after LT in MELD era.

19.
Int J Organ Transplant Med ; 4(1): 35-7, 2013.
Article in English | MEDLINE | ID: mdl-25013652

ABSTRACT

Biliary hamartomata are rare benign lesions. Herein, we report on a 48-year-old man with a history of end-stage liver disease secondary to alcoholic liver disease. The patient received an orthotropic liver transplant from a brain-death woman. At the time of recovery, there were multiple lesions in the transplanted liver measuring 7-10 mm. Pathology revealed multiple biliary hamartomata. The postoperative course of the recipient was uncomplicated and he was discharged home 10 days after the transplantation.

20.
Int J Organ Transplant Med ; 4(4): 137-43, 2013.
Article in English | MEDLINE | ID: mdl-25013666

ABSTRACT

BACKGROUND: Live-donor liver transplantation (LDLT) is a valuable option for patients with hepatocellular carcinoma (HCC) as compared with deceased-donor liver transplantation (DDLT); the tumor could be eradicated early. METHODS: Herein, we reviewed the outcome of adult patients with HCC who underwent LDLT from 1990 to 2009 in the USA, as reported to United Network for Organ Sharing. RESULTS: Compared to DDLT (n=5858), patients who underwent LDLT for HCC (n=170) were more likely to be female (43.8% vs 23.8%), younger (mean age 48.6 vs 54.9 years) and have more tumors outside Milan criteria (30.7% vs 13.6%). However, the recipients of LDLT for HCC had a significantly shorter mean wait time before transplantation (173 vs 219 days; p=0.04). The overall allograft and patient survival were not different, though more patients in LDLT group were outside Milan criteria. Since implementation of the MELD exception for HCC, DDLT for HCC has increased form 337 (2.3%) cases in 2002 to 1142 (18.7%) in 2009 (p<0.001). However, LDLT for HCC has remained stable from 16 (5.7%) in 2002 to 14 (9.2%) in 2009 (p=0.1). Regions 1, 5 and 9 had the highest rate of LDLT for HCC compared to other regions. CONCLUSIONS: LDLT can achieve the same long-term outcomes compared to DDLT in patients with HCC. The current MELD prioritization for HCC reduces the necessity of LDLT for HCC except in areas with severe organ shortage.

SELECTION OF CITATIONS
SEARCH DETAIL
...