Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 46
Filter
1.
Liver Transpl ; 2024 Aug 26.
Article in English | MEDLINE | ID: mdl-39177578

ABSTRACT

BACKGROUND: The impact of transjugular intrahepatic portosystemic shunt (TIPS) on waitlist mortality and liver transplantation (LT) urgency in Budd-Chiari Syndrome (BCS) patients remains unclear. METHOD: We analyzed BCS patients listed for LT in the UNOS database(2002-2024) to assess TIPS's impact on waitlist mortality and LT access via competing-risk analysis. We compared trends across two phases:Phase1(2002-2011) and Phase2(2012-2024). RESULTS: Of 815 BCS patients, 263(32.3%) received TIPS at listing. TIPS group had lower MELD-Na scores(20vs22,p<0.01), milder ascites(p=0.01), and fewer Status1 patients(those at risk of imminent death while awaiting LT)(2.7%vs8.3%,p<0.01) at listing compared to those without TIPS. TIPS patients had lower LT rates(43.3%vs56.5%,p<0.01) and longer waitlist times(350vs113 d,p<0.01). TIPS use increased in Phase2(64.3%vs35.7%,p<0.01). Of 426 transplanted patients, 134(31.5%) received TIPS, showing lower MELD-Na scores(24vs27,p<0.01) and better medical conditions(Intensive care unit:14.9%vs21.9%,p<0.01) at LT. Status1 patients were fewer (3.7%vs12.3%,p<0.01), with longer waiting days(97vs26 d,p<0.01) in TIPS group. TIPS use at listing increased from Phase1(25.6%) to Phase2(37.7%). From Phase1 to Phase2, ascites severity improved, re-LT cases decreased(Phase1:9.8%vsPhase2:2.2%,p<0.01), and cold ischemic time slightly decreased(Phase1:7.0vsPhase2:6.4 hours,p=0.14). Median donor body mass index significantly increased. No significant differences were identified in patient/graft survival at 1-/5-/10-year intervals between phases or TIPS/non-TIPS patients. While 90-day waitlist mortality showed no significant difference(p=0.11), TIPS trended towards lower mortality(subHazard ratio[sHR]:0.70[0.45-1.08]). Multivariable analysis indicated that TIPS was a significant factor in decreasing mortality(sHR:0.45[0.27-0.77],p<0.01). TIPS group also showed significantly lower LT access(sHR:0.65[0.53-0.81],p<0.01). Multivariable analysis showed that TIPS was a significant factor in decreasing access to LT(sHR:0.60[0.46-0.77],p<0.01). Sub-group analysis excluding Status1 or HCC showed similar trends. CONCLUSION: TIPS in BCS patients listed for LT reduces waitlist mortality and LT access, supporting its bridging role.

2.
Liver Transpl ; 2024 Aug 23.
Article in English | MEDLINE | ID: mdl-39172007

ABSTRACT

BACKGROUND: Post-liver transplant (LT) immunosuppression is necessary to prevent rejection; however, a major consequence of this is tumor recurrence. Although recurrence is a concern after LT for patients with hepatocellular carcinoma (HCC), the oncologically optimal tacrolimus (FK) regimen is still unknown. METHOD: This retrospective study included 1,406 LT patients with HCC (2002-2019) at four U.S. institutions utilizing variable post-LT immunosuppression regimens. ROC analyses were performed to investigate the influences of post-LT time-weighted average FK (TWA-FK) level on HCC recurrence. A competing risk analysis was employed to evaluate the prognostic influence of TWA-FK while adjusting for patient and tumor characteristics. RESULTS: The area under the curve (AUC) for TWA-FK was greatest at 2 weeks (0.68), followed by 1 week (0.64) post-LT. Importantly, this was consistently observed across the institutions despite immunosuppression regimen variability. Additionally, the TWA-FK at 2 weeks was not associated with rejection within 6 months of LT. A competing risk regression analysis showed that TWA-FK at 2 weeks post-LT is significantly associated with recurrence (HR: 1.31, 95% CI: 1.21-1.41 p<0.001). The TWA-FK effect on recurrence varied depending on the exposure level and the individual's risk of recurrence, including vascular invasion and tumor morphology. CONCLUSION: Although previous studies have explored the influence of FK levels at 1 to 3 months post-LT on HCC recurrence, this current study suggests that earlier time points and exposure levels must be evaluated. Each patient's oncological risk must also be considered in developing an individualized immunosuppression regimen.

3.
Hepatol Res ; 2024 Aug 12.
Article in English | MEDLINE | ID: mdl-39134448

ABSTRACT

AIM: Liver fibrosis, heralding the potential progression to cirrhosis and hepatocellular carcinoma (HCC), compromises patient survival and augments post-hepatectomy recurrence. This study examined the detrimental effects of liver fibrosis on the antitumor functions of liver natural killer (NK) cells and the interleukin-33 (IL-33) signaling pathway. METHODS: Our investigation, anchored in both human physiologies using living and deceased donor livers and the carbon tetrachloride (CCl4)-induced mouse fibrosis model, aimed to show a troubling interface between liver fibrosis and weakened hepatic immunity. RESULTS: The Fibrosis-4 (FIB-4) index emerged as a salient, non-invasive prognostic marker, and its elevation correlated with reduced survival and heightened recurrence after HCC surgery even after propensity matching (n = 385). We established a strong correlation between liver fibrosis and liver NK cell dysfunction by developing a method for extracting liver NK cells from the liver graft perfusate. Furthermore, liver fibrosis ostensibly disrupted chemokines and promoted IL-33 expression, impeding liver NK cell antitumor activities, as evidenced in mouse models. Intriguingly, our results implicated IL-33 in diminishing the antitumor responses of NK cells. This interrelation, consistent across both mouse and human studies, coincides with clinical data suggesting that liver fibrosis predisposes patients to an increased risk of HCC recurrence. CONCLUSION: Our study revealed a critical relationship between liver fibrosis and compromised tumor immunity, emphasizing the potential interference of IL-33 with NK cell function. These insights advocate for advanced immunostimulatory therapies targeting cytokines, such as IL-33, aiming to bolster the hepatic immune response against HCC in the context of liver fibrosis.

4.
Clin Transplant ; 38(7): e15379, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38952196

ABSTRACT

BACKGROUND: Introducing new liver transplantation (LT) practices, like unconventional donor use, incurs higher costs, making evaluation of their prognostic justification crucial. This study reexamines the spread pattern of new LT practices and its prognosis across the United States. METHODS: The study investigated the spread pattern of new practices using the UNOS database (2014-2023). Practices included LT for hepatitis B/C (HBV/HCV) nonviremic recipients with viremic donors, LT for COVID-19-positive recipients, and LT using onsite machine perfusion (OMP). One year post-LT patient and graft survival were also evaluated. RESULTS: LTs using HBV/HCV donors were common in the East, while LTs for COVID-19 recipients and those using OMP started predominantly in California, Arizona, Texas, and the Northeast. K-means cluster analysis identified three adoption groups: facilities with rapid, slow, and minimal adoption rates. Rapid adoption occurred mainly in high-volume centers, followed by a gradual increase in middle-volume centers, with little increase in low-volume centers. The current spread patterns did not significantly affect patient survival. Specifically, for LTs with HCV donors or COVID-19 recipients, patient and graft survivals in the rapid-increasing group was comparable to others. In LTs involving OMP, the rapid- or slow-increasing groups tended to have better patient survival (p = 0.05) and significantly improved graft survival rates (p = 0.02). Facilities adopting new practices often overlap across different practices. DISCUSSION: Our analysis revealed three distinct adoption groups across all practices, correlating the adoption aggressiveness with LT volume in centers. Aggressive adoption of new practices did not compromise patient and graft survivals, supporting the current strategy. Understanding historical trends could predict the rise in future LT cases with new practices, aiding in resource distribution.


Subject(s)
COVID-19 , Graft Survival , Liver Transplantation , SARS-CoV-2 , Humans , Liver Transplantation/statistics & numerical data , United States/epidemiology , COVID-19/epidemiology , Female , Male , Middle Aged , Tissue and Organ Procurement/statistics & numerical data , Tissue Donors/supply & distribution , Tissue Donors/statistics & numerical data , Adult , Survival Rate , Prognosis , Practice Patterns, Physicians'/statistics & numerical data
5.
Transplantation ; 2024 Jun 04.
Article in English | MEDLINE | ID: mdl-38831488

ABSTRACT

BACKGROUND: This study compares selection criteria for liver transplant (LT) for hepatocellular carcinoma (HCC) for inclusivity and predictive ability to identify the most permissive criteria that maintain patient outcomes. METHODS: The Scientific Registry of Transplant Recipients (SRTR) database was queried for deceased donor LT's for HCC (2003-2020) with 3-y follow-up; these data were compared with a 2-center experience. Milan, University of California, San Francisco (UCSF), 5-5-500, Up-to-seven (U7), HALT-HCC, and Metroticket 2.0 scores were calculated. RESULTS: Nationally, 26 409 patients were included, and 547 at the 2 institutions. Median SRTR-follow-up was 6.8 y (interquartile range 3.9-10.1). Three criteria allowed the expansion of candidacy versus Milan: UCSF (7.7%, n = 1898), Metroticket 2.0 (4.2%, n = 1037), and U7 (3.5%, n = 828). The absolute difference in 3-y overall survival (OS) between scores was 1.5%. HALT-HCC (area under the curve [AUC] = 0.559, 0.551-0.567) best predicted 3-y OS although AUC was notably similar between criteria (0.506 < AUC < 0.527, Mila n = 0.513, UCSF = 0.506, 5-5-500 = 0.522, U7 = 0.511, HALT-HCC = 0.559, and Metroticket 2.0 = 0.520), as was Harrall's c-statistic (0.507 < c-statistic < 0.532). All scores predicted survival to P < 0.001 on competing risk analysis. Median follow-up in our enterprise was 9.8 y (interquartile range 7.1-13.3). U7 (13.0%, n = 58), UCSF (11.1%, n = 50), HALT-HCC (6.4%, n = 29), and Metroticket 2.0 (6.3%, n = 28) allowed candidate expansion. HALT-HCC (AUC = 0.768, 0.713-0.823) and Metroticket 2.0 (AUC = 0.739, 0.677-0.801) were the most predictive of recurrence. All scores predicted recurrence and survival to P < 0.001 using competing risk analysis. CONCLUSIONS: Less restrictive criteria such as Metroticket 2.0, UCSF, or U7 allow broader application of transplants for HCC without sacrificing outcomes. Thus, the criteria for Model for End-stage Liver Disease-exception points for HCC should be expanded to allow more patients to receive life-saving transplantation.

6.
Article in English | MEDLINE | ID: mdl-38908731

ABSTRACT

BACKGROUND & AIMS: Continuous risk-stratification of candidates and urgency-based prioritization have been utilized for liver transplantation (LT) in patients with non-hepatocellular carcinoma (HCC) in the United States. Instead, for patients with HCC, a dichotomous criterion with exception points is still used. This study evaluated the utility of the hazard associated with LT for HCC (HALT-HCC), an oncological continuous risk score, to stratify waitlist dropout and post-LT outcomes. METHODS: A competing risk model was developed and validated using the UNOS database (2012-2021) through multiple policy changes. The primary outcome was to assess the discrimination ability of waitlist dropouts and LT outcomes. The study focused on the HALT-HCC score, compared with other HCC risk scores. RESULTS: Among 23,858 candidates, 14,646 (59.9%) underwent LT and 5196 (21.8%) dropped out of the waitlist. Higher HALT-HCC scores correlated with increased dropout incidence and lower predicted 5-year overall survival after LT. HALT-HCC demonstrated the highest area under the curve (AUC) values for predicting dropout at various intervals post-listing (0.68 at 6 months, 0.66 at 1 year), with excellent calibration (R2 = 0.95 at 6 months, 0.88 at 1 year). Its accuracy remained stable across policy periods and locoregional therapy applications. CONCLUSIONS: This study highlights the predictive capability of the continuous oncological risk score to forecast waitlist dropout and post-LT outcomes in patients with HCC, independent of policy changes. The study advocates integrating continuous scoring systems like HALT-HCC in liver allocation decisions, balancing urgency, organ utility, and survival benefit.

7.
Transplant Direct ; 10(7): e1657, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38881743

ABSTRACT

Background: The role of donor age in liver transplantation (LT) outcomes for hepatocellular carcinoma (HCC) is controversial. Given the significant risk of HCC recurrence post-LT, optimizing donor/recipient matching is crucial. This study reassesses the impact of young donors on LT outcomes in patients with HCC. Methods: A retrospective review of 11 704 LT cases from the United Network for Organ Sharing database (2012-2021) was conducted. The study focused on the effect of donor age on recurrence-free survival, using hazard associated with LT for HCC (HALT-HCC) and Metroticket 2.0 scores to evaluate post-LT survival in patients with HCC. Results: Of 4706 cases with young donors, 11.0% had HCC recurrence or death within 2 y, and 18.3% within 5 y. These outcomes were comparable with those of non-young donors. A significant correlation between donor age and post-LT recurrence or mortality (P = 0.04) was observed, which became statistically insignificant after tumor-related adjustments (P = 0.32). The Kaplan-Meier curve showed that recipients with lower HALT-HCC scores (<9) and Metroticket 2.0 scores (<2.2) significantly benefited from young donors, unlike those exceeding these score thresholds. Cox regression analysis showed that donor age significantly influenced outcomes in recipients below certain score thresholds but was less impactful for higher scores. Conclusions: Young donors are particularly beneficial for LT recipients with less aggressive HCC, as indicated by their HALT-HCC and Metroticket 2.0 scores. These findings suggest strategically allocating young donors to recipients with less aggressive tumor profiles, which could foster more efficient use of the scarce donor supply and potentially enhance post-LT outcomes.

8.
HPB (Oxford) ; 2024 Jun 04.
Article in English | MEDLINE | ID: mdl-38879433

ABSTRACT

BACKGROUND: Cause of death (COD) is a predictor of liver transplant (LT) outcomes independent of donor age, yet has not been recently reappraised. METHODS: Analyzing UNOS database (2013-2022), the study explored COD trends and impacts on one-year post-LT graft survival (GS) and hazard ratios (HR) for graft failure. RESULTS: Of 80,282 brain-death donors, 55,413(69.0%) underwent initial LT. Anoxia became the predominant COD in 2015, increasing from 29.0% in 2013 to 45.1% in 2021, with notable increases in drug intoxication. Survival differences between anoxia and cerebrovascular accidents (CVA) recently became insignificant (P=0.95). Further analysis showed improved GS from intracranial hemorrhage/stroke (previously worse; P<0.01) (P=0.70). HRs for post-1-year graft failure showed reduced significance of CVA (vs.Anoxia) and intracranial hemorrhage/stroke (vs.any other COD) recently. Donors with intracranial hemorrhage/stroke, showing improved survival and HR, were allocated to recipients with lower MELD-Na, contrasting the trend for drug intoxication CODs. DISCUSSION: CVA, traditionally linked with poorer outcomes, shows improved GS and HRs (vs.Anoxia). This could be due to rising drug intoxication cases and the allocation of donors with drug intoxication to recipients with higher MELD-Na, and those with CVA to recipients with lower scores. While COD remains crucial in donor selection, proper matching can mitigate differences among CODs.

10.
Am J Transplant ; 2024 Jun 10.
Article in English | MEDLINE | ID: mdl-38866110

ABSTRACT

Medical literature highlights differences in liver transplantation (LT) waitlist experiences among ABO blood types. Type AB candidates reportedly have higher LT rates and reduced mortality. Despite liver offering guidelines, ABO disparities persist. This study examines LT access discrepancies among blood types, focusing on type AB, and seeks equitable strategies. Using the United Network for Organ Sharing database (2003-2022), 170 276 waitlist candidates were retrospectively analyzed. Dual predictive analyses (LT opportunity and survival studies) evaluated 1-year recipient pool survival, considering waitlist and post-LT survival, alongside anticipated allocation value per recipient, under 6 scenarios. Of the cohort, 97 670 patients (57.2%) underwent LT. Type AB recipients had the highest LT rate (73.7% vs 55.2% for O), shortest median waiting time (90 vs 198 days for A), and lowest waitlist mortality (12.9% vs 23.9% for O), with the lowest median model for end-stage liver disease-sodium (MELD-Na) score (20 vs 25 for A/O). The LT opportunity study revealed that reallocating type A (or A and O) donors originally for AB recipients to A recipients yielded the greatest reduction in disparities in anticipated value per recipient, from 0.19 (before modification) to 0.08. Meanwhile, the survival study showed that ABO-identical LTs reduced disparity the most (3.5% to 2.8%). Sensitivity analysis confirmed these findings were specific to the MELD-Na score < 30 population, indicating current LT allocation may favor certain blood types. Prioritizing ABO-identical LTs for MELD-Na score < 30 recipients could ensure uniform survival outcomes and mitigate disparities.

11.
JMA J ; 7(2): 232-239, 2024 Apr 15.
Article in English | MEDLINE | ID: mdl-38721076

ABSTRACT

Introduction: Hepatocellular carcinoma (HCC) is a major global health challenge, being the fifth most prevalent neoplasm and the third leading cause of cancer-related deaths worldwide. Liver transplantation offers a potentially curative approach for HCC, yet the risk of recurrence posttransplantation remains a significant concern. This study investigates the influence of a liver immune status index (LISI) on the prognosis of patients undergoing living-donor liver transplantation for HCC. Methods: In a single-center study spanning from 2001 to 2020, 113 patients undergoing living-donor liver transplantation for HCC were analyzed. LISI was calculated for each donor liver using body mass index, serum albumin levels, and the fibrosis-4 index. This study assessed the impact of donor LISI on short-term recurrence rates and survival, with special attention to its correlation with the antitumor activity of natural killer (NK) cells in the liver. Results: The patients were divided into two grades (high donor LISI, >-1.23 [n = 43]; and low donor LISI, ≤-1.23 [n = 70]). After propensity matching to adjust the background of recipient factors, the survival rates at 1 and 3 years were 92.6% and 88.9% and 81.5% and 70.4% in the low and high donor LISI groups, respectively (p = 0.11). The 1- and 3-year recurrence-free survival were 88.9% and 85.2% and 74.1% and 55.1% in the low and high donor LISI groups, respectively (p = 0.02). Conclusions: This study underscores the potential of an LISI as a noninvasive biomarker for assessing liver NK cell antitumor capacity, with implications for living-donor liver transplantation for HCC. Donor LISI emerges as a significant predictor of early recurrence risk following living-donor liver transplantation for HCC, highlighting the role of the liver antitumor activity of liver NK cells in managing liver malignancies.

12.
Liver Transpl ; 30(9): 887-895, 2024 Sep 01.
Article in English | MEDLINE | ID: mdl-38727618

ABSTRACT

There is no recent update on the clinical course of retransplantation (re-LT) after living donor liver transplantation (LDLT) in the US using recent national data. The UNOS database (2002-2023) was used to explore patient characteristics in initial LT, comparing deceased donor liver transplantation (DDLT) and LDLT for graft survival (GS), reasons for graft failure, and GS after re-LT. It assesses waitlist dropout and re-LT likelihood, categorizing re-LT cohort based on time to re-listing as acute or chronic (≤ or > 1 mo). Of 132,323 DDLT and 5955 LDLT initial transplants, 3848 DDLT and 302 LDLT recipients underwent re-LT. Of the 302 re-LT following LDLT, 156 were acute and 146 chronic. Primary nonfunction (PNF) was more common in DDLT, although the difference was not statistically significant (17.4% vs. 14.8% for LDLT; p = 0.52). Vascular complications were significantly higher in LDLT (12.5% vs. 8.3% for DDLT; p < 0.01). Acute re-LT showed a larger difference in primary nonfunction between DDLT and LDLT (49.7% vs. 32.0%; p < 0.01). Status 1 patients were more common in DDLT (51.3% vs. 34.0% in LDLT; p < 0.01). In the acute cohort, Kaplan-Meier curves indicated superior GS after re-LT for initial LDLT recipients in both short-term and long-term ( p = 0.02 and < 0.01, respectively), with no significant difference in the chronic cohort. No significant differences in waitlist dropout were observed, but the initial LDLT group had a higher re-LT likelihood in the acute cohort (sHR 1.40, p < 0.01). A sensitivity analysis focusing on the most recent 10-year cohort revealed trends consistent with the overall study findings. LDLT recipients had better GS in re-LT than DDLT. Despite a higher severity of illness, the DDLT cohort was less likely to undergo re-LT.


Subject(s)
Databases, Factual , Graft Survival , Liver Transplantation , Living Donors , Reoperation , Waiting Lists , Humans , Liver Transplantation/statistics & numerical data , Liver Transplantation/adverse effects , Liver Transplantation/methods , Living Donors/statistics & numerical data , Female , Male , United States/epidemiology , Reoperation/statistics & numerical data , Middle Aged , Adult , Databases, Factual/statistics & numerical data , Waiting Lists/mortality , Treatment Outcome , Time Factors , Aged , Graft Rejection/epidemiology , Graft Rejection/etiology , Graft Rejection/prevention & control , Risk Factors
13.
Clin Transplant ; 38(4): e15316, 2024 04.
Article in English | MEDLINE | ID: mdl-38607291

ABSTRACT

BACKGROUND: The incidence of graft failure following liver transplantation (LTx) is consistent. While traditional risk scores for LTx have limited accuracy, the potential of machine learning (ML) in this area remains uncertain, despite its promise in other transplant domains. This study aims to determine ML's predictive limitations in LTx by replicating methods used in previous heart transplant research. METHODS: This study utilized the UNOS STAR database, selecting 64,384 adult patients who underwent LTx between 2010 and 2020. Gradient boosting models (XGBoost and LightGBM) were used to predict 14, 30, and 90-day graft failure compared to conventional logistic regression model. Models were evaluated using both shuffled and rolling cross-validation (CV) methodologies. Model performance was assessed using the AUC across validation iterations. RESULTS: In a study comparing predictive models for 14-day, 30-day and 90-day graft survival, LightGBM consistently outperformed other models, achieving the highest AUC of.740,.722, and.700 in shuffled CV methods. However, in rolling CV the accuracy of the model declined across every ML algorithm. The analysis revealed influential factors for graft survival prediction across all models, including total bilirubin, medical condition, recipient age, and donor AST, among others. Several features like donor age and recipient diabetes history were important in two out of three models. CONCLUSIONS: LightGBM enhances short-term graft survival predictions post-LTx. However, due to changing medical practices and selection criteria, continuous model evaluation is essential. Future studies should focus on temporal variations, clinical implications, and ensure model transparency for broader medical utility.


Subject(s)
Liver Transplantation , Adult , Humans , Liver Transplantation/adverse effects , Research Design , Algorithms , Bilirubin , Machine Learning
15.
Liver Transpl ; 2024 Apr 17.
Article in English | MEDLINE | ID: mdl-38625836

ABSTRACT

The use of older donors after circulatory death (DCD) for liver transplantation (LT) has increased over the past decade. This study examined whether outcomes of LT using older DCD (≥50 y) have improved with advancements in surgical/perioperative care and normothermic machine perfusion (NMP) technology. A total of 7602 DCD LT cases from the United Network for Organ Sharing database (2003-2022) were reviewed. The impact of older DCD donors on graft survival was assessed using the Kaplan-Meier and HR analyses. In all, 1447 LT cases (19.0%) involved older DCD donors. Although there was a decrease in their use from 2003 to 2014, a resurgence was noted after 2015 and reached 21.9% of all LTs in the last 4 years (2019-2022). Initially, 90-day and 1-year graft survivals for older DCDs were worse than younger DCDs, but this difference decreased over time and there was no statistical difference after 2015. Similarly, HRs for graft loss in older DCD have recently become insignificant. In older DCD LT, NMP usage has increased recently, especially in cases with extended donor-recipient distances, while the median time from asystole to aortic cross-clamp has decreased. Multivariable Cox regression analyses revealed that in the early phase, asystole to cross-clamp time had the highest HR for graft loss in older DCD LT without NMP, while in the later phases, the cold ischemic time (>5.5 h) was a significant predictor. LT outcomes using older DCD donors have become comparable to those from young DCD donors, with recent HRs for graft loss becoming insignificant. The strategic approach in the recent period could mitigate risks, including managing cold ischemic time (≤5.5 h), reducing asystole to cross-clamp time, and adopting NMP for longer distances. Optimal use of older DCD donors may alleviate the donor shortage.

16.
Clin Transplant ; 38(1): e15155, 2024 01.
Article in English | MEDLINE | ID: mdl-37812571

ABSTRACT

BACKGROUND: Donors with hyperbilirubinemia are often not utilized for liver transplantation (LT) due to concerns about potential liver dysfunction and graft survival. The potential to mitigate organ shortages using such donors remains unclear. METHODS: This study analyzed adult deceased donor data from the United Network for Organ Sharing database (2002-2022). Hyperbilirubinemia was categorized as high total bilirubin (3.0-5.0 mg/dL) and very high bilirubin (≥5.0 mg/dL) in brain-dead donors. We assessed the impact of donor hyperbilirubinemia on 3-month and 3-year graft survival, comparing these outcomes to donors after circulatory death (DCD). RESULTS: Of 138 622 donors, 3452 (2.5%) had high bilirubin and 1999 (1.4%) had very high bilirubin levels. Utilization rates for normal, high, and very high bilirubin groups were 73.5%, 56.4%, and 29.2%, respectively. No significant differences were found in 3-month and 3-year graft survival between groups. Donors with high bilirubin had superior 3-year graft survival compared to DCD (hazard ratio .83, p = .02). Factors associated with inferior short-term graft survival included recipient medical condition in intensive care unit (ICU) and longer cold ischemic time; factors associated with inferior long-term graft survival included older donor age, recipient medical condition in ICU, older recipient age, and longer cold ischemic time. Donors with ≥10% macrosteatosis in the very high bilirubin group were also associated with worse 3-year graft survival (p = .04). DISCUSSION: The study suggests that despite many grafts with hyperbilirubinemia being non-utilized, acceptable post-LT outcomes can be achieved using donors with hyperbilirubinemia. Careful selection may increase utilization and expand the donor pool without negatively affecting graft outcome.


Subject(s)
Liver , Tissue and Organ Procurement , Adult , Humans , Prognosis , Tissue Donors , Graft Survival , Hyperbilirubinemia/etiology , Bilirubin , Retrospective Studies
18.
J Hepatobiliary Pancreat Sci ; 31(2): 67-68, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37877501

ABSTRACT

Tashiro and colleagues demonstrated for the first time that an artificial intelligence system can precisely identify intrahepatic vascular structures during laparoscopic liver resection in real time through color coding under bleeding and indocyanine green fluorescent imaging. The system supports real-time navigation and offers potentially safer laparoscopic or robotic liver surgery.


Subject(s)
Artificial Intelligence , Laparoscopy , Humans , Optical Imaging/methods , Laparoscopy/methods , Coloring Agents , Indocyanine Green , Hepatectomy/methods , Liver/diagnostic imaging , Liver/surgery
19.
Surgery ; 175(2): 513-521, 2024 02.
Article in English | MEDLINE | ID: mdl-37980203

ABSTRACT

BACKGROUND: Long-distance-traveling liver grafts in liver transplantation present challenges due to prolonged cold ischemic time and increased risk of ischemia-reperfusion injury. We identified long-distance-traveling liver graft donor and recipient characteristics and risk factors associated with long-distance-traveling liver graft use. METHODS: We conducted a retrospective analysis of data from donor liver transplantation patients registered from 2014 to 2020 in the United Network for Organ Sharing registry database. Donor, recipient, and transplant factors of graft survival were compared between short-travel grafts and long-distance-traveling liver grafts (traveled >500 miles). RESULTS: During the study period, 28,265 patients received a donation after brainstem death liver transplantation and 3,250 a donation after circulatory death liver transplantation. The long-distance-traveling liver graft rate was 6.2% in donation after brainstem death liver transplantation and 7.1% in donation after circulatory death liver transplantation. The 90-day graft survival rates were significantly worse for long-distance-traveling liver grafts (donation after brainstem death: 95.7% vs 94.5%, donation after circulatory death: 94.5% vs 93.9%). The 3-year graft survival rates were similar for long-distance-traveling liver grafts (donation after brainstem death: 85.5% vs 85.1%, donation after circulatory death: 81.0% vs 80.4%). Cubic spline regression analyses revealed that travel distance did not linearly worsen the prognosis of 3-year graft survival. On the other hand, younger donor age, lower donor body mass index, and shorter cold ischemic time mitigated the negative impact of 90-day graft survival in long-distance-traveling liver grafts. CONCLUSION: The use of long-distance-traveling liver grafts negatively impacts 90-day graft survival but not 3-year graft survival. Moreover, long-distance-traveling liver grafts are more feasible with appropriate donor and recipient factors offsetting the extended cold ischemic time. Mechanical perfusion can improve long-distance-traveling liver graft use. Enhanced collaboration between organ procurement organizations and transplant centers and optimized transportation systems are essential for increasing long-distance-traveling liver graft use, ultimately expanding the donor pool.


Subject(s)
Liver Transplantation , Tissue and Organ Procurement , Humans , Liver Transplantation/adverse effects , Retrospective Studies , Living Donors , Tissue Donors , Liver , Risk Factors , Graft Survival
SELECTION OF CITATIONS
SEARCH DETAIL
...