Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 36
Filter
1.
Updates Surg ; 76(3): 725-741, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38713396

ABSTRACT

Liver transplant oncology (TO) represents an area of increasing clinical and scientific interest including a heterogeneous group of clinical-pathological settings. Immunosuppressive management after LT is a key factor relevantly impacting result. However, disease-related guidance is still lacking, and many open questions remain in the field. Based on such a substantial lack of solid evidences, the Italian Board of Experts in Liver Transplantation (I-BELT) (a working group including representatives of all national transplant centers), unprecedently promoted a methodologically sound consensus conference on the topic, based on the GRADE approach. The group final recommendations are herein presented and commented. The 18 PICOs and Statements and their levels of evidence and grades of recommendation are reported and grouped into seven areas: (1) risk stratification by histopathological and bio-molecular parameters and role of mTORi post-LT; (2) steroids and HCC recurrence; (3) management of immunosuppression when HCC recurs after LT; (4) mTORi monotherapy; (5) machine perfusion and HCC recurrence after LT; (6) physiopathology of tumor-infiltrating lymphocytes and immunosuppression, the role of inflammation; (7) immunotherapy in liver transplanted patients. The interest in mammalian targets of rapamycin inhibitors (mTORi), for steroid avoidance and the need for a reduction to CNI exposure emerged from the consensus process. A selected list of unmet needs prompting further investigations have also been developed. The so far heterogeneous and granular approach to immunosuppression in oncologic patients deserves greater efforts for a more standardized therapeutic response to the different clinical scenarios. This consensus process makes a first unprecedented step in this direction, to be developed on a larger scale.


Subject(s)
Immunosuppression Therapy , Immunosuppressive Agents , Liver Neoplasms , Liver Transplantation , Humans , Liver Neoplasms/surgery , Immunosuppression Therapy/methods , Italy , Immunosuppressive Agents/therapeutic use , Carcinoma, Hepatocellular/surgery , Neoplasm Recurrence, Local
2.
J Laparoendosc Adv Surg Tech A ; 34(2): 99-105, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38294895

ABSTRACT

Background: Intraoperative blood loss has an unfavorable impact on the outcome of patients undergoing liver surgery. Today, the use of devices capable of minimizing this risk with high technical performance becomes mandatory. Into this scenario fits the CUSA® Clarity Ultrasonic Surgical Aspirator System. This prospective survey involving five liver surgery centers had the objective of investigating whether this innovative ultrasonic surgical aspirator is safe and effective in the transection of the liver parenchyma. Materials and Methods: This clinical study was a prospective, multicenter, single-arm Post-Market Clinical Follow-up study investigating 100 subjects who underwent liver surgery using the CUSA Clarity Ultrasonic Surgical Aspirator System at five centers during a period of 1 year and 8 months. After collecting all the patient's clinical information and instrument usage details, surgeons completed a brief survey giving their opinions on the performance of CUSA. Therefore, safety and efficacy outcomes were evaluated. Results: Surgeons had a 95% success rate in complete removal of the mass with an average overall operative time of 4 hours and 34 minutes. Overall, there were no complications or device deficiencies. Conclusion: The CUSA Clarity Ultrasonic Surgical Aspirator System performs well during liver surgery with a low complication rate. ClinicalTrials.gov Identifier: NCT04298268.


Subject(s)
Hepatectomy , Ultrasonics , Humans , Follow-Up Studies , Hepatectomy/adverse effects , Liver/surgery , Prospective Studies
3.
J Clin Med ; 12(14)2023 Jul 18.
Article in English | MEDLINE | ID: mdl-37510859

ABSTRACT

BACKGROUND: Growing interest has been recently reported in the potential detrimental role of donor gamma-glutamyl transferase (GGT) peak at the time of organ procurement regarding the risk of poor outcomes after liver transplantation (LT). However, the literature on this topic is scarce and controversial data exist on the mechanisms justifying such a correlation. This study aims to demonstrate the adverse effect of donor GGT in a large European LT cohort regarding 90-day post-transplant graft loss. METHODS: This is a retrospective international study investigating 1335 adult patients receiving a first LT from January 2004 to September 2018 in four collaborative European centers. RESULTS: Two different multivariable logistic regression models were constructed to evaluate the risk factors for 90-day post-transplant graft loss, introducing donor GGT as a continuous or dichotomous variable. In both models, donor GGT showed an independent role as a predictor of graft loss. In detail, the log-transformed continuous donor GGT value showed an odds ratio of 1.46 (95% CI = 1.03-2.07; p = 0.03). When the donor GGT peak value was dichotomized using a cut-off of 160 IU/L, the odds ratio was 1.90 (95% CI = 1.20-3.02; p = 0.006). When the graft-loss rates were investigated, significantly higher rates were reported in LT cases with donor GGT ≥160 IU/L. In detail, 90-day graft-loss rates were 23.2% vs. 13.9% in patients with high vs. low donor GGT, respectively (log-rank p = 0.004). Donor GGT was also added to scores conventionally used to predict outcomes (i.e., MELD, D-MELD, DRI, and BAR scores). In all cases, when the score was combined with the donor GGT, an improvement in the model accuracy was observed. CONCLUSIONS: Donor GGT could represent a valuable marker for evaluating graft quality at transplantation. Donor GGT should be implemented in scores aimed at predicting post-transplant clinical outcomes. The exact mechanisms correlating GGT and poor LT outcomes should be better clarified and need prospective studies focused on this topic.

4.
Updates Surg ; 75(3): 531-539, 2023 Apr.
Article in English | MEDLINE | ID: mdl-35948742

ABSTRACT

Poor data exist on the influence of holidays and weekdays on the number and the results of liver transplantation (LT) in Italy. The study's main objective is to investigate the impact of holidays and the different days of the week on the LT number and early graft survival rates in a multi-centric Italian series. We performed a retrospective analysis on 1,026 adult patients undergoing first deceased-donor transplantation between January 2004 and December 2018 in the three university centers in Rome. During the 4,504 workdays, 881 LTs were performed (85.9%; one every 5.1 days on average). On the opposite, 145 LTs were done during the 975 holidays (14.1%; one every 7.1 days on average). Fewer LTs were performed on holidays (P = 0.004). There were no substantial differences in donor-, recipient- and transplant-related characteristics in LTs performed on weekdays or holidays. On Monday, fewer transplants were performed (vs. other weekdays: P < 0.0001; vs. Sunday: P = 0.03). At multivariable Cox regression analysis, LTs performed during the holiday or during the different days of the week were not found to be independent risk factors for the risk of 3- and 12-month graft loss. At three-month survival curves, no differences were observed among the transplants performed during the holidays versus the workdays (86.2 vs. 85.0%; P-0.70). The range of graft survival rates based on the day of the week was 81.6-86.9%, without showing any significant differences (P = 0.57). Fewer transplants are performed on holidays and Mondays. Survivals are not affected by holidays or the day the transplant is performed.


Subject(s)
Liver Transplantation , Adult , Humans , Retrospective Studies , Tissue Donors , Risk Factors , Italy , Graft Survival
5.
Updates Surg ; 74(2): 491-500, 2022 Apr.
Article in English | MEDLINE | ID: mdl-35275380

ABSTRACT

Several studies have explored the risk of graft dysfunction after liver transplantation (LT) in recent years. Conversely, risk factors for graft discard before or at procurement have poorly been investigated. The study aimed at identifying a score to predict the risk of liver-related graft discard before transplantation. Secondary aims were to test the score for prediction of biopsy-related negative features and post-LT early graft loss. A total of 4207 donors evaluated during the period January 2004-Decemeber 2018 were retrospectively analyzed. The group was split into a training set (n = 3,156; 75.0%) and a validation set (n = 1,051; 25.0%). The Donor Rejected Organ Pre-transplantation (DROP) Score was proposed: - 2.68 + (2.14 if Regional Share) + (0.03*age) + (0.04*weight)-(0.03*height) + (0.29 if diabetes) + (1.65 if anti-HCV-positive) + (0.27 if HBV core) - (0.69 if hypotension) + (0.09*creatinine) + (0.38*log10AST) + (0.34*log10ALT) + (0.06*total bilirubin). At validation, the DROP Score showed the best AUCs for the prediction of liver-related graft discard (0.82; p < 0.001) and macrovesicular steatosis ≥ 30% (0.71; p < 0.001). Patients exceeding the DROP 90th centile had the worse post-LT results (3-month graft loss: 82.8%; log-rank P = 0.024).The DROP score represents a valuable tool to predict the risk of liver function-related graft discard, steatosis, and early post-LT graft survival rates. Studies focused on the validation of this score in other geographical settings are required.


Subject(s)
Liver Transplantation , Graft Survival , Humans , Liver , Liver Transplantation/methods , Retrospective Studies , Risk Factors , Tissue Donors
8.
Eur J Clin Invest ; 51(12): e13687, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34599600

ABSTRACT

BACKGROUND/OBJECTIVES: We investigated whether behavioral precautions adopted during Coronavirus disease (COVID-19) pandemic also influenced the spreading and multidrug resistance (MDR) of ESKAPEEc (Enterococcus faecium, Staphylococcus aureus, Klebsiella pneumoniae, Acinetobacter baumannii [AB], Pseudomonas aeruginosa, Enterobacter spp and Escherichia Coli, [EC]) among Intensive Care Unit (ICU) patients. SUBJECTS/METHODS: We performed a single-center retrospective study in adult patients admitted to our COVID-19-free surgical ICU. Only patients staying in ICU for more than 48 hours were included. The ESKAPEEc infections recorded during the COVID-19 period (June 1, 2020 - February 28, 2021) and in the corresponding pre-pandemic period (June 1, 2019 - February 28, 2020) were compared. An interrupted time series analysis was performed to rule out possible confounders. RESULTS: Overall, 173 patients in the COVID-19 period and 132 in the pre-COVID-19 period were investigated. The ESKAPEEc infections were documented in 23 (13.3%) and 35 (26.5%) patients in the pandemic and the pre-pandemic periods, respectively (p = 0.005). Demographics, diagnosis, comorbidities, type of surgery, Simplified Acute Physiology Score II, length of mechanical ventilation, hospital and ICU length of stay, ICU death rate, and 28-day hospital mortality were similar in the two groups. In comparison with the pre-pandemic period, no AB was recorded during COVID-19 period, (p = 0.017), while extended-spectrum beta-lactamase-producing EC infections significantly decreased (p = 0.017). Overall, the ESKAPEEc isolates during pandemic less frequently exhibited multidrug-resistant (p = 0.014). CONCLUSIONS: These findings suggest that a robust adherence to hygiene measures together with human contact restrictions in a COVID-19 free ICU might also restrain the transmission of ESKAPEEc pathogens.


Subject(s)
COVID-19/prevention & control , Cross Infection/epidemiology , Gram-Negative Bacterial Infections/epidemiology , Gram-Positive Bacterial Infections/epidemiology , Infection Control , Acinetobacter Infections/epidemiology , Acinetobacter Infections/microbiology , Acinetobacter Infections/transmission , Acinetobacter baumannii , Aged , Cross Infection/microbiology , Cross Infection/transmission , Drug Resistance, Multiple, Bacterial , Enterobacter , Enterobacteriaceae Infections/epidemiology , Enterobacteriaceae Infections/microbiology , Enterobacteriaceae Infections/transmission , Enterococcus faecium , Escherichia coli Infections/epidemiology , Escherichia coli Infections/microbiology , Escherichia coli Infections/transmission , Female , Gram-Negative Bacterial Infections/microbiology , Gram-Negative Bacterial Infections/transmission , Gram-Positive Bacterial Infections/microbiology , Gram-Positive Bacterial Infections/transmission , Hand Disinfection , Humans , Intensive Care Units , Interrupted Time Series Analysis , Klebsiella Infections/epidemiology , Klebsiella Infections/microbiology , Klebsiella Infections/transmission , Klebsiella pneumoniae , Male , Methicillin-Resistant Staphylococcus aureus , Middle Aged , Organizational Policy , Personal Protective Equipment , Pseudomonas Infections/epidemiology , Pseudomonas Infections/microbiology , Pseudomonas Infections/transmission , Pseudomonas aeruginosa , Retrospective Studies , SARS-CoV-2 , Staphylococcal Infections/epidemiology , Staphylococcal Infections/microbiology , Staphylococcal Infections/transmission , Staphylococcus aureus , Visitors to Patients
9.
Transplant Direct ; 7(3): e669, 2021 Mar.
Article in English | MEDLINE | ID: mdl-34113712

ABSTRACT

Solid organ transplants (SOTs) are life-saving interventions, recently challenged by coronavirus disease 2019 (COVID-19). SOTs require a multistep process, which can be affected by COVID-19 at several phases. METHODS: SOT-specialists, COVID-19-specialists, and medical ethicists designed an international survey according to CHERRIES guidelines. Personal opinions about continuing SOTs, safe managing of donors and recipients, as well as equity of resources' allocation were investigated. The survey was sent by e-mail. Multiple approaches were used (corresponding authors from Scopus, websites of scientific societies, COVID-19 webinars). After the descriptive analysis, univariate and multivariate ordinal regression analysis was performed. RESULTS: There were 1819 complete answers from 71 countries. The response rate was 49%. Data were stratified according to region, macrospecialty, and organ of interest. Answers were analyzed using univariate-multivariate ordinal regression analysis and thematic analysis. Overall, 20% of the responders thought SOTs should not stop (continue transplant without restriction); over 70% suggested SOTs should selectively stop, and almost 10% indicated they should completely stop. Furthermore, 82% agreed to shift resources from transplant to COVID-19 temporarily. Briefly, main reason for not stopping was that if the transplant will not proceed, the organ will be wasted. Focusing on SOT from living donors, 61% stated that activity should be restricted only to "urgent" cases. At the multivariate analysis, factors identified in favor of continuing transplant were Italy, ethicist, partially disagreeing on the equity question, a high number of COVID-19-related deaths on the day of the answer, a high IHDI country. Factors predicting to stop SOTs were Europe except-Italy, public university hospital, and strongly agreeing on the equity question. CONCLUSIONS: In conclusion, the majority of responders suggested that transplant activity should be continued through the implementation of isolation measures and the adoption of the COVID-19-free pathways. Differences between professional categories are less strong than supposed.

10.
Liver Int ; 41(7): 1629-1640, 2021 07.
Article in English | MEDLINE | ID: mdl-33793054

ABSTRACT

BACKGROUND & AIMS: Sarcopenia in liver transplantation (LT) cirrhotic candidates has been connected with higher dropouts and graft losses after transplant. The study aims to create an 'urgency' model combining sarcopenia and Model for End-stage Liver Disease Sodium (MELDNa) to predict the risk of dropout and identify an appropriate threshold of post-LT futility. METHODS: A total of 1087 adult cirrhotic patients were listed for a first LT during January 2012 to December 2018. The study population was split into a training (n = 855) and a validation set (n = 232). RESULTS: Using a competing-risk analysis of cause-specific hazards, we created the Sarco-Model2 . According to the model, one extra point of MELDNa was added for each 0.5 cm2 /m2 reduction of total psoas area (TPA) < 6.0 cm2 /m2 . At external validation, the Sarco-Model2 showed the best diagnostic ability for predicting the risk of 3-month dropout in patients with MELDNa < 20 (area under the curve [AUC] = 0.93; P = .003). Using the net reclassification improvement, 14.3% of dropped-out patients were correctly reclassified using the Sarco-Model2 . As for the futility threshold, transplanted patients with TPA < 6.0 cm2 /m2 and MELDNa 35-40 (n = 16/833, 1.9%) had the worse results (6-month graft loss = 25.5%). CONCLUSIONS: In sarcopenic patients with MELDNa < 20, the 'urgency' Sarco-Model2 should be used to prioritize the list, while MELDNa value should be preferred in patients with MELDNa ≥ 20. The Sarco-Model2 played a role in more than 30% of the cases in the investigated allocation scenario. In sarcopenic patients with a MELDNa value of 35-40, 'futile' transplantation should be considered.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Adult , End Stage Liver Disease/surgery , Humans , Liver Cirrhosis , Prognosis , Severity of Illness Index , Waiting Lists
12.
J Clin Anesth ; 69: 110154, 2021 May.
Article in English | MEDLINE | ID: mdl-33333373

ABSTRACT

STUDY OBJECTIVE: To compare total blood product requirements in liver transplantation (LT) assisted by thromboelastography (TEG) or conventional coagulation tests (CCTs). DESIGN: Retrospective observational study. SETTING: A tertiary care referral center for LT. PATIENTS: Adult patients undergoing LT from deceased donor. INTERVENTION: Hemostasis was monitored by TEG or CCTs and corresponding transfusion algorithms were adopted. MEASUREMENTS: Number and types of blood products (red blood cells, RBC; fresh-frozen plasma, FFP; platelets, PLT) transfused from the beginning of surgery until the admission to the intensive care unit. METHODS: We compared data retrospectively collected in 226 LTs, grouped according to the type of hemostasis monitoring (90 with TEG and 136 with CCTs, respectively). Confounding variables affecting transfusion needs (recipient age, sex, previous hepatocellular carcinoma surgery, Model for End Stage Liver Disease - MELD, baseline hemoglobin, fibrinogen, creatinine, veno-venous by pass, and trans-jugular intrahepatic portosystemic shunt) were managed by propensity score match (PSM). MAIN RESULTS: The preliminary analysis showed that patients in the TEG group received fewer total blood products (RBC + FFP + PLT; p = 0.001, FFP (p = 0.001), and RBC (p = 0.001). After PSM, 89 CCT patients were selected and matched to the 90 TEG patients. CCT and TEG matched patients received similar amount of total blood products. In a subgroup of 39 patients in the top MELD quartile (MELD ≥25), the TEG use resulted in lower consumption of FFP units and total blood products. Nevertheless, due to the low number of patients, any meaningful conclusion could be achieved in this subgroup. CONCLUSIONS: In our experience, TEG-guided transfusion in LT does not reduce the intraoperative blood product consumption. Further studies are warranted to assess an advantage for TEG in either the entire LT population or the high-MELD subgroup of patients.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Adult , End Stage Liver Disease/surgery , Humans , Liver Transplantation/adverse effects , Propensity Score , Retrospective Studies , Severity of Illness Index , Thrombelastography
13.
JAMA Surg ; 155(12): e204095, 2020 12 01.
Article in English | MEDLINE | ID: mdl-33112390

ABSTRACT

Importance: Expansion of donor acceptance criteria for liver transplant increased the risk for early allograft failure (EAF), and although EAF prediction is pivotal to optimize transplant outcomes, there is no consensus on specific EAF indicators or timing to evaluate EAF. Recently, the Liver Graft Assessment Following Transplantation (L-GrAFT) algorithm, based on aspartate transaminase, bilirubin, platelet, and international normalized ratio kinetics, was developed from a single-center database gathered from 2002 to 2015. Objective: To develop and validate a simplified comprehensive model estimating at day 10 after liver transplant the EAF risk at day 90 (the Early Allograft Failure Simplified Estimation [EASE] score) and, secondarily, to identify early those patients with unsustainable EAF risk who are suitable for retransplant. Design, Setting, and Participants: This multicenter cohort study was designed to develop a score capturing a continuum from normal graft function to nonfunction after transplant. Both parenchymal and vascular factors, which provide an indication to list for retransplant, were included among the EAF determinants. The L-GrAFT kinetic approach was adopted and modified with fewer data entries and novel variables. The population included 1609 patients in Italy for the derivation set and 538 patients in the UK for the validation set; all were patients who underwent transplant in 2016 and 2017. Main Outcomes and Measures: Early allograft failure was defined as graft failure (codified by retransplant or death) for any reason within 90 days after transplant. Results: At day 90 after transplant, the incidence of EAF was 110 of 1609 patients (6.8%) in the derivation set and 41 of 538 patients (7.6%) in the external validation set. Median (interquartile range) ages were 57 (51-62) years in the derivation data set and 56 (49-62) years in the validation data set. The EASE score was developed through 17 entries derived from 8 variables, including the Model for End-stage Liver Disease score, blood transfusion, early thrombosis of hepatic vessels, and kinetic parameters of transaminases, platelet count, and bilirubin. Donor parameters (age, donation after cardiac death, and machine perfusion) were not associated with EAF risk. Results were adjusted for transplant center volume. In receiver operating characteristic curve analyses, the EASE score outperformed L-GrAFT, Model for Early Allograft Function, Early Allograft Dysfunction, Eurotransplant Donor Risk Index, donor age × Model for End-stage Liver Disease, and Donor Risk Index scores, estimating day 90 EAF in 87% (95% CI, 83%-91%) of cases in both the derivation data set and the internal validation data set. Patients could be stratified in 5 classes, with those in the highest class exhibiting unsustainable EAF risk. Conclusions and Relevance: This study found that the developed EASE score reliably estimated EAF risk. Knowledge of contributing factors may help clinicians to mitigate risk factors and guide them through the challenging clinical decision to allocate patients to early liver retransplant. The EASE score may be used in translational research across transplant centers.


Subject(s)
Liver Failure/surgery , Liver Transplantation/adverse effects , Primary Graft Dysfunction/diagnosis , Primary Graft Dysfunction/etiology , Aged , Aged, 80 and over , Algorithms , Female , Graft Survival , Humans , Liver Failure/diagnosis , Liver Failure/etiology , Logistic Models , Male , Middle Aged , ROC Curve , Reproducibility of Results , Retrospective Studies , Risk Factors , Time Factors
14.
Respir Care ; 65(1): 21-28, 2020 Jan.
Article in English | MEDLINE | ID: mdl-31270177

ABSTRACT

BACKGROUND: High-flow nasal cannula (HFNC) is a key component of oxygen therapy and has largely been used in patients with acute respiratory failure. We conducted a matched controlled study with the aim to compare the preventive use of oxygen therapy delivered by HFNC versus via air-entrainment mask (standard O2) after extubation in adult subjects with liver transplantation for reducing postextubation hypoxemia. METHODS: Twenty-nine subjects with liver transplantation who received HFNC after extubation (HFNC group) were matched 1:1 with 29 controls (standard O2 group) chosen from an historical group of 90 subjects admitted to the ICU during the previous 36 months. The primary outcome of the study was the incidence of hypoxemia at 1 h and 24 h after extubation. Secondary outcomes were the rate of weaning failure, ICU length of stay, and 28-d mortality. RESULTS: The incidence of hypoxemia was not significantly different between the HFNC and standard O2 groups at 1 h and 24 h after extubation. In the HFNC group, there was a trend toward a lower rate of weaning failure compared with the standard O2 group. ICU length of stay and 28-d mortality were similar in both groups. CONCLUSIONS: Early application of HFNC in the subjects with liver transplantation did not reduce the incidence of hypoxemia after extubation compared with standard O2 and did not modify the incidence of weaning failure, ICU length of stay, and 28-d mortality in this high-risk population of subjects. (ClinicalTrials.gov registration NCT03441854.).


Subject(s)
Airway Extubation , Cannula , Hypoxia/epidemiology , Liver Transplantation , Oxygen Inhalation Therapy , Adult , Aged , Case-Control Studies , Female , Humans , Length of Stay , Male , Middle Aged , Noninvasive Ventilation , Oxygen/administration & dosage , Respiratory Insufficiency/epidemiology , Ventilator Weaning
15.
Hepatology ; 71(2): 569-582, 2020 02.
Article in English | MEDLINE | ID: mdl-31243778

ABSTRACT

Prognosticating outcomes in liver transplant (LT) for hepatocellular carcinoma (HCC) continues to challenge the field. Although Milan Criteria (MC) generalized the practice of LT for HCC and improved outcomes, its predictive character has degraded with increasing candidate and oncological heterogeneity. We sought to validate and recalibrate a previously developed, preoperatively calculated, continuous risk score, the Hazard Associated with Liver Transplantation for Hepatocellular Carcinoma (HALTHCC), in an international cohort. From 2002 to 2014, 4,089 patients (both MC in and out [25.2%]) across 16 centers in North America, Europe, and Asia were included. A continuous risk score using pre-LT levels of alpha-fetoprotein, Model for End-Stage Liver Disease Sodium score, and tumor burden score was recalibrated among a randomly selected cohort (n = 1,021) and validated in the remainder (n = 3,068). This study demonstrated significant heterogeneity by site and year, reflecting practice trends over the last decade. On explant pathology, both vascular invasion (VI) and poorly differentiated component (PDC) increased with increasing HALTHCC score. The lowest-risk patients (HALTHCC 0-5) had lower rates of VI and PDC than the highest-risk patients (HALTHCC > 35) (VI, 7.7%[ 1.2-14.2] vs. 70.6% [48.3-92.9] and PDC:4.6% [0.1%-9.8%] vs. 47.1% [22.6-71.5]; P < 0.0001 for both). This trend was robust to MC status. This international study was used to adjust the coefficients in the HALTHCC score. Before recalibration, HALTHCC had the greatest discriminatory ability for overall survival (OS; C-index = 0.61) compared to all previously reported scores. Following recalibration, the prognostic utility increased for both recurrence (C-index = 0.71) and OS (C-index = 0.63). Conclusion: This large international trial validated and refined the role for the continuous risk metric, HALTHCC, in establishing pre-LT risk among candidates with HCC worldwide. Prospective trials introducing HALTHCC into clinical practice are warranted.


Subject(s)
Carcinoma, Hepatocellular/surgery , Liver Neoplasms/surgery , Liver Transplantation , Risk Assessment , Female , Humans , International Cooperation , Male , Middle Aged , Prognosis , Retrospective Studies
16.
Dig Liver Dis ; 52(3): 301-307, 2020 03.
Article in English | MEDLINE | ID: mdl-31806469

ABSTRACT

BACKGROUND: Early increase of hepatic artery resistive index (HARI) is frequently observed after liver transplant (LTx). AIM: We aimed to investigate contributing factors and prognostic relevance of high HARI after LTx from deceased donor. METHODS: We conducted a retrospective analysis of prospectively collected data from January 2017 and February 2019. According to the Duplex Doppler HARI values (3d post-operative day), patients were grouped in normal (0.55-0.80) and high (>0.80-1) HARI groups. RESULTS: Among 81 LTx, 36 had a high HARI and 45 a normal HARI. Patients developing high HARI were older, exhibited lower platelet, hemoglobin, platelet count/spleen diameter ratio, higher serum creatinine, and a more pronounced spleen enlargement (median values 170 versus 120 mm). At multivariate analysis, PLT/spleen diameter ratio (OR 0.994, p < 0.001) creatinine levels (OR 2.418, p = 0.029), and recipient age (OR 1.157, p = 0.004) significantly predicted the occurrence of high HARI. Patients with high or normal HARI had similar vascular complications, rejection rate and 90-day mortality. In most cases, HARI recovered to normal without any clinical effect. CONCLUSIONS: HARI rises in presence of several surrogate markers of portal hypertension. The increase is mostly transitory, and it may result from the hepatic artery spasm due to the high portal blood flow.


Subject(s)
Hepatic Artery/diagnostic imaging , Hypertension, Portal/diagnostic imaging , Liver Transplantation/adverse effects , Ultrasonography, Doppler , Vascular Resistance , Blood Flow Velocity , Female , Hepatic Artery/physiopathology , Humans , Hypertension, Portal/etiology , Hypertension, Portal/physiopathology , Liver Transplantation/mortality , Male , Middle Aged , Multivariate Analysis , Retrospective Studies , Sensitivity and Specificity , Spleen/diagnostic imaging , Survival Analysis
17.
Liver Transpl ; 25(7): 1023-1033, 2019 07.
Article in English | MEDLINE | ID: mdl-31087772

ABSTRACT

In patients with hepatocellular carcinoma (HCC) meeting the Milan criteria (MC), the benefit of locoregional therapies (LRTs) in the context of liver transplantation (LT) is still debated. Initial biases in the selection between treated and untreated patients have yielded conflicting reported results. The study aimed to identify, using a competing risk analysis, risk factors for HCC-dependent LT failure, defined as pretransplant tumor-related delisting or posttransplant recurrence. The study was registered at www.clinicaltrials.gov (identification number NCT03723304). In order to offset the initial limitations of the investigated population, an inverse probability of treatment weighting (IPTW) analysis was used: 1083 MC-in patients (no LRT = 182; LRT = 901) were balanced using 8 variables: age, sex, Model for End-Stage Liver Disease (MELD) value, hepatitis C virus status, hepatitis B virus status, largest lesion diameter, number of nodules, and alpha-fetoprotein (AFP). All the covariates were available at the first referral. After the IPTW, a pseudo-population of 2019 patients listed for LT was analyzed, comparing 2 homogeneous groups of untreated (n = 1077) and LRT-treated (n = 942) patients. Tumor progression after LRT was the most important independent risk factor for HCC-dependent failure (subhazard ratio [SHR], 5.62; P < 0.001). Other independent risk factors were major tumor diameter, AFP, MELD, patient age, male sex, and period of wait-list registration. One single LRT was protective compared with no treatment (SHR, 0.51; P < 0.001). The positive effect was still observed when 2-3 treatments were performed (SHR, 0.66; P = 0.02), but it was lost in the case of ≥4 LRTs (SHR, 0.80; P = 0.27). In conclusion, for MC-in patients, up to 3 LRTs are beneficial for success in intention-to-treat LT patients, with a 49% to 34% reduction in failure risk compared with untreated patients. This benefit is lost if more LRTs are required. A poor response to LRT is associated with a higher risk for HCC-dependent transplant failure.


Subject(s)
Ablation Techniques/methods , Carcinoma, Hepatocellular/therapy , Graft Rejection/epidemiology , Liver Neoplasms/therapy , Liver Transplantation/adverse effects , Preoperative Care/methods , Age Factors , Carcinoma, Hepatocellular/mortality , Carcinoma, Hepatocellular/pathology , Disease Progression , Female , Follow-Up Studies , Graft Rejection/etiology , Humans , Intention to Treat Analysis , Kaplan-Meier Estimate , Liver Neoplasms/mortality , Liver Neoplasms/pathology , Male , Middle Aged , Risk Assessment , Risk Factors , Sex Factors , Time Factors , Treatment Outcome , Waiting Lists/mortality
18.
Dig Liver Dis ; 49(9): 957-966, 2017 Sep.
Article in English | MEDLINE | ID: mdl-28801180

ABSTRACT

BACKGROUND: Current American and European guidelines consider a pre-transplant BMI ≥40kg/m2 as a relative contraindication for liver transplantation but this recommendation is graded as uncertain and requires further research. Moreover, conflicting results are reported on the predictive value of BMI 30-39.9kg/m2 on post-transplant complication and mortality risk. AIM: This study analyzed the data of the literature on the effect of all three BMI classes of obesity on postoperative outcomes in liver transplantation. MATERIALS AND METHODS: A PubMed and Cochrane Library search was conducted from inception to October 2015. RESULTS: Analysis of the literature demonstrates that discrepancies among studies are mainly either due to limitations of BMI per se, the different BMI cut-offs used to select patients with obesity or reference group and the different outcomes considered. Moreover, the evaluation of visceral adipose tissue and the detrimental effect of muscle mass reduction in presence of obesity are never considered. CONCLUSIONS: BMI assessment should be used as a preliminary method to evaluate obesity. Subsequently, the assessment of visceral adipose tissue and muscle mass should complete the preoperative evaluation of liver transplant candidates. This innovative approach could represent a new field of research in liver transplantation.


Subject(s)
Liver Transplantation/adverse effects , Liver Transplantation/mortality , Obesity/epidemiology , Postoperative Complications/epidemiology , Body Mass Index , Humans , Intra-Abdominal Fat , Muscle, Skeletal , Risk Factors
19.
Hepatology ; 66(6): 1910-1919, 2017 12.
Article in English | MEDLINE | ID: mdl-28653750

ABSTRACT

The debate about the best approach to select patients with hepatocellular cancer (HCC) waiting for liver transplantation (LT) is still ongoing. This study aims to identify the best variables allowing to discriminate between "high-" and "low-benefit" patients. To do so, the concept of intention-to-treat (ITT) survival benefit of LT has been created. Data of 2,103 adult HCC patients consecutively enlisted during the period 1987-2015 were analyzed. Three rigorous statistical steps were used in order to create the ITT survival benefit of LT: the development of an ITT LT and a non-LT survival model, and the individual prediction of the ITT survival benefit of LT defined as the difference between the median ITT survival with (based on the first model) and without LT (based on the second model) calculated for each enrolled patient. Four variables (Model for End-Stage Liver Disease, alpha-fetoprotein, Milan-Criteria status, and radiological response) displayed a high effect in terms of delta benefit. According to these risk factors, four benefit groups were identified. Patients with three to four factors ("no-benefit group"; n = 405 of 2,103; 19.2%) had no benefit of LT compared to alternative treatments. Conversely, patients without any risk factor ("large-benefit group"; n = 108; 5.1%) yielded the highest benefit from LT reaching 60 months. CONCLUSION: The ITT transplant survival benefit presented here allows physicians to better select HCC patients waiting for LT. The obtained stratification may lead to an improved and more equitable method of organ allocation. Patients without benefit should be de-listed, whereas patients with large benefit ratio should be prioritized for LT. (Hepatology 2017;66:1910-1919).


Subject(s)
Carcinoma, Hepatocellular/surgery , Liver Neoplasms/surgery , Liver Transplantation , Carcinoma, Hepatocellular/mortality , Europe , Female , Humans , Liver Neoplasms/mortality , Male , Middle Aged , Retrospective Studies
20.
Jpn J Radiol ; 35(3): 126-130, 2017 Mar.
Article in English | MEDLINE | ID: mdl-28074381

ABSTRACT

OBJECTIVE: A stress reaction involving increased adrenal hormone release occurs when starting adrenal venous sampling (AVS). The purpose of the present study was to investigate the effect of single shot venography on adrenal hormone production during AVS. SUBJECTS AND METHODS: This was a prospective self-controlled study. We enrolled 54 consecutive patients (21 men, 33 women; mean age 52 ± 11 years) with primary aldosteronism who underwent AVS from May 2014 to February 2015. Under non-stimulated conditions, blood samples were obtained from a common trunk of the left adrenal vein before and after single shot venography. The initial plasma aldosterone and cortisol concentration (PAC and PCC) were compared with those measured after venography for each patient. RESULTS: PAC and PCC were slightly but significantly decreased between before and after venography (after log transformation 2.12 ± 0.73 vs 2.07 ± 0.72, P = 0.00066, 1.89 ± 0.52 vs 1.83 ± 0.53, P = 0.00031, respectively). CONCLUSIONS: During non-stimulated left AVS, adrenal hormone secretion was slightly but significantly decreased after venography, similar to the normal time-related stress reaction. Venography did not increase the adrenal hormone secretion.


Subject(s)
Adrenal Glands/blood supply , Aldosterone/blood , Blood Specimen Collection/methods , Hydrocortisone/blood , Hyperaldosteronism/blood , Catheterization, Peripheral/methods , Contrast Media/administration & dosage , Female , Humans , Male , Middle Aged , Prospective Studies , Stress, Physiological/physiology , Veins/diagnostic imaging
SELECTION OF CITATIONS
SEARCH DETAIL
...