Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 28
Filter
1.
Article in English | MEDLINE | ID: mdl-38908731

ABSTRACT

BACKGROUND & AIMS: Continuous risk-stratification of candidates and urgency-based prioritization have been utilized for liver transplantation (LT) in patients with non-hepatocellular carcinoma (HCC) in the United States. Instead, for patients with HCC, a dichotomous criterion with exception points is still used. This study evaluated the utility of the hazard associated with LT for HCC (HALT-HCC), an oncological continuous risk score, to stratify waitlist dropout and post-LT outcomes. METHODS: A competing risk model was developed and validated using the UNOS database (2012-2021) through multiple policy changes. The primary outcome was to assess the discrimination ability of waitlist dropouts and LT outcomes. The study focused on the HALT-HCC score, compared with other HCC risk scores. RESULTS: Among 23,858 candidates, 14,646 (59.9%) underwent LT and 5196 (21.8%) dropped out of the waitlist. Higher HALT-HCC scores correlated with increased dropout incidence and lower predicted 5-year overall survival after LT. HALT-HCC demonstrated the highest area under the curve (AUC) values for predicting dropout at various intervals post-listing (0.68 at 6 months, 0.66 at 1 year), with excellent calibration (R2 = 0.95 at 6 months, 0.88 at 1 year). Its accuracy remained stable across policy periods and locoregional therapy applications. CONCLUSIONS: This study highlights the predictive capability of the continuous oncological risk score to forecast waitlist dropout and post-LT outcomes in patients with HCC, independent of policy changes. The study advocates integrating continuous scoring systems like HALT-HCC in liver allocation decisions, balancing urgency, organ utility, and survival benefit.

2.
Nature ; 622(7982): 393-401, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37821590

ABSTRACT

Recent human decedent model studies1,2 and compassionate xenograft use3 have explored the promise of porcine organs for human transplantation. To proceed to human studies, a clinically ready porcine donor must be engineered and its xenograft successfully tested in nonhuman primates. Here we describe the design, creation and long-term life-supporting function of kidney grafts from a genetically engineered porcine donor transplanted into a cynomolgus monkey model. The porcine donor was engineered to carry 69 genomic edits, eliminating glycan antigens, overexpressing human transgenes and inactivating porcine endogenous retroviruses. In vitro functional analyses showed that the edited kidney endothelial cells modulated inflammation to an extent that was indistinguishable from that of human endothelial cells, suggesting that these edited cells acquired a high level of human immune compatibility. When transplanted into cynomolgus monkeys, the kidneys with three glycan antigen knockouts alone experienced poor graft survival, whereas those with glycan antigen knockouts and human transgene expression demonstrated significantly longer survival time, suggesting the benefit of human transgene expression in vivo. These results show that preclinical studies of renal xenotransplantation could be successfully conducted in nonhuman primates and bring us closer to clinical trials of genetically engineered porcine renal grafts.


Subject(s)
Graft Rejection , Kidney Transplantation , Macaca fascicularis , Swine , Transplantation, Heterologous , Animals , Humans , Animals, Genetically Modified , Endothelial Cells/immunology , Endothelial Cells/metabolism , Graft Rejection/immunology , Graft Rejection/prevention & control , Kidney Transplantation/methods , Polysaccharides/deficiency , Swine/genetics , Transplantation, Heterologous/methods , Transgenes/genetics
3.
Nat Commun ; 14(1): 3022, 2023 06 13.
Article in English | MEDLINE | ID: mdl-37311769

ABSTRACT

Porcine kidney xenotransplantation is accelerating towards clinical translation. However, despite the demonstrated ability of porcine kidneys to remove metabolic waste products, questions remain about their ability to faithfully recapitulate renal endocrine functions after transplantation. Here we analyze xenograft growth and function of two kidney dependent endocrine pathways in seventeen cynomolgus macaques after kidney xenotransplantation from gene edited Yucatan minipigs. Xenograft growth, the renin-angiotensinogen aldosterone-system, and the calcium-vitamin D-parathyroid hormone axis are assessed using clinical chemistries data, renin activity and beta-C-terminal-telopeptide assays, kidney graft RNA-sequencing and serial ultrasonography. We demonstrate that xenografts transplanted from minipigs show only modest growth and do not substantially contribute to recipient RAAS pathway activity. However, parathyroid hormone-independent hypercalcemia and hypophosphatemia are observed, suggesting a need for close monitoring and timely intervention during human testing. Further study of these phenotypes is warranted in designing prospective clinical trials.


Subject(s)
Kidney , Renin , Humans , Animals , Swine , Transplantation, Heterologous , Swine, Miniature , Prospective Studies , Macaca fascicularis
5.
Am J Transplant ; 22(6): 1527-1536, 2022 06.
Article in English | MEDLINE | ID: mdl-35143091

ABSTRACT

Facile gene editing has accelerated progress in pig to non-human-primate (NHP) renal xenotransplantation, however, outcomes are considered inferior to NHP-allotransplantation. This systematic review and outcomes analysis of life-sustaining NHP-renal transplantation aimed to benchmark "preclinical success" and aggregated 1051 NHP-to-NHP or pig-to-NHP transplants across 88 articles. Although protocols varied, NHP-allotransplantation survival (1, 3, 12months, 67.5%, 37.1%, 13.2%) was significantly greater than NHP-xenotransplantation (1, 3, 12 months, 38.8%, 14.0%, 4.4%; p < .001); a difference partially mitigated by gene-edited donors containing at least knockout of alpha-1,3-galactosyltransferase (1, 3, 12 months, 47.1%, 24.2%, 7.6%; p < .001). Pathological analysis demonstrated more cellular rejection in allotransplantation (62.8% vs. 3.1%, p < .001) and more antibody-mediated rejection in xenotransplantation (6.8% vs. 45.5%, p < .001). Nonrejection causes of graft loss between allotransplants and xenotransplants differed; infection and animal welfare (1.7% vs. 11.2% and 3.9% vs. 17.0%, respectively, p < .001 for both). Importantly, even among a subgroup of unsensitized rhesus macaques under long-term immunosuppression, NHP-allotransplant survival was significantly inferior to clinical allotransplantation (6 months, 36.1% vs. 94.0%; p < .001), which suggests clinical outcomes with renal xenografts may be better than predicted by current preclinical data.


Subject(s)
Kidney Transplantation , Transplants , Animals , Graft Rejection/etiology , Graft Survival , Heterografts , Humans , Kidney Transplantation/adverse effects , Kidney Transplantation/methods , Macaca mulatta , Swine , Transplantation, Heterologous/adverse effects , Transplantation, Heterologous/methods
6.
Transpl Int ; 34(8): 1433-1443, 2021 Aug.
Article in English | MEDLINE | ID: mdl-33599045

ABSTRACT

The use of livers from donation after circulatory death (DCD) is historically characterized by increased rates of biliary complications and inferior short-term graft survival (GS) compared to donation after brain death (DBD) allografts. This study aimed to evaluate the dynamic prognostic impact of DCD livers to reveal whether they remain an adverse factor even after patients survive a certain period following liver transplant (LT). This study used 74 961 LT patients including 4065 DCD LT in the scientific registry of transplant recipients from 2002-2017. The actual, 1 and 3-year conditional hazard ratio (HR) of 1-year GS in DCD LT were calculated using a conditional version of Cox regression model. The actual 1-, 3-, and 5-year GS of DCD LT recipients were 83.3%, 73.3%, and 66.3%, which were significantly worse than those of DBD (all P < 0.01). Actual, 1-, and 3-year conditional HR of 1-year GS in DCD compared to DBD livers were 1.87, 1.49, and 1.39, respectively. Graft loss analyses showed that those lost to biliary related complications were significantly higher in the DCD group even 3 years after LT. National registry data demonstrate the protracted higher risks inherent to DCD liver grafts in comparison to their DBD counterparts, despite survival through the early period after LT. These findings underscore the importance of judicious DCD graft selection at individual center level to minimize the risk of long-term biliary complications.


Subject(s)
Liver Transplantation , Tissue and Organ Procurement , Brain Death , Death , Graft Survival , Humans , Proportional Hazards Models , Retrospective Studies , Tissue Donors
7.
Transplantation ; 105(9): 1998-2006, 2021 09 01.
Article in English | MEDLINE | ID: mdl-32947583

ABSTRACT

BACKGROUND: Rates of withdrawal of life-sustaining treatment are higher among critically ill pediatric patients compared to adults. Therefore, livers from pediatric donation after circulatory death (pDCD) could improve graft organ shortage and waiting time for listed patients. As knowledge on the utilization of pDCD is limited, this study used US national registry data (2002-2017) to estimate the prognostic impact of pDCD in both adult and pediatric liver transplant (LT). METHODS: In adult LT, the short-term (1-year) and long-term (overall) graft survival (GS) between pDCD and adult donation after circulatory death (aDCD) grafts was compared. In pediatric LT, the short- and long-term prognostic outcomes of pDCD were compared with other type of grafts (brain dead, split, and living donor). RESULTS: Of 80 843 LTs in the study, 8967 (11.1%) were from pediatric donors. Among these, only 443 were pDCD, which were utilized mainly in adult recipients (91.9%). In adult recipients, short- and long-term GS did not differ significantly between pDCD and aDCD grafts (hazard ratio = 0.82 in short term and 0.73 in long term, both P > 0.05, respectively). Even "very young" (≤12 y) pDCD grafts had similar GS to aDCD grafts, although the rate of graft loss from vascular complications was higher in the former (14.0% versus 3.6%, P < 0.01). In pediatric recipients, pDCD grafts showed similar GS with other graft types whereas waiting time for DCD livers was significantly shorter (36.5 d versus 53.0 d, P < 0.01). CONCLUSIONS: Given the comparable survival seen to aDCDs, this data show that there is still much scope to improve the utilization of pDCD liver grafts.


Subject(s)
Donor Selection , Graft Survival , Liver Transplantation , Tissue Donors/supply & distribution , Adolescent , Adult , Age Factors , Cause of Death , Child , Child, Preschool , Female , Humans , Incidence , Infant , Infant, Newborn , Liver Transplantation/adverse effects , Male , Middle Aged , Postoperative Complications/epidemiology , Registries , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome , United States/epidemiology , Waiting Lists , Young Adult
8.
Hepat Oncol ; 7(3): HEP26, 2020 Jul 24.
Article in English | MEDLINE | ID: mdl-32774836

ABSTRACT

Liver transplantation for hepatocellular carcinoma has proved to be a highly effective cure if the right patient can be selected. Milan criteria has traditionally guided physicians toward appropriate liver allocation but changes in clinical practice, patient populations and recent developments in biomarkers are decreasing Milan criteria's utility. At the same time, the literature has flooded with a diversity of new criteria that demonstrate strong predictive power and are better suited for current clinical practice. In this article, the utility of newly proposed criteria will be reviewed and important issues to improve future criteria will be addressed in hopes of opening a discussion on how key questions surrounding criteria for liver transplantation of hepatocellular carcinoma can be answered.

9.
Hepatology ; 71(2): 569-582, 2020 02.
Article in English | MEDLINE | ID: mdl-31243778

ABSTRACT

Prognosticating outcomes in liver transplant (LT) for hepatocellular carcinoma (HCC) continues to challenge the field. Although Milan Criteria (MC) generalized the practice of LT for HCC and improved outcomes, its predictive character has degraded with increasing candidate and oncological heterogeneity. We sought to validate and recalibrate a previously developed, preoperatively calculated, continuous risk score, the Hazard Associated with Liver Transplantation for Hepatocellular Carcinoma (HALTHCC), in an international cohort. From 2002 to 2014, 4,089 patients (both MC in and out [25.2%]) across 16 centers in North America, Europe, and Asia were included. A continuous risk score using pre-LT levels of alpha-fetoprotein, Model for End-Stage Liver Disease Sodium score, and tumor burden score was recalibrated among a randomly selected cohort (n = 1,021) and validated in the remainder (n = 3,068). This study demonstrated significant heterogeneity by site and year, reflecting practice trends over the last decade. On explant pathology, both vascular invasion (VI) and poorly differentiated component (PDC) increased with increasing HALTHCC score. The lowest-risk patients (HALTHCC 0-5) had lower rates of VI and PDC than the highest-risk patients (HALTHCC > 35) (VI, 7.7%[ 1.2-14.2] vs. 70.6% [48.3-92.9] and PDC:4.6% [0.1%-9.8%] vs. 47.1% [22.6-71.5]; P < 0.0001 for both). This trend was robust to MC status. This international study was used to adjust the coefficients in the HALTHCC score. Before recalibration, HALTHCC had the greatest discriminatory ability for overall survival (OS; C-index = 0.61) compared to all previously reported scores. Following recalibration, the prognostic utility increased for both recurrence (C-index = 0.71) and OS (C-index = 0.63). Conclusion: This large international trial validated and refined the role for the continuous risk metric, HALTHCC, in establishing pre-LT risk among candidates with HCC worldwide. Prospective trials introducing HALTHCC into clinical practice are warranted.


Subject(s)
Carcinoma, Hepatocellular/surgery , Liver Neoplasms/surgery , Liver Transplantation , Risk Assessment , Female , Humans , International Cooperation , Male , Middle Aged , Prognosis , Retrospective Studies
10.
Clin Transplant ; 33(12): e13743, 2019 12.
Article in English | MEDLINE | ID: mdl-31655000

ABSTRACT

OBJECTIVE: The objective of this retrospective study was to characterize the neutrophil to lymphocyte ratio (NLR) on the waitlist and determine its prognostic utility in liver transplantation (LT) for hepatocellular carcinoma (HCC) with special focus on longitudinal data. Biomarkers such as pre-operative NLR have been suggested to predict poor oncological outcomes for patients with HCC seeking LT. NLR's utility is thought to be related to tumor biology. However, recent studies have demonstrated that a high NLR conveys worse outcomes in non-HCC cirrhotics. This study investigated the relationship between NLR, liver function, tumor factors and patient prognosis. METHODS: Patients with HCC undergoing LT were identified between 2002 and 2014 (n = 422). Variables of interest were collected longitudinally from time of listing until LT. The prognostic utility of NLR was assessed using Kaplan-Meier and Cox Proportional Hazard regression. Associations between NLR and MELD-Na, AFP, and tumor morphology were also assessed. RESULTS: NLR demonstrated a positive correlation with MELD-Na at LT (R2 = 0.125, P < 0.001) and had parallel trends over time. The lowest NLR quartile had a median MELD-Na of 9 while the highest had a median MELD-Na of 19. There were minimal differences in AFP, tumor morphology, and rates of vascular invasion between quartiles. NLR was a statistically significant predictor of OS (HR = 1.64, P = 0.017) and recurrence (HR = 1.59, P = 0.016) even after controlling for important tumor factors. However, NLR lost its statistical significance when MELD-Na was added to the Cox regression model (OS: HR = 1.46, P = 0.098) (recurrence: HR = 1.40, P = 0.115). CONCLUSIONS: NLR is a highly volatile marker on the waitlist that demonstrates a significant correlation and collinearity with MELD-Na temporally and at the time of LT. These characteristics of NLR bring into question its utility as a predictive marker in HCC patients.


Subject(s)
Carcinoma, Hepatocellular/mortality , Liver Neoplasms/mortality , Liver Transplantation/mortality , Lymphocytes/pathology , Neutrophils/pathology , Waiting Lists/mortality , Aged , Carcinoma, Hepatocellular/pathology , Carcinoma, Hepatocellular/surgery , Female , Follow-Up Studies , Humans , Liver Neoplasms/pathology , Liver Neoplasms/surgery , Male , Middle Aged , Prognosis , Retrospective Studies , Risk Factors , Survival Rate
11.
Clin Transplant ; 33(11): e13723, 2019 11.
Article in English | MEDLINE | ID: mdl-31583762

ABSTRACT

OBJECTIVE: Portal vein thrombosis (PVT) does not preclude liver transplantation (LT), but poor portal vein (PV) flow after LT remains a predictor of poor outcomes. Given the physiologic tendency of the hepatic artery (HA) to compensate for low PV flow via vasodilation, we investigated whether adequate HA flow would have a favorable prognostic impact among patients with low PV flow following LT. METHODS: This study included 163 patients with PVT who underwent LT between 2004 and 2015. PV and HA flow were categorized into quartiles, and their association with 1-year graft survival (GS) and biliary complication rates was assessed. For both the HA and the PV, patients at the lowest two quartiles were categorized as having low flow and the remainder as having high flow. RESULTS: The median MELD score was 22 and 1-year GS was 87.3%. As expected, GS paralleled PV flow with patients at the lowest flow quartile faring the worst. In combination of PV and HA flows, high HA flow was associated with improved 1-year GS among patients with low PV flow (P = .03). Similar findings were observed with respect to biliary complication rates. CONCLUSIONS: Sufficient HA flow may compensate for poor PV flow. Consequently, meticulous HA reconstruction may be central to achieving optimal outcomes in PVT cases.


Subject(s)
Hepatic Artery/physiopathology , Liver Diseases/mortality , Liver Transplantation/mortality , Liver/blood supply , Portal Vein/pathology , Venous Thrombosis/mortality , Adult , Aged , Female , Follow-Up Studies , Graft Survival , Humans , Liver Circulation , Liver Diseases/surgery , Male , Middle Aged , Prognosis , Survival Rate , Venous Thrombosis/physiopathology
12.
Liver Transpl ; 25(8): 1241-1250, 2019 08.
Article in English | MEDLINE | ID: mdl-31119826

ABSTRACT

This study estimated the utility of technical variant grafts (TVGs), such as split/reduced liver transplantation (SRLT) and living donor liver transplantation (LDLT), in pediatric acute liver failure (PALF). PALF is a devastating condition portending a poor prognosis without liver transplantation (LT). Pediatric candidates have fewer suitable deceased donor liver transplantation (DDLT) donor organs, and the efficacy of TVG in this setting remains incompletely investigated. PALF patients from 1995 to 2015 (age <18 years) were identified using the Scientific Registry of Transplant Recipients (n = 2419). Cox proportional hazards model and Kaplan-Meier curves were used to assess outcomes. Although wait-list mortality decreased (19.1% to 9.7%) and successful transplantations increased (53.7% to 62.2%), patients <1 year of age had persistently higher wait-list mortality rates (>20%) compared with other age groups (P < 0.001). TVGs accounted for only 25.7% of LT for PALF. In the adjusted model for wait-list mortality, among other factors, increased age (subhazard ratio [SHR], 0.97 per year; P = 0.020) and access to TVG were associated with decreased risk (SHR, 0.37; P < 0.0001). LDLT recipients had shorter median waiting times compared with DDLT (LDLT versus DDLT versus SRLT, 3 versus 4 versus 5 days, respectively; P = 0.017). In the adjusted model for post-LT survival, LDLT was superior to DDLT using whole grafts (SHR, 0.41; P = 0.004). However, patient survival after SRLT was not statistically different from DDLT (SHR, 0.75; P = 0.165). In conclusion, despite clear advantages to reduce wait-list mortality, TVGs have been underutilized in PALF. Early access to TVG, especially from LDLT, should be sought to further improve outcomes.


Subject(s)
Liver Failure, Acute/surgery , Liver Transplantation/methods , Living Donors , Waiting Lists/mortality , Adolescent , Age Factors , Allografts/statistics & numerical data , Allografts/supply & distribution , Child , Child, Preschool , Female , Follow-Up Studies , Graft Survival , Humans , Infant , Kaplan-Meier Estimate , Liver Failure, Acute/diagnosis , Liver Failure, Acute/mortality , Liver Transplantation/statistics & numerical data , Liver Transplantation/trends , Male , Prognosis , Registries/statistics & numerical data , Retrospective Studies , Risk Factors , Severity of Illness Index , Time Factors , Time-to-Treatment , Treatment Outcome
13.
Liver Transpl ; 25(5): 741-751, 2019 05.
Article in English | MEDLINE | ID: mdl-30615254

ABSTRACT

A recent study using US national registry data reported, using Cox proportional hazards (PH) models, that split-liver transplantation (SLT) has improved over time and is no more hazardous than whole-liver transplantation (WLT). However, the study methods violated the PH assumption, which is the fundamental assumption of Cox modeling. As a result, the reported hazard ratios (HRs) are biased and unreliable. This study aimed to investigate whether the risk of graft survival (GS) in SLT has really improved over time, ensuring attention to the PH assumption. This study included 80,998 adult deceased donor liver transplantation (LT) (1998-2015) from the Scientific Registry Transplant Recipient. The study period was divided into 3 time periods: era 1 (January 1998 to February 2002), era 2 (March 2002 to December 2008), and era 3 (January 2009 to December 2015). The PH assumption was tested using Schoenfeld's test, and where the HR of SLT violated the assumption, changes in risk for SLT over time from transplant were assessed. SLT was performed in 1098 (1.4%) patients, whereas WLT was used in 79,900 patients. In the Cox PH analysis, the P values of Schoenfeld's global tests were <0.05 in all eras, which is consistent with deviation from proportionality. Assessing HRs of SLT with a time-varying effect, multiple Cox models were conducted for post-LT intervals. The HR curves plotted according to time from transplant were higher in the early period and then decreased at approximately 1 year and continued to decrease in all eras. For 1-year GS, the HRs of SLT were 1.92 in era 1, 1.52 in era 2, and 1.47 in era 3 (all P < 0.05). In conclusion, the risk of SLT has a time-varying effect and is highest in the early post-LT period. The risk of SLT is underestimated if it is evaluated by overall GS. SLT was still hazardous if the PH assumption was considered, although it became safer over time.


Subject(s)
Graft Rejection/epidemiology , Graft Survival , Liver Transplantation/adverse effects , Tissue Donors/statistics & numerical data , Transplant Recipients/statistics & numerical data , Adolescent , Adult , Allografts/supply & distribution , Allografts/surgery , Data Interpretation, Statistical , Female , Follow-Up Studies , Graft Rejection/etiology , Humans , Liver/surgery , Liver Transplantation/methods , Male , Middle Aged , Registries/statistics & numerical data , Time Factors , Treatment Outcome , United States/epidemiology , Young Adult
15.
HPB (Oxford) ; 21(6): 702-710, 2019 06.
Article in English | MEDLINE | ID: mdl-30501989

ABSTRACT

INTRODUCTION: Investigation into right and left-sided primary colon liver metastasis (CLM) has revealed differences in the tumor biology and prognosis. This indicates that preoperative and operative factors may affect outcomes of right-sided primary CLM differently than left. This retrospective analysis investigated the effects of resection margin stratified by left and right-sided primary CLM on overall survival (OS) for patients undergoing hepatectomy. METHODS: A total of 732 patients undergoing hepatic resection for CLM at the Cleveland Clinic and Johns Hopkins were identified between 2002 and 2016. Clinically significant variables were analyzed using Cox proportional hazard regression. The cohort was then divided into patients with right and left-sided CLM and analyzed separately using Kaplan Meier analysis and Cox proportional hazard regression. RESULTS: Cox proportional hazard regression showed that left-sided CLM with an R0 margin was a statistically significant predictor of OS even after controlling for other important factors (HR = 0.629, P = 0.024) but right-sided CLM with R0 margin was not (HR = 0.788, P = 0.245). Kaplan-Meier analysis demonstrated that patients with a left-sided CLM and R0 margin had the best prognosis (P = 0.037). CONCLUSION: Surgical margin is an important prognostic factor for left-sided primary CLM but tumor biology may override surgical technique for right-sided CLM.


Subject(s)
Colonic Neoplasms/pathology , Hepatectomy/methods , Liver Neoplasms/surgery , Margins of Excision , Neoplasm Staging , Aged , Colonic Neoplasms/mortality , Colonic Neoplasms/surgery , Female , Follow-Up Studies , Humans , Liver Neoplasms/diagnosis , Liver Neoplasms/secondary , Male , Middle Aged , Prognosis , Retrospective Studies , Survival Rate/trends , United States/epidemiology
16.
Elife ; 72018 08 01.
Article in English | MEDLINE | ID: mdl-30066671

ABSTRACT

Understanding cellular processes occurring in vivo on time scales of days to weeks requires repeatedly interrogating the same tissue without perturbing homeostasis. We describe a novel setup for longitudinal intravital imaging of murine peripheral lymph nodes (LNs). The formation and evolution of single germinal centers (GCs) was visualized over days to weeks. Naïve B cells encounter antigen and form primary foci, which subsequently seed GCs. These experience widely varying rates of homogenizing selection, even within closely confined spatial proximity. The fluidity of GCs is greater than previously observed with large shifts in clonality over short time scales; and loss of GCs is a rare, observable event. The observation of contemporaneous, congruent shifts in clonal composition between GCs within the same animal suggests inter-GC trafficking of memory B cells. This tool refines approaches to resolving immune dynamics in peripheral LNs with high temporospatial resolution and minimal perturbation of homeostasis.


Subject(s)
B-Lymphocytes/immunology , Clonal Evolution , Clonal Selection, Antigen-Mediated/immunology , Germinal Center/cytology , Lymph Nodes/cytology , Animals , B-Lymphocytes/cytology , B-Lymphocytes/physiology , Cell Movement , Cells, Cultured , Germinal Center/immunology , Germinal Center/physiology , Lymph Nodes/immunology , Lymph Nodes/physiology , Mice , Mice, Inbred C57BL , Time-Lapse Imaging
17.
Epilepsia ; 59(9): 1667-1675, 2018 09.
Article in English | MEDLINE | ID: mdl-30142255

ABSTRACT

OBJECTIVE: Stereotactic electroencephalography (SEEG) is used for the evaluation and identification of the epileptogenic zone (EZ) in patients suffering from medically refractory seizures and relies upon the accurate implantation of depth electrodes. Accurate implantation is critical for identification of the EZ. Multiple electrodes and implantation systems exist, but these have not previously been systematically evaluated for implantation accuracy. This study compares the accuracy of two SEEG electrode implantation methods. METHODS: Thirteen "technique 1" electrodes (applying guiding bolts and external stylets) and 13 "technique 2" electrodes (without guiding bolts and external stylets) were implanted into four cadaver heads (52 total of each) according to each product's instructions for use using a stereotactic robot. Postimplantation computed tomography scans were compared to preimplantation computed tomography scans and to the previously defined targets. Electrode entry and final depth location were measured by Euclidean coordinates. The mean errors of each technique were compared using linear mixed effects models. RESULTS: Primary analysis revealed that the mean error difference of the technique 1 and 2 electrodes at entry and target favored the technique 1 electrode implantation accuracy (P < 0.001). Secondary analysis demonstrated that orthogonal implantation trajectories were more accurate than oblique trajectories at entry for technique 1 electrodes (P = 0.002). Furthermore, deep implantations were significantly less accurate than shallow implantations for technique 2 electrodes (P = 0.005), but not for technique 1 electrodes (P = 0.50). SIGNIFICANCE: Technique 1 displays greater accuracy following SEEG electrode implantation into human cadaver heads. Increased implantation accuracy may lead to increased success in identifying the EZ and increased seizure freedom rates following surgery.


Subject(s)
Brain/physiology , Electrodes, Implanted , Stereotaxic Techniques , Brain/diagnostic imaging , Brain Mapping , Cadaver , Electroencephalography , Humans , Imaging, Three-Dimensional
19.
Hepatology ; 68(4): 1448-1458, 2018 10.
Article in English | MEDLINE | ID: mdl-29604231

ABSTRACT

Patients with hepatocellular carcinoma (HCC) are screened at presentation for appropriateness of liver transplantation (LT) using morphometric criteria, which poorly specifies risk. Morphology is the crux of measuring tumor response to locoregional therapy (LRT) using modified Response Evaluation Criteria in Solid Tumors (mRECIST). This study investigated the utility of following a continuous risk score (hazard associated with liver transplantation in hepatocellular carcinoma; HALTHCC) to longitudinally assess risk. This multicenter, retrospective study from 2002 to 2014 enrolled 419 patients listed for LT for HCC. One cohort had LRT while waiting (n = 351), compared to the control group (n = 68) without LRT. Imaging studies (n = 2,085) were collated to laboratory data to calculate HALTHCC, MORAL, Metroticket 2.0, and alpha fetoprotein (AFP) score longitudinally. Cox proportional hazards evaluated associations of HALTHCC and peri-LRT changes with intention-to-treat (ITT) survival (considering dropout or post-LT mortality), and utility was assessed with Harrell's C-index. HALTHCC better predicted ITT outcome (LT = 309; dropout = 110) when assessed closer to delisting (P < 0.0001), maximally just before delisting (C-index, 0.742 [0.643-0.790]). Delta-HALTHCC post-LRT was more sensitive to changes in risk than mRECIST. HALTHCC score and peri-LRT percentage change were independently associated with ITT mortality (hazard ratio = 1.105 [1.045-1.169] per point and 1.014 [1.004-1.024] per percent, respectively). CONCLUSIONS: HALTHCC is superior in assessing tumor risk in candidates awaiting LT, and its utility increases over time. Peri-LRT relative change in HALTHCC outperforms mRECIST in stratifying risk of dropout, mortality, and recurrence post-LT. With improving estimates of post-LT outcomes, it is reasonable to consider allocation using HALTHCC and not just waiting time. Furthermore, this study supports a shift in perspective, from listing to allocation, to better utilize precious donor organs. (Hepatology 2018).


Subject(s)
Carcinoma, Hepatocellular/pathology , Carcinoma, Hepatocellular/surgery , Liver Neoplasms/pathology , Liver Neoplasms/surgery , Liver Transplantation/methods , Waiting Lists , Adult , Biomarkers, Tumor/analysis , Biopsy, Needle , Carcinoma, Hepatocellular/mortality , Case-Control Studies , Female , Follow-Up Studies , Graft Survival , Humans , Immunohistochemistry , Kaplan-Meier Estimate , Liver Neoplasms/mortality , Liver Transplantation/adverse effects , Longitudinal Studies , Male , Middle Aged , Patient Selection , Proportional Hazards Models , Retrospective Studies , Risk Assessment , Survival Rate , Treatment Outcome , United States , alpha-Fetoproteins/metabolism
20.
Cell ; 170(5): 913-926.e19, 2017 Aug 24.
Article in English | MEDLINE | ID: mdl-28841417

ABSTRACT

Germinal centers (GCs) are the primary sites of clonal B cell expansion and affinity maturation, directing the production of high-affinity antibodies. This response is a central driver of pathogenesis in autoimmune diseases, such as systemic lupus erythematosus (SLE), but the natural history of autoreactive GCs remains unclear. Here, we present a novel mouse model where the presence of a single autoreactive B cell clone drives the TLR7-dependent activation, expansion, and differentiation of other autoreactive B cells in spontaneous GCs. Once tolerance was broken for one self-antigen, autoreactive GCs generated B cells targeting other self-antigens. GCs became independent of the initial clone and evolved toward dominance of individual clonal lineages, indicating affinity maturation. This process produced serum autoantibodies to a breadth of self-antigens, leading to antibody deposition in the kidneys. Our data provide insight into the maturation of the self-reactive B cell response, contextualizing the epitope spreading observed in autoimmune disease.


Subject(s)
B-Lymphocytes/immunology , Clonal Evolution , Germinal Center/cytology , Germinal Center/immunology , Immune Tolerance , Animals , Autoantibodies/immunology , Autoantigens/immunology , Autoimmune Diseases/immunology , B-Lymphocytes/cytology , Chimera/immunology , Epitopes/immunology , Kidney/immunology , Mice , Mice, Inbred C57BL
SELECTION OF CITATIONS
SEARCH DETAIL
...