Your browser doesn't support javascript.
loading
Predicting rectal cancer prognosis from histopathological images and clinical information using multi-modal deep learning.
Xu, Yixin; Guo, Jiedong; Yang, Na; Zhu, Can; Zheng, Tianlei; Zhao, Weiguo; Liu, Jia; Song, Jun.
Affiliation
  • Xu Y; Department of General Surgery, The Affiliated Hospital of Xuzhou Medical University, Xuzhou, Jiangsu, China.
  • Guo J; Department of General Surgery, The Affiliated Hospital of Xuzhou Medical University, Xuzhou, Jiangsu, China.
  • Yang N; Artificial Intelligence Unit, Department of Medical Equipment Management, Affiliated Hospital of Xuzhou Medical University, Xuzhou, China.
  • Zhu C; Department of General Surgery, The Affiliated Hospital of Xuzhou Medical University, Xuzhou, Jiangsu, China.
  • Zheng T; Artificial Intelligence Unit, Department of Medical Equipment Management, Affiliated Hospital of Xuzhou Medical University, Xuzhou, China.
  • Zhao W; Artificial Intelligence Unit, Department of Medical Equipment Management, Affiliated Hospital of Xuzhou Medical University, Xuzhou, China.
  • Liu J; Department of General Surgery, The Affiliated Hospital of Xuzhou Medical University, Xuzhou, Jiangsu, China.
  • Song J; Department of General Surgery, The Affiliated Hospital of Xuzhou Medical University, Xuzhou, Jiangsu, China.
Front Oncol ; 14: 1353446, 2024.
Article in En | MEDLINE | ID: mdl-38690169
ABSTRACT

Objective:

The objective of this study was to provide a multi-modal deep learning framework for forecasting the survival of rectal cancer patients by utilizing both digital pathological images data and non-imaging clinical data. Materials and

methods:

The research included patients diagnosed with rectal cancer by pathological confirmation from January 2015 to December 2016. Patients were allocated to training and testing sets in a randomized manner, with a ratio of 41. The tissue microarrays (TMAs) and clinical indicators were obtained. Subsequently, we selected distinct deep learning models to individually forecast patient survival. We conducted a scanning procedure on the TMAs in order to transform them into digital pathology pictures. Additionally, we performed pre-processing on the clinical data of the patients. Subsequently, we selected distinct deep learning algorithms to conduct survival prediction analysis using patients' pathological images and clinical data, respectively.

Results:

A total of 292 patients with rectal cancer were randomly allocated into two groups a training set consisting of 234 cases, and a testing set consisting of 58 instances. Initially, we make direct predictions about the survival status by using pre-processed Hematoxylin and Eosin (H&E) pathological images of rectal cancer. We utilized the ResNest model to extract data from histopathological images of patients, resulting in a survival status prediction with an AUC (Area Under the Curve) of 0.797. Furthermore, we employ a multi-head attention fusion (MHAF) model to combine image features and clinical features in order to accurately forecast the survival rate of rectal cancer patients. The findings of our experiment show that the multi-modal structure works better than directly predicting from histopathological images. It achieves an AUC of 0.837 in predicting overall survival (OS).

Conclusions:

Our study highlights the potential of multi-modal deep learning models in predicting survival status from histopathological images and clinical information, thus offering valuable insights for clinical applications.
Key words

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Front Oncol Year: 2024 Document type: Article Affiliation country: China Country of publication: Switzerland

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Front Oncol Year: 2024 Document type: Article Affiliation country: China Country of publication: Switzerland