Your browser doesn't support javascript.
loading
Feasibility of deep learning image-based segmentation algorithm in pathological section of gastric cancer / 第二军医大学学报
Article em Zh | WPRIM | ID: wpr-838165
Biblioteca responsável: WPRO
ABSTRACT
Objective To recognize cancer regions by using segmentation algorithm for pathological slices of gastric cancer based on deep learning. Methods The U-net network was used as the basic structure to design a deeper segmentation algorithm deeper U-Net (DU-Net) for gastric cancer pathological slices. The datasets were segmented into several small blocks by the region overlapping segmentation method. Then the blocks were firstly segmented by the pre-trained DU-Net model, and the new samples were re-synthesized using the image classifier to remove false positive samples. The new samples were repeatedly trained by repeated learning methods, and the results of segmentation were processed with fully connected conditional random field (CRF). Finally, the segmentation pictures of gastric cancer were obtained and validated. Results After 3 times of repeated learning, the mean accuracy of the DU-Net model for pathological slices of gastric cancer was 91.5%, and the mean intersection over union coefficient (IoU) was 88.4%. Compared with the basic DU-Net model without repeated learning, the mean accuracy and mean IoU of the DU-Net network were increased by 2.9% and 5.6%, respectively. Conclusion The segmentation algorithm for pathological slices of gastric cancer based on deep learning can accurately recognize cancer regions, improve the generalization ability and robustness of the model, and can be used for computer-assisted diagnosis of gastric cancer.
Palavras-chave
Texto completo: 1 Índice: WPRIM Tipo de estudo: Prognostic_studies Idioma: Zh Revista: Academic Journal of Second Military Medical University Ano de publicação: 2018 Tipo de documento: Article
Texto completo: 1 Índice: WPRIM Tipo de estudo: Prognostic_studies Idioma: Zh Revista: Academic Journal of Second Military Medical University Ano de publicação: 2018 Tipo de documento: Article