Diagnosis of nasopharyngeal carcinoma with convolutional neural network on narrowband imaging / 临床耳鼻咽喉头颈外科杂志
Journal of Clinical Otorhinolaryngology Head and Neck Surgery
; (12): 483-486, 2023.
Article
em Zh
| WPRIM
| ID: wpr-982772
Biblioteca responsável:
WPRO
ABSTRACT
Objective:To evaluate the diagnostic accuracy of the convolutional neural network(CNN) in diagnosing nasopharyngeal carcinoma using endoscopic narrowband imaging. Methods:A total of 834 cases with nasopharyngeal lesions were collected from the People's Hospital of Guangxi Zhuang Autonomous Region between 2014 and 2016. We trained the DenseNet201 model to classify the endoscopic images, evaluated its performance using the test dataset, and compared the results with those of two independent endoscopic experts. Results:The area under the ROC curve of the CNN in diagnosing nasopharyngeal carcinoma was 0.98. The sensitivity and specificity of the CNN were 91.90% and 94.69%, respectively. The sensitivity of the two expert-based assessment was 92.08% and 91.06%, respectively, and the specificity was 95.58% and 92.79%, respectively. There was no significant difference between the diagnostic accuracy of CNN and the expert-based assessment (P=0.282, P=0.085). Moreover, there was no significant difference in the accuracy in discriminating early-stage and late-stage nasopharyngeal carcinoma(P=0.382). The CNN model could rapidly distinguish nasopharyngeal carcinoma from benign lesions, with an image recognition time of 0.1 s/piece. Conclusion:The CNN model can quickly distinguish nasopharyngeal carcinoma from benign nasopharyngeal lesions, which can aid endoscopists in diagnosing nasopharyngeal lesions and reduce the rate of nasopharyngeal biopsy.
Palavras-chave
Texto completo:
1
Base de dados:
WPRIM
Assunto principal:
China
/
Neoplasias Nasofaríngeas
/
Redes Neurais de Computação
/
Imagem de Banda Estreita
/
Carcinoma Nasofaríngeo
Limite:
Humans
País/Região como assunto:
Asia
Idioma:
Zh
Revista:
Journal of Clinical Otorhinolaryngology Head and Neck Surgery
Ano de publicação:
2023
Tipo de documento:
Article