Your browser doesn't support javascript.
Improved method for COVID-19 Classification of Complex-Architecture CNN from Chest CT volumes using Orthogonal Ensemble Networks
Progress in Biomedical Optics and Imaging - Proceedings of SPIE ; 12465, 2023.
Artículo en Inglés | Scopus | ID: covidwho-20243842
ABSTRACT
This paper introduces the improved method for the COVID-19 classification based on computed tomography (CT) volumes using a combination of a complex-architecture convolutional neural network (CNN) and orthogonal ensemble networks (OEN). The novel coronavirus disease reported in 2019 (COVID-19) is still spreading worldwide. Early and accurate diagnosis of COVID-19 is required in such a situation, and the CT scan is an essential examination. Various computer-aided diagnosis (CAD) methods have been developed to assist and accelerate doctors' diagnoses. Although one of the effective methods is ensemble learning, existing methods combine some major models which do not specialize in COVID-19. In this study, we attempted to improve the performance of a CNN for the COVID-19 classification based on chest CT volumes. The CNN model specializes in feature extraction from anisotropic chest CT volumes. We adopt the OEN, an ensemble learning method considering inter-model diversity, to boost its feature extraction ability. For the experiment, We used chest CT volumes of 1283 cases acquired in multiple medical institutions in Japan. The classification result on 257 test cases indicated that the combination could improve the classification performance. © 2023 SPIE.
Palabras clave

Texto completo: Disponible Colección: Bases de datos de organismos internacionales Base de datos: Scopus Idioma: Inglés Revista: Progress in Biomedical Optics and Imaging - Proceedings of SPIE Año: 2023 Tipo del documento: Artículo

Similares

MEDLINE

...
LILACS

LIS


Texto completo: Disponible Colección: Bases de datos de organismos internacionales Base de datos: Scopus Idioma: Inglés Revista: Progress in Biomedical Optics and Imaging - Proceedings of SPIE Año: 2023 Tipo del documento: Artículo