Your browser doesn't support javascript.
Dual attention multiple instance learning with unsupervised complementary loss for COVID-19 screening.
Chikontwe, Philip; Luna, Miguel; Kang, Myeongkyun; Hong, Kyung Soo; Ahn, June Hong; Park, Sang Hyun.
  • Chikontwe P; Department of Robotics Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu, South Korea.
  • Luna M; Department of Robotics Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu, South Korea.
  • Kang M; Department of Robotics Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu, South Korea.
  • Hong KS; Division of Pulmonology and Allergy, Department of Internal Medicine, Regional Center for Respiratory Diseases, Yeungnam University Medical Center, College of Medicine, Yeungnam University, Daegu, Korea.
  • Ahn JH; Division of Pulmonology and Allergy, Department of Internal Medicine, Regional Center for Respiratory Diseases, Yeungnam University Medical Center, College of Medicine, Yeungnam University, Daegu, Korea. Electronic address: fireajh@yu.ac.kr.
  • Park SH; Department of Robotics Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu, South Korea. Electronic address: shpark13135@dgist.ac.kr.
Med Image Anal ; 72: 102105, 2021 08.
Artículo en Inglés | MEDLINE | ID: covidwho-1240507
Preprint
Este artículo de revista científica es probablemente basado en un preprint previamente disponible, por medio del reconocimiento de similitud realizado por una máquina. La confirmación humana aún está pendiente.
Ver preprint
ABSTRACT
Chest computed tomography (CT) based analysis and diagnosis of the Coronavirus Disease 2019 (COVID-19) plays a key role in combating the outbreak of the pandemic that has rapidly spread worldwide. To date, the disease has infected more than 18 million people with over 690k deaths reported. Reverse transcription polymerase chain reaction (RT-PCR) is the current gold standard for clinical diagnosis but may produce false positives; thus, chest CT based diagnosis is considered more viable. However, accurate screening is challenging due to the difficulty in annotation of infected areas, curation of large datasets, and the slight discrepancies between COVID-19 and other viral pneumonia. In this study, we propose an attention-based end-to-end weakly supervised framework for the rapid diagnosis of COVID-19 and bacterial pneumonia based on multiple instance learning (MIL). We further incorporate unsupervised contrastive learning for improved accuracy with attention applied both in spatial and latent contexts, herein we propose Dual Attention Contrastive based MIL (DA-CMIL). DA-CMIL takes as input several patient CT slices (considered as bag of instances) and outputs a single label. Attention based pooling is applied to implicitly select key slices in the latent space, whereas spatial attention learns slice spatial context for interpretable diagnosis. A contrastive loss is applied at the instance level to encode similarity of features from the same patient against representative pooled patient features. Empirical results show that our algorithm achieves an overall accuracy of 98.6% and an AUC of 98.4%. Moreover, ablation studies show the benefit of contrastive learning with MIL.
Asunto(s)
Palabras clave

Texto completo: Disponible Colección: Bases de datos internacionales Base de datos: MEDLINE Asunto principal: Neumonía Viral / COVID-19 Tipo de estudio: Estudios diagnósticos / Estudio pronóstico Límite: Humanos Idioma: Inglés Revista: Med Image Anal Asunto de la revista: Diagnóstico por Imagen Año: 2021 Tipo del documento: Artículo País de afiliación: J.media.2021.102105

Similares

MEDLINE

...
LILACS

LIS


Texto completo: Disponible Colección: Bases de datos internacionales Base de datos: MEDLINE Asunto principal: Neumonía Viral / COVID-19 Tipo de estudio: Estudios diagnósticos / Estudio pronóstico Límite: Humanos Idioma: Inglés Revista: Med Image Anal Asunto de la revista: Diagnóstico por Imagen Año: 2021 Tipo del documento: Artículo País de afiliación: J.media.2021.102105