Worth a Thousand Words: Automated Diagnosis of COVID-19 from Chest CTs

Scientists develop new algorithm for rapid, computerized diagnosis of COVID-19, overcoming the limitations of reverse transcription polymerase chain reaction

Philip Chikontwe (Left) and Prof. Sang Hyun Park from Daegu Gyeongbuk Institute of Science and Technology (DGIST), South Korea, have developed a new framework for accurate and interpretable automated analysis of chest CT scans.

Automated frameworks for the analysis of medical images, such as chest CT scans, greatly cut down the time-to-diagnosis for individuals. It also reduces the burden on radiologists in countries with limited medical and human resources.

The current gold standard for COVID-19 diagnosis is a nasal swab followed by reverse transcription polymerase chain reaction. But such tests are time consuming, requiring days before results are available. This wastes crucial time in the treatment and prevention of the disease. Recently, scientists from Korea have developed a computerized framework that can swiftly and accurately interpret chest CT scans to provide COVID-19 diagnosis in minutes, potentially changing how we tackle this disease.

 

In a little over 18 months, the novel coronavirus (Sars-CoV-2) has infected over 18 million people and caused more than 690,000 deaths. The current standard for diagnosis through reverse transcription polymerase chain reaction is limited owing to its low sensitivity, high rate of false positives, and long testing times. This makes it difficult to identify infected patients quickly and provide them with treatment. Furthermore, there is a risk that patients will still spread the disease while waiting for the results of their diagnostic test.

 

Chest CT scans have emerged as a quick and effective way to diagnose the disease, but they require radiologist expertise to interpret, and sometimes the scans look similar to other kinds of lung infections, like bacterial pneumonia. Now, a new paper in Medical Imaging Science by a team of scientists, including those from Daegu Gyeongbuk Institute of Science (DGIST), South Korea, details a technique for the automated and accurate interpretation of chest CT scans. “As academics who were equally affected by the COVID pandemic, we were keen to use our expertise in medical image analysis to aid in faster diagnosis and improve clinical workflows,” says Prof. Sang Hyun Park and Philip Chikontwe from DGIST, who led the study.

 

To build their diagnostic framework, the research team used a Machine Learning technique called “Multiple Instance Learning” (MIL). In MIL, the machine learning algorithm is “trained” using sets, or “bags,” of multiple examples called “instances.” The MIL algorithm then uses these bags to learn to label individual examples or inputs. The research team trained their new framework, called dual attention contrastive based MIL (DA-CMIL), to differentiate between COVID and bacterial pneumonia, and found that its performance was on par to other state-of-the-art automated image analysis methods. Moreover, the DA-CMIL algorithm can leverage limited or incomplete information to efficiently train its AI system.

 

“Our study can be viewed from both a technical and clinical perspective. First, the algorithms introduced here can be extended to similar settings with other types of medical images. Second, the ‘dual attention,’ particularly the ‘spatial attention,’ used in the model improves the interpretability of the algorithm, which will help clinicians understand how automated solutions make decisions,” explain Prof. Park and Mr. Chikontwe.

 

This research extends far beyond the COVID pandemic, laying the foundation for the development of more robust and cheap diagnostic systems, which will be of particular benefit to under-developed countries or countries with otherwise limited medical and human resources.

 

Reference

Authors:

Philip Chikontwe1, Miguel Luna1, Myeongkyun Kang1, Kyung Soo Hong2, June Hong Ahn2, *, Sang Hyun Park1, *

Title of original paper:

Dual attention multiple instance learning with unsupervised complementary loss for COVID-19 screening

Journal:

Medical Image Analysis

DOI:

10.1016/j.media.2021.102105

Affiliations:

1 Department of Robotics Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), South Korea

2College of Medicine, Yeungnam University, South Korea

 

*Corresponding author’s email: [email protected]; [email protected]

 

 

About Daegu Gyeongbuk Institute of Science and Technology (DGIST)

Daegu Gyeongbuk Institute of Science and Technology (DGIST) is a well-known and respected research institute located in Daegu, Republic of Korea. Established in 2004 by the Korean Government, the main aim of DGIST is to promote national science and technology, as well as to boost the local economy.

With a vision of “Changing the world through convergence", DGIST has undertaken a wide range of research in various fields of science and technology. DGIST has embraced a multidisciplinary approach to research and undertaken intensive studies in some of today's most vital fields. DGIST also has state-of-the-art-infrastructure to enable cutting-edge research in materials science, robotics, cognitive sciences, and communication engineering.

 

Website:   https://www.dgist.ac.kr/en/html/sub01/010204.html

 

 

About the author

Sang Hyun Park is an Assistant Professor at the Dept. of Robotics Engineering at DGIST since 2017. Before that, he was a Postdoctoral Fellow for a year at SRI International and two years at University of North Carolina. With a Ph.D. from Seoul National University, his research interests include medical image analysis, computer vision, and machine learning.

 

Philip Chikontwe is a PhD Candidate with the medical imaging and signal processing lab (MISPL) under the Robotics Engineering division. With an MSc. from Chonbuk National University, his research interests include medical image analysis with deep learning under limited or incomplete supervision.  

Published: 22 Oct 2021

Contact details:

DGIST PR

333, Techno jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988

+82-53-785-1135
Country: 
Academic discipline: 
Content type: 
Website: