Thanks to the rapid increase in computational capability during the latest years, traditional and more explainable methods have been gradually replaced by more complex deep-learning-based approaches, which have in fact reached new state-of-the-art results for a variety of tasks. However, for certain kinds of applications performance alone is not enough. A prime example is represented by the medical field, in which building trust between the physicians and the AI models is fundamental. Providing an explainable or trustful model, however, is not a trivial task, considering the black-box nature of deep-learning based methods. While some existing methods, such as gradient or saliency maps, try to provide insights about the functioning of deep neural networks, they often provide limited information with regards to clinical needs. We propose a two-step diagnostic approach for the detection of Covid-19 infection from Chest X-Ray images. Our approach is designed to mimic the diagnosis process of human radiologists: it detects objective radiological findings in the lungs, which are then employed for making a final Covid-19 diagnosis. We believe that this kind of structural explainability can be preferable in this context. The proposed approach achieves promising performance in Covid-19 detection, compatible with expert human radiologists. Moreover, despite this work being focused Covid-19, we believe that this approach could be employed for many different CXR-based diagnosis.

A Two-Step Radiologist-Like Approach for Covid-19 Computer-Aided Diagnosis from Chest X-Ray Images

Barbano Carlo Alberto Maria
First
;
Tartaglione E.;Berzovini C.;Calandri M.;Grangetto M.
2022-01-01

Abstract

Thanks to the rapid increase in computational capability during the latest years, traditional and more explainable methods have been gradually replaced by more complex deep-learning-based approaches, which have in fact reached new state-of-the-art results for a variety of tasks. However, for certain kinds of applications performance alone is not enough. A prime example is represented by the medical field, in which building trust between the physicians and the AI models is fundamental. Providing an explainable or trustful model, however, is not a trivial task, considering the black-box nature of deep-learning based methods. While some existing methods, such as gradient or saliency maps, try to provide insights about the functioning of deep neural networks, they often provide limited information with regards to clinical needs. We propose a two-step diagnostic approach for the detection of Covid-19 infection from Chest X-Ray images. Our approach is designed to mimic the diagnosis process of human radiologists: it detects objective radiological findings in the lungs, which are then employed for making a final Covid-19 diagnosis. We believe that this kind of structural explainability can be preferable in this context. The proposed approach achieves promising performance in Covid-19 detection, compatible with expert human radiologists. Moreover, despite this work being focused Covid-19, we believe that this approach could be employed for many different CXR-based diagnosis.
2022
21st International Conference on Image Analysis and Processing, ICIAP 2022
ita
2022
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Springer Science and Business Media Deutschland GmbH
13231
173
184
978-3-031-06426-5
978-3-031-06427-2
https://link.springer.com/chapter/10.1007/978-3-031-06427-2_15
Chest x-ray; Covid-19; Deep learning; Radiological findings
Barbano Carlo Alberto Maria; Tartaglione E.; Berzovini C.; Calandri M.; Grangetto M.
File in questo prodotto:
File Dimensione Formato  
output-3.pdf

Open Access dal 16/05/2023

Tipo di file: POSTPRINT (VERSIONE FINALE DELL’AUTORE)
Dimensione 1.09 MB
Formato Adobe PDF
1.09 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1863524
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact