Federated Learning (FL) is becoming popular in different industrial sectors where data access is critical for security, privacy and the economic value of data itself. Unlike traditional machine learning, where all the data must be globally gathered for analysis, FL makes it possible to extract knowledge from data distributed across different organizations that can be coupled with different Machine Learning paradigms. In this work, we replicate, using Federated Learning, the analysis of a pooled dataset (with AdaBoost) that has been used to define the PRAISE score, which is today among the most accurate scores to evaluate the risk of a second acute myocardial infarction. We show that thanks to the extended-OpenFL framework, which implements AdaBoost.F, we can train a federated PRAISE model that exhibits comparable accuracy and recall as the centralised model. We achieved F1 and F2 scores which are consistently comparable to the PRAISE score study of a 16- parties federation but within an order of magnitude less time.

Pooling critical datasets with Federated Learning

Yasir Arfat;Gianluca Mittone;Iacopo Colonnelli;Fabrizio D'Ascenzo;Roberto Esposito;Marco Aldinucci
2023-01-01

Abstract

Federated Learning (FL) is becoming popular in different industrial sectors where data access is critical for security, privacy and the economic value of data itself. Unlike traditional machine learning, where all the data must be globally gathered for analysis, FL makes it possible to extract knowledge from data distributed across different organizations that can be coupled with different Machine Learning paradigms. In this work, we replicate, using Federated Learning, the analysis of a pooled dataset (with AdaBoost) that has been used to define the PRAISE score, which is today among the most accurate scores to evaluate the risk of a second acute myocardial infarction. We show that thanks to the extended-OpenFL framework, which implements AdaBoost.F, we can train a federated PRAISE model that exhibits comparable accuracy and recall as the centralised model. We achieved F1 and F2 scores which are consistently comparable to the PRAISE score study of a 16- parties federation but within an order of magnitude less time.
2023
Euromicro Intl. Conference on Parallel Distributed and network-based Processing (PDP)
Napoli, Italia
1-3 March 2023
Proc. of the 31st Euromicro Intl. Conference on Parallel Distributed and network-based Processing (PDP)
IEEE
329
337
979-8-3503-3763-1
Federated Learning, Machine Learning, Cardi- ology, Healthcare, Performance Analysis, Decentralized machine learning, Distributed machine learning, PRAISE score
Yasir Arfat; Gianluca Mittone; Iacopo Colonnelli; Fabrizio D'Ascenzo; Roberto Esposito; Marco Aldinucci
File in questo prodotto:
File Dimensione Formato  
23_pdp_fl.pdf

Accesso aperto

Tipo di file: PREPRINT (PRIMA BOZZA)
Dimensione 1.17 MB
Formato Adobe PDF
1.17 MB Adobe PDF Visualizza/Apri
PDP2023_editorial.pdf

Accesso riservato

Descrizione: PDF Editoriale
Tipo di file: PDF EDITORIALE
Dimensione 341.43 kB
Formato Adobe PDF
341.43 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1890256
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? ND
social impact