Federated Learning (FL) has emerged as a solution to preserve data privacy by keeping the data locally on each participant’s device. However, FL alone is still vulnerable to attacks that can cause privacy leaks. Therefore, additional security measures, at the cost of increasing runtimes, become necessary. The Trusted Execution Environment (TEE) approach offers the highest degree of security during execution. However, TEEs suffer from memory limits which prevent safe end-to-end FL training of modern deep models. State-of-the-art approaches limit secure training to selected layers, failing to avert the full spectrum of attacks or adopt layer-wise training affecting model performance. We benchmark the usage of a library OS (LibOS) to run the full, unmodified end-to-end FL training inside the TEE. We extensively evaluate and model the overhead of the different security mechanisms needed to protect the data and model during computation (TEE), communication (TLS), and storage (disk encryption). The obtained results across three datasets and two models demonstrate that LibOSes are a viable way to seamlessly inject security into FL with limited overhead (at most 2x), offering valuable guidance for researchers and developers aiming to apply FL in data-security-focused contexts.

A Performance Analysis for Confidential Federated Learning

Bruno Casella
First
;
Iacopo Colonnelli;Gianluca Mittone;Robert Birke;Marco Aldinucci
Last
2024-01-01

Abstract

Federated Learning (FL) has emerged as a solution to preserve data privacy by keeping the data locally on each participant’s device. However, FL alone is still vulnerable to attacks that can cause privacy leaks. Therefore, additional security measures, at the cost of increasing runtimes, become necessary. The Trusted Execution Environment (TEE) approach offers the highest degree of security during execution. However, TEEs suffer from memory limits which prevent safe end-to-end FL training of modern deep models. State-of-the-art approaches limit secure training to selected layers, failing to avert the full spectrum of attacks or adopt layer-wise training affecting model performance. We benchmark the usage of a library OS (LibOS) to run the full, unmodified end-to-end FL training inside the TEE. We extensively evaluate and model the overhead of the different security mechanisms needed to protect the data and model during computation (TEE), communication (TLS), and storage (disk encryption). The obtained results across three datasets and two models demonstrate that LibOSes are a viable way to seamlessly inject security into FL with limited overhead (at most 2x), offering valuable guidance for researchers and developers aiming to apply FL in data-security-focused contexts.
2024
Inglese
contributo
4 - Workshop
7th DEEP LEARNING SECURITY AND PRIVACY WORKSHOP
San Francisco
23/05/2024
Internazionale
2024 IEEE Security and Privacy Workshops (SPW)
Comitato scientifico
IEEE Computer Society
Los Alamitos
STATI UNITI D'AMERICA
40
47
8
979-8-3503-5487-4
federated learning, trusted execution environments, intel sgx, sgx, confidential computing
STATI UNITI D'AMERICA
   Future HPC & Big Data-finanziato con fondi PNRR MUR-M4C2-Investimento 1.4-Avviso"Centri Nazionali"-D.D.n.3138 del 16/12/2021 rettificato con DD n.3175 del 18/12/2021,codice MUR CN00000013, CUP D13C22001340001
   CN-HPC
   Ministero dell'Università e della Ricerca
   ALDINUCCI M.- CN-HPC

   EPI SGA2 - European Processor Initiative
   EPI SGA2
   EUROPEAN COMMISSION
   H2020
   POLATO M. - H2020 RIA - G.A. 101036168
1 – prodotto con file in versione Open Access (allegherò il file al passo 6 - Carica)
8
info:eu-repo/semantics/conferenceObject
04-CONTRIBUTO IN ATTI DI CONVEGNO::04A-Conference paper in volume
Bruno Casella, Iacopo Colonnelli, Gianluca Mittone, Robert Birke, Walter Riviera, Antonio Sciarappa, Carlo Cavazzoni, Marco Aldinucci
273
partially_open
File in questo prodotto:
File Dimensione Formato  
DLSP___CONFIDENTIAL_FL.pdf

Accesso aperto

Tipo di file: PREPRINT (PRIMA BOZZA)
Dimensione 344.13 kB
Formato Adobe PDF
344.13 kB Adobe PDF Visualizza/Apri
PDF_Editoriale.pdf

Accesso riservato

Descrizione: PFD Editoriale
Tipo di file: PDF EDITORIALE
Dimensione 401.68 kB
Formato Adobe PDF
401.68 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1961156
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact