This work presents a novel approach to distributed training of deep neural networks (DNNs) that aims to overcome the issues related to mainstream approaches to data parallel training. Established techniques for data parallel training are discussed from both a parallel computing and deep learning perspective, then a different approach is presented that is meant to allow DNN training to scale while retaining good convergence properties. Moreover, an experimental implementation is presented as well as some preliminary results.

Deep Learning at Scale

Paolo Viviani;Maurizio Drocco;Daniele Baccega;Iacopo Colonnelli;Marco Aldinucci
2019-01-01

Abstract

This work presents a novel approach to distributed training of deep neural networks (DNNs) that aims to overcome the issues related to mainstream approaches to data parallel training. Established techniques for data parallel training are discussed from both a parallel computing and deep learning perspective, then a different approach is presented that is meant to allow DNN training to scale while retaining good convergence properties. Moreover, an experimental implementation is presented as well as some preliminary results.
2019
Euromicro International Conference on Parallel, Distributed and Network Based Processing
Pavia, Italy
13-15 February 2019
Proc. of the 27th Euromicro Intl. Conference on Parallel Distributed and network-based Processing (PDP)
IEEE
124
131
978-1-7281-1644-0
https://ieeexplore.ieee.org/document/8671552
deep learning, distributed computing, machine learning, large scale, C++
Paolo Viviani, Maurizio Drocco, Daniele Baccega, Iacopo Colonnelli, Marco Aldinucci
File in questo prodotto:
File Dimensione Formato  
19_deeplearning_PDP.pdf

Accesso aperto

Tipo di file: POSTPRINT (VERSIONE FINALE DELL’AUTORE)
Dimensione 404.27 kB
Formato Adobe PDF
404.27 kB Adobe PDF Visualizza/Apri
19_deeplearning_PDP_editorial.pdf

Accesso riservato

Descrizione: PDF Editoriale
Tipo di file: PDF EDITORIALE
Dimensione 173.9 kB
Formato Adobe PDF
173.9 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1695211
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 4
social impact