Deep neural networks have become the de-facto standard across various computer science domains. Nonetheless, effectively training these deep networks remains challenging and resource-intensive. This paper investigates the efficacy of pruned deep learning models in transfer learning scenarios under extremely low memory budgets, tailored for TinyML models. Our study reveals that the source task's model with the highest activation entropy outperforms others in the target task. Motivated by this, we propose an entropy-based Efficient Neural Transfer with Reduced Overhead via PrunIng (ENTROPI) algorithm. Through comprehensive experiments on diverse models (ResNet18 and MobileNet-v3) and target datasets (CIFAR-100, VLCS, and PACS), we substantiate the superior generalization achieved by transfer learning from the entropy-pruned model. Quantitative measures for entropy provide valuable insights into the reasons behind the observed performance improvements. The results underscore ENTROPI's potential as an efficient solution for enhancing generalization in data-limited transfer learning tasks.

Shannon Strikes Again! Entropy-based Pruning in Deep Neural Networks for Transfer Learning under Extreme Memory and Computation Budgets

Spadaro G.;Renzulli R.;Bragagnolo A.;Fiandrotti A.;Grangetto M.;Tartaglione E.
2023-01-01

Abstract

Deep neural networks have become the de-facto standard across various computer science domains. Nonetheless, effectively training these deep networks remains challenging and resource-intensive. This paper investigates the efficacy of pruned deep learning models in transfer learning scenarios under extremely low memory budgets, tailored for TinyML models. Our study reveals that the source task's model with the highest activation entropy outperforms others in the target task. Motivated by this, we propose an entropy-based Efficient Neural Transfer with Reduced Overhead via PrunIng (ENTROPI) algorithm. Through comprehensive experiments on diverse models (ResNet18 and MobileNet-v3) and target datasets (CIFAR-100, VLCS, and PACS), we substantiate the superior generalization achieved by transfer learning from the entropy-pruned model. Quantitative measures for entropy provide valuable insights into the reasons behind the observed performance improvements. The results underscore ENTROPI's potential as an efficient solution for enhancing generalization in data-limited transfer learning tasks.
2023
2023 IEEE/CVF International Conference on Computer Vision Workshops, ICCVW 2023
fra
2023
Proceedings - 2023 IEEE/CVF International Conference on Computer Vision Workshops, ICCVW 2023
Institute of Electrical and Electronics Engineers Inc.
1510
1514
Deep learning; entropy; pruning; TinyML; transfer learning
Spadaro G.; Renzulli R.; Bragagnolo A.; Giraldo J.H.; Fiandrotti A.; Grangetto M.; Tartaglione E.
File in questo prodotto:
File Dimensione Formato  
Spadaro_Shannon_Strikes_Again_Entropy-Based_Pruning_in_Deep_Neural_Networks_for_ICCVW_2023_paper.pdf

Accesso aperto

Tipo di file: PDF EDITORIALE
Dimensione 568.58 kB
Formato Adobe PDF
568.58 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/2037911
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? ND
social impact