From the moment Neural Networks dominated the scene for image processing, the computational complexity needed to solve the targeted tasks skyrocketed: against such an unsustainable trend, many strategies have been developed, ambitiously targeting performance’s preservation. Promot- ing sparse topologies, for example, allows the deployment of deep neural networks models on embedded, resource- constrained devices. Recently, Capsule Networks were in- troduced to enhance explainability of a model, where each capsule is an explicit representation of an object or its parts. These models show promising results on toy datasets, but their low scalability prevents deployment on more complex tasks. In this work, we explore sparsity besides capsule rep- resentations to improve their computational efficiency by reducing the number of capsules. We show how pruning with Capsule Network achieves high generalization with less memory requirements, computational effort, and inference and training time.

Towards Efficient Capsule Networks

Renzulli, Riccardo
;
Grangetto, Marco
2022-01-01

Abstract

From the moment Neural Networks dominated the scene for image processing, the computational complexity needed to solve the targeted tasks skyrocketed: against such an unsustainable trend, many strategies have been developed, ambitiously targeting performance’s preservation. Promot- ing sparse topologies, for example, allows the deployment of deep neural networks models on embedded, resource- constrained devices. Recently, Capsule Networks were in- troduced to enhance explainability of a model, where each capsule is an explicit representation of an object or its parts. These models show promising results on toy datasets, but their low scalability prevents deployment on more complex tasks. In this work, we explore sparsity besides capsule rep- resentations to improve their computational efficiency by reducing the number of capsules. We show how pruning with Capsule Network achieves high generalization with less memory requirements, computational effort, and inference and training time.
2022
IEEE International Conference on Image Processing (ICIP)
Bordeaux, France
October 16-19, 2022
2022 IEEE International Conference on Image Processing (ICIP)
IEEE
2801
2805
978-1-6654-9620-9
https://arxiv.org/abs/2208.09203
capsule networks, routing, pruning
Renzulli, Riccardo; Grangetto, Marco
File in questo prodotto:
File Dimensione Formato  
output-4.pdf

Accesso aperto

Tipo di file: POSTPRINT (VERSIONE FINALE DELL’AUTORE)
Dimensione 3.32 MB
Formato Adobe PDF
3.32 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1883410
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact