As the density of sensing/computation/actuation nodes is increasing, it becomes more and more feasible and useful to think at an entire network of physical devices as a single, continuous space-time computing machine. The emergent behaviour of the whole software system is then induced by local computations deployed within each node and by the dynamics of the information diffusion. A relevant example of this distribution model is given by aggregate computing and its companion language field calculus, a minimal set of purely functional constructs used to manipulate distributed data structures evolving over space and time, and resulting in robustness to changes. In this paper, we study the convergence time of an archetypal and widely used component of distributed computations expressed in field calculus, called gradient: a fully-distributed estimation of distances over a metric space by a spanning tree. We provide an analytic result linking the quality of the output of a gradient to the amount of computing resources dedicated. The resulting error bounds are then exploited for network design, suggesting an optimal density value taking broadcast interferences into account. Finally, an empirical evaluation is performed validating the theoretical results.
Distributed Real-Time Shortest-Paths Computations with the Field Calculus
Audrito, Giorgio;Damiani, Ferruccio;Bini, Enrico
2018-01-01
Abstract
As the density of sensing/computation/actuation nodes is increasing, it becomes more and more feasible and useful to think at an entire network of physical devices as a single, continuous space-time computing machine. The emergent behaviour of the whole software system is then induced by local computations deployed within each node and by the dynamics of the information diffusion. A relevant example of this distribution model is given by aggregate computing and its companion language field calculus, a minimal set of purely functional constructs used to manipulate distributed data structures evolving over space and time, and resulting in robustness to changes. In this paper, we study the convergence time of an archetypal and widely used component of distributed computations expressed in field calculus, called gradient: a fully-distributed estimation of distances over a metric space by a spanning tree. We provide an analytic result linking the quality of the output of a gradient to the amount of computing resources dedicated. The resulting error bounds are then exploited for network design, suggesting an optimal density value taking broadcast interferences into account. Finally, an empirical evaluation is performed validating the theoretical results.File | Dimensione | Formato | |
---|---|---|---|
Bini_2018-RTSS.pdf
Accesso riservato
Tipo di file:
PDF EDITORIALE
Dimensione
463.82 kB
Formato
Adobe PDF
|
463.82 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.