In precision agriculture, remote sensing is an essential phase in assessing crop status and variability when considering both the spatial and the temporal dimensions. To this aim, the use of unmanned aerial vehicles (UAVs) is growing in popularity, allowing for the autonomous performance of a variety of in-field tasks which are not limited to scouting or monitoring. To enable autonomous navigation, however, a crucial capability lies in accurately locating the vehicle within the surrounding environment. This task becomes challenging in agricultural scenarios where the crops and/or the adopted trellis systems can negatively affect GPS signal reception and localisation reliability. A viable solution to this problem can be the exploitation of high-accuracy 3D maps, which provide important data regarding crop morphology, as an additional input of the UAVs’ localisation system. However, the management of such big data may be difficult in real-time applications. In this paper, an innovative 3D sensor fusion approach is proposed, which combines the data provided by onboard proprioceptive (i.e., GPS and IMU) and exteroceptive (i.e., ultrasound) sensors with the information provided by a georeferenced 3D low-complexity map. In particular, the parallel-cuts ellipsoid method is used to merge the data from the distance sensors and the 3D map. Then, the improved estimation of the UAV location is fused with the data provided by the GPS and IMU sensors, using a Kalman-based filtering scheme. The simulation results prove the efficacy of the proposed navigation approach when applied to a quadrotor that autonomously navigates between vine rows.

3D distance filter for the autonomous navigation of UAVs in agricultural scenarios

Lorenzo Comba;Alessandro Biglia;Paolo Gay
Co-last
;
2022-01-01

Abstract

In precision agriculture, remote sensing is an essential phase in assessing crop status and variability when considering both the spatial and the temporal dimensions. To this aim, the use of unmanned aerial vehicles (UAVs) is growing in popularity, allowing for the autonomous performance of a variety of in-field tasks which are not limited to scouting or monitoring. To enable autonomous navigation, however, a crucial capability lies in accurately locating the vehicle within the surrounding environment. This task becomes challenging in agricultural scenarios where the crops and/or the adopted trellis systems can negatively affect GPS signal reception and localisation reliability. A viable solution to this problem can be the exploitation of high-accuracy 3D maps, which provide important data regarding crop morphology, as an additional input of the UAVs’ localisation system. However, the management of such big data may be difficult in real-time applications. In this paper, an innovative 3D sensor fusion approach is proposed, which combines the data provided by onboard proprioceptive (i.e., GPS and IMU) and exteroceptive (i.e., ultrasound) sensors with the information provided by a georeferenced 3D low-complexity map. In particular, the parallel-cuts ellipsoid method is used to merge the data from the distance sensors and the 3D map. Then, the improved estimation of the UAV location is fused with the data provided by the GPS and IMU sensors, using a Kalman-based filtering scheme. The simulation results prove the efficacy of the proposed navigation approach when applied to a quadrotor that autonomously navigates between vine rows.
2022
14
1
18
https://www.mdpi.com/2072-4292/14/6/1374
Precision farming, 3D crop modelling, Sensor fusion, Autonomous vehicles in agriculture, Ellipsoid method, Kalman filter
Cesare Donati, Martina Mammarella, Lorenzo Comba, Alessandro Biglia, Paolo Gay, Fabrizio Dabbene
File in questo prodotto:
File Dimensione Formato  
remotesensing-14-01374.pdf

Accesso aperto

Descrizione: pdf editoriale
Tipo di file: PDF EDITORIALE
Dimensione 6.52 MB
Formato Adobe PDF
6.52 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1852120
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 4
social impact