Localization of autonomous unmanned aerial vehicles (UAVs) relies heavily on Global Navigation Satellite Systems (GNSS), which are susceptible to interference. Especially in security applications, robust localization algorithms independent of GNSS are needed to provide dependable operations of autonomous UAVs also in interfered conditions. Typical non-GNSS visual localization approaches rely on known starting pose, work only on a small-sized map, or require known flight paths before a mission starts. We consider the problem of localization with no information on initial pose or planned flight path. We propose a solution for global visual localization on large maps, based on matching orthoprojected UAV images to satellite imagery using learned season-invariant descriptors, and test with environment sizes up to 100 km . We show that the method is able to determine heading, latitude and longitude of the UAV at 12.6–18.7 m lateral translation error in as few as 23.2–44.4 updates from an uninformed initialization, also in situations of significant seasonal appearance difference (winter–summer) between the UAV image and the map. We evaluate the characteristics of multiple neural network architectures for generating the descriptors, and likelihood estimation methods that are able to provide fast convergence and low localization error. We also evaluate the operation of the algorithm using real UAV data and evaluate running time on a real-time embedded platform. We believe this is the first work that is able to recover the pose of an UAV at this scale and rate of convergence, while allowing significant seasonal difference between camera observations and map.

LSVL: Large-scale season-invariant visual localization for UAVs

Renzulli R.
Co-first
;
Verdoja F.
Co-last
;
2023-01-01

Abstract

Localization of autonomous unmanned aerial vehicles (UAVs) relies heavily on Global Navigation Satellite Systems (GNSS), which are susceptible to interference. Especially in security applications, robust localization algorithms independent of GNSS are needed to provide dependable operations of autonomous UAVs also in interfered conditions. Typical non-GNSS visual localization approaches rely on known starting pose, work only on a small-sized map, or require known flight paths before a mission starts. We consider the problem of localization with no information on initial pose or planned flight path. We propose a solution for global visual localization on large maps, based on matching orthoprojected UAV images to satellite imagery using learned season-invariant descriptors, and test with environment sizes up to 100 km . We show that the method is able to determine heading, latitude and longitude of the UAV at 12.6–18.7 m lateral translation error in as few as 23.2–44.4 updates from an uninformed initialization, also in situations of significant seasonal appearance difference (winter–summer) between the UAV image and the map. We evaluate the characteristics of multiple neural network architectures for generating the descriptors, and likelihood estimation methods that are able to provide fast convergence and low localization error. We also evaluate the operation of the algorithm using real UAV data and evaluate running time on a real-time embedded platform. We believe this is the first work that is able to recover the pose of an UAV at this scale and rate of convergence, while allowing significant seasonal difference between camera observations and map.
2023
168
1
15
https://www.sciencedirect.com/science/article/pii/S0921889023001367
UAV,Localization,Wake-up robot problem,Seasonal appearance change
Kinnari J.; Renzulli R.; Verdoja F.; Kyrki V.
File in questo prodotto:
File Dimensione Formato  
main.pdf

Accesso aperto

Tipo di file: PDF EDITORIALE
Dimensione 2.88 MB
Formato Adobe PDF
2.88 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1945275
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact