Online Path Planning Framework for UAV in Rural Areas

A motion strategy plays an important role in supporting autonomous Unmanned Aerial Vehicle (UAV) movement. Many studies have been conducted to improve the motion frameworks in terms of its robustness, safety and performance. Most of them worked on the prior known maps scenario where the area informa...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2022, Vol.10, p.37572-37585
Hauptverfasser: Airlangga, Gregorius, Liu, Alan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A motion strategy plays an important role in supporting autonomous Unmanned Aerial Vehicle (UAV) movement. Many studies have been conducted to improve the motion frameworks in terms of its robustness, safety and performance. Most of them worked on the prior known maps scenario where the area information was collected by Global Positioning System (GPS) and satellite cameras. Even though the scheme can provide high quality map, the computation of motion planning remains dependent on the communication signal. In the rural areas such as forests and mountains, where communication signal does not perform well, unclear and noisy terrain maps can be generated and lead to mission failure. Therefore, it is significant that an alternative framework to enhance autonomous UAV motion performance in these certain conditions should be developed. Our work focuses on developing a high performance path planner for autonomous UAV motion when communication signal does not work well in rural areas. The search mission problem in forest terrain has been implemented in 3D simulation as an evaluation. By conducting a simulation process repeatedly with different test cases for positions, time constraints, flight speed (3-11 m/s) and flight range, our path planning framework can achieve completeness between 90-100% and better performance compared to others.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3164505