Positioning Errors of Objects Measured by Convolution Neural Network in Unmanned Aerial Vehicle Images

The conversion of unmanned aerial vehicles (UAVs, also called drones) and convolution neural network (CNN) facilitates the location of objects in real time using their sensors. In photogrammetry, the positional accuracy of objects is directly affected by the use of technology. It is necessary to imp...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors and materials 2022-01, Vol.34 (7), p.2637
Hauptverfasser: Kang, Woosuk, Kim, Jisung, Yun, Hongsic, Lee, Pooreum, Kim, Heecheol
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The conversion of unmanned aerial vehicles (UAVs, also called drones) and convolution neural network (CNN) facilitates the location of objects in real time using their sensors. In photogrammetry, the positional accuracy of objects is directly affected by the use of technology. It is necessary to improve the accuracy of object positioning to increase the utilization of drones and CNNs. In this study, the error factors that impede accuracy, such as the global navigation satellite system (GNSS) error of the UAV, camera distortion error, and camera posture error, were analyzed to improve the accuracy of object positioning. The effect of each error was also analyzed. The study was conducted in stages, such as establishing a method for the positioning of objects, specifying errors, and analyzing the amount and effect of error elements. The magnitude of the positioning errors was found by comparing it with accurate values measured by GNSS. Furthermore, the correlation of the errors with the factors that impeded accuracy was analyzed. Consequently, the effect of each error factor on the overall error was identified. These results can play an important role in improving positioning accuracy and developing UAV and CNN technologies employing sensors in the future.
ISSN:0914-4935
2435-0869
DOI:10.18494/SAM3939