Unmanned aerial vehicle landing localization technology based on visual dual-channel uniqueness coding
Unmanned aerial vehicle (UAV) mainly uses GPS (Global Positioning System) for positioning, but it is difficult to locate in areas where GPS signal is missing or being disturbed. Therefore, the relative positioning technology based on visual method is also widely used in UAV landing, but due to the i...
Gespeichert in:
Veröffentlicht in: | International journal of advanced robotic systems 2024-09, Vol.21 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Unmanned aerial vehicle (UAV) mainly uses GPS (Global Positioning System) for positioning, but it is difficult to locate in areas where GPS signal is missing or being disturbed. Therefore, the relative positioning technology based on visual method is also widely used in UAV landing, but due to the influence of cross wind and the motion of the landing plane, the traditional visual landing sign and positioning method is prone to lose the positioning target at the end stage of the landing. Therefore, this article proposes a landing sign based on double-layer unique coding, which can realize continuous positioning in the horizontal direction and accurate positioning of large-scale changes in the vertical direction. Firstly, real-time image acquisition and image processing are carried out through the airborne camera. Based on the flight altitude and encoding extraction effect of the UAV, the red or blue channel component is extracted, and then the image is preprocessed, edge detection, line detection, coding grid extraction, etc. As long as any 5 × 5 small squares of the whole pattern can be extracted, the perspective-n-point algorithm is used to obtain the relative pose estimation between the UAV and the landing sign. Experiments show that the landing sign and recognition algorithm effectively improve the UAV landing reliability. |
---|---|
ISSN: | 1729-8814 |
DOI: | 10.1177/17298806241279046 |