Unmanned aerial vehicle relative navigation method and system based on vision and UWB fusion

The invention provides an unmanned aerial vehicle relative navigation method and system based on vision and UWB fusion, and relates to the technical field of unmanned aerial vehicles, and the method comprises the steps: firstly obtaining a central coordinate prediction value of a target unmanned aer...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: ZHANG YUNPENG, GUO YANTENG, ZHANG HE, ZHENG YUNHAI, LI YUFENG, YE WEN, ZHANG LINGHAO, MIAO CUNXIAO
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator ZHANG YUNPENG
GUO YANTENG
ZHANG HE
ZHENG YUNHAI
LI YUFENG
YE WEN
ZHANG LINGHAO
MIAO CUNXIAO
description The invention provides an unmanned aerial vehicle relative navigation method and system based on vision and UWB fusion, and relates to the technical field of unmanned aerial vehicles, and the method comprises the steps: firstly obtaining a central coordinate prediction value of a target unmanned aerial vehicle through an image of the target unmanned aerial vehicle, and further obtaining an azimuth angle and an altitude angle of the target unmanned aerial vehicle; then relative distance information is obtained, and the actual relative distance is obtained through calculation; calculating according to the azimuth angle, the altitude angle and the actual relative distance to obtain relative coordinates of the target unmanned aerial vehicle; and finally navigating the following unmanned aerial vehicle according to the relative coordinates of the target unmanned aerial vehicle. According to the method, continuous and high-precision relative positioning of the unmanned aerial vehicle can be realized under the condi
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN116753957A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN116753957A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN116753957A3</originalsourceid><addsrcrecordid>eNrjZIgJzctNzMtLTVFITC3KTMxRKEvNyEzOSVUoSs1JLMksS1XISyzLTAcy8_MUclNLMvKBKvNSFIori0tScxWSEouBWoFSZZnFIBUgqdBwJ4W0UhCXh4E1LTGnOJUXSnMzKLq5hjh76KYW5MenFhckJqfmpZbEO_sZGpqZmxpbmpo7GhOjBgCa2TuN</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Unmanned aerial vehicle relative navigation method and system based on vision and UWB fusion</title><source>esp@cenet</source><creator>ZHANG YUNPENG ; GUO YANTENG ; ZHANG HE ; ZHENG YUNHAI ; LI YUFENG ; YE WEN ; ZHANG LINGHAO ; MIAO CUNXIAO</creator><creatorcontrib>ZHANG YUNPENG ; GUO YANTENG ; ZHANG HE ; ZHENG YUNHAI ; LI YUFENG ; YE WEN ; ZHANG LINGHAO ; MIAO CUNXIAO</creatorcontrib><description>The invention provides an unmanned aerial vehicle relative navigation method and system based on vision and UWB fusion, and relates to the technical field of unmanned aerial vehicles, and the method comprises the steps: firstly obtaining a central coordinate prediction value of a target unmanned aerial vehicle through an image of the target unmanned aerial vehicle, and further obtaining an azimuth angle and an altitude angle of the target unmanned aerial vehicle; then relative distance information is obtained, and the actual relative distance is obtained through calculation; calculating according to the azimuth angle, the altitude angle and the actual relative distance to obtain relative coordinates of the target unmanned aerial vehicle; and finally navigating the following unmanned aerial vehicle according to the relative coordinates of the target unmanned aerial vehicle. According to the method, continuous and high-precision relative positioning of the unmanned aerial vehicle can be realized under the condi</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; ELECTRIC COMMUNICATION TECHNIQUE ; ELECTRICITY ; GYROSCOPIC INSTRUMENTS ; IMAGE DATA PROCESSING OR GENERATION, IN GENERAL ; MEASURING ; MEASURING DISTANCES, LEVELS OR BEARINGS ; NAVIGATION ; PHOTOGRAMMETRY OR VIDEOGRAMMETRY ; PHYSICS ; SURVEYING ; TESTING ; WIRELESS COMMUNICATIONS NETWORKS</subject><creationdate>2023</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230915&amp;DB=EPODOC&amp;CC=CN&amp;NR=116753957A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25564,76547</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230915&amp;DB=EPODOC&amp;CC=CN&amp;NR=116753957A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>ZHANG YUNPENG</creatorcontrib><creatorcontrib>GUO YANTENG</creatorcontrib><creatorcontrib>ZHANG HE</creatorcontrib><creatorcontrib>ZHENG YUNHAI</creatorcontrib><creatorcontrib>LI YUFENG</creatorcontrib><creatorcontrib>YE WEN</creatorcontrib><creatorcontrib>ZHANG LINGHAO</creatorcontrib><creatorcontrib>MIAO CUNXIAO</creatorcontrib><title>Unmanned aerial vehicle relative navigation method and system based on vision and UWB fusion</title><description>The invention provides an unmanned aerial vehicle relative navigation method and system based on vision and UWB fusion, and relates to the technical field of unmanned aerial vehicles, and the method comprises the steps: firstly obtaining a central coordinate prediction value of a target unmanned aerial vehicle through an image of the target unmanned aerial vehicle, and further obtaining an azimuth angle and an altitude angle of the target unmanned aerial vehicle; then relative distance information is obtained, and the actual relative distance is obtained through calculation; calculating according to the azimuth angle, the altitude angle and the actual relative distance to obtain relative coordinates of the target unmanned aerial vehicle; and finally navigating the following unmanned aerial vehicle according to the relative coordinates of the target unmanned aerial vehicle. According to the method, continuous and high-precision relative positioning of the unmanned aerial vehicle can be realized under the condi</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC COMMUNICATION TECHNIQUE</subject><subject>ELECTRICITY</subject><subject>GYROSCOPIC INSTRUMENTS</subject><subject>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</subject><subject>MEASURING</subject><subject>MEASURING DISTANCES, LEVELS OR BEARINGS</subject><subject>NAVIGATION</subject><subject>PHOTOGRAMMETRY OR VIDEOGRAMMETRY</subject><subject>PHYSICS</subject><subject>SURVEYING</subject><subject>TESTING</subject><subject>WIRELESS COMMUNICATIONS NETWORKS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2023</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZIgJzctNzMtLTVFITC3KTMxRKEvNyEzOSVUoSs1JLMksS1XISyzLTAcy8_MUclNLMvKBKvNSFIori0tScxWSEouBWoFSZZnFIBUgqdBwJ4W0UhCXh4E1LTGnOJUXSnMzKLq5hjh76KYW5MenFhckJqfmpZbEO_sZGpqZmxpbmpo7GhOjBgCa2TuN</recordid><startdate>20230915</startdate><enddate>20230915</enddate><creator>ZHANG YUNPENG</creator><creator>GUO YANTENG</creator><creator>ZHANG HE</creator><creator>ZHENG YUNHAI</creator><creator>LI YUFENG</creator><creator>YE WEN</creator><creator>ZHANG LINGHAO</creator><creator>MIAO CUNXIAO</creator><scope>EVB</scope></search><sort><creationdate>20230915</creationdate><title>Unmanned aerial vehicle relative navigation method and system based on vision and UWB fusion</title><author>ZHANG YUNPENG ; GUO YANTENG ; ZHANG HE ; ZHENG YUNHAI ; LI YUFENG ; YE WEN ; ZHANG LINGHAO ; MIAO CUNXIAO</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN116753957A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2023</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC COMMUNICATION TECHNIQUE</topic><topic>ELECTRICITY</topic><topic>GYROSCOPIC INSTRUMENTS</topic><topic>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</topic><topic>MEASURING</topic><topic>MEASURING DISTANCES, LEVELS OR BEARINGS</topic><topic>NAVIGATION</topic><topic>PHOTOGRAMMETRY OR VIDEOGRAMMETRY</topic><topic>PHYSICS</topic><topic>SURVEYING</topic><topic>TESTING</topic><topic>WIRELESS COMMUNICATIONS NETWORKS</topic><toplevel>online_resources</toplevel><creatorcontrib>ZHANG YUNPENG</creatorcontrib><creatorcontrib>GUO YANTENG</creatorcontrib><creatorcontrib>ZHANG HE</creatorcontrib><creatorcontrib>ZHENG YUNHAI</creatorcontrib><creatorcontrib>LI YUFENG</creatorcontrib><creatorcontrib>YE WEN</creatorcontrib><creatorcontrib>ZHANG LINGHAO</creatorcontrib><creatorcontrib>MIAO CUNXIAO</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>ZHANG YUNPENG</au><au>GUO YANTENG</au><au>ZHANG HE</au><au>ZHENG YUNHAI</au><au>LI YUFENG</au><au>YE WEN</au><au>ZHANG LINGHAO</au><au>MIAO CUNXIAO</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Unmanned aerial vehicle relative navigation method and system based on vision and UWB fusion</title><date>2023-09-15</date><risdate>2023</risdate><abstract>The invention provides an unmanned aerial vehicle relative navigation method and system based on vision and UWB fusion, and relates to the technical field of unmanned aerial vehicles, and the method comprises the steps: firstly obtaining a central coordinate prediction value of a target unmanned aerial vehicle through an image of the target unmanned aerial vehicle, and further obtaining an azimuth angle and an altitude angle of the target unmanned aerial vehicle; then relative distance information is obtained, and the actual relative distance is obtained through calculation; calculating according to the azimuth angle, the altitude angle and the actual relative distance to obtain relative coordinates of the target unmanned aerial vehicle; and finally navigating the following unmanned aerial vehicle according to the relative coordinates of the target unmanned aerial vehicle. According to the method, continuous and high-precision relative positioning of the unmanned aerial vehicle can be realized under the condi</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN116753957A
source esp@cenet
subjects CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
ELECTRIC COMMUNICATION TECHNIQUE
ELECTRICITY
GYROSCOPIC INSTRUMENTS
IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
MEASURING
MEASURING DISTANCES, LEVELS OR BEARINGS
NAVIGATION
PHOTOGRAMMETRY OR VIDEOGRAMMETRY
PHYSICS
SURVEYING
TESTING
WIRELESS COMMUNICATIONS NETWORKS
title Unmanned aerial vehicle relative navigation method and system based on vision and UWB fusion
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-24T18%3A11%3A45IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=ZHANG%20YUNPENG&rft.date=2023-09-15&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN116753957A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true