Single unmanned aerial vehicle autonomous motion planning method based on imitation learning

The invention provides a single unmanned aerial vehicle autonomous motion planning method based on imitation learning. A simulation environment is constructed, and a corresponding virtual unmanned aerial vehicle model is constructed based on a real unmanned aerial vehicle in the simulation environme...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: HOU ZIYUE, ZHANG LONGYUAN, LI WEI, LIU ZI'ANG, WANG JI
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator HOU ZIYUE
ZHANG LONGYUAN
LI WEI
LIU ZI'ANG
WANG JI
description The invention provides a single unmanned aerial vehicle autonomous motion planning method based on imitation learning. A simulation environment is constructed, and a corresponding virtual unmanned aerial vehicle model is constructed based on a real unmanned aerial vehicle in the simulation environment, and the method comprises the following steps: S1, obtaining a training sample for trajectory planning of the virtual unmanned aerial vehicle based on the simulation environment; s2, training a neural network of the real unmanned aerial vehicle through the training sample; s3, collecting real-time environment sensing data and state data of the real unmanned aerial vehicle through a sensor of the real unmanned aerial vehicle; s4, generating a real-time prediction track point sequence based on the real-time environment sensing data and the real unmanned aerial vehicle state data by adopting a neural network in combination with the target point data; s5, converting the real-time prediction track point sequence into
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN116481532A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN116481532A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN116481532A3</originalsourceid><addsrcrecordid>eNqNyzEOwjAMBdAsDAi4gzkAQyggVlSBmFhgRKpM61JLiR01CecnIA7A9KX_35-a-5Xl6QiyeBShDpBGRgcvGrgtPeakol5zBK-JVSC4AssJPKVBO3hgLLcysOeEX-IIxw-Zm0mPLtLilzOzPB1v9XlFQRuKAVsSSk19sXa32dtttT5U_5g3AYM8aQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Single unmanned aerial vehicle autonomous motion planning method based on imitation learning</title><source>esp@cenet</source><creator>HOU ZIYUE ; ZHANG LONGYUAN ; LI WEI ; LIU ZI'ANG ; WANG JI</creator><creatorcontrib>HOU ZIYUE ; ZHANG LONGYUAN ; LI WEI ; LIU ZI'ANG ; WANG JI</creatorcontrib><description>The invention provides a single unmanned aerial vehicle autonomous motion planning method based on imitation learning. A simulation environment is constructed, and a corresponding virtual unmanned aerial vehicle model is constructed based on a real unmanned aerial vehicle in the simulation environment, and the method comprises the following steps: S1, obtaining a training sample for trajectory planning of the virtual unmanned aerial vehicle based on the simulation environment; s2, training a neural network of the real unmanned aerial vehicle through the training sample; s3, collecting real-time environment sensing data and state data of the real unmanned aerial vehicle through a sensor of the real unmanned aerial vehicle; s4, generating a real-time prediction track point sequence based on the real-time environment sensing data and the real unmanned aerial vehicle state data by adopting a neural network in combination with the target point data; s5, converting the real-time prediction track point sequence into</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; GYROSCOPIC INSTRUMENTS ; MEASURING ; MEASURING DISTANCES, LEVELS OR BEARINGS ; NAVIGATION ; PHOTOGRAMMETRY OR VIDEOGRAMMETRY ; PHYSICS ; SURVEYING ; TESTING</subject><creationdate>2023</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230725&amp;DB=EPODOC&amp;CC=CN&amp;NR=116481532A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76290</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230725&amp;DB=EPODOC&amp;CC=CN&amp;NR=116481532A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>HOU ZIYUE</creatorcontrib><creatorcontrib>ZHANG LONGYUAN</creatorcontrib><creatorcontrib>LI WEI</creatorcontrib><creatorcontrib>LIU ZI'ANG</creatorcontrib><creatorcontrib>WANG JI</creatorcontrib><title>Single unmanned aerial vehicle autonomous motion planning method based on imitation learning</title><description>The invention provides a single unmanned aerial vehicle autonomous motion planning method based on imitation learning. A simulation environment is constructed, and a corresponding virtual unmanned aerial vehicle model is constructed based on a real unmanned aerial vehicle in the simulation environment, and the method comprises the following steps: S1, obtaining a training sample for trajectory planning of the virtual unmanned aerial vehicle based on the simulation environment; s2, training a neural network of the real unmanned aerial vehicle through the training sample; s3, collecting real-time environment sensing data and state data of the real unmanned aerial vehicle through a sensor of the real unmanned aerial vehicle; s4, generating a real-time prediction track point sequence based on the real-time environment sensing data and the real unmanned aerial vehicle state data by adopting a neural network in combination with the target point data; s5, converting the real-time prediction track point sequence into</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>GYROSCOPIC INSTRUMENTS</subject><subject>MEASURING</subject><subject>MEASURING DISTANCES, LEVELS OR BEARINGS</subject><subject>NAVIGATION</subject><subject>PHOTOGRAMMETRY OR VIDEOGRAMMETRY</subject><subject>PHYSICS</subject><subject>SURVEYING</subject><subject>TESTING</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2023</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNyzEOwjAMBdAsDAi4gzkAQyggVlSBmFhgRKpM61JLiR01CecnIA7A9KX_35-a-5Xl6QiyeBShDpBGRgcvGrgtPeakol5zBK-JVSC4AssJPKVBO3hgLLcysOeEX-IIxw-Zm0mPLtLilzOzPB1v9XlFQRuKAVsSSk19sXa32dtttT5U_5g3AYM8aQ</recordid><startdate>20230725</startdate><enddate>20230725</enddate><creator>HOU ZIYUE</creator><creator>ZHANG LONGYUAN</creator><creator>LI WEI</creator><creator>LIU ZI'ANG</creator><creator>WANG JI</creator><scope>EVB</scope></search><sort><creationdate>20230725</creationdate><title>Single unmanned aerial vehicle autonomous motion planning method based on imitation learning</title><author>HOU ZIYUE ; ZHANG LONGYUAN ; LI WEI ; LIU ZI'ANG ; WANG JI</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN116481532A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2023</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>GYROSCOPIC INSTRUMENTS</topic><topic>MEASURING</topic><topic>MEASURING DISTANCES, LEVELS OR BEARINGS</topic><topic>NAVIGATION</topic><topic>PHOTOGRAMMETRY OR VIDEOGRAMMETRY</topic><topic>PHYSICS</topic><topic>SURVEYING</topic><topic>TESTING</topic><toplevel>online_resources</toplevel><creatorcontrib>HOU ZIYUE</creatorcontrib><creatorcontrib>ZHANG LONGYUAN</creatorcontrib><creatorcontrib>LI WEI</creatorcontrib><creatorcontrib>LIU ZI'ANG</creatorcontrib><creatorcontrib>WANG JI</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>HOU ZIYUE</au><au>ZHANG LONGYUAN</au><au>LI WEI</au><au>LIU ZI'ANG</au><au>WANG JI</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Single unmanned aerial vehicle autonomous motion planning method based on imitation learning</title><date>2023-07-25</date><risdate>2023</risdate><abstract>The invention provides a single unmanned aerial vehicle autonomous motion planning method based on imitation learning. A simulation environment is constructed, and a corresponding virtual unmanned aerial vehicle model is constructed based on a real unmanned aerial vehicle in the simulation environment, and the method comprises the following steps: S1, obtaining a training sample for trajectory planning of the virtual unmanned aerial vehicle based on the simulation environment; s2, training a neural network of the real unmanned aerial vehicle through the training sample; s3, collecting real-time environment sensing data and state data of the real unmanned aerial vehicle through a sensor of the real unmanned aerial vehicle; s4, generating a real-time prediction track point sequence based on the real-time environment sensing data and the real unmanned aerial vehicle state data by adopting a neural network in combination with the target point data; s5, converting the real-time prediction track point sequence into</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN116481532A
source esp@cenet
subjects CALCULATING
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
GYROSCOPIC INSTRUMENTS
MEASURING
MEASURING DISTANCES, LEVELS OR BEARINGS
NAVIGATION
PHOTOGRAMMETRY OR VIDEOGRAMMETRY
PHYSICS
SURVEYING
TESTING
title Single unmanned aerial vehicle autonomous motion planning method based on imitation learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T11%3A52%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=HOU%20ZIYUE&rft.date=2023-07-25&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN116481532A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true