Unmanned vehicle local autonomous control method, device and equipment based on depth map

The invention relates to an unmanned vehicle local autonomous control method, a device and equipment based on a depth map. The method comprises the following steps: acquiring the depth map shot in the field of view of an unmanned vehicle and extracting a depth feature vector of the depth map; splici...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: LIANG ZHUANG, ZHANG QI, SUO XIANGBO, CHEN TINGZHENG, HU RUIJUN, ZHANG YULIN, LI CHUANXIANG, ZHENG YONGHUANG, ZHAO CHENG
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator LIANG ZHUANG
ZHANG QI
SUO XIANGBO
CHEN TINGZHENG
HU RUIJUN
ZHANG YULIN
LI CHUANXIANG
ZHENG YONGHUANG
ZHAO CHENG
description The invention relates to an unmanned vehicle local autonomous control method, a device and equipment based on a depth map. The method comprises the following steps: acquiring the depth map shot in the field of view of an unmanned vehicle and extracting a depth feature vector of the depth map; splicing and fusing the depth feature vectors corresponding to a plurality of depth images continuously shot at historical moments and the navigation target point position coordinates of the unmanned vehicle when the depth images are shot to obtain a fused feature vector, and taking the fused feature vector as the input state of the navigation neural network of the unmanned vehicle; designing a comprehensive reward function; training a navigation neural network by adopting a hyper-parameter segmentation training strategy in an obstacle simulation environment by utilizing the fusion feature vector and the comprehensive reward function; and in a real physical environment, processing a depth image by using the trained navig
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN113486871A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN113486871A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN113486871A3</originalsourceid><addsrcrecordid>eNqNjDEOwjAQBNNQIOAPRw-FFQRpUQSiooKCKjrsRYlk3xls5_2k4AFUU8xo5tXjLoFF4GhEP1gP8mrZE5esokFLIquSP-opIPfqNuQwDhbE4gjvMsQAyfTkND1UJhtzT4Hjspq92CesflxU6_Pp1l62iNohRbYQ5K69GlPvmn1zMMf6n-YL6_c6oQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Unmanned vehicle local autonomous control method, device and equipment based on depth map</title><source>esp@cenet</source><creator>LIANG ZHUANG ; ZHANG QI ; SUO XIANGBO ; CHEN TINGZHENG ; HU RUIJUN ; ZHANG YULIN ; LI CHUANXIANG ; ZHENG YONGHUANG ; ZHAO CHENG</creator><creatorcontrib>LIANG ZHUANG ; ZHANG QI ; SUO XIANGBO ; CHEN TINGZHENG ; HU RUIJUN ; ZHANG YULIN ; LI CHUANXIANG ; ZHENG YONGHUANG ; ZHAO CHENG</creatorcontrib><description>The invention relates to an unmanned vehicle local autonomous control method, a device and equipment based on a depth map. The method comprises the following steps: acquiring the depth map shot in the field of view of an unmanned vehicle and extracting a depth feature vector of the depth map; splicing and fusing the depth feature vectors corresponding to a plurality of depth images continuously shot at historical moments and the navigation target point position coordinates of the unmanned vehicle when the depth images are shot to obtain a fused feature vector, and taking the fused feature vector as the input state of the navigation neural network of the unmanned vehicle; designing a comprehensive reward function; training a navigation neural network by adopting a hyper-parameter segmentation training strategy in an obstacle simulation environment by utilizing the fusion feature vector and the comprehensive reward function; and in a real physical environment, processing a depth image by using the trained navig</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; CONTROLLING ; COUNTING ; HANDLING RECORD CARRIERS ; IMAGE DATA PROCESSING OR GENERATION, IN GENERAL ; PHYSICS ; PRESENTATION OF DATA ; RECOGNITION OF DATA ; RECORD CARRIERS ; REGULATING ; SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES</subject><creationdate>2021</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20211008&amp;DB=EPODOC&amp;CC=CN&amp;NR=113486871A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25563,76318</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20211008&amp;DB=EPODOC&amp;CC=CN&amp;NR=113486871A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>LIANG ZHUANG</creatorcontrib><creatorcontrib>ZHANG QI</creatorcontrib><creatorcontrib>SUO XIANGBO</creatorcontrib><creatorcontrib>CHEN TINGZHENG</creatorcontrib><creatorcontrib>HU RUIJUN</creatorcontrib><creatorcontrib>ZHANG YULIN</creatorcontrib><creatorcontrib>LI CHUANXIANG</creatorcontrib><creatorcontrib>ZHENG YONGHUANG</creatorcontrib><creatorcontrib>ZHAO CHENG</creatorcontrib><title>Unmanned vehicle local autonomous control method, device and equipment based on depth map</title><description>The invention relates to an unmanned vehicle local autonomous control method, a device and equipment based on a depth map. The method comprises the following steps: acquiring the depth map shot in the field of view of an unmanned vehicle and extracting a depth feature vector of the depth map; splicing and fusing the depth feature vectors corresponding to a plurality of depth images continuously shot at historical moments and the navigation target point position coordinates of the unmanned vehicle when the depth images are shot to obtain a fused feature vector, and taking the fused feature vector as the input state of the navigation neural network of the unmanned vehicle; designing a comprehensive reward function; training a navigation neural network by adopting a hyper-parameter segmentation training strategy in an obstacle simulation environment by utilizing the fusion feature vector and the comprehensive reward function; and in a real physical environment, processing a depth image by using the trained navig</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>CONTROLLING</subject><subject>COUNTING</subject><subject>HANDLING RECORD CARRIERS</subject><subject>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</subject><subject>PHYSICS</subject><subject>PRESENTATION OF DATA</subject><subject>RECOGNITION OF DATA</subject><subject>RECORD CARRIERS</subject><subject>REGULATING</subject><subject>SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2021</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNjDEOwjAQBNNQIOAPRw-FFQRpUQSiooKCKjrsRYlk3xls5_2k4AFUU8xo5tXjLoFF4GhEP1gP8mrZE5esokFLIquSP-opIPfqNuQwDhbE4gjvMsQAyfTkND1UJhtzT4Hjspq92CesflxU6_Pp1l62iNohRbYQ5K69GlPvmn1zMMf6n-YL6_c6oQ</recordid><startdate>20211008</startdate><enddate>20211008</enddate><creator>LIANG ZHUANG</creator><creator>ZHANG QI</creator><creator>SUO XIANGBO</creator><creator>CHEN TINGZHENG</creator><creator>HU RUIJUN</creator><creator>ZHANG YULIN</creator><creator>LI CHUANXIANG</creator><creator>ZHENG YONGHUANG</creator><creator>ZHAO CHENG</creator><scope>EVB</scope></search><sort><creationdate>20211008</creationdate><title>Unmanned vehicle local autonomous control method, device and equipment based on depth map</title><author>LIANG ZHUANG ; ZHANG QI ; SUO XIANGBO ; CHEN TINGZHENG ; HU RUIJUN ; ZHANG YULIN ; LI CHUANXIANG ; ZHENG YONGHUANG ; ZHAO CHENG</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN113486871A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2021</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>CONTROLLING</topic><topic>COUNTING</topic><topic>HANDLING RECORD CARRIERS</topic><topic>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</topic><topic>PHYSICS</topic><topic>PRESENTATION OF DATA</topic><topic>RECOGNITION OF DATA</topic><topic>RECORD CARRIERS</topic><topic>REGULATING</topic><topic>SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES</topic><toplevel>online_resources</toplevel><creatorcontrib>LIANG ZHUANG</creatorcontrib><creatorcontrib>ZHANG QI</creatorcontrib><creatorcontrib>SUO XIANGBO</creatorcontrib><creatorcontrib>CHEN TINGZHENG</creatorcontrib><creatorcontrib>HU RUIJUN</creatorcontrib><creatorcontrib>ZHANG YULIN</creatorcontrib><creatorcontrib>LI CHUANXIANG</creatorcontrib><creatorcontrib>ZHENG YONGHUANG</creatorcontrib><creatorcontrib>ZHAO CHENG</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>LIANG ZHUANG</au><au>ZHANG QI</au><au>SUO XIANGBO</au><au>CHEN TINGZHENG</au><au>HU RUIJUN</au><au>ZHANG YULIN</au><au>LI CHUANXIANG</au><au>ZHENG YONGHUANG</au><au>ZHAO CHENG</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Unmanned vehicle local autonomous control method, device and equipment based on depth map</title><date>2021-10-08</date><risdate>2021</risdate><abstract>The invention relates to an unmanned vehicle local autonomous control method, a device and equipment based on a depth map. The method comprises the following steps: acquiring the depth map shot in the field of view of an unmanned vehicle and extracting a depth feature vector of the depth map; splicing and fusing the depth feature vectors corresponding to a plurality of depth images continuously shot at historical moments and the navigation target point position coordinates of the unmanned vehicle when the depth images are shot to obtain a fused feature vector, and taking the fused feature vector as the input state of the navigation neural network of the unmanned vehicle; designing a comprehensive reward function; training a navigation neural network by adopting a hyper-parameter segmentation training strategy in an obstacle simulation environment by utilizing the fusion feature vector and the comprehensive reward function; and in a real physical environment, processing a depth image by using the trained navig</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN113486871A
source esp@cenet
subjects CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
CONTROLLING
COUNTING
HANDLING RECORD CARRIERS
IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
PHYSICS
PRESENTATION OF DATA
RECOGNITION OF DATA
RECORD CARRIERS
REGULATING
SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
title Unmanned vehicle local autonomous control method, device and equipment based on depth map
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-12T23%3A25%3A32IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=LIANG%20ZHUANG&rft.date=2021-10-08&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN113486871A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true