Active target recognition method based on sparse feature point cloud

The invention relates to the field of pattern recognition, in particular to an active target recognition method based on sparse feature point cloud. The purpose of active target recognition is to change the pose (viewpoint) of a visual sensor through planning to obtain sufficient information, so tha...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: SUN HAIBO, FU SHUANGFEI, ZHU FENG, KONG YANZI, HAO YINGMING
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator SUN HAIBO
FU SHUANGFEI
ZHU FENG
KONG YANZI
HAO YINGMING
description The invention relates to the field of pattern recognition, in particular to an active target recognition method based on sparse feature point cloud. The purpose of active target recognition is to change the pose (viewpoint) of a visual sensor through planning to obtain sufficient information, so that the recognition efficiency and accuracy are improved. The active target recognition method based on the sparse feature point cloud comprises a target representation method based on the sparse feature point cloud and a feature point discrimination measurement method combining a visual dictionary and Bayesian. The method comprises the following steps: performing feature point detection and description on a target dense point cloud model, and constructing a sparse feature point cloud model of a target in combination with feature point model coordinates; and for the observable feature points in the candidate target, measuring the discrimination of each feature point by adopting a method of combining a visual dictiona
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN112307809A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN112307809A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN112307809A3</originalsourceid><addsrcrecordid>eNrjZHBxTC7JLEtVKEksSk8tUShKTc5Pz8ssyczPU8hNLcnIT1FISixOTVEA8osLEouKUxXSUhNLSotSFQryM_NKFJJz8ktTeBhY0xJzilN5oTQ3g6Kba4izh25qQX58KlBfcmpeakm8s5-hoZGxgbmFgaWjMTFqAMKdMuU</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Active target recognition method based on sparse feature point cloud</title><source>esp@cenet</source><creator>SUN HAIBO ; FU SHUANGFEI ; ZHU FENG ; KONG YANZI ; HAO YINGMING</creator><creatorcontrib>SUN HAIBO ; FU SHUANGFEI ; ZHU FENG ; KONG YANZI ; HAO YINGMING</creatorcontrib><description>The invention relates to the field of pattern recognition, in particular to an active target recognition method based on sparse feature point cloud. The purpose of active target recognition is to change the pose (viewpoint) of a visual sensor through planning to obtain sufficient information, so that the recognition efficiency and accuracy are improved. The active target recognition method based on the sparse feature point cloud comprises a target representation method based on the sparse feature point cloud and a feature point discrimination measurement method combining a visual dictionary and Bayesian. The method comprises the following steps: performing feature point detection and description on a target dense point cloud model, and constructing a sparse feature point cloud model of a target in combination with feature point model coordinates; and for the observable feature points in the candidate target, measuring the discrimination of each feature point by adopting a method of combining a visual dictiona</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; HANDLING RECORD CARRIERS ; PHYSICS ; PRESENTATION OF DATA ; RECOGNITION OF DATA ; RECORD CARRIERS</subject><creationdate>2021</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20210202&amp;DB=EPODOC&amp;CC=CN&amp;NR=112307809A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25563,76318</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20210202&amp;DB=EPODOC&amp;CC=CN&amp;NR=112307809A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>SUN HAIBO</creatorcontrib><creatorcontrib>FU SHUANGFEI</creatorcontrib><creatorcontrib>ZHU FENG</creatorcontrib><creatorcontrib>KONG YANZI</creatorcontrib><creatorcontrib>HAO YINGMING</creatorcontrib><title>Active target recognition method based on sparse feature point cloud</title><description>The invention relates to the field of pattern recognition, in particular to an active target recognition method based on sparse feature point cloud. The purpose of active target recognition is to change the pose (viewpoint) of a visual sensor through planning to obtain sufficient information, so that the recognition efficiency and accuracy are improved. The active target recognition method based on the sparse feature point cloud comprises a target representation method based on the sparse feature point cloud and a feature point discrimination measurement method combining a visual dictionary and Bayesian. The method comprises the following steps: performing feature point detection and description on a target dense point cloud model, and constructing a sparse feature point cloud model of a target in combination with feature point model coordinates; and for the observable feature points in the candidate target, measuring the discrimination of each feature point by adopting a method of combining a visual dictiona</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>HANDLING RECORD CARRIERS</subject><subject>PHYSICS</subject><subject>PRESENTATION OF DATA</subject><subject>RECOGNITION OF DATA</subject><subject>RECORD CARRIERS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2021</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZHBxTC7JLEtVKEksSk8tUShKTc5Pz8ssyczPU8hNLcnIT1FISixOTVEA8osLEouKUxXSUhNLSotSFQryM_NKFJJz8ktTeBhY0xJzilN5oTQ3g6Kba4izh25qQX58KlBfcmpeakm8s5-hoZGxgbmFgaWjMTFqAMKdMuU</recordid><startdate>20210202</startdate><enddate>20210202</enddate><creator>SUN HAIBO</creator><creator>FU SHUANGFEI</creator><creator>ZHU FENG</creator><creator>KONG YANZI</creator><creator>HAO YINGMING</creator><scope>EVB</scope></search><sort><creationdate>20210202</creationdate><title>Active target recognition method based on sparse feature point cloud</title><author>SUN HAIBO ; FU SHUANGFEI ; ZHU FENG ; KONG YANZI ; HAO YINGMING</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN112307809A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2021</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>HANDLING RECORD CARRIERS</topic><topic>PHYSICS</topic><topic>PRESENTATION OF DATA</topic><topic>RECOGNITION OF DATA</topic><topic>RECORD CARRIERS</topic><toplevel>online_resources</toplevel><creatorcontrib>SUN HAIBO</creatorcontrib><creatorcontrib>FU SHUANGFEI</creatorcontrib><creatorcontrib>ZHU FENG</creatorcontrib><creatorcontrib>KONG YANZI</creatorcontrib><creatorcontrib>HAO YINGMING</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>SUN HAIBO</au><au>FU SHUANGFEI</au><au>ZHU FENG</au><au>KONG YANZI</au><au>HAO YINGMING</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Active target recognition method based on sparse feature point cloud</title><date>2021-02-02</date><risdate>2021</risdate><abstract>The invention relates to the field of pattern recognition, in particular to an active target recognition method based on sparse feature point cloud. The purpose of active target recognition is to change the pose (viewpoint) of a visual sensor through planning to obtain sufficient information, so that the recognition efficiency and accuracy are improved. The active target recognition method based on the sparse feature point cloud comprises a target representation method based on the sparse feature point cloud and a feature point discrimination measurement method combining a visual dictionary and Bayesian. The method comprises the following steps: performing feature point detection and description on a target dense point cloud model, and constructing a sparse feature point cloud model of a target in combination with feature point model coordinates; and for the observable feature points in the candidate target, measuring the discrimination of each feature point by adopting a method of combining a visual dictiona</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN112307809A
source esp@cenet
subjects CALCULATING
COMPUTING
COUNTING
HANDLING RECORD CARRIERS
PHYSICS
PRESENTATION OF DATA
RECOGNITION OF DATA
RECORD CARRIERS
title Active target recognition method based on sparse feature point cloud
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-12T22%3A25%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=SUN%20HAIBO&rft.date=2021-02-02&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN112307809A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true