Hyperspectral wetland image classification method based on graph capsule neural network

The invention discloses a hyperspectral wetland image classification method based on a graph capsule neural network, and the method comprises the following steps: S1, generating an adversarial domain adaptive frame, carrying out the learning feature transformation, and enabling a source domain sampl...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: LI HENGCHAO, NIU XUEMEI, HU WENSHUAI, GUO BENJUN, WANG WEIYE, DENG YANGJUN
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator LI HENGCHAO
NIU XUEMEI
HU WENSHUAI
GUO BENJUN
WANG WEIYE
DENG YANGJUN
description The invention discloses a hyperspectral wetland image classification method based on a graph capsule neural network, and the method comprises the following steps: S1, generating an adversarial domain adaptive frame, carrying out the learning feature transformation, and enabling a source domain sample and a target domain sample of a hyperspectral wetland image to carry out the feature matching; s2, constructing a map capsule neural domain adaptive network structure, extracting domain invariant features and domain related features, discovering migratable features and performing cross-domain sharing; and S3, designing two classifiers of a coupling structure, training the two classifiers by using the source domain sample, maximizing the classification difference of the target domain sample, and realizing accurate classification of the hyperspectral wetland image by identifying the classification boundary. According to the method, transferable knowledge is discovered, cross-domain sharing is realized, effective di
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN116645552A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN116645552A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN116645552A3</originalsourceid><addsrcrecordid>eNqNjEEKwjAUBbNxIeodvgdwUbXdl6J05UpwWb7paxtMk5CfUry9FTyAq2FgmLV61O-AKAE6RbY0I1l2LZmRe5C2LGI6ozkZ72hEGnxLTxa0tHgfOQykOchkQQ7T9-CQZh9fW7Xq2Ap2P27U_nq5V_UBwTeQwBpL2VS3LCuKc57nx_L0T_MBu_Q6lQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Hyperspectral wetland image classification method based on graph capsule neural network</title><source>esp@cenet</source><creator>LI HENGCHAO ; NIU XUEMEI ; HU WENSHUAI ; GUO BENJUN ; WANG WEIYE ; DENG YANGJUN</creator><creatorcontrib>LI HENGCHAO ; NIU XUEMEI ; HU WENSHUAI ; GUO BENJUN ; WANG WEIYE ; DENG YANGJUN</creatorcontrib><description>The invention discloses a hyperspectral wetland image classification method based on a graph capsule neural network, and the method comprises the following steps: S1, generating an adversarial domain adaptive frame, carrying out the learning feature transformation, and enabling a source domain sample and a target domain sample of a hyperspectral wetland image to carry out the feature matching; s2, constructing a map capsule neural domain adaptive network structure, extracting domain invariant features and domain related features, discovering migratable features and performing cross-domain sharing; and S3, designing two classifiers of a coupling structure, training the two classifiers by using the source domain sample, maximizing the classification difference of the target domain sample, and realizing accurate classification of the hyperspectral wetland image by identifying the classification boundary. According to the method, transferable knowledge is discovered, cross-domain sharing is realized, effective di</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; PHYSICS</subject><creationdate>2023</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230825&amp;DB=EPODOC&amp;CC=CN&amp;NR=116645552A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76289</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230825&amp;DB=EPODOC&amp;CC=CN&amp;NR=116645552A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>LI HENGCHAO</creatorcontrib><creatorcontrib>NIU XUEMEI</creatorcontrib><creatorcontrib>HU WENSHUAI</creatorcontrib><creatorcontrib>GUO BENJUN</creatorcontrib><creatorcontrib>WANG WEIYE</creatorcontrib><creatorcontrib>DENG YANGJUN</creatorcontrib><title>Hyperspectral wetland image classification method based on graph capsule neural network</title><description>The invention discloses a hyperspectral wetland image classification method based on a graph capsule neural network, and the method comprises the following steps: S1, generating an adversarial domain adaptive frame, carrying out the learning feature transformation, and enabling a source domain sample and a target domain sample of a hyperspectral wetland image to carry out the feature matching; s2, constructing a map capsule neural domain adaptive network structure, extracting domain invariant features and domain related features, discovering migratable features and performing cross-domain sharing; and S3, designing two classifiers of a coupling structure, training the two classifiers by using the source domain sample, maximizing the classification difference of the target domain sample, and realizing accurate classification of the hyperspectral wetland image by identifying the classification boundary. According to the method, transferable knowledge is discovered, cross-domain sharing is realized, effective di</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2023</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNjEEKwjAUBbNxIeodvgdwUbXdl6J05UpwWb7paxtMk5CfUry9FTyAq2FgmLV61O-AKAE6RbY0I1l2LZmRe5C2LGI6ozkZ72hEGnxLTxa0tHgfOQykOchkQQ7T9-CQZh9fW7Xq2Ap2P27U_nq5V_UBwTeQwBpL2VS3LCuKc57nx_L0T_MBu_Q6lQ</recordid><startdate>20230825</startdate><enddate>20230825</enddate><creator>LI HENGCHAO</creator><creator>NIU XUEMEI</creator><creator>HU WENSHUAI</creator><creator>GUO BENJUN</creator><creator>WANG WEIYE</creator><creator>DENG YANGJUN</creator><scope>EVB</scope></search><sort><creationdate>20230825</creationdate><title>Hyperspectral wetland image classification method based on graph capsule neural network</title><author>LI HENGCHAO ; NIU XUEMEI ; HU WENSHUAI ; GUO BENJUN ; WANG WEIYE ; DENG YANGJUN</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN116645552A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2023</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>LI HENGCHAO</creatorcontrib><creatorcontrib>NIU XUEMEI</creatorcontrib><creatorcontrib>HU WENSHUAI</creatorcontrib><creatorcontrib>GUO BENJUN</creatorcontrib><creatorcontrib>WANG WEIYE</creatorcontrib><creatorcontrib>DENG YANGJUN</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>LI HENGCHAO</au><au>NIU XUEMEI</au><au>HU WENSHUAI</au><au>GUO BENJUN</au><au>WANG WEIYE</au><au>DENG YANGJUN</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Hyperspectral wetland image classification method based on graph capsule neural network</title><date>2023-08-25</date><risdate>2023</risdate><abstract>The invention discloses a hyperspectral wetland image classification method based on a graph capsule neural network, and the method comprises the following steps: S1, generating an adversarial domain adaptive frame, carrying out the learning feature transformation, and enabling a source domain sample and a target domain sample of a hyperspectral wetland image to carry out the feature matching; s2, constructing a map capsule neural domain adaptive network structure, extracting domain invariant features and domain related features, discovering migratable features and performing cross-domain sharing; and S3, designing two classifiers of a coupling structure, training the two classifiers by using the source domain sample, maximizing the classification difference of the target domain sample, and realizing accurate classification of the hyperspectral wetland image by identifying the classification boundary. According to the method, transferable knowledge is discovered, cross-domain sharing is realized, effective di</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN116645552A
source esp@cenet
subjects CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
PHYSICS
title Hyperspectral wetland image classification method based on graph capsule neural network
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-12T21%3A46%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=LI%20HENGCHAO&rft.date=2023-08-25&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN116645552A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true