Fast Cross-Modal Hashing With Global and Local Similarity Embedding
Recently, supervised cross-modal hashing has attracted much attention and achieved promising performance. To learn hash functions and binary codes, most methods globally exploit the supervised information, for example, preserving an at-least-one pairwise similarity into hash codes or reconstructing...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on cybernetics 2022-10, Vol.52 (10), p.10064-10077 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 10077 |
---|---|
container_issue | 10 |
container_start_page | 10064 |
container_title | IEEE transactions on cybernetics |
container_volume | 52 |
creator | Wang, Yongxin Chen, Zhen-Duo Luo, Xin Li, Rui Xu, Xin-Shun |
description | Recently, supervised cross-modal hashing has attracted much attention and achieved promising performance. To learn hash functions and binary codes, most methods globally exploit the supervised information, for example, preserving an at-least-one pairwise similarity into hash codes or reconstructing the label matrix with binary codes. However, due to the hardness of the discrete optimization problem, they are usually time consuming on large-scale datasets. In addition, they neglect the class correlation in supervised information. From another point of view, they only explore the global similarity of data but overlook the local similarity hidden in the data distribution. To address these issues, we present an efficient supervised cross-modal hashing method, that is, fast cross-modal hashing (FCMH). It leverages not only global similarity information but also the local similarity in a group. Specifically, training samples are partitioned into groups; thereafter, the local similarity in each group is extracted. Moreover, the class correlation in labels is also exploited and embedded into the learning of binary codes. In addition, to solve the discrete optimization problem, we further propose an efficient discrete optimization algorithm with a well-designed group updating scheme, making its computational complexity linear to the size of the training set. In light of this, it is more efficient and scalable to large-scale datasets. Extensive experiments on three benchmark datasets demonstrate that FCMH outperforms some state-of-the-art cross-modal hashing approaches in terms of both retrieval accuracy and learning efficiency. |
doi_str_mv | 10.1109/TCYB.2021.3059886 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_9382960</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9382960</ieee_id><sourcerecordid>2504352842</sourcerecordid><originalsourceid>FETCH-LOGICAL-c349t-f968f075a4f2eeae3d6b6046bd5715fc7361e745be3b5be0f4a3877e5135bed03</originalsourceid><addsrcrecordid>eNpdkDtPwzAQgC0Eoqj0ByAkFImFJcVvOyNEfSAVMVCEmCwndqirPEqcDP33OGrpgAf7fP7udP4AuEFwihBMHtfp1_MUQ4ymBLJESn4GrjDiMsZYsPNTzMUITLzfwrBkSCXyEowIEQwKTK5AOte-i9K28T5-bYwuo6X2G1d_R5-u20SLsslCTtcmWjV5iN5d5Urdum4fzarMGhPQa3BR6NLbyfEcg4_5bJ0u49Xb4iV9WsU5oUkXFwmXBRRM0wJbqy0xPOOQ8swwgViRC8KRFZRllmRhgwXVRAphGSLhaiAZg4dD313b_PTWd6pyPrdlqWvb9F5hBilhWFIc0Pt_6Lbp2zpMp7BAnFBBIQoUOlD58P_WFmrXukq3e4WgGiSrQbIaJKuj5FBzd-zcZ5U1p4o_pQG4PQDOWnt6TojECYfkFxSoffs</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2716347401</pqid></control><display><type>article</type><title>Fast Cross-Modal Hashing With Global and Local Similarity Embedding</title><source>IEEE Electronic Library (IEL)</source><creator>Wang, Yongxin ; Chen, Zhen-Duo ; Luo, Xin ; Li, Rui ; Xu, Xin-Shun</creator><creatorcontrib>Wang, Yongxin ; Chen, Zhen-Duo ; Luo, Xin ; Li, Rui ; Xu, Xin-Shun</creatorcontrib><description>Recently, supervised cross-modal hashing has attracted much attention and achieved promising performance. To learn hash functions and binary codes, most methods globally exploit the supervised information, for example, preserving an at-least-one pairwise similarity into hash codes or reconstructing the label matrix with binary codes. However, due to the hardness of the discrete optimization problem, they are usually time consuming on large-scale datasets. In addition, they neglect the class correlation in supervised information. From another point of view, they only explore the global similarity of data but overlook the local similarity hidden in the data distribution. To address these issues, we present an efficient supervised cross-modal hashing method, that is, fast cross-modal hashing (FCMH). It leverages not only global similarity information but also the local similarity in a group. Specifically, training samples are partitioned into groups; thereafter, the local similarity in each group is extracted. Moreover, the class correlation in labels is also exploited and embedded into the learning of binary codes. In addition, to solve the discrete optimization problem, we further propose an efficient discrete optimization algorithm with a well-designed group updating scheme, making its computational complexity linear to the size of the training set. In light of this, it is more efficient and scalable to large-scale datasets. Extensive experiments on three benchmark datasets demonstrate that FCMH outperforms some state-of-the-art cross-modal hashing approaches in terms of both retrieval accuracy and learning efficiency.</description><identifier>ISSN: 2168-2267</identifier><identifier>EISSN: 2168-2275</identifier><identifier>DOI: 10.1109/TCYB.2021.3059886</identifier><identifier>PMID: 33750723</identifier><identifier>CODEN: ITCEB8</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Binary codes ; Correlation ; Cross-modal hashing ; Datasets ; discrete optimization ; Embedding ; Hash based algorithms ; Hash functions ; Learning ; local similarity embedding ; Optimization ; scalable hashing ; Semantics ; Similarity ; Symmetric matrices ; Training</subject><ispartof>IEEE transactions on cybernetics, 2022-10, Vol.52 (10), p.10064-10077</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c349t-f968f075a4f2eeae3d6b6046bd5715fc7361e745be3b5be0f4a3877e5135bed03</citedby><cites>FETCH-LOGICAL-c349t-f968f075a4f2eeae3d6b6046bd5715fc7361e745be3b5be0f4a3877e5135bed03</cites><orcidid>0000-0001-9972-7370 ; 0000-0002-0172-9085 ; 0000-0002-3481-4892 ; 0000-0002-6901-5476</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9382960$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27903,27904,54736</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9382960$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/33750723$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Wang, Yongxin</creatorcontrib><creatorcontrib>Chen, Zhen-Duo</creatorcontrib><creatorcontrib>Luo, Xin</creatorcontrib><creatorcontrib>Li, Rui</creatorcontrib><creatorcontrib>Xu, Xin-Shun</creatorcontrib><title>Fast Cross-Modal Hashing With Global and Local Similarity Embedding</title><title>IEEE transactions on cybernetics</title><addtitle>TCYB</addtitle><addtitle>IEEE Trans Cybern</addtitle><description>Recently, supervised cross-modal hashing has attracted much attention and achieved promising performance. To learn hash functions and binary codes, most methods globally exploit the supervised information, for example, preserving an at-least-one pairwise similarity into hash codes or reconstructing the label matrix with binary codes. However, due to the hardness of the discrete optimization problem, they are usually time consuming on large-scale datasets. In addition, they neglect the class correlation in supervised information. From another point of view, they only explore the global similarity of data but overlook the local similarity hidden in the data distribution. To address these issues, we present an efficient supervised cross-modal hashing method, that is, fast cross-modal hashing (FCMH). It leverages not only global similarity information but also the local similarity in a group. Specifically, training samples are partitioned into groups; thereafter, the local similarity in each group is extracted. Moreover, the class correlation in labels is also exploited and embedded into the learning of binary codes. In addition, to solve the discrete optimization problem, we further propose an efficient discrete optimization algorithm with a well-designed group updating scheme, making its computational complexity linear to the size of the training set. In light of this, it is more efficient and scalable to large-scale datasets. Extensive experiments on three benchmark datasets demonstrate that FCMH outperforms some state-of-the-art cross-modal hashing approaches in terms of both retrieval accuracy and learning efficiency.</description><subject>Binary codes</subject><subject>Correlation</subject><subject>Cross-modal hashing</subject><subject>Datasets</subject><subject>discrete optimization</subject><subject>Embedding</subject><subject>Hash based algorithms</subject><subject>Hash functions</subject><subject>Learning</subject><subject>local similarity embedding</subject><subject>Optimization</subject><subject>scalable hashing</subject><subject>Semantics</subject><subject>Similarity</subject><subject>Symmetric matrices</subject><subject>Training</subject><issn>2168-2267</issn><issn>2168-2275</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpdkDtPwzAQgC0Eoqj0ByAkFImFJcVvOyNEfSAVMVCEmCwndqirPEqcDP33OGrpgAf7fP7udP4AuEFwihBMHtfp1_MUQ4ymBLJESn4GrjDiMsZYsPNTzMUITLzfwrBkSCXyEowIEQwKTK5AOte-i9K28T5-bYwuo6X2G1d_R5-u20SLsslCTtcmWjV5iN5d5Urdum4fzarMGhPQa3BR6NLbyfEcg4_5bJ0u49Xb4iV9WsU5oUkXFwmXBRRM0wJbqy0xPOOQ8swwgViRC8KRFZRllmRhgwXVRAphGSLhaiAZg4dD313b_PTWd6pyPrdlqWvb9F5hBilhWFIc0Pt_6Lbp2zpMp7BAnFBBIQoUOlD58P_WFmrXukq3e4WgGiSrQbIaJKuj5FBzd-zcZ5U1p4o_pQG4PQDOWnt6TojECYfkFxSoffs</recordid><startdate>20221001</startdate><enddate>20221001</enddate><creator>Wang, Yongxin</creator><creator>Chen, Zhen-Duo</creator><creator>Luo, Xin</creator><creator>Li, Rui</creator><creator>Xu, Xin-Shun</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-9972-7370</orcidid><orcidid>https://orcid.org/0000-0002-0172-9085</orcidid><orcidid>https://orcid.org/0000-0002-3481-4892</orcidid><orcidid>https://orcid.org/0000-0002-6901-5476</orcidid></search><sort><creationdate>20221001</creationdate><title>Fast Cross-Modal Hashing With Global and Local Similarity Embedding</title><author>Wang, Yongxin ; Chen, Zhen-Duo ; Luo, Xin ; Li, Rui ; Xu, Xin-Shun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c349t-f968f075a4f2eeae3d6b6046bd5715fc7361e745be3b5be0f4a3877e5135bed03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Binary codes</topic><topic>Correlation</topic><topic>Cross-modal hashing</topic><topic>Datasets</topic><topic>discrete optimization</topic><topic>Embedding</topic><topic>Hash based algorithms</topic><topic>Hash functions</topic><topic>Learning</topic><topic>local similarity embedding</topic><topic>Optimization</topic><topic>scalable hashing</topic><topic>Semantics</topic><topic>Similarity</topic><topic>Symmetric matrices</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wang, Yongxin</creatorcontrib><creatorcontrib>Chen, Zhen-Duo</creatorcontrib><creatorcontrib>Luo, Xin</creatorcontrib><creatorcontrib>Li, Rui</creatorcontrib><creatorcontrib>Xu, Xin-Shun</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on cybernetics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wang, Yongxin</au><au>Chen, Zhen-Duo</au><au>Luo, Xin</au><au>Li, Rui</au><au>Xu, Xin-Shun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Fast Cross-Modal Hashing With Global and Local Similarity Embedding</atitle><jtitle>IEEE transactions on cybernetics</jtitle><stitle>TCYB</stitle><addtitle>IEEE Trans Cybern</addtitle><date>2022-10-01</date><risdate>2022</risdate><volume>52</volume><issue>10</issue><spage>10064</spage><epage>10077</epage><pages>10064-10077</pages><issn>2168-2267</issn><eissn>2168-2275</eissn><coden>ITCEB8</coden><abstract>Recently, supervised cross-modal hashing has attracted much attention and achieved promising performance. To learn hash functions and binary codes, most methods globally exploit the supervised information, for example, preserving an at-least-one pairwise similarity into hash codes or reconstructing the label matrix with binary codes. However, due to the hardness of the discrete optimization problem, they are usually time consuming on large-scale datasets. In addition, they neglect the class correlation in supervised information. From another point of view, they only explore the global similarity of data but overlook the local similarity hidden in the data distribution. To address these issues, we present an efficient supervised cross-modal hashing method, that is, fast cross-modal hashing (FCMH). It leverages not only global similarity information but also the local similarity in a group. Specifically, training samples are partitioned into groups; thereafter, the local similarity in each group is extracted. Moreover, the class correlation in labels is also exploited and embedded into the learning of binary codes. In addition, to solve the discrete optimization problem, we further propose an efficient discrete optimization algorithm with a well-designed group updating scheme, making its computational complexity linear to the size of the training set. In light of this, it is more efficient and scalable to large-scale datasets. Extensive experiments on three benchmark datasets demonstrate that FCMH outperforms some state-of-the-art cross-modal hashing approaches in terms of both retrieval accuracy and learning efficiency.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>33750723</pmid><doi>10.1109/TCYB.2021.3059886</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0001-9972-7370</orcidid><orcidid>https://orcid.org/0000-0002-0172-9085</orcidid><orcidid>https://orcid.org/0000-0002-3481-4892</orcidid><orcidid>https://orcid.org/0000-0002-6901-5476</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2168-2267 |
ispartof | IEEE transactions on cybernetics, 2022-10, Vol.52 (10), p.10064-10077 |
issn | 2168-2267 2168-2275 |
language | eng |
recordid | cdi_ieee_primary_9382960 |
source | IEEE Electronic Library (IEL) |
subjects | Binary codes Correlation Cross-modal hashing Datasets discrete optimization Embedding Hash based algorithms Hash functions Learning local similarity embedding Optimization scalable hashing Semantics Similarity Symmetric matrices Training |
title | Fast Cross-Modal Hashing With Global and Local Similarity Embedding |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T23%3A54%3A56IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Fast%20Cross-Modal%20Hashing%20With%20Global%20and%20Local%20Similarity%20Embedding&rft.jtitle=IEEE%20transactions%20on%20cybernetics&rft.au=Wang,%20Yongxin&rft.date=2022-10-01&rft.volume=52&rft.issue=10&rft.spage=10064&rft.epage=10077&rft.pages=10064-10077&rft.issn=2168-2267&rft.eissn=2168-2275&rft.coden=ITCEB8&rft_id=info:doi/10.1109/TCYB.2021.3059886&rft_dat=%3Cproquest_RIE%3E2504352842%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2716347401&rft_id=info:pmid/33750723&rft_ieee_id=9382960&rfr_iscdi=true |