All is attention for multi-label text classification

Multi-label text classification(MLTC) is a key task in natural language processing. Its challenge is to extract latent semantic features from text and effectively exploit label-associated features. This work proposes an MLTC model driven solely by attention mechanisms, which includes Graph Attention...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge and information systems 2025-02, Vol.67 (2), p.1249-1270
Hauptverfasser: Liu, Zhi, Huang, Yunjie, Xia, Xincheng, Zhang, Yihao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1270
container_issue 2
container_start_page 1249
container_title Knowledge and information systems
container_volume 67
creator Liu, Zhi
Huang, Yunjie
Xia, Xincheng
Zhang, Yihao
description Multi-label text classification(MLTC) is a key task in natural language processing. Its challenge is to extract latent semantic features from text and effectively exploit label-associated features. This work proposes an MLTC model driven solely by attention mechanisms, which includes Graph Attention(GA), Class-Specific Attention(CSA), and Multi-Head Attention(MHA) modules. The GA module examines and records label dependencies by considering label semantic features as attributes of graph nodes. It uses graph embedding to maintain structural relationships within the label graph. Meanwhile, the CSA module produces distinctive features for each category by utilizing spatial attention scores, thereby improving classification accuracy. Then, the MHA module facilitates extensive feature interactions, enhancing the expressiveness of text features and supporting the handling of long-range dependencies. Experimental evaluations conducted on two MLTC datasets show that our proposed model outperforms existing MLTC algorithms, achieving state-of-the-art performance. These results highlight the effectiveness of our attention-based approach in tackling the complexity of MLTC tasks.
doi_str_mv 10.1007/s10115-024-02253-w
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3162070355</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3162070355</sourcerecordid><originalsourceid>FETCH-LOGICAL-c156t-fea697e552f2c6f688ecfa4edf0fa5688a67287a2a78a7ebf0bc06db10c716573</originalsourceid><addsrcrecordid>eNotkE1LAzEQhoMoWKt_wNOC5-hMskm2x1L8goIXPYfZNIEt6W5NslT_vVvbwzAz8PC-8DB2j_CIAOYpIyAqDqKeRijJDxdsBgIXXCLqy_ON0phrdpPzFgCNRpyxehlj1eWKSvF96Ya-CkOqdmMsHY_U-lgV_1MqFynnLnSOjswtuwoUs7877zn7enn-XL3x9cfr-2q55g6VLjx40gvjlRJBOB1003gXqPabAIHU9JI2ojEkyDRkfBugdaA3LYIzqJWRc_Zwyt2n4Xv0udjtMKZ-qrQStQADUqmJEifKpSHn5IPdp25H6dci2KMde7JjJzv23449yD9qI1fR</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3162070355</pqid></control><display><type>article</type><title>All is attention for multi-label text classification</title><source>SpringerLink Journals</source><creator>Liu, Zhi ; Huang, Yunjie ; Xia, Xincheng ; Zhang, Yihao</creator><creatorcontrib>Liu, Zhi ; Huang, Yunjie ; Xia, Xincheng ; Zhang, Yihao</creatorcontrib><description>Multi-label text classification(MLTC) is a key task in natural language processing. Its challenge is to extract latent semantic features from text and effectively exploit label-associated features. This work proposes an MLTC model driven solely by attention mechanisms, which includes Graph Attention(GA), Class-Specific Attention(CSA), and Multi-Head Attention(MHA) modules. The GA module examines and records label dependencies by considering label semantic features as attributes of graph nodes. It uses graph embedding to maintain structural relationships within the label graph. Meanwhile, the CSA module produces distinctive features for each category by utilizing spatial attention scores, thereby improving classification accuracy. Then, the MHA module facilitates extensive feature interactions, enhancing the expressiveness of text features and supporting the handling of long-range dependencies. Experimental evaluations conducted on two MLTC datasets show that our proposed model outperforms existing MLTC algorithms, achieving state-of-the-art performance. These results highlight the effectiveness of our attention-based approach in tackling the complexity of MLTC tasks.</description><identifier>ISSN: 0219-1377</identifier><identifier>EISSN: 0219-3116</identifier><identifier>DOI: 10.1007/s10115-024-02253-w</identifier><language>eng</language><publisher>London: Springer Nature B.V</publisher><subject>Algorithms ; Attention ; Classification ; Feature extraction ; Labels ; Modules ; Natural language processing ; Semantics ; Task complexity ; Text categorization</subject><ispartof>Knowledge and information systems, 2025-02, Vol.67 (2), p.1249-1270</ispartof><rights>Copyright Springer Nature B.V. Feb 2025</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c156t-fea697e552f2c6f688ecfa4edf0fa5688a67287a2a78a7ebf0bc06db10c716573</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Liu, Zhi</creatorcontrib><creatorcontrib>Huang, Yunjie</creatorcontrib><creatorcontrib>Xia, Xincheng</creatorcontrib><creatorcontrib>Zhang, Yihao</creatorcontrib><title>All is attention for multi-label text classification</title><title>Knowledge and information systems</title><description>Multi-label text classification(MLTC) is a key task in natural language processing. Its challenge is to extract latent semantic features from text and effectively exploit label-associated features. This work proposes an MLTC model driven solely by attention mechanisms, which includes Graph Attention(GA), Class-Specific Attention(CSA), and Multi-Head Attention(MHA) modules. The GA module examines and records label dependencies by considering label semantic features as attributes of graph nodes. It uses graph embedding to maintain structural relationships within the label graph. Meanwhile, the CSA module produces distinctive features for each category by utilizing spatial attention scores, thereby improving classification accuracy. Then, the MHA module facilitates extensive feature interactions, enhancing the expressiveness of text features and supporting the handling of long-range dependencies. Experimental evaluations conducted on two MLTC datasets show that our proposed model outperforms existing MLTC algorithms, achieving state-of-the-art performance. These results highlight the effectiveness of our attention-based approach in tackling the complexity of MLTC tasks.</description><subject>Algorithms</subject><subject>Attention</subject><subject>Classification</subject><subject>Feature extraction</subject><subject>Labels</subject><subject>Modules</subject><subject>Natural language processing</subject><subject>Semantics</subject><subject>Task complexity</subject><subject>Text categorization</subject><issn>0219-1377</issn><issn>0219-3116</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2025</creationdate><recordtype>article</recordtype><recordid>eNotkE1LAzEQhoMoWKt_wNOC5-hMskm2x1L8goIXPYfZNIEt6W5NslT_vVvbwzAz8PC-8DB2j_CIAOYpIyAqDqKeRijJDxdsBgIXXCLqy_ON0phrdpPzFgCNRpyxehlj1eWKSvF96Ya-CkOqdmMsHY_U-lgV_1MqFynnLnSOjswtuwoUs7877zn7enn-XL3x9cfr-2q55g6VLjx40gvjlRJBOB1003gXqPabAIHU9JI2ojEkyDRkfBugdaA3LYIzqJWRc_Zwyt2n4Xv0udjtMKZ-qrQStQADUqmJEifKpSHn5IPdp25H6dci2KMde7JjJzv23449yD9qI1fR</recordid><startdate>20250201</startdate><enddate>20250201</enddate><creator>Liu, Zhi</creator><creator>Huang, Yunjie</creator><creator>Xia, Xincheng</creator><creator>Zhang, Yihao</creator><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20250201</creationdate><title>All is attention for multi-label text classification</title><author>Liu, Zhi ; Huang, Yunjie ; Xia, Xincheng ; Zhang, Yihao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c156t-fea697e552f2c6f688ecfa4edf0fa5688a67287a2a78a7ebf0bc06db10c716573</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2025</creationdate><topic>Algorithms</topic><topic>Attention</topic><topic>Classification</topic><topic>Feature extraction</topic><topic>Labels</topic><topic>Modules</topic><topic>Natural language processing</topic><topic>Semantics</topic><topic>Task complexity</topic><topic>Text categorization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liu, Zhi</creatorcontrib><creatorcontrib>Huang, Yunjie</creatorcontrib><creatorcontrib>Xia, Xincheng</creatorcontrib><creatorcontrib>Zhang, Yihao</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Knowledge and information systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Liu, Zhi</au><au>Huang, Yunjie</au><au>Xia, Xincheng</au><au>Zhang, Yihao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>All is attention for multi-label text classification</atitle><jtitle>Knowledge and information systems</jtitle><date>2025-02-01</date><risdate>2025</risdate><volume>67</volume><issue>2</issue><spage>1249</spage><epage>1270</epage><pages>1249-1270</pages><issn>0219-1377</issn><eissn>0219-3116</eissn><abstract>Multi-label text classification(MLTC) is a key task in natural language processing. Its challenge is to extract latent semantic features from text and effectively exploit label-associated features. This work proposes an MLTC model driven solely by attention mechanisms, which includes Graph Attention(GA), Class-Specific Attention(CSA), and Multi-Head Attention(MHA) modules. The GA module examines and records label dependencies by considering label semantic features as attributes of graph nodes. It uses graph embedding to maintain structural relationships within the label graph. Meanwhile, the CSA module produces distinctive features for each category by utilizing spatial attention scores, thereby improving classification accuracy. Then, the MHA module facilitates extensive feature interactions, enhancing the expressiveness of text features and supporting the handling of long-range dependencies. Experimental evaluations conducted on two MLTC datasets show that our proposed model outperforms existing MLTC algorithms, achieving state-of-the-art performance. These results highlight the effectiveness of our attention-based approach in tackling the complexity of MLTC tasks.</abstract><cop>London</cop><pub>Springer Nature B.V</pub><doi>10.1007/s10115-024-02253-w</doi><tpages>22</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0219-1377
ispartof Knowledge and information systems, 2025-02, Vol.67 (2), p.1249-1270
issn 0219-1377
0219-3116
language eng
recordid cdi_proquest_journals_3162070355
source SpringerLink Journals
subjects Algorithms
Attention
Classification
Feature extraction
Labels
Modules
Natural language processing
Semantics
Task complexity
Text categorization
title All is attention for multi-label text classification
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-13T05%3A14%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=All%20is%20attention%20for%20multi-label%20text%20classification&rft.jtitle=Knowledge%20and%20information%20systems&rft.au=Liu,%20Zhi&rft.date=2025-02-01&rft.volume=67&rft.issue=2&rft.spage=1249&rft.epage=1270&rft.pages=1249-1270&rft.issn=0219-1377&rft.eissn=0219-3116&rft_id=info:doi/10.1007/s10115-024-02253-w&rft_dat=%3Cproquest_cross%3E3162070355%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3162070355&rft_id=info:pmid/&rfr_iscdi=true