Multistream BertGCN for Sentiment Classification Based on Cross-Document Learning

Very recently, the BERT graph convolutional network (BertGCN) model has attracted much attention from researchers due to its good text classification performance. However, just using original documents in the corpus to construct the topology of graphs for GCN-based models may lose some effective inf...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Quantum engineering 2023-11, Vol.2023, p.1-9
Hauptverfasser: Li, Meng, Xie, Yujin, Yang, Weifeng, Chen, Shenyu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 9
container_issue
container_start_page 1
container_title Quantum engineering
container_volume 2023
creator Li, Meng
Xie, Yujin
Yang, Weifeng
Chen, Shenyu
description Very recently, the BERT graph convolutional network (BertGCN) model has attracted much attention from researchers due to its good text classification performance. However, just using original documents in the corpus to construct the topology of graphs for GCN-based models may lose some effective information. In this paper, we focus on sentiment classification, an important branch of text classification, and propose the multistream BERT graph convolutional network (MS-BertGCN) for sentiment classification based on cross-document learning. In the proposed method, we first combine the documents in the training set based on within-class similarity. Then, each heterogeneous graph is constructed using a group of combinations of documents for the single-stream BertGCN model. Finally, we construct multistream-BertGCN (MS-BertGCN) based on multiple heterogeneous graphs constructed from different groups of combined documents. The experimental results show that our MS-BertGCN model outperforms state-of-the-art methods on sentiment classification tasks.
doi_str_mv 10.1155/2023/3668960
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3107135450</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3107135450</sourcerecordid><originalsourceid>FETCH-LOGICAL-c1390-ac6b575129d2c93ee6e085f13251b623e6e1b8292c96856653a529a42cef39223</originalsourceid><addsrcrecordid>eNp9kEFPwzAMhSMEEtPYjR9QiSOUOXGTNkdWYCANEALOUZqmkGlrR9IK8e_J2A6cuNjP8ic_6xFySuGSUs6nDBhOUYhCCjggI8bzPIUsh8M_-phMQlgCAKNZxhFH5PlhWPUu9N7qdTKzvp-Xj0nT-eTFtr1bx5KUKx2Ca5zRvevaZKaDrZMoSt-FkF53ZvjFFlb71rXvJ-So0atgJ_s-Jm-3N6_lXbp4mt-XV4vUUJSQaiMqnnPKZM2MRGuFhYI3FBmnlWAYZ1oVTMalKLgQHDVnUmfM2AYlYzgmZ7u7G999Djb0atkNvo2WCinkFHnGIVIXO8psv_W2URvv1tp_Kwpqm5va5qb2uUX8fId_uLbWX-5_-gckuWqE</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3107135450</pqid></control><display><type>article</type><title>Multistream BertGCN for Sentiment Classification Based on Cross-Document Learning</title><source>Wiley-Blackwell Open Access Titles</source><source>Wiley Online Library All Journals</source><source>Alma/SFX Local Collection</source><creator>Li, Meng ; Xie, Yujin ; Yang, Weifeng ; Chen, Shenyu</creator><contributor>Dong, Shi Hai ; Shi Hai Dong</contributor><creatorcontrib>Li, Meng ; Xie, Yujin ; Yang, Weifeng ; Chen, Shenyu ; Dong, Shi Hai ; Shi Hai Dong</creatorcontrib><description>Very recently, the BERT graph convolutional network (BertGCN) model has attracted much attention from researchers due to its good text classification performance. However, just using original documents in the corpus to construct the topology of graphs for GCN-based models may lose some effective information. In this paper, we focus on sentiment classification, an important branch of text classification, and propose the multistream BERT graph convolutional network (MS-BertGCN) for sentiment classification based on cross-document learning. In the proposed method, we first combine the documents in the training set based on within-class similarity. Then, each heterogeneous graph is constructed using a group of combinations of documents for the single-stream BertGCN model. Finally, we construct multistream-BertGCN (MS-BertGCN) based on multiple heterogeneous graphs constructed from different groups of combined documents. The experimental results show that our MS-BertGCN model outperforms state-of-the-art methods on sentiment classification tasks.</description><identifier>ISSN: 2577-0470</identifier><identifier>EISSN: 2577-0470</identifier><identifier>DOI: 10.1155/2023/3668960</identifier><language>eng</language><publisher>Hoboken: Hindawi</publisher><subject>Accuracy ; Artificial neural networks ; Classification ; Datasets ; Deep learning ; Dictionaries ; Documents ; Experiments ; Graph representations ; Graphs ; Information processing ; Language ; Learning ; Machine learning ; Natural language processing ; Neural networks ; Public opinion surveys ; Sentiment analysis ; Text categorization ; Topology</subject><ispartof>Quantum engineering, 2023-11, Vol.2023, p.1-9</ispartof><rights>Copyright © 2023 Meng Li et al.</rights><rights>Copyright © 2023 Meng Li et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c1390-ac6b575129d2c93ee6e085f13251b623e6e1b8292c96856653a529a42cef39223</cites><orcidid>0000-0001-6061-3915 ; 0000-0002-9527-1422 ; 0000-0003-3497-4391 ; 0000-0002-0462-6727</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27923,27924</link.rule.ids></links><search><contributor>Dong, Shi Hai</contributor><contributor>Shi Hai Dong</contributor><creatorcontrib>Li, Meng</creatorcontrib><creatorcontrib>Xie, Yujin</creatorcontrib><creatorcontrib>Yang, Weifeng</creatorcontrib><creatorcontrib>Chen, Shenyu</creatorcontrib><title>Multistream BertGCN for Sentiment Classification Based on Cross-Document Learning</title><title>Quantum engineering</title><description>Very recently, the BERT graph convolutional network (BertGCN) model has attracted much attention from researchers due to its good text classification performance. However, just using original documents in the corpus to construct the topology of graphs for GCN-based models may lose some effective information. In this paper, we focus on sentiment classification, an important branch of text classification, and propose the multistream BERT graph convolutional network (MS-BertGCN) for sentiment classification based on cross-document learning. In the proposed method, we first combine the documents in the training set based on within-class similarity. Then, each heterogeneous graph is constructed using a group of combinations of documents for the single-stream BertGCN model. Finally, we construct multistream-BertGCN (MS-BertGCN) based on multiple heterogeneous graphs constructed from different groups of combined documents. The experimental results show that our MS-BertGCN model outperforms state-of-the-art methods on sentiment classification tasks.</description><subject>Accuracy</subject><subject>Artificial neural networks</subject><subject>Classification</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Dictionaries</subject><subject>Documents</subject><subject>Experiments</subject><subject>Graph representations</subject><subject>Graphs</subject><subject>Information processing</subject><subject>Language</subject><subject>Learning</subject><subject>Machine learning</subject><subject>Natural language processing</subject><subject>Neural networks</subject><subject>Public opinion surveys</subject><subject>Sentiment analysis</subject><subject>Text categorization</subject><subject>Topology</subject><issn>2577-0470</issn><issn>2577-0470</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RHX</sourceid><recordid>eNp9kEFPwzAMhSMEEtPYjR9QiSOUOXGTNkdWYCANEALOUZqmkGlrR9IK8e_J2A6cuNjP8ic_6xFySuGSUs6nDBhOUYhCCjggI8bzPIUsh8M_-phMQlgCAKNZxhFH5PlhWPUu9N7qdTKzvp-Xj0nT-eTFtr1bx5KUKx2Ca5zRvevaZKaDrZMoSt-FkF53ZvjFFlb71rXvJ-So0atgJ_s-Jm-3N6_lXbp4mt-XV4vUUJSQaiMqnnPKZM2MRGuFhYI3FBmnlWAYZ1oVTMalKLgQHDVnUmfM2AYlYzgmZ7u7G999Djb0atkNvo2WCinkFHnGIVIXO8psv_W2URvv1tp_Kwpqm5va5qb2uUX8fId_uLbWX-5_-gckuWqE</recordid><startdate>20231113</startdate><enddate>20231113</enddate><creator>Li, Meng</creator><creator>Xie, Yujin</creator><creator>Yang, Weifeng</creator><creator>Chen, Shenyu</creator><general>Hindawi</general><general>Wiley Subscription Services, Inc</general><scope>RHU</scope><scope>RHW</scope><scope>RHX</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>JQ2</scope><orcidid>https://orcid.org/0000-0001-6061-3915</orcidid><orcidid>https://orcid.org/0000-0002-9527-1422</orcidid><orcidid>https://orcid.org/0000-0003-3497-4391</orcidid><orcidid>https://orcid.org/0000-0002-0462-6727</orcidid></search><sort><creationdate>20231113</creationdate><title>Multistream BertGCN for Sentiment Classification Based on Cross-Document Learning</title><author>Li, Meng ; Xie, Yujin ; Yang, Weifeng ; Chen, Shenyu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c1390-ac6b575129d2c93ee6e085f13251b623e6e1b8292c96856653a529a42cef39223</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Accuracy</topic><topic>Artificial neural networks</topic><topic>Classification</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Dictionaries</topic><topic>Documents</topic><topic>Experiments</topic><topic>Graph representations</topic><topic>Graphs</topic><topic>Information processing</topic><topic>Language</topic><topic>Learning</topic><topic>Machine learning</topic><topic>Natural language processing</topic><topic>Neural networks</topic><topic>Public opinion surveys</topic><topic>Sentiment analysis</topic><topic>Text categorization</topic><topic>Topology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Meng</creatorcontrib><creatorcontrib>Xie, Yujin</creatorcontrib><creatorcontrib>Yang, Weifeng</creatorcontrib><creatorcontrib>Chen, Shenyu</creatorcontrib><collection>Hindawi Publishing Complete</collection><collection>Hindawi Publishing Subscription Journals</collection><collection>Hindawi Publishing Open Access Journals</collection><collection>CrossRef</collection><collection>ProQuest Computer Science Collection</collection><jtitle>Quantum engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Meng</au><au>Xie, Yujin</au><au>Yang, Weifeng</au><au>Chen, Shenyu</au><au>Dong, Shi Hai</au><au>Shi Hai Dong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Multistream BertGCN for Sentiment Classification Based on Cross-Document Learning</atitle><jtitle>Quantum engineering</jtitle><date>2023-11-13</date><risdate>2023</risdate><volume>2023</volume><spage>1</spage><epage>9</epage><pages>1-9</pages><issn>2577-0470</issn><eissn>2577-0470</eissn><abstract>Very recently, the BERT graph convolutional network (BertGCN) model has attracted much attention from researchers due to its good text classification performance. However, just using original documents in the corpus to construct the topology of graphs for GCN-based models may lose some effective information. In this paper, we focus on sentiment classification, an important branch of text classification, and propose the multistream BERT graph convolutional network (MS-BertGCN) for sentiment classification based on cross-document learning. In the proposed method, we first combine the documents in the training set based on within-class similarity. Then, each heterogeneous graph is constructed using a group of combinations of documents for the single-stream BertGCN model. Finally, we construct multistream-BertGCN (MS-BertGCN) based on multiple heterogeneous graphs constructed from different groups of combined documents. The experimental results show that our MS-BertGCN model outperforms state-of-the-art methods on sentiment classification tasks.</abstract><cop>Hoboken</cop><pub>Hindawi</pub><doi>10.1155/2023/3668960</doi><tpages>9</tpages><orcidid>https://orcid.org/0000-0001-6061-3915</orcidid><orcidid>https://orcid.org/0000-0002-9527-1422</orcidid><orcidid>https://orcid.org/0000-0003-3497-4391</orcidid><orcidid>https://orcid.org/0000-0002-0462-6727</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2577-0470
ispartof Quantum engineering, 2023-11, Vol.2023, p.1-9
issn 2577-0470
2577-0470
language eng
recordid cdi_proquest_journals_3107135450
source Wiley-Blackwell Open Access Titles; Wiley Online Library All Journals; Alma/SFX Local Collection
subjects Accuracy
Artificial neural networks
Classification
Datasets
Deep learning
Dictionaries
Documents
Experiments
Graph representations
Graphs
Information processing
Language
Learning
Machine learning
Natural language processing
Neural networks
Public opinion surveys
Sentiment analysis
Text categorization
Topology
title Multistream BertGCN for Sentiment Classification Based on Cross-Document Learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T07%3A07%3A48IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Multistream%20BertGCN%20for%20Sentiment%20Classification%20Based%20on%20Cross-Document%20Learning&rft.jtitle=Quantum%20engineering&rft.au=Li,%20Meng&rft.date=2023-11-13&rft.volume=2023&rft.spage=1&rft.epage=9&rft.pages=1-9&rft.issn=2577-0470&rft.eissn=2577-0470&rft_id=info:doi/10.1155/2023/3668960&rft_dat=%3Cproquest_cross%3E3107135450%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3107135450&rft_id=info:pmid/&rfr_iscdi=true