A Hybrid Bidirectional Recurrent Convolutional Neural Network Attention-Based Model for Text Classification
The text classification task is an important application in natural language processing. At present, deep learning models, such as convolutional neural network and recurrent neural network, have achieved good results for this task, but the multi-class text classification and the fine-grained sentime...
Gespeichert in:
Veröffentlicht in: | IEEE access 2019, Vol.7, p.106673-106685 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 106685 |
---|---|
container_issue | |
container_start_page | 106673 |
container_title | IEEE access |
container_volume | 7 |
creator | Zheng, Jin Zheng, Limin |
description | The text classification task is an important application in natural language processing. At present, deep learning models, such as convolutional neural network and recurrent neural network, have achieved good results for this task, but the multi-class text classification and the fine-grained sentiment analysis are still challenging. In this paper, we propose a hybrid bidirectional recurrent convolutional neural network attention-based model to address this issue, which named BRCAN. The model combines the bidirectional long short-term memory and the convolutional neural network with the attention mechanism and word2vec to achieve the fine-grained text classification task. In our model, we apply word2vec to generate word vectors automatically and a bidirectional recurrent structure to capture contextual information and long-term dependence of sentences. We also employ a maximum pool layer of convolutional neural network that judges which words play an essential role in text classification, and use the attention mechanism to give them higher weights to capture the key components in texts. We conduct experiments on four datasets, including Yahoo! Answers, Sogou News of the topic classification, Yelp Reviews, and Douban Movies Top250 short reviews of the sentiment analysis. And the experimental results show that the BRCAN outperforms the state-of-the-art models. |
doi_str_mv | 10.1109/ACCESS.2019.2932619 |
format | Article |
fullrecord | <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_8784247</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8784247</ieee_id><doaj_id>oai_doaj_org_article_2aa9af3f999d4ff89f7db54cd23f002a</doaj_id><sourcerecordid>2455631702</sourcerecordid><originalsourceid>FETCH-LOGICAL-c408t-674e2a8fa5e79b1c3c52ab3936e1703f926b867737c146dd79af2a7b721020c53</originalsourceid><addsrcrecordid>eNpNkVtP7CAUhRujiUb9Bb40Oc8dubRQHsfGW-Il8fJMdmFjGHsGhdZz_PcydmLkZZPF-haEVRQnlCwoJep02XXnj48LRqhaMMWZoGqnOGBUqIo3XOz-2u8XxymtSF5tlhp5ULwuy6vPPnpbnnnrI5rRhzUM5QOaKUZcj2UX1h9hmLb6HU7xe4z_Qnwtl-OYPfmoOoOEtrwNFofShVg-4f_MDpCSd97AxnNU7DkYEh5v52HxfHH-1F1VN_eX193ypjI1acdKyBoZtA4alKqnhpuGQc8VF0gl4U4x0bdCSi4NrYW1UoFjIHvJKGHENPywuJ5zbYCVfov-L8RPHcDrbyHEFw1x9GZAzQAynTOVsrVzrXLS9k1tLOOOEAY568-c9RbD-4Rp1KswxfwTSbO6aQTPT2LZxWeXiSGliO7nVkr0piQ9l6Q3JeltSZk6mSmPiD9EK9ua1ZJ_AYTEjjU</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2455631702</pqid></control><display><type>article</type><title>A Hybrid Bidirectional Recurrent Convolutional Neural Network Attention-Based Model for Text Classification</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Zheng, Jin ; Zheng, Limin</creator><creatorcontrib>Zheng, Jin ; Zheng, Limin</creatorcontrib><description>The text classification task is an important application in natural language processing. At present, deep learning models, such as convolutional neural network and recurrent neural network, have achieved good results for this task, but the multi-class text classification and the fine-grained sentiment analysis are still challenging. In this paper, we propose a hybrid bidirectional recurrent convolutional neural network attention-based model to address this issue, which named BRCAN. The model combines the bidirectional long short-term memory and the convolutional neural network with the attention mechanism and word2vec to achieve the fine-grained text classification task. In our model, we apply word2vec to generate word vectors automatically and a bidirectional recurrent structure to capture contextual information and long-term dependence of sentences. We also employ a maximum pool layer of convolutional neural network that judges which words play an essential role in text classification, and use the attention mechanism to give them higher weights to capture the key components in texts. We conduct experiments on four datasets, including Yahoo! Answers, Sogou News of the topic classification, Yelp Reviews, and Douban Movies Top250 short reviews of the sentiment analysis. And the experimental results show that the BRCAN outperforms the state-of-the-art models.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2019.2932619</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Artificial neural networks ; Attention mechanism ; bidirectional long short-term memory ; Classification ; convolutional neural network ; Convolutional neural networks ; Data mining ; Feature extraction ; fine-grained sentiment analysis ; Machine learning ; multi-class text classification ; Natural language processing ; Neural networks ; Recurrent neural networks ; Semantics ; Sentences ; Sentiment analysis ; Task analysis ; Text categorization</subject><ispartof>IEEE access, 2019, Vol.7, p.106673-106685</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c408t-674e2a8fa5e79b1c3c52ab3936e1703f926b867737c146dd79af2a7b721020c53</citedby><cites>FETCH-LOGICAL-c408t-674e2a8fa5e79b1c3c52ab3936e1703f926b867737c146dd79af2a7b721020c53</cites><orcidid>0000-0003-2725-5532 ; 0000-0002-4830-2142</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8784247$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,864,2101,4023,27632,27922,27923,27924,54932</link.rule.ids></links><search><creatorcontrib>Zheng, Jin</creatorcontrib><creatorcontrib>Zheng, Limin</creatorcontrib><title>A Hybrid Bidirectional Recurrent Convolutional Neural Network Attention-Based Model for Text Classification</title><title>IEEE access</title><addtitle>Access</addtitle><description>The text classification task is an important application in natural language processing. At present, deep learning models, such as convolutional neural network and recurrent neural network, have achieved good results for this task, but the multi-class text classification and the fine-grained sentiment analysis are still challenging. In this paper, we propose a hybrid bidirectional recurrent convolutional neural network attention-based model to address this issue, which named BRCAN. The model combines the bidirectional long short-term memory and the convolutional neural network with the attention mechanism and word2vec to achieve the fine-grained text classification task. In our model, we apply word2vec to generate word vectors automatically and a bidirectional recurrent structure to capture contextual information and long-term dependence of sentences. We also employ a maximum pool layer of convolutional neural network that judges which words play an essential role in text classification, and use the attention mechanism to give them higher weights to capture the key components in texts. We conduct experiments on four datasets, including Yahoo! Answers, Sogou News of the topic classification, Yelp Reviews, and Douban Movies Top250 short reviews of the sentiment analysis. And the experimental results show that the BRCAN outperforms the state-of-the-art models.</description><subject>Artificial neural networks</subject><subject>Attention mechanism</subject><subject>bidirectional long short-term memory</subject><subject>Classification</subject><subject>convolutional neural network</subject><subject>Convolutional neural networks</subject><subject>Data mining</subject><subject>Feature extraction</subject><subject>fine-grained sentiment analysis</subject><subject>Machine learning</subject><subject>multi-class text classification</subject><subject>Natural language processing</subject><subject>Neural networks</subject><subject>Recurrent neural networks</subject><subject>Semantics</subject><subject>Sentences</subject><subject>Sentiment analysis</subject><subject>Task analysis</subject><subject>Text categorization</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNkVtP7CAUhRujiUb9Bb40Oc8dubRQHsfGW-Il8fJMdmFjGHsGhdZz_PcydmLkZZPF-haEVRQnlCwoJep02XXnj48LRqhaMMWZoGqnOGBUqIo3XOz-2u8XxymtSF5tlhp5ULwuy6vPPnpbnnnrI5rRhzUM5QOaKUZcj2UX1h9hmLb6HU7xe4z_Qnwtl-OYPfmoOoOEtrwNFofShVg-4f_MDpCSd97AxnNU7DkYEh5v52HxfHH-1F1VN_eX193ypjI1acdKyBoZtA4alKqnhpuGQc8VF0gl4U4x0bdCSi4NrYW1UoFjIHvJKGHENPywuJ5zbYCVfov-L8RPHcDrbyHEFw1x9GZAzQAynTOVsrVzrXLS9k1tLOOOEAY568-c9RbD-4Rp1KswxfwTSbO6aQTPT2LZxWeXiSGliO7nVkr0piQ9l6Q3JeltSZk6mSmPiD9EK9ua1ZJ_AYTEjjU</recordid><startdate>2019</startdate><enddate>2019</enddate><creator>Zheng, Jin</creator><creator>Zheng, Limin</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0003-2725-5532</orcidid><orcidid>https://orcid.org/0000-0002-4830-2142</orcidid></search><sort><creationdate>2019</creationdate><title>A Hybrid Bidirectional Recurrent Convolutional Neural Network Attention-Based Model for Text Classification</title><author>Zheng, Jin ; Zheng, Limin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c408t-674e2a8fa5e79b1c3c52ab3936e1703f926b867737c146dd79af2a7b721020c53</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Artificial neural networks</topic><topic>Attention mechanism</topic><topic>bidirectional long short-term memory</topic><topic>Classification</topic><topic>convolutional neural network</topic><topic>Convolutional neural networks</topic><topic>Data mining</topic><topic>Feature extraction</topic><topic>fine-grained sentiment analysis</topic><topic>Machine learning</topic><topic>multi-class text classification</topic><topic>Natural language processing</topic><topic>Neural networks</topic><topic>Recurrent neural networks</topic><topic>Semantics</topic><topic>Sentences</topic><topic>Sentiment analysis</topic><topic>Task analysis</topic><topic>Text categorization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zheng, Jin</creatorcontrib><creatorcontrib>Zheng, Limin</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zheng, Jin</au><au>Zheng, Limin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Hybrid Bidirectional Recurrent Convolutional Neural Network Attention-Based Model for Text Classification</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2019</date><risdate>2019</risdate><volume>7</volume><spage>106673</spage><epage>106685</epage><pages>106673-106685</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>The text classification task is an important application in natural language processing. At present, deep learning models, such as convolutional neural network and recurrent neural network, have achieved good results for this task, but the multi-class text classification and the fine-grained sentiment analysis are still challenging. In this paper, we propose a hybrid bidirectional recurrent convolutional neural network attention-based model to address this issue, which named BRCAN. The model combines the bidirectional long short-term memory and the convolutional neural network with the attention mechanism and word2vec to achieve the fine-grained text classification task. In our model, we apply word2vec to generate word vectors automatically and a bidirectional recurrent structure to capture contextual information and long-term dependence of sentences. We also employ a maximum pool layer of convolutional neural network that judges which words play an essential role in text classification, and use the attention mechanism to give them higher weights to capture the key components in texts. We conduct experiments on four datasets, including Yahoo! Answers, Sogou News of the topic classification, Yelp Reviews, and Douban Movies Top250 short reviews of the sentiment analysis. And the experimental results show that the BRCAN outperforms the state-of-the-art models.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2019.2932619</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0003-2725-5532</orcidid><orcidid>https://orcid.org/0000-0002-4830-2142</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2169-3536 |
ispartof | IEEE access, 2019, Vol.7, p.106673-106685 |
issn | 2169-3536 2169-3536 |
language | eng |
recordid | cdi_ieee_primary_8784247 |
source | IEEE Open Access Journals; DOAJ Directory of Open Access Journals; EZB-FREE-00999 freely available EZB journals |
subjects | Artificial neural networks Attention mechanism bidirectional long short-term memory Classification convolutional neural network Convolutional neural networks Data mining Feature extraction fine-grained sentiment analysis Machine learning multi-class text classification Natural language processing Neural networks Recurrent neural networks Semantics Sentences Sentiment analysis Task analysis Text categorization |
title | A Hybrid Bidirectional Recurrent Convolutional Neural Network Attention-Based Model for Text Classification |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T11%3A17%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Hybrid%20Bidirectional%20Recurrent%20Convolutional%20Neural%20Network%20Attention-Based%20Model%20for%20Text%20Classification&rft.jtitle=IEEE%20access&rft.au=Zheng,%20Jin&rft.date=2019&rft.volume=7&rft.spage=106673&rft.epage=106685&rft.pages=106673-106685&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2019.2932619&rft_dat=%3Cproquest_ieee_%3E2455631702%3C/proquest_ieee_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2455631702&rft_id=info:pmid/&rft_ieee_id=8784247&rft_doaj_id=oai_doaj_org_article_2aa9af3f999d4ff89f7db54cd23f002a&rfr_iscdi=true |