Hybrid RNN Based Text Classification Model for Unstructured Data
The volume of social media posts is on the rise as the number of social media users expands. It is imperative that these data be analyzed using cutting-edge algorithms. This goal is handled by the many techniques used in text categorization. There are a variety of text categorization techniques avai...
Gespeichert in:
Veröffentlicht in: | SN computer science 2024-07, Vol.5 (6), p.726, Article 726 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 6 |
container_start_page | 726 |
container_title | SN computer science |
container_volume | 5 |
creator | Sunagar, Pramod Sowmya, B. J. Pruthviraja, Dayananda Supreeth, S Mathew, Jimpson Rohith, S Shruthi, G |
description | The volume of social media posts is on the rise as the number of social media users expands. It is imperative that these data be analyzed using cutting-edge algorithms. This goal is handled by the many techniques used in text categorization. There are a variety of text categorization techniques available, ranging from machine learning to deep learning. Numerical crunching has become easier with less processing time since the emergence of high-end computer facilities. This has led to the development of sophisticated network architectures that can be trained to achieve higher precision and recall. The performance of neural network models which was evaluated by the F1 score is affected by cumulative performance in precision and recall. The current study intends to analyze and compare the performance of the neural network proposed, A Hybrid RNN model that has two layers of BiLSTM and two layers of GRU to that of previous hybrid models. GloVE dataset is used to train the models and their accuracy, precision, recall, and F1 score are used to assess performance. Except for the RNN + GRU model, the RNN + BILSTM + GRU model has a precision of 0.767, a recall of 0.759, and an F1-score of 0.7585. This hybrid model outperforms the others. |
doi_str_mv | 10.1007/s42979-024-03091-x |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3085041209</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3085041209</sourcerecordid><originalsourceid>FETCH-LOGICAL-c159x-b67cc6618b2ba25ada2dc000390e82bb3e1b7ed9b8ffb9ac31896c56139173ae3</originalsourceid><addsrcrecordid>eNp9kEtPwzAQhC0EElXpH-BkibNhbeflGxAeRSpFQu3Zsh0bpQpJsRMp_fcYggQnTruHmdnZD6FzCpcUIL8KCRO5IMASAhwEJeMRmrEso6QQkB__2U_RIoQdALAUkiRLZ-h6edC-rvDreo1vVbAV3tixx2WjQqhdbVRfdy1-7irbYNd5vG1D7wfTDz5K71SvztCJU02wi585R9uH-025JKuXx6fyZkUMTcVIdJYbE2sUmmnFUlUpVplYhAuwBdOaW6pzWwldOKeFMpwWIjNpRrmgOVeWz9HFlLv33cdgQy933eDbeFJyKOI7lIGIKjapjO9C8NbJva_flT9ICvILlpxgyQhLfsOSYzTxyRSiuH2z_jf6H9cnssZsNg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3085041209</pqid></control><display><type>article</type><title>Hybrid RNN Based Text Classification Model for Unstructured Data</title><source>SpringerLink Journals</source><creator>Sunagar, Pramod ; Sowmya, B. J. ; Pruthviraja, Dayananda ; Supreeth, S ; Mathew, Jimpson ; Rohith, S ; Shruthi, G</creator><creatorcontrib>Sunagar, Pramod ; Sowmya, B. J. ; Pruthviraja, Dayananda ; Supreeth, S ; Mathew, Jimpson ; Rohith, S ; Shruthi, G</creatorcontrib><description>The volume of social media posts is on the rise as the number of social media users expands. It is imperative that these data be analyzed using cutting-edge algorithms. This goal is handled by the many techniques used in text categorization. There are a variety of text categorization techniques available, ranging from machine learning to deep learning. Numerical crunching has become easier with less processing time since the emergence of high-end computer facilities. This has led to the development of sophisticated network architectures that can be trained to achieve higher precision and recall. The performance of neural network models which was evaluated by the F1 score is affected by cumulative performance in precision and recall. The current study intends to analyze and compare the performance of the neural network proposed, A Hybrid RNN model that has two layers of BiLSTM and two layers of GRU to that of previous hybrid models. GloVE dataset is used to train the models and their accuracy, precision, recall, and F1 score are used to assess performance. Except for the RNN + GRU model, the RNN + BILSTM + GRU model has a precision of 0.767, a recall of 0.759, and an F1-score of 0.7585. This hybrid model outperforms the others.</description><identifier>ISSN: 2661-8907</identifier><identifier>ISSN: 2662-995X</identifier><identifier>EISSN: 2661-8907</identifier><identifier>DOI: 10.1007/s42979-024-03091-x</identifier><language>eng</language><publisher>Singapore: Springer Nature Singapore</publisher><subject>Algorithms ; Classification ; Computer Imaging ; Computer Science ; Computer Systems Organization and Communication Networks ; Data analysis ; Data Structures and Information Theory ; Datasets ; Deep learning ; Digital media ; Information Systems and Communication Service ; Machine learning ; Medical research ; Neural networks ; Original Research ; Pattern Recognition and Graphics ; Performance evaluation ; Recall ; Recurrent neural networks ; Research Advancements in Intelligent Computing ; Semantics ; Social networks ; Software Engineering/Programming and Operating Systems ; Text categorization ; Unstructured data ; Vision</subject><ispartof>SN computer science, 2024-07, Vol.5 (6), p.726, Article 726</ispartof><rights>The Author(s) 2024</rights><rights>The Author(s) 2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c159x-b67cc6618b2ba25ada2dc000390e82bb3e1b7ed9b8ffb9ac31896c56139173ae3</cites><orcidid>0000-0001-8445-3469</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s42979-024-03091-x$$EPDF$$P50$$Gspringer$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s42979-024-03091-x$$EHTML$$P50$$Gspringer$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Sunagar, Pramod</creatorcontrib><creatorcontrib>Sowmya, B. J.</creatorcontrib><creatorcontrib>Pruthviraja, Dayananda</creatorcontrib><creatorcontrib>Supreeth, S</creatorcontrib><creatorcontrib>Mathew, Jimpson</creatorcontrib><creatorcontrib>Rohith, S</creatorcontrib><creatorcontrib>Shruthi, G</creatorcontrib><title>Hybrid RNN Based Text Classification Model for Unstructured Data</title><title>SN computer science</title><addtitle>SN COMPUT. SCI</addtitle><description>The volume of social media posts is on the rise as the number of social media users expands. It is imperative that these data be analyzed using cutting-edge algorithms. This goal is handled by the many techniques used in text categorization. There are a variety of text categorization techniques available, ranging from machine learning to deep learning. Numerical crunching has become easier with less processing time since the emergence of high-end computer facilities. This has led to the development of sophisticated network architectures that can be trained to achieve higher precision and recall. The performance of neural network models which was evaluated by the F1 score is affected by cumulative performance in precision and recall. The current study intends to analyze and compare the performance of the neural network proposed, A Hybrid RNN model that has two layers of BiLSTM and two layers of GRU to that of previous hybrid models. GloVE dataset is used to train the models and their accuracy, precision, recall, and F1 score are used to assess performance. Except for the RNN + GRU model, the RNN + BILSTM + GRU model has a precision of 0.767, a recall of 0.759, and an F1-score of 0.7585. This hybrid model outperforms the others.</description><subject>Algorithms</subject><subject>Classification</subject><subject>Computer Imaging</subject><subject>Computer Science</subject><subject>Computer Systems Organization and Communication Networks</subject><subject>Data analysis</subject><subject>Data Structures and Information Theory</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Digital media</subject><subject>Information Systems and Communication Service</subject><subject>Machine learning</subject><subject>Medical research</subject><subject>Neural networks</subject><subject>Original Research</subject><subject>Pattern Recognition and Graphics</subject><subject>Performance evaluation</subject><subject>Recall</subject><subject>Recurrent neural networks</subject><subject>Research Advancements in Intelligent Computing</subject><subject>Semantics</subject><subject>Social networks</subject><subject>Software Engineering/Programming and Operating Systems</subject><subject>Text categorization</subject><subject>Unstructured data</subject><subject>Vision</subject><issn>2661-8907</issn><issn>2662-995X</issn><issn>2661-8907</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>C6C</sourceid><recordid>eNp9kEtPwzAQhC0EElXpH-BkibNhbeflGxAeRSpFQu3Zsh0bpQpJsRMp_fcYggQnTruHmdnZD6FzCpcUIL8KCRO5IMASAhwEJeMRmrEso6QQkB__2U_RIoQdALAUkiRLZ-h6edC-rvDreo1vVbAV3tixx2WjQqhdbVRfdy1-7irbYNd5vG1D7wfTDz5K71SvztCJU02wi585R9uH-025JKuXx6fyZkUMTcVIdJYbE2sUmmnFUlUpVplYhAuwBdOaW6pzWwldOKeFMpwWIjNpRrmgOVeWz9HFlLv33cdgQy933eDbeFJyKOI7lIGIKjapjO9C8NbJva_flT9ICvILlpxgyQhLfsOSYzTxyRSiuH2z_jf6H9cnssZsNg</recordid><startdate>20240726</startdate><enddate>20240726</enddate><creator>Sunagar, Pramod</creator><creator>Sowmya, B. J.</creator><creator>Pruthviraja, Dayananda</creator><creator>Supreeth, S</creator><creator>Mathew, Jimpson</creator><creator>Rohith, S</creator><creator>Shruthi, G</creator><general>Springer Nature Singapore</general><general>Springer Nature B.V</general><scope>C6C</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>JQ2</scope><orcidid>https://orcid.org/0000-0001-8445-3469</orcidid></search><sort><creationdate>20240726</creationdate><title>Hybrid RNN Based Text Classification Model for Unstructured Data</title><author>Sunagar, Pramod ; Sowmya, B. J. ; Pruthviraja, Dayananda ; Supreeth, S ; Mathew, Jimpson ; Rohith, S ; Shruthi, G</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c159x-b67cc6618b2ba25ada2dc000390e82bb3e1b7ed9b8ffb9ac31896c56139173ae3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Classification</topic><topic>Computer Imaging</topic><topic>Computer Science</topic><topic>Computer Systems Organization and Communication Networks</topic><topic>Data analysis</topic><topic>Data Structures and Information Theory</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Digital media</topic><topic>Information Systems and Communication Service</topic><topic>Machine learning</topic><topic>Medical research</topic><topic>Neural networks</topic><topic>Original Research</topic><topic>Pattern Recognition and Graphics</topic><topic>Performance evaluation</topic><topic>Recall</topic><topic>Recurrent neural networks</topic><topic>Research Advancements in Intelligent Computing</topic><topic>Semantics</topic><topic>Social networks</topic><topic>Software Engineering/Programming and Operating Systems</topic><topic>Text categorization</topic><topic>Unstructured data</topic><topic>Vision</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Sunagar, Pramod</creatorcontrib><creatorcontrib>Sowmya, B. J.</creatorcontrib><creatorcontrib>Pruthviraja, Dayananda</creatorcontrib><creatorcontrib>Supreeth, S</creatorcontrib><creatorcontrib>Mathew, Jimpson</creatorcontrib><creatorcontrib>Rohith, S</creatorcontrib><creatorcontrib>Shruthi, G</creatorcontrib><collection>Springer Nature OA Free Journals</collection><collection>CrossRef</collection><collection>ProQuest Computer Science Collection</collection><jtitle>SN computer science</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Sunagar, Pramod</au><au>Sowmya, B. J.</au><au>Pruthviraja, Dayananda</au><au>Supreeth, S</au><au>Mathew, Jimpson</au><au>Rohith, S</au><au>Shruthi, G</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Hybrid RNN Based Text Classification Model for Unstructured Data</atitle><jtitle>SN computer science</jtitle><stitle>SN COMPUT. SCI</stitle><date>2024-07-26</date><risdate>2024</risdate><volume>5</volume><issue>6</issue><spage>726</spage><pages>726-</pages><artnum>726</artnum><issn>2661-8907</issn><issn>2662-995X</issn><eissn>2661-8907</eissn><abstract>The volume of social media posts is on the rise as the number of social media users expands. It is imperative that these data be analyzed using cutting-edge algorithms. This goal is handled by the many techniques used in text categorization. There are a variety of text categorization techniques available, ranging from machine learning to deep learning. Numerical crunching has become easier with less processing time since the emergence of high-end computer facilities. This has led to the development of sophisticated network architectures that can be trained to achieve higher precision and recall. The performance of neural network models which was evaluated by the F1 score is affected by cumulative performance in precision and recall. The current study intends to analyze and compare the performance of the neural network proposed, A Hybrid RNN model that has two layers of BiLSTM and two layers of GRU to that of previous hybrid models. GloVE dataset is used to train the models and their accuracy, precision, recall, and F1 score are used to assess performance. Except for the RNN + GRU model, the RNN + BILSTM + GRU model has a precision of 0.767, a recall of 0.759, and an F1-score of 0.7585. This hybrid model outperforms the others.</abstract><cop>Singapore</cop><pub>Springer Nature Singapore</pub><doi>10.1007/s42979-024-03091-x</doi><orcidid>https://orcid.org/0000-0001-8445-3469</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2661-8907 |
ispartof | SN computer science, 2024-07, Vol.5 (6), p.726, Article 726 |
issn | 2661-8907 2662-995X 2661-8907 |
language | eng |
recordid | cdi_proquest_journals_3085041209 |
source | SpringerLink Journals |
subjects | Algorithms Classification Computer Imaging Computer Science Computer Systems Organization and Communication Networks Data analysis Data Structures and Information Theory Datasets Deep learning Digital media Information Systems and Communication Service Machine learning Medical research Neural networks Original Research Pattern Recognition and Graphics Performance evaluation Recall Recurrent neural networks Research Advancements in Intelligent Computing Semantics Social networks Software Engineering/Programming and Operating Systems Text categorization Unstructured data Vision |
title | Hybrid RNN Based Text Classification Model for Unstructured Data |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T04%3A01%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Hybrid%20RNN%20Based%20Text%20Classification%20Model%20for%20Unstructured%20Data&rft.jtitle=SN%20computer%20science&rft.au=Sunagar,%20Pramod&rft.date=2024-07-26&rft.volume=5&rft.issue=6&rft.spage=726&rft.pages=726-&rft.artnum=726&rft.issn=2661-8907&rft.eissn=2661-8907&rft_id=info:doi/10.1007/s42979-024-03091-x&rft_dat=%3Cproquest_cross%3E3085041209%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3085041209&rft_id=info:pmid/&rfr_iscdi=true |