Assessing the impact of minor modifications on the interior structure of GRU: GRU1 and GRU2

In this study, two GRU variants named GRU1 and GRU2 are proposed by employing simple changes to the internal structure of the standard GRU, which is one of the popular RNN variants. Comparative experiments are conducted on four problems: language modeling, question answering, addition task, and sent...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Concurrency and computation 2022-09, Vol.34 (20), p.n/a
Hauptverfasser: Yigit, Gulsum, Amasyali, Mehmet Fatih
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page n/a
container_issue 20
container_start_page
container_title Concurrency and computation
container_volume 34
creator Yigit, Gulsum
Amasyali, Mehmet Fatih
description In this study, two GRU variants named GRU1 and GRU2 are proposed by employing simple changes to the internal structure of the standard GRU, which is one of the popular RNN variants. Comparative experiments are conducted on four problems: language modeling, question answering, addition task, and sentiment analysis. Moreover, in the addition task, curriculum learning and anti‐curriculum learning strategies, which extend the training data having examples from easy to hard or from hard to easy, are comparatively evaluated. Accordingly, the GRU1 and GRU2 variants outperformed the standard GRU. In addition, the curriculum learning approach, in which the training data is expanded from easy to difficult, improves the performance considerably.
doi_str_mv 10.1002/cpe.6775
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2699544343</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2699544343</sourcerecordid><originalsourceid>FETCH-LOGICAL-c2935-67168d132207c8efeec59ad7480f50d54bc3787e4c1604144da123382898ca613</originalsourceid><addsrcrecordid>eNp1kMFKxDAQhoMouK6Cj1Dw4qVrJkmT1NtS1lVYUMQ9eQgxTTXLtqlJi-zb21rx5mXmh_lmBj6ELgEvAGNyY1q74EJkR2gGGSUp5pQd_2XCT9FZjDuMATCFGXpdxmhjdM170n3YxNWtNl3iq6R2jQ9J7UtXOaM755uY-GaCms4GN0xjF3rT9cGOC-vn7e1YINFNOQZyjk4qvY_24rfP0fZu9VLcp5vH9UOx3KSG5DRLuQAuS6CEYGGkraw1Wa5LwSSuMlxm7M1QIYVlBjhmwFipgVAqicyl0RzoHF1Nd9vgP3sbO7XzfWiGl4rwPM8Yo4wO1PVEmeBjDLZSbXC1DgcFWI3q1KBOjeoGNJ3QL7e3h385VTytfvhvMLJtFA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2699544343</pqid></control><display><type>article</type><title>Assessing the impact of minor modifications on the interior structure of GRU: GRU1 and GRU2</title><source>Wiley-Blackwell Journals</source><creator>Yigit, Gulsum ; Amasyali, Mehmet Fatih</creator><creatorcontrib>Yigit, Gulsum ; Amasyali, Mehmet Fatih</creatorcontrib><description>In this study, two GRU variants named GRU1 and GRU2 are proposed by employing simple changes to the internal structure of the standard GRU, which is one of the popular RNN variants. Comparative experiments are conducted on four problems: language modeling, question answering, addition task, and sentiment analysis. Moreover, in the addition task, curriculum learning and anti‐curriculum learning strategies, which extend the training data having examples from easy to hard or from hard to easy, are comparatively evaluated. Accordingly, the GRU1 and GRU2 variants outperformed the standard GRU. In addition, the curriculum learning approach, in which the training data is expanded from easy to difficult, improves the performance considerably.</description><identifier>ISSN: 1532-0626</identifier><identifier>EISSN: 1532-0634</identifier><identifier>DOI: 10.1002/cpe.6775</identifier><language>eng</language><publisher>Hoboken: Wiley Subscription Services, Inc</publisher><subject>Curricula ; curriculum learning ; Data mining ; gated recurrent units ; Learning ; recurrent neural networks ; Seq2seq ; short‐term dependency ; Training</subject><ispartof>Concurrency and computation, 2022-09, Vol.34 (20), p.n/a</ispartof><rights>2021 John Wiley &amp; Sons Ltd.</rights><rights>2022 John Wiley &amp; Sons, Ltd.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c2935-67168d132207c8efeec59ad7480f50d54bc3787e4c1604144da123382898ca613</citedby><cites>FETCH-LOGICAL-c2935-67168d132207c8efeec59ad7480f50d54bc3787e4c1604144da123382898ca613</cites><orcidid>0000-0001-7010-169X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Fcpe.6775$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Fcpe.6775$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,776,780,1411,27901,27902,45550,45551</link.rule.ids></links><search><creatorcontrib>Yigit, Gulsum</creatorcontrib><creatorcontrib>Amasyali, Mehmet Fatih</creatorcontrib><title>Assessing the impact of minor modifications on the interior structure of GRU: GRU1 and GRU2</title><title>Concurrency and computation</title><description>In this study, two GRU variants named GRU1 and GRU2 are proposed by employing simple changes to the internal structure of the standard GRU, which is one of the popular RNN variants. Comparative experiments are conducted on four problems: language modeling, question answering, addition task, and sentiment analysis. Moreover, in the addition task, curriculum learning and anti‐curriculum learning strategies, which extend the training data having examples from easy to hard or from hard to easy, are comparatively evaluated. Accordingly, the GRU1 and GRU2 variants outperformed the standard GRU. In addition, the curriculum learning approach, in which the training data is expanded from easy to difficult, improves the performance considerably.</description><subject>Curricula</subject><subject>curriculum learning</subject><subject>Data mining</subject><subject>gated recurrent units</subject><subject>Learning</subject><subject>recurrent neural networks</subject><subject>Seq2seq</subject><subject>short‐term dependency</subject><subject>Training</subject><issn>1532-0626</issn><issn>1532-0634</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp1kMFKxDAQhoMouK6Cj1Dw4qVrJkmT1NtS1lVYUMQ9eQgxTTXLtqlJi-zb21rx5mXmh_lmBj6ELgEvAGNyY1q74EJkR2gGGSUp5pQd_2XCT9FZjDuMATCFGXpdxmhjdM170n3YxNWtNl3iq6R2jQ9J7UtXOaM755uY-GaCms4GN0xjF3rT9cGOC-vn7e1YINFNOQZyjk4qvY_24rfP0fZu9VLcp5vH9UOx3KSG5DRLuQAuS6CEYGGkraw1Wa5LwSSuMlxm7M1QIYVlBjhmwFipgVAqicyl0RzoHF1Nd9vgP3sbO7XzfWiGl4rwPM8Yo4wO1PVEmeBjDLZSbXC1DgcFWI3q1KBOjeoGNJ3QL7e3h385VTytfvhvMLJtFA</recordid><startdate>20220910</startdate><enddate>20220910</enddate><creator>Yigit, Gulsum</creator><creator>Amasyali, Mehmet Fatih</creator><general>Wiley Subscription Services, Inc</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-7010-169X</orcidid></search><sort><creationdate>20220910</creationdate><title>Assessing the impact of minor modifications on the interior structure of GRU: GRU1 and GRU2</title><author>Yigit, Gulsum ; Amasyali, Mehmet Fatih</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c2935-67168d132207c8efeec59ad7480f50d54bc3787e4c1604144da123382898ca613</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Curricula</topic><topic>curriculum learning</topic><topic>Data mining</topic><topic>gated recurrent units</topic><topic>Learning</topic><topic>recurrent neural networks</topic><topic>Seq2seq</topic><topic>short‐term dependency</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yigit, Gulsum</creatorcontrib><creatorcontrib>Amasyali, Mehmet Fatih</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Concurrency and computation</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yigit, Gulsum</au><au>Amasyali, Mehmet Fatih</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Assessing the impact of minor modifications on the interior structure of GRU: GRU1 and GRU2</atitle><jtitle>Concurrency and computation</jtitle><date>2022-09-10</date><risdate>2022</risdate><volume>34</volume><issue>20</issue><epage>n/a</epage><issn>1532-0626</issn><eissn>1532-0634</eissn><abstract>In this study, two GRU variants named GRU1 and GRU2 are proposed by employing simple changes to the internal structure of the standard GRU, which is one of the popular RNN variants. Comparative experiments are conducted on four problems: language modeling, question answering, addition task, and sentiment analysis. Moreover, in the addition task, curriculum learning and anti‐curriculum learning strategies, which extend the training data having examples from easy to hard or from hard to easy, are comparatively evaluated. Accordingly, the GRU1 and GRU2 variants outperformed the standard GRU. In addition, the curriculum learning approach, in which the training data is expanded from easy to difficult, improves the performance considerably.</abstract><cop>Hoboken</cop><pub>Wiley Subscription Services, Inc</pub><doi>10.1002/cpe.6775</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0001-7010-169X</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1532-0626
ispartof Concurrency and computation, 2022-09, Vol.34 (20), p.n/a
issn 1532-0626
1532-0634
language eng
recordid cdi_proquest_journals_2699544343
source Wiley-Blackwell Journals
subjects Curricula
curriculum learning
Data mining
gated recurrent units
Learning
recurrent neural networks
Seq2seq
short‐term dependency
Training
title Assessing the impact of minor modifications on the interior structure of GRU: GRU1 and GRU2
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-04T14%3A15%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Assessing%20the%20impact%20of%20minor%20modifications%20on%20the%20interior%20structure%20of%20GRU:%20GRU1%20and%20GRU2&rft.jtitle=Concurrency%20and%20computation&rft.au=Yigit,%20Gulsum&rft.date=2022-09-10&rft.volume=34&rft.issue=20&rft.epage=n/a&rft.issn=1532-0626&rft.eissn=1532-0634&rft_id=info:doi/10.1002/cpe.6775&rft_dat=%3Cproquest_cross%3E2699544343%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2699544343&rft_id=info:pmid/&rfr_iscdi=true