A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data

Many language pairs are low resourced, meaning the amount and/or quality of the available parallel data between them is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored using the readily available monolingua...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Engineering letters 2021-11, Vol.29 (4), p.1478
Hauptverfasser: Abdulmumin, Idris, Galadanci, Bashir Shehu, Isa, Abubakar, Kakudi, Habeebah Adamu, Sinan, Ismaila Idris
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 4
container_start_page 1478
container_title Engineering letters
container_volume 29
creator Abdulmumin, Idris
Galadanci, Bashir Shehu
Isa, Abubakar
Kakudi, Habeebah Adamu
Sinan, Ismaila Idris
description Many language pairs are low resourced, meaning the amount and/or quality of the available parallel data between them is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored using the readily available monolingual data in either or both of the languages to improve the standard of translation models in low, and even high, resource languages. One of the most successful of such works is the back-translation that utilizes the translations of the target language monolingual data to increase the amount of the training data. The quality of the backward model which is trained on the available – often low resource – parallel data has been shown to determine the performance of this approach. Despite this, the standard back-translation is meant only to improve the performance of the forward model on the available monolingual target data. A previous study proposed an iterative back-translation approach that, unlike in traditional back-translation, relies on both the target and source monolingual data to improve the two models over several iterations. In this work, however, we proposed a novel approach that enables both of the backward and forward models to benefit from the monolingual target data through a hybrid of self-learning and back-translation respectively. Experimental results have shown the superiority of the proposed approach over the traditional back-translation method on English-German low resource neural machine translation. We also proposed an iterative self-learning approach that outperforms the iterative back-translation while also relying only on the monolingual target data and requiring the training of less models.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2620404982</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2620404982</sourcerecordid><originalsourceid>FETCH-LOGICAL-p183t-46c8365fb8d6fd7f871b856480a3bf2de17126b3057e174cd6d010f62bc102943</originalsourceid><addsrcrecordid>eNo9jl1LwzAYhYMoOOb-wwteF_LVNL0s82ODTkE22I2MpEm0UpOatIr_3oDi1XkOHM45Z2hBJBEFrrk8_2d2vESrlHqNOa9YWeNygZ4b2Hzr2BtoxjEG1b2CCxG279l8WgNt-IInm8IcOwsPdo5qgF1O9d7CPiqfBjX1wcOcev8Cu-DDkGHOqRs1qSt04dSQ7OpPl-hwd7tfb4r28X67btpiJJJNBRedZKJ0WhrhTOVkRbQsBZdYMe2osaQiVGiGyyoj74wwmGAnqO4IpjVnS3T925tff8w2Tae3_NjnyRMVFHPMa0nZD28gUN0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2620404982</pqid></control><display><type>article</type><title>A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data</title><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Abdulmumin, Idris ; Galadanci, Bashir Shehu ; Isa, Abubakar ; Kakudi, Habeebah Adamu ; Sinan, Ismaila Idris</creator><creatorcontrib>Abdulmumin, Idris ; Galadanci, Bashir Shehu ; Isa, Abubakar ; Kakudi, Habeebah Adamu ; Sinan, Ismaila Idris</creatorcontrib><description>Many language pairs are low resourced, meaning the amount and/or quality of the available parallel data between them is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored using the readily available monolingual data in either or both of the languages to improve the standard of translation models in low, and even high, resource languages. One of the most successful of such works is the back-translation that utilizes the translations of the target language monolingual data to increase the amount of the training data. The quality of the backward model which is trained on the available – often low resource – parallel data has been shown to determine the performance of this approach. Despite this, the standard back-translation is meant only to improve the performance of the forward model on the available monolingual target data. A previous study proposed an iterative back-translation approach that, unlike in traditional back-translation, relies on both the target and source monolingual data to improve the two models over several iterations. In this work, however, we proposed a novel approach that enables both of the backward and forward models to benefit from the monolingual target data through a hybrid of self-learning and back-translation respectively. Experimental results have shown the superiority of the proposed approach over the traditional back-translation method on English-German low resource neural machine translation. We also proposed an iterative self-learning approach that outperforms the iterative back-translation while also relying only on the monolingual target data and requiring the training of less models.</description><identifier>ISSN: 1816-093X</identifier><identifier>EISSN: 1816-0948</identifier><language>eng</language><publisher>Hong Kong: International Association of Engineers</publisher><subject>Iterative methods ; Languages ; Learning ; Machine translation ; Performance enhancement ; Training</subject><ispartof>Engineering letters, 2021-11, Vol.29 (4), p.1478</ispartof><rights>Copyright International Association of Engineers Nov 22, 2021</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780</link.rule.ids></links><search><creatorcontrib>Abdulmumin, Idris</creatorcontrib><creatorcontrib>Galadanci, Bashir Shehu</creatorcontrib><creatorcontrib>Isa, Abubakar</creatorcontrib><creatorcontrib>Kakudi, Habeebah Adamu</creatorcontrib><creatorcontrib>Sinan, Ismaila Idris</creatorcontrib><title>A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data</title><title>Engineering letters</title><description>Many language pairs are low resourced, meaning the amount and/or quality of the available parallel data between them is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored using the readily available monolingual data in either or both of the languages to improve the standard of translation models in low, and even high, resource languages. One of the most successful of such works is the back-translation that utilizes the translations of the target language monolingual data to increase the amount of the training data. The quality of the backward model which is trained on the available – often low resource – parallel data has been shown to determine the performance of this approach. Despite this, the standard back-translation is meant only to improve the performance of the forward model on the available monolingual target data. A previous study proposed an iterative back-translation approach that, unlike in traditional back-translation, relies on both the target and source monolingual data to improve the two models over several iterations. In this work, however, we proposed a novel approach that enables both of the backward and forward models to benefit from the monolingual target data through a hybrid of self-learning and back-translation respectively. Experimental results have shown the superiority of the proposed approach over the traditional back-translation method on English-German low resource neural machine translation. We also proposed an iterative self-learning approach that outperforms the iterative back-translation while also relying only on the monolingual target data and requiring the training of less models.</description><subject>Iterative methods</subject><subject>Languages</subject><subject>Learning</subject><subject>Machine translation</subject><subject>Performance enhancement</subject><subject>Training</subject><issn>1816-093X</issn><issn>1816-0948</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><recordid>eNo9jl1LwzAYhYMoOOb-wwteF_LVNL0s82ODTkE22I2MpEm0UpOatIr_3oDi1XkOHM45Z2hBJBEFrrk8_2d2vESrlHqNOa9YWeNygZ4b2Hzr2BtoxjEG1b2CCxG279l8WgNt-IInm8IcOwsPdo5qgF1O9d7CPiqfBjX1wcOcev8Cu-DDkGHOqRs1qSt04dSQ7OpPl-hwd7tfb4r28X67btpiJJJNBRedZKJ0WhrhTOVkRbQsBZdYMe2osaQiVGiGyyoj74wwmGAnqO4IpjVnS3T925tff8w2Tae3_NjnyRMVFHPMa0nZD28gUN0</recordid><startdate>20211122</startdate><enddate>20211122</enddate><creator>Abdulmumin, Idris</creator><creator>Galadanci, Bashir Shehu</creator><creator>Isa, Abubakar</creator><creator>Kakudi, Habeebah Adamu</creator><creator>Sinan, Ismaila Idris</creator><general>International Association of Engineers</general><scope>7SC</scope><scope>7TB</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20211122</creationdate><title>A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data</title><author>Abdulmumin, Idris ; Galadanci, Bashir Shehu ; Isa, Abubakar ; Kakudi, Habeebah Adamu ; Sinan, Ismaila Idris</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-p183t-46c8365fb8d6fd7f871b856480a3bf2de17126b3057e174cd6d010f62bc102943</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Iterative methods</topic><topic>Languages</topic><topic>Learning</topic><topic>Machine translation</topic><topic>Performance enhancement</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Abdulmumin, Idris</creatorcontrib><creatorcontrib>Galadanci, Bashir Shehu</creatorcontrib><creatorcontrib>Isa, Abubakar</creatorcontrib><creatorcontrib>Kakudi, Habeebah Adamu</creatorcontrib><creatorcontrib>Sinan, Ismaila Idris</creatorcontrib><collection>Computer and Information Systems Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Engineering letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Abdulmumin, Idris</au><au>Galadanci, Bashir Shehu</au><au>Isa, Abubakar</au><au>Kakudi, Habeebah Adamu</au><au>Sinan, Ismaila Idris</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data</atitle><jtitle>Engineering letters</jtitle><date>2021-11-22</date><risdate>2021</risdate><volume>29</volume><issue>4</issue><spage>1478</spage><pages>1478-</pages><issn>1816-093X</issn><eissn>1816-0948</eissn><abstract>Many language pairs are low resourced, meaning the amount and/or quality of the available parallel data between them is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored using the readily available monolingual data in either or both of the languages to improve the standard of translation models in low, and even high, resource languages. One of the most successful of such works is the back-translation that utilizes the translations of the target language monolingual data to increase the amount of the training data. The quality of the backward model which is trained on the available – often low resource – parallel data has been shown to determine the performance of this approach. Despite this, the standard back-translation is meant only to improve the performance of the forward model on the available monolingual target data. A previous study proposed an iterative back-translation approach that, unlike in traditional back-translation, relies on both the target and source monolingual data to improve the two models over several iterations. In this work, however, we proposed a novel approach that enables both of the backward and forward models to benefit from the monolingual target data through a hybrid of self-learning and back-translation respectively. Experimental results have shown the superiority of the proposed approach over the traditional back-translation method on English-German low resource neural machine translation. We also proposed an iterative self-learning approach that outperforms the iterative back-translation while also relying only on the monolingual target data and requiring the training of less models.</abstract><cop>Hong Kong</cop><pub>International Association of Engineers</pub></addata></record>
fulltext fulltext
identifier ISSN: 1816-093X
ispartof Engineering letters, 2021-11, Vol.29 (4), p.1478
issn 1816-093X
1816-0948
language eng
recordid cdi_proquest_journals_2620404982
source Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals
subjects Iterative methods
Languages
Learning
Machine translation
Performance enhancement
Training
title A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T23%3A08%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Hybrid%20Approach%20for%20Improved%20Low%20Resource%20Neural%20Machine%20Translation%20using%20Monolingual%20Data&rft.jtitle=Engineering%20letters&rft.au=Abdulmumin,%20Idris&rft.date=2021-11-22&rft.volume=29&rft.issue=4&rft.spage=1478&rft.pages=1478-&rft.issn=1816-093X&rft.eissn=1816-0948&rft_id=info:doi/&rft_dat=%3Cproquest%3E2620404982%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2620404982&rft_id=info:pmid/&rfr_iscdi=true