A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data

Engineering Letters, vol. 29, no. 4, pp1478-1493, 2021 Many language pairs are low resource, meaning the amount and/or quality of available parallel data is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored u...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Abdulmumin, Idris, Galadanci, Bashir Shehu, Isa, Abubakar, Kakudi, Habeebah Adamu, Sinan, Ismaila Idris
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Abdulmumin, Idris
Galadanci, Bashir Shehu
Isa, Abubakar
Kakudi, Habeebah Adamu
Sinan, Ismaila Idris
description Engineering Letters, vol. 29, no. 4, pp1478-1493, 2021 Many language pairs are low resource, meaning the amount and/or quality of available parallel data is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored using the readily available monolingual data in either or both of the languages to improve the standard of translation models in low, and even high, resource languages. One of the most successful of such works is the back-translation that utilizes the translations of the target language monolingual data to increase the amount of the training data. The quality of the backward model which is trained on the available parallel data has been shown to determine the performance of the back-translation approach. Despite this, only the forward model is improved on the monolingual target data in standard back-translation. A previous study proposed an iterative back-translation approach for improving both models over several iterations. But unlike in the traditional back-translation, it relied on both the target and source monolingual data. This work, therefore, proposes a novel approach that enables both the backward and forward models to benefit from the monolingual target data through a hybrid of self-learning and back-translation respectively. Experimental results have shown the superiority of the proposed approach over the traditional back-translation method on English-German low resource neural machine translation. We also proposed an iterative self-learning approach that outperforms the iterative back-translation while also relying only on the monolingual target data and require the training of less models.
doi_str_mv 10.48550/arxiv.2011.07403
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2011_07403</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2011_07403</sourcerecordid><originalsourceid>FETCH-LOGICAL-a673-fbca8faddac9adb317316401cfc89e89ea6c3bf49f21e87151f0092a585a75e53</originalsourceid><addsrcrecordid>eNotj8tqwzAURLXpoqT9gK56f8CuZFmWvDTpIwGnheJtMdd6tALHMnKcNn9fNykMzAwMA4eQO0bTXAlBHzD--GOaUcZSKnPKr8lHBZtTF72BahxjQP0FLkTY7pdytAbq8A3vdgpz1BZe7Ryxh92y8oOFJuIw9XjwYYB58sMn7MIQ-iXMy-oRD3hDrhz2k7399xVpnp-a9Sap316266pOsJA8cZ1G5dAY1CWajjPJWZFTpp1WpV2Eheady0uXMaskE8xRWmYolEAprOArcn-5PfO1Y_R7jKf2j7M9c_Jfy_hOtw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data</title><source>arXiv.org</source><creator>Abdulmumin, Idris ; Galadanci, Bashir Shehu ; Isa, Abubakar ; Kakudi, Habeebah Adamu ; Sinan, Ismaila Idris</creator><creatorcontrib>Abdulmumin, Idris ; Galadanci, Bashir Shehu ; Isa, Abubakar ; Kakudi, Habeebah Adamu ; Sinan, Ismaila Idris</creatorcontrib><description>Engineering Letters, vol. 29, no. 4, pp1478-1493, 2021 Many language pairs are low resource, meaning the amount and/or quality of available parallel data is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored using the readily available monolingual data in either or both of the languages to improve the standard of translation models in low, and even high, resource languages. One of the most successful of such works is the back-translation that utilizes the translations of the target language monolingual data to increase the amount of the training data. The quality of the backward model which is trained on the available parallel data has been shown to determine the performance of the back-translation approach. Despite this, only the forward model is improved on the monolingual target data in standard back-translation. A previous study proposed an iterative back-translation approach for improving both models over several iterations. But unlike in the traditional back-translation, it relied on both the target and source monolingual data. This work, therefore, proposes a novel approach that enables both the backward and forward models to benefit from the monolingual target data through a hybrid of self-learning and back-translation respectively. Experimental results have shown the superiority of the proposed approach over the traditional back-translation method on English-German low resource neural machine translation. We also proposed an iterative self-learning approach that outperforms the iterative back-translation while also relying only on the monolingual target data and require the training of less models.</description><identifier>DOI: 10.48550/arxiv.2011.07403</identifier><language>eng</language><subject>Computer Science - Computation and Language ; Computer Science - Learning</subject><creationdate>2020-11</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2011.07403$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2011.07403$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Abdulmumin, Idris</creatorcontrib><creatorcontrib>Galadanci, Bashir Shehu</creatorcontrib><creatorcontrib>Isa, Abubakar</creatorcontrib><creatorcontrib>Kakudi, Habeebah Adamu</creatorcontrib><creatorcontrib>Sinan, Ismaila Idris</creatorcontrib><title>A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data</title><description>Engineering Letters, vol. 29, no. 4, pp1478-1493, 2021 Many language pairs are low resource, meaning the amount and/or quality of available parallel data is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored using the readily available monolingual data in either or both of the languages to improve the standard of translation models in low, and even high, resource languages. One of the most successful of such works is the back-translation that utilizes the translations of the target language monolingual data to increase the amount of the training data. The quality of the backward model which is trained on the available parallel data has been shown to determine the performance of the back-translation approach. Despite this, only the forward model is improved on the monolingual target data in standard back-translation. A previous study proposed an iterative back-translation approach for improving both models over several iterations. But unlike in the traditional back-translation, it relied on both the target and source monolingual data. This work, therefore, proposes a novel approach that enables both the backward and forward models to benefit from the monolingual target data through a hybrid of self-learning and back-translation respectively. Experimental results have shown the superiority of the proposed approach over the traditional back-translation method on English-German low resource neural machine translation. We also proposed an iterative self-learning approach that outperforms the iterative back-translation while also relying only on the monolingual target data and require the training of less models.</description><subject>Computer Science - Computation and Language</subject><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj8tqwzAURLXpoqT9gK56f8CuZFmWvDTpIwGnheJtMdd6tALHMnKcNn9fNykMzAwMA4eQO0bTXAlBHzD--GOaUcZSKnPKr8lHBZtTF72BahxjQP0FLkTY7pdytAbq8A3vdgpz1BZe7Ryxh92y8oOFJuIw9XjwYYB58sMn7MIQ-iXMy-oRD3hDrhz2k7399xVpnp-a9Sap316266pOsJA8cZ1G5dAY1CWajjPJWZFTpp1WpV2Eheady0uXMaskE8xRWmYolEAprOArcn-5PfO1Y_R7jKf2j7M9c_Jfy_hOtw</recordid><startdate>20201114</startdate><enddate>20201114</enddate><creator>Abdulmumin, Idris</creator><creator>Galadanci, Bashir Shehu</creator><creator>Isa, Abubakar</creator><creator>Kakudi, Habeebah Adamu</creator><creator>Sinan, Ismaila Idris</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20201114</creationdate><title>A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data</title><author>Abdulmumin, Idris ; Galadanci, Bashir Shehu ; Isa, Abubakar ; Kakudi, Habeebah Adamu ; Sinan, Ismaila Idris</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a673-fbca8faddac9adb317316401cfc89e89ea6c3bf49f21e87151f0092a585a75e53</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Computer Science - Computation and Language</topic><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Abdulmumin, Idris</creatorcontrib><creatorcontrib>Galadanci, Bashir Shehu</creatorcontrib><creatorcontrib>Isa, Abubakar</creatorcontrib><creatorcontrib>Kakudi, Habeebah Adamu</creatorcontrib><creatorcontrib>Sinan, Ismaila Idris</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Abdulmumin, Idris</au><au>Galadanci, Bashir Shehu</au><au>Isa, Abubakar</au><au>Kakudi, Habeebah Adamu</au><au>Sinan, Ismaila Idris</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data</atitle><date>2020-11-14</date><risdate>2020</risdate><abstract>Engineering Letters, vol. 29, no. 4, pp1478-1493, 2021 Many language pairs are low resource, meaning the amount and/or quality of available parallel data is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored using the readily available monolingual data in either or both of the languages to improve the standard of translation models in low, and even high, resource languages. One of the most successful of such works is the back-translation that utilizes the translations of the target language monolingual data to increase the amount of the training data. The quality of the backward model which is trained on the available parallel data has been shown to determine the performance of the back-translation approach. Despite this, only the forward model is improved on the monolingual target data in standard back-translation. A previous study proposed an iterative back-translation approach for improving both models over several iterations. But unlike in the traditional back-translation, it relied on both the target and source monolingual data. This work, therefore, proposes a novel approach that enables both the backward and forward models to benefit from the monolingual target data through a hybrid of self-learning and back-translation respectively. Experimental results have shown the superiority of the proposed approach over the traditional back-translation method on English-German low resource neural machine translation. We also proposed an iterative self-learning approach that outperforms the iterative back-translation while also relying only on the monolingual target data and require the training of less models.</abstract><doi>10.48550/arxiv.2011.07403</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2011.07403
ispartof
issn
language eng
recordid cdi_arxiv_primary_2011_07403
source arXiv.org
subjects Computer Science - Computation and Language
Computer Science - Learning
title A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T22%3A58%3A55IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Hybrid%20Approach%20for%20Improved%20Low%20Resource%20Neural%20Machine%20Translation%20using%20Monolingual%20Data&rft.au=Abdulmumin,%20Idris&rft.date=2020-11-14&rft_id=info:doi/10.48550/arxiv.2011.07403&rft_dat=%3Carxiv_GOX%3E2011_07403%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true