A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data

Many language pairs are low resourced, meaning the amount and/or quality of the available parallel data between them is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored using the readily available monolingua...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Engineering letters 2021-11, Vol.29 (4), p.1478
Hauptverfasser: Abdulmumin, Idris, Galadanci, Bashir Shehu, Isa, Abubakar, Kakudi, Habeebah Adamu, Sinan, Ismaila Idris
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Many language pairs are low resourced, meaning the amount and/or quality of the available parallel data between them is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored using the readily available monolingual data in either or both of the languages to improve the standard of translation models in low, and even high, resource languages. One of the most successful of such works is the back-translation that utilizes the translations of the target language monolingual data to increase the amount of the training data. The quality of the backward model which is trained on the available – often low resource – parallel data has been shown to determine the performance of this approach. Despite this, the standard back-translation is meant only to improve the performance of the forward model on the available monolingual target data. A previous study proposed an iterative back-translation approach that, unlike in traditional back-translation, relies on both the target and source monolingual data to improve the two models over several iterations. In this work, however, we proposed a novel approach that enables both of the backward and forward models to benefit from the monolingual target data through a hybrid of self-learning and back-translation respectively. Experimental results have shown the superiority of the proposed approach over the traditional back-translation method on English-German low resource neural machine translation. We also proposed an iterative self-learning approach that outperforms the iterative back-translation while also relying only on the monolingual target data and requiring the training of less models.
ISSN:1816-093X
1816-0948