A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data
Engineering Letters, vol. 29, no. 4, pp1478-1493, 2021 Many language pairs are low resource, meaning the amount and/or quality of available parallel data is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy. Many works have explored u...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Engineering Letters, vol. 29, no. 4, pp1478-1493, 2021 Many language pairs are low resource, meaning the amount and/or quality of
available parallel data is not sufficient to train a neural machine translation
(NMT) model which can reach an acceptable standard of accuracy. Many works have
explored using the readily available monolingual data in either or both of the
languages to improve the standard of translation models in low, and even high,
resource languages. One of the most successful of such works is the
back-translation that utilizes the translations of the target language
monolingual data to increase the amount of the training data. The quality of
the backward model which is trained on the available parallel data has been
shown to determine the performance of the back-translation approach. Despite
this, only the forward model is improved on the monolingual target data in
standard back-translation. A previous study proposed an iterative
back-translation approach for improving both models over several iterations.
But unlike in the traditional back-translation, it relied on both the target
and source monolingual data. This work, therefore, proposes a novel approach
that enables both the backward and forward models to benefit from the
monolingual target data through a hybrid of self-learning and back-translation
respectively. Experimental results have shown the superiority of the proposed
approach over the traditional back-translation method on English-German low
resource neural machine translation. We also proposed an iterative
self-learning approach that outperforms the iterative back-translation while
also relying only on the monolingual target data and require the training of
less models. |
---|---|
DOI: | 10.48550/arxiv.2011.07403 |