Focused Concatenation for Context-Aware Neural Machine Translation
A straightforward approach to context-aware neural machine translation consists in feeding the standard encoder-decoder architecture with a window of consecutive sentences, formed by the current sentence and a number of sentences from its context concatenated to it. In this work, we propose an impro...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | A straightforward approach to context-aware neural machine translation
consists in feeding the standard encoder-decoder architecture with a window of
consecutive sentences, formed by the current sentence and a number of sentences
from its context concatenated to it. In this work, we propose an improved
concatenation approach that encourages the model to focus on the translation of
the current sentence, discounting the loss generated by target context. We also
propose an additional improvement that strengthen the notion of sentence
boundaries and of relative sentence distance, facilitating model compliance to
the context-discounted objective. We evaluate our approach with both
average-translation quality metrics and contrastive test sets for the
translation of inter-sentential discourse phenomena, proving its superiority to
the vanilla concatenation approach and other sophisticated context-aware
systems. |
---|---|
DOI: | 10.48550/arxiv.2210.13388 |