Ladder: A Model-Agnostic Framework Boosting LLM-based Machine Translation to the Next Level
General-purpose Large Language Models (LLMs) like GPT-4 have achieved remarkable advancements in machine translation (MT) by leveraging extensive web content. On the other hand, translation-specific LLMs are built by pre-training on domain-specific monolingual corpora and fine-tuning with human-anno...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | General-purpose Large Language Models (LLMs) like GPT-4 have achieved
remarkable advancements in machine translation (MT) by leveraging extensive web
content. On the other hand, translation-specific LLMs are built by pre-training
on domain-specific monolingual corpora and fine-tuning with human-annotated
translation data. Despite the superior performance, these methods either demand
an unprecedented scale of computing and data or substantial human editing and
annotation efforts. In this paper, we develop MT-Ladder, a novel model-agnostic
and cost-effective tool to refine the performance of general LLMs for MT.
MT-Ladder is trained on pseudo-refinement triplets which can be easily obtained
from existing LLMs without additional human cost. During training, we propose a
hierarchical fine-tuning strategy with an easy-to-hard schema, improving
MT-Ladder's refining performance progressively. The trained MT-Ladder can be
seamlessly integrated with any general-purpose LLMs to boost their translation
performance. By utilizing Gemma-2B/7B as the backbone, MT-Ladder-2B can elevate
raw translations to the level of top-tier open-source models (e.g., refining
BigTranslate-13B with +6.91 BLEU and +3.52 COMET for XX-En), and MT-Ladder-7B
can further enhance model performance to be on par with the state-of-the-art
GPT-4. Extensive ablation and analysis corroborate the effectiveness of
MT-Ladder in diverse settings. Our code is available at
https://github.com/fzp0424/MT-Ladder |
---|---|
DOI: | 10.48550/arxiv.2406.15741 |