SYSTEMS AND METHODS FOR CROSS-LINGUAL TRANSFER LEARNING

Embodiments described herein provide a method of training a language model by tuning a prompt. The method comprises masking tokens of first and second conversational texts which have the same semantic meaning but in different languages (e.g., a translation). The masked texts are input to a language...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Xiong, Caiming, Zhou, Yingbo, Qu, Jin, Tu, Lifu
Format: Patent
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Embodiments described herein provide a method of training a language model by tuning a prompt. The method comprises masking tokens of first and second conversational texts which have the same semantic meaning but in different languages (e.g., a translation). The masked texts are input to a language model with a prepended soft prompt. The language model generates respective predicted outputs. A loss objective is computed including a masked language model loss. The prompt is updated based on the computed loss objective via backpropagation while keeping the language model frozen.