HerBERT: Efficiently Pretrained Transformer-based Language Model for Polish
BERT-based models are currently used for solving nearly all Natural Language Processing (NLP) tasks and most often achieve state-of-the-art results. Therefore, the NLP community conducts extensive research on understanding these models, but above all on designing effective and efficient training pro...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | BERT-based models are currently used for solving nearly all Natural Language
Processing (NLP) tasks and most often achieve state-of-the-art results.
Therefore, the NLP community conducts extensive research on understanding these
models, but above all on designing effective and efficient training procedures.
Several ablation studies investigating how to train BERT-like models have been
carried out, but the vast majority of them concerned only the English language.
A training procedure designed for English does not have to be universal and
applicable to other especially typologically different languages. Therefore,
this paper presents the first ablation study focused on Polish, which, unlike
the isolating English language, is a fusional language. We design and
thoroughly evaluate a pretraining procedure of transferring knowledge from
multilingual to monolingual BERT-based models. In addition to multilingual
model initialization, other factors that possibly influence pretraining are
also explored, i.e. training objective, corpus size, BPE-Dropout, and
pretraining length. Based on the proposed procedure, a Polish BERT-based
language model -- HerBERT -- is trained. This model achieves state-of-the-art
results on multiple downstream tasks. |
---|---|
DOI: | 10.48550/arxiv.2105.01735 |