Kullback-Leibler Divergence-Based Regularized Normalization for Low-Resource Tasks
Large pretrained models, like BERT, GPT, and Wav2Vec, have demonstrated their ability to learn transferable representations for various downstream tasks. However, obtaining a substantial amount of supervised data remains a challenge due to resource and time limitations. As a solution, researchers ha...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on artificial intelligence 2024-06, Vol.5 (6), p.2638-2650 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Schreiben Sie den ersten Kommentar!