Perturbation-Based Self-Supervised Attention for Attention Bias in Text Classification

In text classification, the traditional attention mechanisms usually focus too much on frequent words, and need extensive labeled data in order to learn. This article proposes a perturbation-based self-supervised attention approach to guide attention learning without any annotation overhead. Specifi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE/ACM transactions on audio, speech, and language processing speech, and language processing, 2023, Vol.31, p.3139-3151
Hauptverfasser: Feng, Huawen, Lin, Zhenxi, Ma, Qianli
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!