An Introductory Survey on Attention Mechanisms in NLP Problems
First derived from human intuition, later adapted to machine translation for automatic token alignment, attention mechanism, a simple method that can be used for encoding sequence data based on the importance score each element is assigned, has been widely applied to and attained significant improve...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | First derived from human intuition, later adapted to machine translation for
automatic token alignment, attention mechanism, a simple method that can be
used for encoding sequence data based on the importance score each element is
assigned, has been widely applied to and attained significant improvement in
various tasks in natural language processing, including sentiment
classification, text summarization, question answering, dependency parsing,
etc. In this paper, we survey through recent works and conduct an introductory
summary of the attention mechanism in different NLP problems, aiming to provide
our readers with basic knowledge on this widely used method, discuss its
different variants for different tasks, explore its association with other
techniques in machine learning, and examine methods for evaluating its
performance. |
---|---|
DOI: | 10.48550/arxiv.1811.05544 |