Multi-task learning using a hybrid representation for text classification

Text classification is an important task in machine learning. Specifically, deep neural network has been shown strong capability to improve performance in different fields, for example speech recognition, objects recognition and natural language processing. However, in most previous work, the extrac...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural computing & applications 2020-06, Vol.32 (11), p.6467-6480
Hauptverfasser: Lu, Guangquan, Gan, Jiangzhang, Yin, Jian, Luo, Zhiping, Li, Bo, Zhao, Xishun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Text classification is an important task in machine learning. Specifically, deep neural network has been shown strong capability to improve performance in different fields, for example speech recognition, objects recognition and natural language processing. However, in most previous work, the extracted feature models do not achieve the relative text tasks well. To address this issue, we introduce a novel multi-task learning approach called a hybrid representation-learning network for text classification tasks. Our method consists of two network components: a bidirectional gated recurrent unit with attention network module and a convolutional neural network module. In particular, the attention module allows for the task learning private feature representation in local dependence from training texts and that the convolutional neural network module can learn the global representation on sharing. Experiments on 16 subsets of Amazon review data show that our method outperforms several baselines and also proves the effectiveness of joint learning multi-relative tasks.
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-018-3934-y