AHCL-TC: Adaptive Hypergraph Contrastive Learning Networks for Text Classification
Text classification is an essential and classic problem in natural language processing. In recent years, Graph Convolutional Networks (GCNs) have been widely applied to text classification tasks. However, there are still three critical challenges in practical applications: (1) Limited explanatory ab...
Gespeichert in:
Veröffentlicht in: | Neurocomputing (Amsterdam) 2024-09, Vol.597, p.127989, Article 127989 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Text classification is an essential and classic problem in natural language processing. In recent years, Graph Convolutional Networks (GCNs) have been widely applied to text classification tasks. However, there are still three critical challenges in practical applications: (1) Limited explanatory ability for domain-specific terminology; (2) Imbalanced sample distribution leading to a decline in model performance; (3) Excessive calculation consumption. To address these issues, this paper proposes an Adaptive Hypergraph Contrastive Learning Network (AHCL-TC) for text classification, which combines graph contrastive learning and hypergraph neural networks to better capture the internal relationship structure of domain-specific terminology and achieve superior performance with imbalanced sample distribution. AHCL-TC designs a neural network structure based on hypergraphs, using the high-order relationships of hypergraphs to model complex structures in text data. This structure allows the model to better understand multiple relationships between terms, thereby improving classification performance. Graph contrastive learning is used as the training framework of the model. By learning the intrinsic characteristics and structure of graph data, and using data enhancement algorithms to expand the data set, the model’s robustness to data imbalance is improved. Additionally, the paper presents a hypergraph adaptive augmentation algorithm designed for hypergraph structures to address the data imbalance problem. The proposed model is evaluated on multiple benchmark datasets, and the experimental results demonstrate its effectiveness for text classification tasks, outperforming baseline models even with reduced training data percentage. Furthermore, a comprehensive comparison of computational efficiency was conducted. The outcomes reveal that the computational consumption of our model is notably lower than that of other models.
•The model can capture comprehensive semantic information and expand sample data.•This study introduces Hypergraph Attention Networks, leveraging hypergraph structures.•Comparative analysis shows both superior performance and computational efficiency. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2024.127989 |