Segmentation-aware relational graph convolutional network with multi-layer CRF for nested named entity recognition

Named Entity Recognition (NER) is fundamental in natural language processing, involving identifying entity spans and types within a sentence. Nested NER contains other entities, which pose a significant challenge, especially pronounced in the domain of medical-named entities due to intricate nesting...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Complex & Intelligent Systems 2024-12, Vol.10 (6), p.7893-7905
Hauptverfasser: Han, Daojun, Wang, Zemin, Li, Yunsong, ma, Xiangbo, Zhang, Juntao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Named Entity Recognition (NER) is fundamental in natural language processing, involving identifying entity spans and types within a sentence. Nested NER contains other entities, which pose a significant challenge, especially pronounced in the domain of medical-named entities due to intricate nesting patterns inherent in medical terminology. Existing studies can not capture interdependencies among different entity categories, resulting in inadequate performance in nested NER tasks. To address this problem, we propose a novel L ayer-based architecture with S egmentation-aware R elational G raph C onvolutional N etwork (LSRGCN) for Nested NER in the medical domain. LSRGCN comprises two key modules: a shared segmentation-aware encoder and a multi-layer conditional random field decoder. The former part provides token representation including boundary information from sentence segmentation. The latter part can learn the connections between different entity classes and improve recognition accuracy through secondary decoding. We conduct experiments on four datasets. Experimental results demonstrate the effectiveness of our model. Additionally, extensive studies are conducted to enhance our understanding of the model and its capabilities.
ISSN:2199-4536
2198-6053
DOI:10.1007/s40747-024-01551-8