Improved Word Segmentation System for Chinese Criminal Judgment Documents

In this paper, a system for automatic word segmentation of Chinese criminal judgment documents is proposed. The system uses a hybrid model composed of fine-tuned BERT (Bidirectional Encoder Representations from Transformers), BiLSTM (Bidirectional Long Short Term Memory) and CRF (Conditional Random...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied artificial intelligence 2024-12, Vol.38 (1)
1. Verfasser: Zhang, Chi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, a system for automatic word segmentation of Chinese criminal judgment documents is proposed. The system uses a hybrid model composed of fine-tuned BERT (Bidirectional Encoder Representations from Transformers), BiLSTM (Bidirectional Long Short Term Memory) and CRF (Conditional Random Field) for named entity recognition, and introduces a custom dictionary that includes common professional terms in Chinese criminal trial documents, as well as a rule system based on judicial system and litigation procedure related regulations, to further improve the accuracy of word segmentation. BERT uses a deep bidirectional Transformer encoder to pre-train general language representations from large-scale unlabeled text corpora. BiLSTM uses two LSTM networks, one for the forward direction and one for the backward direction, to capture the context from both sides of the input sequence. CRF uses a set of features and weights to define a log-linear distribution over the output sequence. Experimental results show that the proposed system has significantly improved word segmentation accuracy compared to the current commonly used Chinese word segmentation models. In the results of the segmentation of the test data, the F1 scores for jieba, THULAC and the segmentation system proposed in this paper are 85.59%, 87.94% and 94.82%, respectively.
ISSN:0883-9514
1087-6545
DOI:10.1080/08839514.2023.2297524