TTN‐FCN: A Tangut character classification framework by tree tensor network and fully connected neural network
The classification of Tangut characters plays a significant role in Western Xia research, and yet it is still a great challenge to recognize Tangut characters accurately due to the less inter‐class similarity and smaller intra‐class variation of Tangut character images. The main reason is that Tangu...
Gespeichert in:
Veröffentlicht in: | IET Image Processing 2023-11, Vol.17 (13), p.3815-3829 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The classification of Tangut characters plays a significant role in Western Xia research, and yet it is still a great challenge to recognize Tangut characters accurately due to the less inter‐class similarity and smaller intra‐class variation of Tangut character images. The main reason is that Tangut characters possess an extremely intricate construct despite some other character recognition methods emerging. For this reason, the authors propose a novel framework for Tangut character classification, named tree tensor network‐fully connected neural network (TTN‐FCN), in which TTN is embedded to fully connect neural network. Firstly, Tangut images are encoded into quantum product states without entanglement in pre‐processing. Then the TTN is adopted to contract quantum product states to intermediate low dimensional quantum states. Finally, low dimensional quantum states are input to the FCN network to perform classification tasks. The Model is evaluated on the Tangut character dataset that is constructed from Tangut character‐related documents by scanning and consists of 30,293 Tangut character images with 6077 categories. Experimental results show that TTN‐FCN has a faster convergence speed and achieves classification precision (AC) of 99.98% and loss of 0.688% with the max batch size 2042, which outperforms 30 compared networks. Moreover, the proposed model can also be generalized to other character recognition, which enhances its potential for cultural relic research and development. |
---|---|
ISSN: | 1751-9659 1751-9667 |
DOI: | 10.1049/ipr2.12899 |