Research on Chinese ancient characters image recognition method based on adaptive receptive field
As the main carrier of Chinese civilization, the importance of ancient Chinese documents cannot be overstated. However, the phenomenon of “Heavy storage and light use” is serious in modern society. In order to improve the utilization of its content, and to improve the difficult recognition problem c...
Gespeichert in:
Veröffentlicht in: | Soft computing (Berlin, Germany) Germany), 2022-09, Vol.26 (17), p.8273-8282 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | As the main carrier of Chinese civilization, the importance of ancient Chinese documents cannot be overstated. However, the phenomenon of “Heavy storage and light use” is serious in modern society. In order to improve the utilization of its content, and to improve the difficult recognition problem caused by the complex character glyphs and diverse styles of Chinese characters in ancient books, as well as the single and double column mismatch and the different sizes of Chinese characters, in this paper, we propose an adaptive receptive field-based image recognition network for Chinese ancient characters. The network fully considers the high sensitivity of residual dense network to data fluctuation and uses selective convolution kernel to construct residual dense blocks to provide a dynamic selection mechanism for the receptive fields of different images, so that it can focus on the fine structure of Chinese characters and thus realize cross-channel information interaction and integration. The experimental results show that the accuracy and precision of recognition using this paper's network can reach 93.48% and 95.37%, which is a significant improvement compared with other recognition methods, and proves that SKRDN has good advantages in dealing with the recognition problem of ancient Chinese characters. |
---|---|
ISSN: | 1432-7643 1433-7479 |
DOI: | 10.1007/s00500-022-07270-x |