Dimension reduction based on small sample entropy learning for hand-writing image
Since deep learning requires a large number of training samples, which is not conducive to its application in reality, small sample learning began to get a lot of attention recently. However, under the condition of small samples for training, high dimensional data still impede the efficiency of the...
Gespeichert in:
Veröffentlicht in: | Multimedia tools and applications 2021-05, Vol.80 (11), p.17365-17376 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Since deep learning requires a large number of training samples, which is not conducive to its application in reality, small sample learning began to get a lot of attention recently. However, under the condition of small samples for training, high dimensional data still impede the efficiency of the general machine learning model. To solve this problem, we propose a dimension reduction method based on small sample entropy learning and apply it on hand-writing image. An index based on entropy is introduced to measure the importance of different features. Group entropy and labeled entropies are defined according to the distribution of the whole and labeled data on it respectively. And their estimating calculation forms are discussed separately in the case that feature is discrete and continuous. Finally, the index is approximated and prepared for dimension reduction. Numerical results on hand-writing image data sets are presented to verify that much irrelevant dimensions of hand-writing image are found and reduced. Average computational time of image classification computing is shortened. And classification accuracy is retained and even enhanced slightly on some labeled proportions. The case for one shot learning is validated as well. As a result, the proposed method is meaningful and has practical application value. |
---|---|
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-020-09019-w |