Scale-selective and noise-robust extended local binary pattern for texture classification

•A novel texture descriptor to address both scale transformation and noise interference.•An extended LBP with lightweight feature dimension.•Maintains both macro and micro descriptive information in the spatial and spectral domains.•Outperforms thirty classical LPB variants as well as eight typical...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition 2022-12, Vol.132, p.108901, Article 108901
Hauptverfasser: Luo, Qiwu, Su, Jiaojiao, Yang, Chunhua, Silven, Olli, Liu, Li
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•A novel texture descriptor to address both scale transformation and noise interference.•An extended LBP with lightweight feature dimension.•Maintains both macro and micro descriptive information in the spatial and spectral domains.•Outperforms thirty classical LPB variants as well as eight typical deep learning methods.•Experiments on five public databases and one fresh texture database. As one of the most successful local feature descriptors, the local binary pattern (LBP) estimates the texture distribution rule of an image based on the signs of differences between neighboring pixels to obtain intensity- and rotation- invariance. In this paper, we propose a novel image descriptor to address scale transformation and noise interference simultaneously. We name it scale-selective and noise-robust extended LBP (SNELBP). First, each image in training sets is transformed into different scale spaces by a Gaussian filter. Second, noise-robust pattern histograms are obtained from each scale space by using our previously proposed median robust extended LBP (MRELBP). Then, scale-invariant histograms are determined by selecting the maximum among all scale levels for a certain image. Finally, the most informative patterns are selected from the dictionary pretrained by the two-stage compact dominant feature selection method (CDFS), maintaining the descriptor more lightweight with sufficiently low time cost. Extensive experiments on five public databases (Outex_TC_00011, TC_00012, KTH-TIPS, UMD and NEU) and one fresh texture database (JoJo) under two kinds of interferences (Gaussian and salt pepper) indicate that our SNELBP yields more competitive results than thirty classical LPB variants as well as eight typical deep learning methods.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2022.108901