A novel bio-inspired texture descriptor based on biodiversity and taxonomic measures

•An image as an abstract model of an ecosystem.•Combination of species diversity and richness and taxonomic distinctiveness for texture description.•A texture descriptor invariant to rotation, translation, and scale. Texture can be defined as the change of image intensity that forms repetitive patte...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition 2022-03, Vol.123, p.108382, Article 108382
Hauptverfasser: Ataky, Steve Tsham Mpinda, Lameiras Koerich, Alessandro
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•An image as an abstract model of an ecosystem.•Combination of species diversity and richness and taxonomic distinctiveness for texture description.•A texture descriptor invariant to rotation, translation, and scale. Texture can be defined as the change of image intensity that forms repetitive patterns resulting from the physical properties of an object’s roughness or differences in a reflection on the surface. Considering that texture forms a system of patterns in a non-deterministic way, biodiversity concepts can help its characterization from an image. This paper proposes a novel approach to quantify such a complex system of diverse patterns through species diversity, richness, and taxonomic distinctiveness. The proposed approach considers each image channel as a species ecosystem and computes species diversity and richness as well as taxonomic measures to describe the texture. Furthermore, the proposed approach takes advantage of ecological patterns’ invariance characteristics to build a permutation, rotation, and translation invariant descriptor. Experimental results on three datasets of natural texture images and two datasets of histopathological images have shown that the proposed texture descriptor has advantages over several texture descriptors and deep methods.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2021.108382