Consensus rule for wheat cultivar classification on VL, VNIR and SWIR imaging
To facilitate the quality assessment of wheat cultivars, diverse imaging tools and techniques have been applied in order to omit the expert decision, which can cause failure in the identification of the wheat cultivar's label and its quality simultaneously. To minimize the risks caused by the e...
Gespeichert in:
Veröffentlicht in: | IET image processing 2022-09, Vol.16 (11), p.2834-2844 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | To facilitate the quality assessment of wheat cultivars, diverse imaging tools and techniques have been applied in order to omit the expert decision, which can cause failure in the identification of the wheat cultivar's label and its quality simultaneously. To minimize the risks caused by the expert's decision, a promising framework for identification is greatly required to more effectively assess wheat type. Therefore, to be beneficial to this association, two methods have been developed by performing traditional and modern feature extraction algorithms on visible light (VL), visible near‐infrared (VNIR) and short‐wave infrared (SWIR) imaging as well as a fusion of these imaging systems. The proposed systems are called the bag of word (BoW) framework and convolutional neural networks (CNN) framework. With regard to wheat cultivar detection, the consensus rule has been established based on decisions predicted by CNN and BoW frameworks. The accuracy results obtained by consensus rule indicate that we have achieved 99.94% and 68.94% in case of CNN framework and BoW framework, respectively. Experimental results suggest that BoW features are not suitable to represent and match texture patterns such as repeated wheat kernels in an image, whereas CNN features always outperform handcrafted elements and properties for all datasets. |
---|---|
ISSN: | 1751-9659 1751-9667 |
DOI: | 10.1049/ipr2.12206 |