Survey of feature extraction and classification techniques to identify plant through leaves
•Features of leaf image used in image processing.•Different combination of features which make identification of plant, efficient.•Review and comparison of different feature extraction techniques.•Review and comparison of different classification techniques. This paper provides a comprehensive surve...
Gespeichert in:
Veröffentlicht in: | Expert systems with applications 2021-04, Vol.167, p.114181, Article 114181 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •Features of leaf image used in image processing.•Different combination of features which make identification of plant, efficient.•Review and comparison of different feature extraction techniques.•Review and comparison of different classification techniques.
This paper provides a comprehensive survey of various techniques used in computer vision for the automatic identification of plants with the help of leaf images. The extracted information is used by the botanists to identify different species of plants and use their medicinal or other properties. With the upsurge of human interference, the number of plants seems to decrease, but their automatic identification can lead to conservation. The Leaf images may be acquired by a phone camera or a digital camera mounted on a tripod stand. The leaves may be covered by dirt, shadows, or hidden under other leaves. Real-life applications based on the automatic identification of plants can successfully identify even similar-looking plant leaves in all environmental conditions. This paper provides a state-of-the-art review of different leaf extraction techniques which are categorized according to the features of leaf used and their pros and cons. We also discuss and compare the various classifiers used in the identification process. The conclusion of the paper also provides different areas of improvement and future work. |
---|---|
ISSN: | 0957-4174 1873-6793 |
DOI: | 10.1016/j.eswa.2020.114181 |