Applications of Convolutional Neural Network for Classification of Land Cover and Groundwater Potentiality Zones
In the field of groundwater engineering, a convolutional neural network (CNN) has become a great role to assess the spatial groundwater potentiality zones and land use/land cover changes based on remote sensing (RS) technology. CNN can be offering a great potential to extract complex spatial feature...
Gespeichert in:
Veröffentlicht in: | Journal of Engineering 2022-01, Vol.2022, p.1-8 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In the field of groundwater engineering, a convolutional neural network (CNN) has become a great role to assess the spatial groundwater potentiality zones and land use/land cover changes based on remote sensing (RS) technology. CNN can be offering a great potential to extract complex spatial features with multiple high levels of generalization. However, geometric distortion and fuzzy entity boundaries as well as a huge data preparation severance may be the main constraint and affect the spatial potential of CNN application for land cover classification. This study aims to recognize the proficiency of deep learning algorithms, i.e., CNN, for spatial assessment of groundwater potential zones and land cover. Among the groundwater influencing factors, classification of land cover (agriculture, built-up, water bodies, forests, and bare land) has been reported by several researchers for different purposes and they approved the CNN capability for the prediction of spatial groundwater potentiality zones like very high, high, moderate, poor, and very poor areas. In this study, CNN is recommended as a very essential algorithm for the identification of groundwater potential zones and classification of land use/land cover change. CNN gives a better option for scholars regarding when the limited data sets are available for validation. |
---|---|
ISSN: | 2314-4904 2314-4912 2314-4912 |
DOI: | 10.1155/2022/6372089 |