Land use/land cover change classification and prediction using deep learning approaches
Nowadays, land use and land cover (LULC) change is a major problem for decision-makers and ecologists on account of its impact on natural ecosystems. In this manuscript, LU/LC change classification and prediction using deep convolutional spiking neural network (DCSNN) and enhanced Elman spike neural...
Gespeichert in:
Veröffentlicht in: | Signal, image and video processing image and video processing, 2024-02, Vol.18 (1), p.223-232 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Nowadays, land use and land cover (LULC) change is a major problem for decision-makers and ecologists on account of its impact on natural ecosystems. In this manuscript, LU/LC change classification and prediction using deep convolutional spiking neural network (DCSNN) and enhanced Elman spike neural network (EESNN) (LU/LC-DCSNN-EESNN) is proposed. The input images are gathered from IRS Satellite Resourcesat-1 LISS-III with Cartosat-1 digital elevation model (DEM) satellite imagery of the Javadi Hills, Tamil Nadu. After that, the images are pre-processed using the fast discrete curvelet transform and wrapping (FDCT-WRP) method is used for extracting the region of interest (ROI) coordinates of Javadi Hills satellite image. Then, for categorizing the area of forest and non-forest, the DCSNN is used. The categorized images are given to post-classification process for eradicating the noise and misclassification errors by Markov chain random field (MCRF) co-simulation approach. The LU/LC changes are predicted using EESNN method. The performance metrics, like precision, accuracy, f1 score, error rate, specificity, recall, kappa coefficient and ROC, are analyzed. The proposed LU/LC-DCSNN-EESNN method has attained 19.45%, 20.56% and 23.67% higher accuracy, 19.45%, 32.56% and 17.45% higher F-measure, and 16.78%, 22.09% and 32.39% lower error rate compared with the existing methods. |
---|---|
ISSN: | 1863-1703 1863-1711 |
DOI: | 10.1007/s11760-023-02701-0 |