Automatic extraction of build-up areas from bare land using Sentinel 2A imagery in El Khroub city, Algeria

In this research work, the separation of built-up areas from bare lands in El Khroub city is carried out using a supervised classification approach involving several indices and combining spectral bands of the Sentinel-2A images sensor. The multi-index approach is based on the combination of seven i...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Bulletin de la Société royale des sciences de Liège 2023, p.1-22
Hauptverfasser: Tabet, Ahmed Amine, Abdaoui, Gihen Rym, Layeb, Hafid
Format: Artikel
Sprache:fre
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this research work, the separation of built-up areas from bare lands in El Khroub city is carried out using a supervised classification approach involving several indices and combining spectral bands of the Sentinel-2A images sensor. The multi-index approach is based on the combination of seven indices in order to discriminate between the three main categories of land cover, which are water bodies, green areas and buildings. 3First, this operation requires the use of NDVI, BAEI, NDBI, NDTI, BUI, MNDWI and the NDVIre index, which have a strong discrimination capacity between build-up area and the other land cover features. The neo-images obtained from the combination of the above indices are then classified with the Likelihood algorithm for the extraction of the six class types of land cover (built-up areas, bare land, vegetation, forest, water bodies and asphalt). The multi-index obtained from the combination of BUI, NDTI and NDVIre is the most effective; shown by the evaluation values, where the Overall accuracy is of 96.44%, the Kappa Coefficient (K) of 95.72% and a User Accuracy for built-up class of the order of 100%, with a zero rate of commission. Therefore, the multi-index (BUI, NDTI and NDVIre) is retained for build-up area extraction due to its best discrimination capability.
ISSN:0037-9565
1783-5720
DOI:10.25518/0037-9565.11175