Semantic segmentation of satellite images with different building types using deep learning methods

In this study, using deep learning-based semantic segmentation methods, an automatic building segmentation application was carried out with a remote sensing image on a sample area covering a small part of Istanbul. In this context, firstly, fully convolutional networks, semantic segmentation inferen...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Remote sensing applications 2024-04, Vol.34, p.101176, Article 101176
Hauptverfasser: Amirgan, Burcu, Erener, Arzu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this study, using deep learning-based semantic segmentation methods, an automatic building segmentation application was carried out with a remote sensing image on a sample area covering a small part of Istanbul. In this context, firstly, fully convolutional networks, semantic segmentation inference principles, and open-source building datasets presented to the public were examined. Within the scope of the study, the IST building dataset containing examples from 5 different building type classes were created using very high resolution Pleiades satellite images. Then, building segmentation training was carried out on UNet and UNet++ architectures with this dataset. Segmentation success was compared between the models obtained after the training and the building classes according to their types. Experimental results showed that the UNET and UNet++ architecture IOU metric achieved segmentation success of 0.9167 and 0.9150 for Industrial class, 0.8124 and 0.8175 for Adjacent class, 0.8459 and 0.8446 for Housing-Villa class, 0.7629 and 0.7477 for Slum class, 0.6697 and 0.6140 for Other class. Finally, building segmentation difficulties arising from the types of buildings have been identified, and suggestions have been made to overcome this problem.
ISSN:2352-9385
2352-9385
DOI:10.1016/j.rsase.2024.101176