DEEP BUILDING FOOTPRINT EXTRACTION FOR URBAN RISK ASSESSMENT – REMOTE SENSING AND DEEP LEARNING BASED APPROACH

Mapping building footprints can play a crucial role in urban dynamics moni-toring, risk assessment and disaster management. Available free building footprints, like OpenStreetMap, provide manually annotated building foot-print information for some urban areas; however, frequently it does not en-tire...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International archives of the photogrammetry, remote sensing and spatial information sciences. remote sensing and spatial information sciences., 2022-12, Vol.XLVIII-4/W3-2022, p.83-86
Hauptverfasser: Mharzi Alaoui, H., Radoine, H., Chenal, J., Hajji, H., Yakubu, H.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Mapping building footprints can play a crucial role in urban dynamics moni-toring, risk assessment and disaster management. Available free building footprints, like OpenStreetMap, provide manually annotated building foot-print information for some urban areas; however, frequently it does not en-tirely cover urban areas in many parts of the world and is not always availa-ble. The huge potential for meaningful ground information extraction from high-resolution Remote Sensing imagery can be considered as an alternative and a reliable source of data for building footprint generation. Therefore, the aim of the study is to explore the use of satellite imagery data and some of the state-of-the art deep learning tools to fully automate building footprint extraction. To better understand the usability and generalization ability of those approaches, this study proposes a comparative analysis of the perfor-mances and characteristics of two of the most recent deep learning models such as Unet and Attention-Unet for building footprint generation.
ISSN:2194-9034
1682-1750
2194-9034
DOI:10.5194/isprs-archives-XLVIII-4-W3-2022-83-2022