A Novel Impervious Surface Extraction Method Integrating POI, Vehicle Trajectories, and Satellite Imagery

Impervious surfaces are essential elements for the urban ecological environment. Machine-learning-based approaches have achieved successful breakthroughs in impervious surface extraction. These methods require large sets of labeled impervious surface data to train a model. However, it is a challenge...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal of selected topics in applied earth observations and remote sensing 2021, Vol.14, p.8804-8814
Hauptverfasser: Wan, Yiliang, Fei, Yuwen, Wu, Tao, Jin, Rui, Xiao, Tong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Impervious surfaces are essential elements for the urban ecological environment. Machine-learning-based approaches have achieved successful breakthroughs in impervious surface extraction. These methods require large sets of labeled impervious surface data to train a model. However, it is a challenge to acquire massive impervious surface sample data because of complexity, time consumption, and high cost. To address this issue, we explore a method to generate massive impervious surface training samples using point of interest (POI) data and vehicle trajectory global positioning system data. Furthermore, a neural-network-based method is proposed for impervious surface extraction based on the generated training samples. One Landsat-8 image of Shenzhen City, China, was selected to test our approach. The extraction accuracy of the impervious surface was 90.88%, and the overall accuracy based on this method was improved by 8.57% and 8.45% compared with the support vector data description and weighted one-class support vector machine methods, respectively. The results show that the method integrating POI, trajectory data, and satellite imagery can be a viable candidate for impervious surface extraction.
ISSN:1939-1404
2151-1535
DOI:10.1109/JSTARS.2021.3103785