Combining deep learning and crowd-sourcing images to predict housing quality in rural China

Housing quality is essential to human well-being, security and health. Monitoring the housing quality is crucial for unveiling the socioeconomic development status and providing political proposals. However, depicting the nationwide housing quality in large-scale and fine detail is exceedingly rare...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scientific reports 2022-11, Vol.12 (1), p.19558-19558, Article 19558
Hauptverfasser: Xu, Weipan, Gu, Yu, Chen, Yifan, Wang, Yongtian, Chen, Luan, Deng, Weihuan, Li, Xun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Housing quality is essential to human well-being, security and health. Monitoring the housing quality is crucial for unveiling the socioeconomic development status and providing political proposals. However, depicting the nationwide housing quality in large-scale and fine detail is exceedingly rare in remote rural areas owing to the high cost of canonical survey methods. Taking rural China as an example, we collect massive rural house images for housing quality assessment by various volunteers and further build up a deep learning model based on the assessed images to realize an automatic prediction for huge raw house images. As a result, the model performance achieves a high R 2 of 0.76. Afterward, the housing qualities of 10,000 Chinese villages are estimated based on 50,000 unlabeled geo-images, and an apparent spatial heterogeneity is discovered. Specifically, divided by Qinling Mountains-Huaihe River Line, housing quality in southern China is much better than in northern China. Our method provides high-resolution predictions of housing quality across the extensive rural area, which could be a complementary tool for automatical monitoring of housing change and supporting house-related policymaking.
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-022-23679-8