Exploring spatiotemporal changes in cities and villages through remote sensing using multibranch networks
With the rapid development of the social economy, monumental changes have taken place in the urban and rural environments. Urban and rural areas play a vital role in the interactions between humans and society. Traditional machine learning methods are used to perceive the massive changes in the urba...
Gespeichert in:
Veröffentlicht in: | Heritage science 2021-09, Vol.9 (1), p.1-15, Article 120 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | With the rapid development of the social economy, monumental changes have taken place in the urban and rural environments. Urban and rural areas play a vital role in the interactions between humans and society. Traditional machine learning methods are used to perceive the massive changes in the urban and rural areas, though it is easy to overlook the detailed information about the changes made to the intentional target. As a result, the perception accuracy needs to be improved. Therefore, based on a deep neural network, this paper proposes a method to perceive the spatiotemporal changes in urban and rural intentional connotations through the perspective of remote sensing. The framework first uses multibranch DenseNet to model the multiscale spatiotemporal information of the intentional target and realizes the interaction of high-level semantics and low-level details in the physical appearance. Second, a multibranch and cross-channel attention module is designed to refine and converge multilevel and multiscale temporal and spatial semantics to perceive the subtle changes in the urban and rural intentional targets through the semantics and physical appearance. Finally, the experimental results show that the multibranch perception framework proposed in this paper has the best performance on the two baseline datasets A and B, and its F-Score values are 88.04% and 53.72%, respectively. |
---|---|
ISSN: | 2050-7445 2050-7445 |
DOI: | 10.1186/s40494-021-00595-0 |