GeoSparseNet: A Multi-Source Geometry-Aware CNN for Urban Scene Analysis
The convolutional neural networks (CNNs) functioning on geometric learning for the urban large-scale 3D meshes are indispensable because of their substantial, complex, and deformed shape constitutions. To address this issue, we proposed a novel Geometry-Aware Multi-Source Sparse-Attention CNN (GeoSp...
Gespeichert in:
Veröffentlicht in: | Remote sensing (Basel, Switzerland) Switzerland), 2024-06, Vol.16 (11), p.1827 |
---|---|
Hauptverfasser: | , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The convolutional neural networks (CNNs) functioning on geometric learning for the urban large-scale 3D meshes are indispensable because of their substantial, complex, and deformed shape constitutions. To address this issue, we proposed a novel Geometry-Aware Multi-Source Sparse-Attention CNN (GeoSparseNet) for the urban large-scale triangular mesh classification task. GeoSparseNet leverages the non-uniformity of 3D meshes to depict both broad flat areas and finely detailed features by adopting the multi-scale convolutional kernels. By operating on the mesh edges to prepare for subsequent convolutions, our method exploits the inherent geodesic connections by utilizing the Large Kernel Attention (LKA) based Pooling and Unpooling layers to maintain the shape topology for accurate classification predictions. Learning which edges in a mesh face to collapse, GeoSparseNet establishes a task-oriented process where the network highlights and enhances crucial features while eliminating unnecessary ones. Compared to previous methods, our innovative approach outperforms them significantly by directly processing extensive 3D mesh data, resulting in more discerning feature maps. We achieved an accuracy rate of 87.5% when testing on an urban large-scale model dataset of the Australian city of Adelaide. |
---|---|
ISSN: | 2072-4292 2072-4292 |
DOI: | 10.3390/rs16111827 |