LCPFormer: Towards Effective 3D Point Cloud Analysis via Local Context Propagation in Transformers
Transformer with its underlying attention mechanism and the ability to capture long-range dependencies makes it become a natural choice for unordered point cloud data. However, separated local regions from the general sampling architecture corrupt the structural information of the instances, and the...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Transformer with its underlying attention mechanism and the ability to
capture long-range dependencies makes it become a natural choice for unordered
point cloud data. However, separated local regions from the general sampling
architecture corrupt the structural information of the instances, and the
inherent relationships between adjacent local regions lack exploration, while
local structural information is crucial in a transformer-based 3D point cloud
model. Therefore, in this paper, we propose a novel module named Local Context
Propagation (LCP) to exploit the message passing between neighboring local
regions and make their representations more informative and discriminative.
More specifically, we use the overlap points of adjacent local regions (which
statistically show to be prevalent) as intermediaries, then re-weight the
features of these shared points from different local regions before passing
them to the next layers. Inserting the LCP module between two transformer
layers results in a significant improvement in network expressiveness. Finally,
we design a flexible LCPFormer architecture equipped with the LCP module. The
proposed method is applicable to different tasks and outperforms various
transformer-based methods in benchmarks including 3D shape classification and
dense prediction tasks such as 3D object detection and semantic segmentation.
Code will be released for reproduction. |
---|---|
DOI: | 10.48550/arxiv.2210.12755 |