Pseudo-labeling for Scalable 3D Object Detection
To safely deploy autonomous vehicles, onboard perception systems must work reliably at high accuracy across a diverse set of environments and geographies. One of the most common techniques to improve the efficacy of such systems in new domains involves collecting large labeled datasets, but such dat...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | To safely deploy autonomous vehicles, onboard perception systems must work
reliably at high accuracy across a diverse set of environments and geographies.
One of the most common techniques to improve the efficacy of such systems in
new domains involves collecting large labeled datasets, but such datasets can
be extremely costly to obtain, especially if each new deployment geography
requires additional data with expensive 3D bounding box annotations. We
demonstrate that pseudo-labeling for 3D object detection is an effective way to
exploit less expensive and more widely available unlabeled data, and can lead
to performance gains across various architectures, data augmentation
strategies, and sizes of the labeled dataset. Overall, we show that better
teacher models lead to better student models, and that we can distill expensive
teachers into efficient, simple students.
Specifically, we demonstrate that pseudo-label-trained student models can
outperform supervised models trained on 3-10 times the amount of labeled
examples. Using PointPillars [24], a two-year-old architecture, as our student
model, we are able to achieve state of the art accuracy simply by leveraging
large quantities of pseudo-labeled data. Lastly, we show that these student
models generalize better than supervised models to a new domain in which we
only have unlabeled data, making pseudo-label training an effective form of
unsupervised domain adaptation. |
---|---|
DOI: | 10.48550/arxiv.2103.02093 |