BEVSeg2TP: Surround View Camera Bird's-Eye-View Based Joint Vehicle Segmentation and Ego Vehicle Trajectory Prediction
Trajectory prediction is, naturally, a key task for vehicle autonomy. While the number of traffic rules is limited, the combinations and uncertainties associated with each agent's behaviour in real-world scenarios are nearly impossible to encode. Consequently, there is a growing interest in lea...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Trajectory prediction is, naturally, a key task for vehicle autonomy. While
the number of traffic rules is limited, the combinations and uncertainties
associated with each agent's behaviour in real-world scenarios are nearly
impossible to encode. Consequently, there is a growing interest in
learning-based trajectory prediction. The proposed method in this paper
predicts trajectories by considering perception and trajectory prediction as a
unified system. In considering them as unified tasks, we show that there is the
potential to improve the performance of perception. To achieve these goals, we
present BEVSeg2TP - a surround-view camera bird's-eye-view-based joint vehicle
segmentation and ego vehicle trajectory prediction system for autonomous
vehicles. The proposed system uses a network trained on multiple camera views.
The images are transformed using several deep learning techniques to perform
semantic segmentation of objects, including other vehicles, in the scene. The
segmentation outputs are fused across the camera views to obtain a
comprehensive representation of the surrounding vehicles from the
bird's-eye-view perspective. The system further predicts the future trajectory
of the ego vehicle using a spatiotemporal probabilistic network (STPN) to
optimize trajectory prediction. This network leverages information from
encoder-decoder transformers and joint vehicle segmentation. |
---|---|
DOI: | 10.48550/arxiv.2312.13081 |