A-BDD: Leveraging Data Augmentations for Safe Autonomous Driving in Adverse Weather and Lighting
High-autonomy vehicle functions rely on machine learning (ML) algorithms to understand the environment. Despite displaying remarkable performance in fair weather scenarios, perception algorithms are heavily affected by adverse weather and lighting conditions. To overcome these difficulties, ML engin...
Gespeichert in:
Hauptverfasser: | , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | High-autonomy vehicle functions rely on machine learning (ML) algorithms to
understand the environment. Despite displaying remarkable performance in fair
weather scenarios, perception algorithms are heavily affected by adverse
weather and lighting conditions. To overcome these difficulties, ML engineers
mainly rely on comprehensive real-world datasets. However, the difficulties in
real-world data collection for critical areas of the operational design domain
(ODD) often means synthetic data is required for perception training and safety
validation. Thus, we present A-BDD, a large set of over 60,000 synthetically
augmented images based on BDD100K that are equipped with semantic segmentation
and bounding box annotations (inherited from the BDD100K dataset). The dataset
contains augmented data for rain, fog, overcast and sunglare/shadow with
varying intensity levels. We further introduce novel strategies utilizing
feature-based image quality metrics like FID and CMMD, which help identify
useful augmented and real-world data for ML training and testing. By conducting
experiments on A-BDD, we provide evidence that data augmentations can play a
pivotal role in closing performance gaps in adverse weather and lighting
conditions. |
---|---|
DOI: | 10.48550/arxiv.2408.06071 |