RodNet: An Advanced Multidomain Object Detection Approach Using Feature Transformation With Generative Adversarial Networks
Advanced object detection (OD) techniques have been widely studied in recent years and have been successfully applied in real-world applications. However, existing algorithms may struggle with nighttime image detection, especially in low-luminance conditions. Researchers have attempted to overcome t...
Gespeichert in:
Veröffentlicht in: | IEEE sensors journal 2023-08, Vol.23 (15), p.17531-17540 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Advanced object detection (OD) techniques have been widely studied in recent years and have been successfully applied in real-world applications. However, existing algorithms may struggle with nighttime image detection, especially in low-luminance conditions. Researchers have attempted to overcome this issue by collecting large amounts of multidomain data, but performance remains poor because these methods train images from both low- and sufficient-luminance domains without a specific training policy. In this work, we present a lightweight framework for multidomain OD using feature domain transformation with generative adversarial networks (GANs). The proposed GAN framework trains a generator network to transform features from the low-luminance domain to a sufficient-luminance domain, making the discriminator networks unable to distinguish whether the features were generated from a low-luminance or a normal image and thus achieving luminance-invariant feature extraction. To preserve semantic meaning in the transformed features, a training policy has been introduced for OD and feature transformation in various domains. The proposed method achieves the state-of-the-art performance with a 9.95 improvement in average precision without incurring additional computational costs. |
---|---|
ISSN: | 1530-437X 1558-1748 |
DOI: | 10.1109/JSEN.2023.3281399 |