Adaptive Sensing Data Augmentation for Drones Using Attention-Based GAN

Drones have become essential tools across various industries due to their ability to provide real-time data and perform automated tasks. However, integrating multiple sensors on a single drone poses challenges such as payload limitations and data management issues. This paper proposes a comprehensiv...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors (Basel, Switzerland) Switzerland), 2024-08, Vol.24 (16), p.5451
Hauptverfasser: Yoon, Namkyung, Kim, Kiseok, Lee, Sangmin, Bai, Jin Hyoung, Kim, Hwangnam
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Drones have become essential tools across various industries due to their ability to provide real-time data and perform automated tasks. However, integrating multiple sensors on a single drone poses challenges such as payload limitations and data management issues. This paper proposes a comprehensive system that leverages advanced deep learning techniques, specifically an attention-based generative adversarial network (GAN), to address data scarcity in drone-collected time-series sensor data. By adjusting sensing frequency based on operational conditions while maintaining data resolution, our system ensures consistent and high-quality data collection. The The attention mechanism within the GAN enhances the generation of synthetic data, filling gaps caused by reduced sensing frequency with realistic data. This approach improves the efficiency and performance of various applications, such as precision agriculture, environmental monitoring, and surveillance. The experimental results demonstrated the effectiveness of our methodology in extending the operational range and duration of drones and providing reliable augmented data utilizing a variety of evaluation metrics. Furthermore, the superior performance of the proposed system was verified by comparing it with various comparative GAN models.
ISSN:1424-8220
1424-8220
DOI:10.3390/s24165451