NeRF-enabled Analysis-Through-Synthesis for ISAR Imaging of Small Everyday Objects with Sparse and Noisy UWB Radar Data
Inverse Synthetic Aperture Radar (ISAR) imaging presents a formidable challenge when it comes to small everyday objects due to their limited Radar Cross-Section (RCS) and the inherent resolution constraints of radar systems. Existing ISAR reconstruction methods including backprojection (BP) often re...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Inverse Synthetic Aperture Radar (ISAR) imaging presents a formidable
challenge when it comes to small everyday objects due to their limited Radar
Cross-Section (RCS) and the inherent resolution constraints of radar systems.
Existing ISAR reconstruction methods including backprojection (BP) often
require complex setups and controlled environments, rendering them impractical
for many real-world noisy scenarios. In this paper, we propose a novel
Analysis-through-Synthesis (ATS) framework enabled by Neural Radiance Fields
(NeRF) for high-resolution coherent ISAR imaging of small objects using sparse
and noisy Ultra-Wideband (UWB) radar data with an inexpensive and portable
setup. Our end-to-end framework integrates ultra-wideband radar wave
propagation, reflection characteristics, and scene priors, enabling efficient
2D scene reconstruction without the need for costly anechoic chambers or
complex measurement test beds. With qualitative and quantitative comparisons,
we demonstrate that the proposed method outperforms traditional techniques and
generates ISAR images of complex scenes with multiple targets and complex
structures in Non-Line-of-Sight (NLOS) and noisy scenarios, particularly with
limited number of views and sparse UWB radar scans. This work represents a
significant step towards practical, cost-effective ISAR imaging of small
everyday objects, with broad implications for robotics and mobile sensing
applications. |
---|---|
DOI: | 10.48550/arxiv.2410.10085 |