Fast Wideband Beamforming Using Convolutional Neural Network
With the wideband beamforming approaches, the synthetic aperture radar (SAR) could achieve high azimuth resolution and wide swath. However, the performance of conventional adaptive wideband time-domain beamforming is severely affected as the received signal snapshots are insufficient for adaptive ap...
Gespeichert in:
Veröffentlicht in: | Remote sensing (Basel, Switzerland) Switzerland), 2023-02, Vol.15 (3), p.712 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | With the wideband beamforming approaches, the synthetic aperture radar (SAR) could achieve high azimuth resolution and wide swath. However, the performance of conventional adaptive wideband time-domain beamforming is severely affected as the received signal snapshots are insufficient for adaptive approaches. In this paper, a wideband beamformer using convolutional neural network (CNN) method, namely, frequency constraint wideband beamforming prediction network (WBPNet), is proposed to obtain a satisfactory performance in the circumstances of scanty snapshots. The proposed WBPNet successfully estimates the direction of arrival of interference with scanty snapshots and obtains the optimal weights with effectively null for the interference by utilizing the uniqueness of CNN to extract potential nonlinear features of input information. Meanwhile, the novel beamformer has an undistorted response to the wideband signal of interest. Compared with the conventional time-domain wideband beamforming algorithm, the proposed method can fast obtain adaptive weights because of using few snapshots. Moreover, the proposed WBPNet has a satisfactory performance on wideband beamforming with low computational complexity because it avoids the inverse operation of covariance matrix. Simulation results show the meliority and feasibility of the proposed approach. |
---|---|
ISSN: | 2072-4292 2072-4292 |
DOI: | 10.3390/rs15030712 |