DSRKD: Joint Despecking and Super-Resolution of SAR Images via Knowledge Distillation
Deep learning has achieved success in optical image super-resolution (SR). However, few methods have been proposed for synthetic aperture radar (SAR) images. This is because SAR images inherently suffer from severe speckle noise, and their resolution is typically much lower compared with optical ima...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2024, Vol.62, p.1-13 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Deep learning has achieved success in optical image super-resolution (SR). However, few methods have been proposed for synthetic aperture radar (SAR) images. This is because SAR images inherently suffer from severe speckle noise, and their resolution is typically much lower compared with optical images, making SR more challenging. To simultaneously denoise and preserve texture details in SAR image SR tasks, we propose a joint despeckling and SR network via knowledge distillation (DSRKD). By implementing feature distillation (FD) in the encoding stage, we enable the student network to obtain noise-free latent variables, thereby restoring clean SR images, and simultaneously enhancing effective supervision for the student using target distillation (TD). This distillation strategy does not incur additional computational costs during inference. Experimental results demonstrate that our algorithm effectively suppresses speckle noise while preserving texture details of images on both synthetic and SAR datasets. Compared with other state-of-the-art algorithms, the proposed method achieves superior performance on both visual results and quantitative metrics, such as PSNR, SSIM, and other nonreference indicators under various SR scales and speckle intensities. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2024.3432193 |