Adaptive Coherency Matrix Estimation for Polarimetric SAR Imagery Based on Local Heterogeneity Coefficients

Polarimetric synthetic aperture radar (SAR) images usually contain a mixture of homogeneous and heterogeneous regions, which makes estimation of the coherency matrix a very challenging task. In this paper, we propose an adaptive coherency matrix estimation method that employs local heterogeneity coe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2016-11, Vol.54 (11), p.6732-6745
Hauptverfasser: Yang, Shuai, Chen, Qihao, Yuan, Xiaohui, Liu, Xiuguo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Polarimetric synthetic aperture radar (SAR) images usually contain a mixture of homogeneous and heterogeneous regions, which makes estimation of the coherency matrix a very challenging task. In this paper, we propose an adaptive coherency matrix estimation method that employs local heterogeneity coefficient and leverages the sample covariance matrix estimation to the homogeneous components and the fixed-point estimation to the heterogeneous components. Evaluations were conducted with synthetic polarimetric data and real-world SAR imagery, including UAVSAR, RADARSAT-2, and ESAR. Our experimental results demonstrated that the heterogeneity coefficient effectively characterizes the scattering property of ground objects, which enables adaptive estimation of the coherency matrix in high-resolution polarimetric SAR imagery. Our method was able to handle single- and multilook polarimetric SAR imagery gracefully. Compared with the sample covariance matrix estimator, the fixed-point estimator, and the Lee sigma filtering, our method achieved the best performance for retaining the spatial structure, suppressing speckles, and preserving polarimetric information of SAR imagery with different degrees of heterogeneity.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2016.2589279