A novel classification method based on texture analysis using high-resolution SAR and optical data
Data fusion technique is an efficient way to benefit multi-source, multi-platform, and multi-angle remotely sensed information. Optical imagery and SAR (synthetic aperture radar) data are complementary in terms of capability of data acquisition and image characteristics. With their different capabil...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Data fusion technique is an efficient way to benefit multi-source, multi-platform, and multi-angle remotely sensed information. Optical imagery and SAR (synthetic aperture radar) data are complementary in terms of capability of data acquisition and image characteristics. With their different capability and their unique information content respectively, fusion of high resolution SAR and optical multi-spectral imagery can improve the classification accuracy in land use. Texture information plays an important role for class discrimination especially in SAR imagery for its backscatter is sensitive to the type, orientation, homogeneity and spatial relationship of ground objects. In order to take full advantage of multi-source remotely sensed data and combine different features of them, this paper put forward a data fusion method for high spatial resolution remotely sensed data based on texture analysis. Texture features of high resolution SAR imagery were extracted using GLCM (Grey Level Co-occurrence Matrix) method. The texture features were detected in 0°, 45°, 90° and 135° four directions, and the moving window size of 3×3, 5×5, to 31×31, 41×41, 51×51, and 61×61 were tested to analyze the influences among them. The selected texture features were added with SAR data to make classification next. Both the two imagery were classified using an object-based and rule-based approach. Then, a decision level fusion was implemented and the accuracy of classification result was improved from 78.7% and 83.0% to 88.8%. |
---|---|
DOI: | 10.1109/EORSA.2012.6261162 |