Comparison of the multilayer perceptron with neuro-fuzzy techniques in the estimation of cover class mixture in remotely sensed data
Mixed pixels are a major source of inconvenience in the classification of remotely sensed data. This paper compares MLP with so-called neuro-fuzzy algorithms in the estimation of pixel component cover classes. Two neuro-fuzzy networks are selected from the literature as representatives of soft class...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2001-05, Vol.39 (5), p.994-1005 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Mixed pixels are a major source of inconvenience in the classification of remotely sensed data. This paper compares MLP with so-called neuro-fuzzy algorithms in the estimation of pixel component cover classes. Two neuro-fuzzy networks are selected from the literature as representatives of soft classifiers featuring different combinations of fuzzy set-theoretic principles with neural network learning mechanisms. These networks are: 1) the fuzzy multilayer perceptron (FMLP) and 2) a two-stage hybrid (TSH) learning neural network whose unsupervised first stage consists of the fully self-organizing simplified adaptive resonance theory (FOSART) clustering model, FMLP, TSH, and MLP are compared on CLASSITEST, a standard set of synthetic images where per-pixel proportions of cover class mixtures are known a priori. Results are assessed by means of evaluation tools specifically developed for the comparison of soft classifiers. Experimental results show that classification accuracies of FMLP and TSH are comparable, whereas TSH is faster to train than FMLP. On the other hand, FMLP and TSW outperform MLP when little prior knowledge is available for training the network, i.e., when no fuzzy training sites, describing intermediate label assignments, are available. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/36.921417 |