A Granular Reflex Fuzzy Min-Max Neural Network for Classification

Granular data classification and clustering is an upcoming and important issue in the field of pattern recognition. Conventionally, computing is thought to be manipulation of numbers or symbols. However, human recognition capabilities are based on ability to process nonnumeric clumps of information...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2009-07, Vol.20 (7), p.1117-1134
Hauptverfasser: Nandedkar, A.V., Biswas, P.K.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Granular data classification and clustering is an upcoming and important issue in the field of pattern recognition. Conventionally, computing is thought to be manipulation of numbers or symbols. However, human recognition capabilities are based on ability to process nonnumeric clumps of information (information granules) in addition to individual numeric values. This paper proposes a granular neural network (GNN) called granular reflex fuzzy min-max neural network (GrRFMN) which can learn and classify granular data. GrRFMN uses hyperbox fuzzy set to represent granular data. Its architecture consists of a reflex mechanism inspired from human brain to handle class overlaps. The network can be trained online using granular or point data. The neuron activation functions in GrRFMN are designed to tackle data of different granularity (size). This paper also addresses an issue to granulate the training data and learn from it. It is observed that such a preprocessing of data can improve performance of a classifier. Experimental results on real data sets show that the proposed GrRFMN can classify granules of different granularity more correctly. Results are compared with general fuzzy min-max neural network (GFMN) proposed by Gabrys and Bargiela and with some classical methods.
ISSN:1045-9227
2162-237X
1941-0093
2162-2388
DOI:10.1109/TNN.2009.2016419