A hybrid feature extraction method-based object recognition by neural network

This paper presents a neural based approach for rotated object recognition in still black and white images. The system comprises two phases: extraction of information from edge of the objects known as feature extraction and classification of objects by artificial neural network (ANN) trained with ba...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Wahi, A., Athiq, F.M., Palanisamy, C.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper presents a neural based approach for rotated object recognition in still black and white images. The system comprises two phases: extraction of information from edge of the objects known as feature extraction and classification of objects by artificial neural network (ANN) trained with back propagation algorithm. The two different methods of features extraction are adopted from rotated edge images of the objects. First the magnitude of 2D discrete Fourier transform (DFT) of the rotated edge image of the object are computed and stored in a matrix form. The features are calculated from these coefficients and stored in a vector form. The above feature extraction method is repeated to all rotated images of the objects. The second is an efficient hybrid features extraction method which combines the features from first method and the features from 3 - level decomposition of rotated edge image by 2D discrete wavelet transform (DWT). The same method is followed to extract the features from all the rotated images. The neural classifiers are trained with 75% of data sets obtained by two different feature data sets. The performance of the neural systems is evaluated on 25% of test data sets. The results are compared in both cases which predict that ANN presents higher accuracy of object recognition rate when trained with method 2 data sets than method 1.
DOI:10.1109/ICCCNET.2008.4787728