Designing translation invariant operations via neural network training

The main objective of this work is to develop an analytical method for designing translation invariant operators via neural network training. A new neural network architecture, called modular morphological neural network (MMNN), is defined using a fundamental result of minimal representations for tr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: de Sousa, R.P., de Carvalho, J.M., de Assis, F.M., Pessoa, L.F.C.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The main objective of this work is to develop an analytical method for designing translation invariant operators via neural network training. A new neural network architecture, called modular morphological neural network (MMNN), is defined using a fundamental result of minimal representations for translation invariant set mappings via mathematical morphology, proposed by Banon and Barrera (1991). The MMNN general architecture is capable of learning both binary and gray-scale translation invariant operators. For its training, ideas of the backpropagation (BP) algorithm and the methodology proposed by Pessoa and Maragos (see Ph.D. thesis, Georgia Institute of Technology, 1997) for overcoming the problem of non-differentiability of the rank functions are used. An alternative MMNN training method via genetic algorithms (GA) is also developed, and a comparative analysis of BP vs. GA training in problems of image restoration and pattern recognition is provided. The MMNN structure can be viewed as a special case of the morphological/rank/linear neural network (MRL-NN), proposed by Pessoa and Maragos (1997), but with specific architecture and training rules. The effectiveness of the proposed BP and GA training algorithms for MMNNs is encouraging, offering alternative design tools for the important class of translation invariant operators.
ISSN:1522-4880
2381-8549
DOI:10.1109/ICIP.2000.901107