Good weights and hyperbolic kernels for neural networks, projection pursuit, and pattern classification: Fourier strategies for extracting information from high-dimensional data

Fourier approximation and estimation of discriminant, regression, and density functions are considered. A preference order is established for the frequency weights in multiple Fourier expansions and the connection weights in single hidden-layer neural networks. These preferred weight vectors, called...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on information theory 1994-03, Vol.40 (2), p.439-454
1. Verfasser: Jones, L.K.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Fourier approximation and estimation of discriminant, regression, and density functions are considered. A preference order is established for the frequency weights in multiple Fourier expansions and the connection weights in single hidden-layer neural networks. These preferred weight vectors, called good weights (good lattice weights for estimation of periodic functions), are generalizations for arbitrary periods of the hyperbolic lattice points of Korobov (1959) and Hlawka (1962) associated with classes of smooth functions of period one in each variable. Although previous results on approximation and quadrature are affinely invariant to the scale of the underlying periods, some of our results deal with optimization over finite sets and strongly depend on the choice of scale. It is shown how to count and generate good lattice weights. Finite sample bounds on mean integrated squared error are calculated for ridge estimates of periodic pattern class densities. The bounds are combined with a table of cardinalities of good lattice weight sets to furnish classifier design with prescribed class density estimation errors. Applications are presented for neural networks and projection pursuit. A hyperbolic kernel gradient transform is developed which automatically determines the training weights (projection directions). Its sampling properties are discussed. Algorithms are presented for generating good weights for projection pursuit.< >
ISSN:0018-9448
1557-9654
DOI:10.1109/18.312166