Universal Manifold Embedding for Geometrically Deformed Functions
Assume we have a set of observations (for example, images) of different objects, each undergoing a different geometric deformation, yet all the deformations belong to the same family. As a result of the action of these deformations, the set of different observations on each object is generally a man...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on information theory 2016-06, Vol.62 (6), p.3676-3684 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Assume we have a set of observations (for example, images) of different objects, each undergoing a different geometric deformation, yet all the deformations belong to the same family. As a result of the action of these deformations, the set of different observations on each object is generally a manifold in the ambient space of observations. In this paper we show that in those cases where the set of deformations admits a finite-dimensional representation, there is a mapping from the space of observations to a low-dimensional linear space. The manifold corresponding to each object is mapped to a distinct linear subspace of Euclidean space. The dimension of the subspace is the same as that of the manifold. This mapping, which we call universal manifold embedding, enables the estimation of geometric deformations using the classical linear theory. The universal manifold embedding further enables the representation of the object classification and detection problems in a linear subspace matching framework. The embedding of the space of observations depends on the deformation model, and is independent of the specific observed object; hence, it is universal. We study two cases of this embedding: that of elastic deformations of 1-D signals, and the case of affine deformations of n-dimensional signals. |
---|---|
ISSN: | 0018-9448 1557-9654 |
DOI: | 10.1109/TIT.2016.2555324 |