A lower bound for generalized median based consensus learning using kernel-induced distance functions
•A lower bound of the generalized median using kernel-induced distances is presented.•The lower bound is domain-independent and can be widely applied in all domains with some related kernel function.•The new lower bound is demonstrated to be considerably tighter than existing ones, while requiring l...
Gespeichert in:
Veröffentlicht in: | Pattern recognition letters 2020-12, Vol.140, p.339-347 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •A lower bound of the generalized median using kernel-induced distances is presented.•The lower bound is domain-independent and can be widely applied in all domains with some related kernel function.•The new lower bound is demonstrated to be considerably tighter than existing ones, while requiring lower computational time.
Computing a consensus object from a set of given objects is a core problem in machine learning and pattern recognition. One commonly used approach is to formulate it as an optimization problem using the generalized median. However, in many domains the construction of a median object is NP-hard, requiring approximate solutions instead. In these cases a lower bound is helpful to assess the quality of an approximate median without the need of a ground-truth median result. In this work we introduce a domain-independent lower bound formulation for the generalized median using kernel-induced distances. As kernel functions induce a scalar product in a high-dimensional vector space, they can be used to construct new nonlinear distance functions in the original space with desirable properties. Using the properties of the kernel functions at the basis of the kernel-induced distances, this kernel-based lower bound formulation is shown to be considerably tighter than existing lower bounds on a number of datasets in four different domains, while requiring a lower computational time. |
---|---|
ISSN: | 0167-8655 1872-7344 |
DOI: | 10.1016/j.patrec.2020.11.003 |