Local-to-Global Support Vector Machines (LGSVMs)
•Support Vector Machines (SVMs) are a popular kernel method for supervised learning.•Complexity costs and memory needs become prohibitive as the number of samples grows.•LGSVMs split the original problem into overlapping local SVMs classification tasks.•The Partition of Unity (PU) scheme ensures the...
Gespeichert in:
Veröffentlicht in: | Pattern recognition 2022-12, Vol.132, p.108920, Article 108920 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •Support Vector Machines (SVMs) are a popular kernel method for supervised learning.•Complexity costs and memory needs become prohibitive as the number of samples grows.•LGSVMs split the original problem into overlapping local SVMs classification tasks.•The Partition of Unity (PU) scheme ensures the definition of a global classifier.•As theoretically analyzed and shown, LGSVMs reduce the required execution time.
For supervised classification tasks that involve a large number of instances, we propose and study a new efficient tool, namely the Local-to-Global Support Vector Machine (LGSVM) method. Its background somehow lies in the framework of approximation theory and of local kernel-based models, such as the Partition of Unity (PU) method. Indeed, even if the latter needs to be accurately tailored for classification tasks, such as allowing the use of the cosine semi-metric for defining the patches, the LGSVM is a global method constructed by gluing together the local SVM contributions via compactly supported weights. When the number of instances grows, such a construction of a global classifier enables us to significantly reduce the usually high complexity cost of SVMs. This claim is supported by a theoretical analysis of the LGSVM and of its complexity as well as by extensive numerical experiments carried out by considering benchmark datasets. |
---|---|
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2022.108920 |