K-nearest neighbor-based weighted multi-class twin support vector machine

Twin-KSVC, as a novel multi-class classification algorithm, aims at finding two nonparallel hyper-planes for the two focused classes of samples by solving a pair of smaller-sized quadratic programming problems (QPPs), which makes the learning speed faster than other multi-class classification algori...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neurocomputing (Amsterdam) 2016-09, Vol.205, p.430-438
1. Verfasser: Xu, Yitian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Twin-KSVC, as a novel multi-class classification algorithm, aims at finding two nonparallel hyper-planes for the two focused classes of samples by solving a pair of smaller-sized quadratic programming problems (QPPs), which makes the learning speed faster than other multi-class classification algorithms. However, the local information of samples is ignored, and then each sample shares the same weight when constructing the separating hyper-planes. In fact, they have different influences on the separating hyper-planes. Inspired by the studies above, we propose a K-nearest neighbor (KNN)-based weighted multi-class twin support vector machine (KWMTSVM) in this paper. Weight matrix W is employed in the objective function to exploit the local information of intra-class. Meanwhile, both weight vectors f and h are introduced into the constraints to exploit the information of inter-class. When component fj=0 or hk=0, it implies that the j-th or k-th constraint is redundant. Removing these redundant constraints can effectively improve the computational speed of the classifier. Experimental results on eleven benchmark datasets and ABCD dataset demonstrate the validity of our proposed algorithm.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2016.04.024