An optimization method for selecting parameters in support vector machines

It has been shown that the cost parameters and kernel parameters are critical in the performance of support vector machines (SVMs). A standard parameter selection method compares parameters among a discrete set of values, called the candidate set, and picks the one which has the best classification...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Yulin Dong, Manghui Tu, Zhonghang Xia, Guangming Xing
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:It has been shown that the cost parameters and kernel parameters are critical in the performance of support vector machines (SVMs). A standard parameter selection method compares parameters among a discrete set of values, called the candidate set, and picks the one which has the best classification accuracy. As a result, the choice of parameters strongly depends on the pre-defined candidate set. In this paper, we formulate the selection of the cost parameter and kernel parameter as a two-level optimization problem, in which the values of parameters vary continuously and thus optimization techniques can be applied to select ideal parameters. Due to the non-smoothness of the objective function in our model, a genetic algorithm has been presented. Numerical results show that the two-level approach can significantly improve the performance of SVM classifier in terms of classification accuracy.
DOI:10.1109/ICMLA.2007.38