Preconditioning the support vector machine algorithm to suit margin and outlier priors of Gaussian data
The Support Vector Machine (SVM) Algorithm is one of the most popular classification method in machine learning and statistics. However, in the presence of outliers, the classifier may be adversely affected. In this paper, we experiment on the hinge loss function of the unconstrained SVM Algorithm t...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The Support Vector Machine (SVM) Algorithm is one of the most popular classification method in machine learning and statistics. However, in the presence of outliers, the classifier may be adversely affected. In this paper, we experiment on the hinge loss function of the unconstrained SVM Algorithm to suit prior information about nonlinearly separable sets of Gaussian data. First, we determine if an altered hinge loss function x ↦ max(0, α − x) with several positive values of α will be significantly better in classification compared when α = 1. Then, taking an inspiration from Huber’s least informative distribution model to desensitize regression from outliers, we smoothen the hinge loss function to promote insensitivity of the classification to outliers. Using statistical analysis, we determine that at some level of significance, there is a considerable improvement in classification with respect to the number of misclassified data. |
---|---|
ISSN: | 0094-243X 1551-7616 |
DOI: | 10.1063/1.5139170 |