Yet faster method to optimize SVR hyperparameters based on minimizing cross-validation error
The performance of support vector (SV) regression deeply depends on its hyperparameters such as an insensitive zone thickness, a penalty factor, kernel function parameters. A method called MCV-SVR was once proposed, which optimizes SVR hyperparameters /spl lambda/ so that a cross-validation error is...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The performance of support vector (SV) regression deeply depends on its hyperparameters such as an insensitive zone thickness, a penalty factor, kernel function parameters. A method called MCV-SVR was once proposed, which optimizes SVR hyperparameters /spl lambda/ so that a cross-validation error is minimized. The method iterates two steps until convergence; step 1 optimizes parameters /spl theta/ under given /spl lambda/, while step 2 improves /spl lambda/ under given /spl theta/. Recently a faster version called the MCV-SVR-light was proposed, which accelerates step 2 by pruning. The present paper yet accelerates step 1 of the MCV-SVR-light by pruning without affecting solution quality. Here the pruning means confining the process to support vectors. Our experiments using three data sets show that the proposed method converged faster than the existing methods while the generalization performance remained comparable. |
---|---|
ISSN: | 2161-4393 2161-4407 |
DOI: | 10.1109/IJCNN.2005.1555967 |