Incremental / decremental SVM for function approximation

Document Type

Conference Proceeding

Department or Administrative Unit

Computer Science

Publication Date

5-20-2008

Abstract

Training a support vector regression (SVR) resumes to the process of migrating the vectors in and out of the support set along with modifying the associated thresholds. This paper gives a complete overview of all the boundary conditions implied by vector migration through the process. The process is similar to that of training a SVM, though the process of incrementing / decrementing of vectors into / out of the solution does not coincide with the increase / decrease of the associated threshold. The analysis shows the details of incremental and decremental procedures used to train the SVR. Vectors with duplicate contribution are also considered. The migration of vectors among sets on decreasing the regularization parameter C is particularly given attention. Eventually, experimental data show the possibility of modifying this parameter on a large scale, varying it from complete training (overfitting) to a calibrated value, to tune up the approximation performance of the regression.

Comments

This article was originally published in 2008 11th International Conference on Optimization of Electrical and Electronic Equipment. The full-text article from the publisher can be found here.

Due to copyright restrictions, this article is not available for free download from ScholarWorks @ CWU.

Journal

2008 11th International Conference on Optimization of Electrical and Electronic Equipment

Rights

Copyright © 2008, IEEE

Share

COinS