Implementation Issues of an Incremental and Decremental SVM
Document Type
Article
Department or Administrative Unit
Computer Science
Publication Date
9-3-2008
Abstract
Incremental and decremental processes of training a support vector machine (SVM) resumes to the migration of vectors in and out of the support set along with modifying the associated thresholds. This paper gives an overview of all the boundary conditions implied by vector migration through the incremental / decremental process. The analysis will show that the same procedures, with very slight variations, can be used for both the incremental and decremental learning. The case of vectors with duplicate contribution is also considered. Migration of vectors among sets on decreasing the regularization parameter is given particularly attention. Experimental data show the possibility of modifying this parameter on a large scale, varying it from complete training (overfitting) to a calibrated value.
Recommended Citation
Gâlmeanu H., Andonie R. (2008) Implementation Issues of an Incremental and Decremental SVM. In: Kůrková V., Neruda R., Koutník J. (eds) Artificial Neural Networks - ICANN 2008. ICANN 2008. Lecture Notes in Computer Science, vol 5163. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87536-9_34
Journal
Artificial Neural Networks - ICANN 2008
Rights
© Springer-Verlag Berlin Heidelberg 2008
Comments
This article was originally published in Artificial Neural Networks - ICANN 2008. The article from the publisher can be found here.
Due to copyright restrictions, this article is not available for free download from ScholarWorks @ CWU.