Implementation Issues of an Incremental and Decremental SVM

Document Type


Department or Administrative Unit

Computer Science

Publication Date



Incremental and decremental processes of training a support vector machine (SVM) resumes to the migration of vectors in and out of the support set along with modifying the associated thresholds. This paper gives an overview of all the boundary conditions implied by vector migration through the incremental / decremental process. The analysis will show that the same procedures, with very slight variations, can be used for both the incremental and decremental learning. The case of vectors with duplicate contribution is also considered. Migration of vectors among sets on decreasing the regularization parameter is given particularly attention. Experimental data show the possibility of modifying this parameter on a large scale, varying it from complete training (overfitting) to a calibrated value.


This article was originally published in Artificial Neural Networks - ICANN 2008. The article from the publisher can be found here.

Due to copyright restrictions, this article is not available for free download from ScholarWorks @ CWU.


Artificial Neural Networks - ICANN 2008


© Springer-Verlag Berlin Heidelberg 2008