Document Type

Article

Department or Administrative Unit

Computer Science

Publication Date

8-2022

Abstract

We study the problem of learning the data samples’ distribution as it changes in time. This change, known as concept drift, complicates the task of training a model, as the predictions become less and less accurate. It is known that Support Vector Machines (SVMs) can learn weighted input instances and that they can also be trained online (incremental–decremental learning). Combining these two SVM properties, the open problem is to define an online SVM concept drift model with shifting weighted window. The classic SVM model should be retrained from scratch after each window shift. We introduce the Weighted Incremental–Decremental SVM (WIDSVM), a generalization of the incremental–decremental SVM for shifting windows. WIDSVM is capable of learning from data streams with concept drift, using the weighted shifting window technique. The soft margin constrained optimization problem imposed on the shifting window is reduced to an incremental–decremental SVM. At each window shift, we determine the exact conditions for vector migration during the incremental–decremental process. We perform experiments on artificial and real-world concept drift datasets; they show that the classification accuracy of WIDSVM significantly improves compared to a SVM with no shifting window. The WIDSVM training phase is fast, since it does not retrain from scratch after each window shift.

Comments

This article was originally published open access in Neural Networks. The full-text article from the publisher can be found here.

Journal

Neural Networks

Rights

©2022 The Author(s).

Share

COinS