Incremental and Decremental Training for Linear Classification
Date Issued
2014
Date
2014
Author(s)
Tsai, Cheng-Hao
Abstract
In classification, if a small number of instances is added or removed, incremental and decremental techniques can be applied to quickly update the model. However, the design of incremental and decremental algorithms involves many considerations. In this thesis, we focus on linear classifiers including logistic regression and linear SVM because of their simplicity over kernel or other methods. By applying a warm start strategy, we investigate issues such as using primal or dual formulation, choosing optimization methods, and creating practical implementations. Through theoretical analysis and practical experiments, we conclude that a warm start setting on a high-order optimization method for primal formulations is more suitable than others for incremental and decremental learning of linear classification.
Subjects
暖啟動
增量式學習
減量式學習
線性分類
Type
thesis
File(s)![Thumbnail Image]()
Loading...
Name
ntu-103-R01922025-1.pdf
Size
23.32 KB
Format
Adobe PDF
Checksum
(MD5):fa6083465ded108329dbeb5af98b1cbb
