國立臺灣大學資訊工程學系Chang, Ming-WeiMing-WeiChangLin, Chih-JenChih-JenLin2006-09-272018-07-052006-09-272018-07-052005http://ntur.lib.ntu.edu.tw//handle/246246/20060927122856648470Minimizing bounds of leave-one-out (loo) errors is an important & efficient approach for support vector machine (SVM) model selection. Past research focuses on their use for classification but not regression. In this article, we derive various loo bounds for support vector regression (SVR) & discuss the difference from those for classification. Experiments demonstrate that the proposed bounds are competitive with Bayesian SVR for parameter selection. We also discuss the differentiability of loo bounds.application/pdf256168 bytesapplication/pdfzh-TWLeave-one-out Bounds for Support Vector Regression Model Selectionjournal article10.1162/0899766053491869http://ntur.lib.ntu.edu.tw/bitstream/246246/20060927122856648470/1/svrbound.pdf