https://scholars.lib.ntu.edu.tw/handle/123456789/297619
Title: | A note on the decomposition methods for support vector regression | Authors: | Liao, Shuo-Peng Lin, Hsuan-Tien Lin, Chih-JenLiao, S.-P. Lin, H.-T. Lin, C.-J. HSUAN-TIEN LIN CHIH-JEN LIN |
Issue Date: | 2002 | Journal Volume: | 14 | Journal Issue: | 6 | Start page/Pages: | 1267-1281 | Source: | Neural Computation | Abstract: | The dual formulation of support vector regression involves two closely related sets of variables. When the decomposition method is used, many existing approaches use pairs of indices from these two sets as the working set. Basically, they select a base set first and then expand it so all indices are pairs. This makes the implementation different from that for support vector classification. In addition, a larger optimization subproblem has to be solved in each iteration. We provide theoretical proofs and conduct experiments to show that using the base set as the working set leads to similar convergence (number of iterations). Therefore, by using a smaller working set while keeping a similar number of iterations, the program can be simpler and more efficient. |
URI: | http://www.scopus.com/inward/record.url?eid=2-s2.0-0040081684&partnerID=MN8TOARS http://scholars.lib.ntu.edu.tw/handle/123456789/297619 https://www.scopus.com/inward/record.uri?eid=2-s2.0-0040081684&doi=10.1162%2f089976602753712936&partnerID=40&md5=2ebd1fa931b1d60488b5907e01a3ced5 |
ISSN: | 08997667 | DOI: | 10.1162/089976602753712936 | SDG/Keyword: | article |
Appears in Collections: | 資訊工程學系 |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.