https://scholars.lib.ntu.edu.tw/handle/123456789/387209
標題: | Sparse random features algorithm as Coordinate Descent in Hilbert Space | 作者: | Yen, I.E.H. Lin, T.-W. SHOU-DE LIN Ravikumar, P. Dhillon, I.S. |
公開日期: | 2014 | 卷: | 3 | 期: | January | 起(迄)頁: | 2456-2464 | 來源出版物: | Advances in Neural Information Processing Systems | 摘要: | In this paper, we propose a Sparse Random Features algorithm, which learns a sparse non-linear predictor by minimizing an ℓ1-regularized objective function over the Hilbert Space induced from a kernel function. By interpreting the algorithm as Randomized Coordinate Descent in an infinite-dimensional space, we show the proposed approach converges to a solution within ε-precision of that using an exact kernel method, by drawing O(1/ε) random features, in contrast to the O(1/ε2) convergence achieved by current Monte-Carlo analyses of Random Features. In our experiments, the Sparse Random Feature algorithm obtains a sparse solution that requires less memory and prediction time, while maintaining comparable performance on regression and classification tasks. Moreover, as an approximate solver for the infinite-dimensional ℓ1-regularized problem, the randomized approach also enjoys better convergence guarantees than a Boosting approach in the setting where the greedy Boosting step cannot be performed exactly. |
URI: | http://www.scopus.com/inward/record.url?eid=2-s2.0-84937906787&partnerID=MN8TOARS http://scholars.lib.ntu.edu.tw/handle/123456789/387209 |
ISSN: | 10495258 | SDG/關鍵字: | Hilbert spaces; Information science; Monte Carlo methods; Vector spaces; Boosting approach; Classification tasks; Coordinate descent; Infinite dimensional; Monte carlo analysis; Nonlinear predictors; Objective functions; Randomized approach; Algorithms |
顯示於: | 資訊工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。