DC 欄位 | 值 | 語言 |
dc.contributor | 林智仁 | zh-TW |
dc.contributor | Lin, Chih-Jen | en |
dc.contributor | 臺灣大學:資訊工程學研究所 | zh-TW |
dc.contributor.author | 李振宇 | zh-TW |
dc.contributor.author | Lee, Cheng-Yu | en |
dc.creator | 李振宇 | zh-TW |
dc.creator | Lee, Cheng-Yu | en |
dc.date | 2008 | en |
dc.date.accessioned | 2010-06-02T03:24:45Z | - |
dc.date.accessioned | 2018-07-05T01:57:27Z | - |
dc.date.available | 2010-06-02T03:24:45Z | - |
dc.date.available | 2018-07-05T01:57:27Z | - |
dc.date.issued | 2008 | - |
dc.identifier.other | U0001-2207200801295600 | en |
dc.identifier.uri | http://ntur.lib.ntu.edu.tw//handle/246246/184966 | - |
dc.description.abstract | 邏輯迴歸是一種常被應用在文件分類與計算語言學上的技術。L1 正規化的邏輯迴歸可被視為一種特徵選取的方式,然而它不可微分的特性增加了問題的困難度。近年來有多種最佳化方法被用在解決這個問題上,但這些方法彼此之間卻缺乏嚴謹的比較。在這篇論文之中,我們提出了一種信賴區間牛頓法,並將它與數種已知的最佳化方法比較。實驗結果顯示我們提出的方法並不亞於目前最新的最佳化方法。另一個實驗比較了 L1 與 L2 正規化的邏輯迴歸,結果證實了在達到相似準確度的前提之下,使用 L1 正規化邏輯迴歸可得到比 L2 正規化邏輯迴歸更為稀疏的向量解。 | zh-TW |
dc.description.abstract | Large-scale logistic regression is useful for document classification and computational linguistics. The L1-regularized form can be used for feature selection, but its non-differentiability causes more difficulties in training. Various optimization methods are proposed in recent years, but no serious comparison between them has been made. In this thesis we propose a trust region Newton method and compare several existing methods. Result shows that our method is competitive with some state-of-art L1-regularized logistic regression solvers. To investigate the applicability of L1-regularized logistic regression, we also conduct an experiment to show that compared to L2-regularized logistic regression, a sparser solution is obtained with similar accuracy. | en |
dc.description.tableofcontents | 口試委員審定書 i文摘要 iiBSTRACT iiiIST OF FIGURES viIST OF TABLES viiHAPTER. Introduction 1I. Review of Existing Methods 5.1 Limited memory BFGS 5.2 Interior point methods (IPM) 8.3 Coordinate Descent 10II. A Trust Region Newton Method for Large-Scale L1-Regularized Logistic Regression 15.1 The Framework 15.2 Cauchy Point 17.3 Newton Direction 18.4 Discussion and Implementation Issues 20V. Experiments 23.1 Settings 23.2 Issue of Obtaining Sparse Solutions via IPM 24.3 Comparison of Di erent Optimization Methods 25.4 Comparison of L1-regularized and L2-regularized Logistic Regression 27 . Conclusions and Future Work 32IBLIOGRAPHY 34 | en |
dc.format | application/pdf | en |
dc.format.extent | 4053175 bytes | - |
dc.format.mimetype | application/pdf | - |
dc.language | en | en |
dc.language.iso | en_US | - |
dc.subject | 邏輯迴歸 | zh-TW |
dc.subject | 最佳化 | zh-TW |
dc.subject | L1正規化 | zh-TW |
dc.subject | 牛頓法 | zh-TW |
dc.subject | 特徵選取 | zh-TW |
dc.subject | Logistic regression | en |
dc.subject | Optimization | en |
dc.subject | L1-regularized | en |
dc.subject | Newton method | en |
dc.subject | Feature selection | en |
dc.title | 大規模L1正規化邏輯迴歸最佳化方法之比較 | zh-TW |
dc.title | A Comparison of Optimization Methods for Large-scale L1-regularized Logistic Regression | en |
dc.type | thesis | en |
dc.identifier.uri.fulltext | http://ntur.lib.ntu.edu.tw/bitstream/246246/184966/1/ntu-97-R95922035-1.pdf | - |
item.fulltext | with fulltext | - |
item.grantfulltext | open | - |
item.languageiso639-1 | en_US | - |
item.openairetype | thesis | - |
item.openairecristype | http://purl.org/coar/resource_type/c_46ec | - |
item.cerifentitytype | Publications | - |
顯示於: | 資訊工程學系
|