TZU YU LIUTrinchera, LauraLauraTrincheraTenenhaus, ArthurArthurTenenhausWei, DennisDennisWeiHero, Alfred O.Alfred O.Hero2023-10-062023-10-062013-10-28978146148282621941009https://scholars.lib.ntu.edu.tw/handle/123456789/635970Partial least squares (PLS) regression combines dimensionality reduction and prediction using a latent variable model. It provides better predictive ability than principal component analysis by taking into account both the independent and response variables in the dimension reduction procedure. However, PLS suffers from over-fitting problems for few samples but many variables.We formulate a new criterion for sparse PLS by adding a structured sparsity constraint to the global SIMPLS optimization. The constraint is a sparsity-inducing norm, which is useful for selecting the important variables shared among all the components. The optimization is solved by an augmented Lagrangian method to obtain the PLS components and to perform variable selection simultaneously.We propose a novel greedy algorithm to overcome the computation difficulties. Experiments demonstrate that our approach to PLS regression attains better performance with fewer selected predictors. © Springer Science+Business Media New York 2013.Over-fitting | Principal component analysis | Regularization | Sparse PLS | SparsityGlobally Sparse PLS Regressionjournal article10.1007/978-1-4614-8283-3_72-s2.0-84885981953https://api.elsevier.com/content/abstract/scopus_id/84885981953