Y.-R. YehY.-Y. ChungT.-C. LinY.-C. F. WangYU-CHIANG WANG王鈺強2019-10-242019-10-242011https://scholars.lib.ntu.edu.tw/handle/123456789/427535https://www.scopus.com/inward/record.uri?eid=2-s2.0-80054762064&doi=10.1109%2fIJCNN.2011.6033554&partnerID=40&md5=bb3b6c1f9c1334ddc1af9976eaaaf66cWe propose a novel multiple kernel learning (MKL) algorithm with a group lasso regularizer, called group lasso regularized MKL (GL-MKL), for heterogeneous feature selection. We extend the existing MKL algorithm and impose a mixed 1 and 2 norm constraint (known as group lasso) as the regularizer. Our GL-MKL determines the optimal base kernels, including the associated weights and kernel parameters, and results in a compact set of features for comparable or improved recognition performance. The use of our GL-MKL avoids the problem of choosing the proper technique to normalize the feature attributes collected from heterogeneous domains (and thus with different properties and distribution ranges). Our approach does not need to exhaustively search for the entire feature space when performing feature selection like prior sequential-based feature selection methods did, and we do not require any prior knowledge on the optimal size of the feature subset either. Comparisons with existing MKL or sequential-based feature selection methods on a variety of datasets confirm the effectiveness of our method in selecting a compact feature subset for comparable or improved classification performance. © 2011 IEEE.Classification performance; Compact sets; Data sets; Feature selection methods; Feature space; Feature subset; Heterogeneous domains; Heterogeneous features; Kernel parameter; Multiple Kernel Learning; Optimal size; Prior knowledge; Recognition performance; Regularizer; Algorithms; Classification (of information); Neural networks; Optimization; Feature extractionGroup Lasso Regularized Multiple Kernel Learning for Heterogeneous Feature Selectionconference paper10.1109/ijcnn.2011.60335542-s2.0-80054762064