Group Lasso Regularized Multiple Kernel Learning for Heterogeneous Feature Selection
Journal
IEEE International Joint Conference on Neural Networks (IJCNN)
Pages
11月18日
Date Issued
2011
Author(s)
Abstract
We propose a novel multiple kernel learning (MKL) algorithm with a group lasso regularizer, called group lasso regularized MKL (GL-MKL), for heterogeneous feature selection. We extend the existing MKL algorithm and impose a mixed 1 and 2 norm constraint (known as group lasso) as the regularizer. Our GL-MKL determines the optimal base kernels, including the associated weights and kernel parameters, and results in a compact set of features for comparable or improved recognition performance. The use of our GL-MKL avoids the problem of choosing the proper technique to normalize the feature attributes collected from heterogeneous domains (and thus with different properties and distribution ranges). Our approach does not need to exhaustively search for the entire feature space when performing feature selection like prior sequential-based feature selection methods did, and we do not require any prior knowledge on the optimal size of the feature subset either. Comparisons with existing MKL or sequential-based feature selection methods on a variety of datasets confirm the effectiveness of our method in selecting a compact feature subset for comparable or improved classification performance. © 2011 IEEE.
Other Subjects
Classification performance; Compact sets; Data sets; Feature selection methods; Feature space; Feature subset; Heterogeneous domains; Heterogeneous features; Kernel parameter; Multiple Kernel Learning; Optimal size; Prior knowledge; Recognition performance; Regularizer; Algorithms; Classification (of information); Neural networks; Optimization; Feature extraction
Type
conference paper
