Options
Effective Multi-class Kernel MSE Classifier with Sherman-Woodbury Formula
Date Issued
2006
Date
2006
Author(s)
Li, Chen-Wei
DOI
en-US
Abstract
In general, there are two kinds of linear classification methods: one is MSE, and the other is FLD. Because linear methods are not sufficient to analyze the data with nonlinear patterns, the nonlinear methods KMSE and KFD are hence developed from MSE and FLD, respectively. Both transform the instances from the original attribute space to the high-dimensional feature space and then linear methods are applied. The objective of FLD and KFD is to find the directions on which the projection of training instances can provide the maximal separability of classes. FLD and KFD are known to be inefficient for datasets with a large amount of attributes and instances, respectively. To improve the computing efficiency, we use MSE for linear classification problems. However, MSE, like SVM, can use only the one-against-one or the one-against-the-rest approach to solve the multi-class problems. Both are inefficient compared to FLD and KFD where only one model is built to discriminate multiple classes simultaneously. Thus, we develop the multi-class MSE with Sherman-Woodbury formula to improve the computation efficiency. It can deal with multiple classes simultaneously by a class-labeling scheme. The different class-labeling schemes are determined by the Gram-Schmidt process. The nonlinear application, multi-class KMSE, is also developed from the multi-class MSE. Then, a simulated example is used to show how the proposed method works and to visualize the meaning of the class-labeling scheme. Finally, two real-world datasets are used for comparing the proposed method with other conventional methods.
Subjects
分類方法
費雪線性區別
核心費雪區別
最小平方誤差法
核心最小平方誤差法
效率
Classification method
FLD
KFD
MSE
KMSE
Efficiency
Type
thesis
File(s)
No Thumbnail Available
Name
ntu-95-R93546018-1.pdf
Size
23.53 KB
Format
Adobe PDF
Checksum
(MD5):f93433f3aacdca05c26c28a362234ff9