KFD分析的分類方法與評估研究
Classification Methods and Their Evaluation for Kernel Fisher Discriminant Analysis
Date Issued
2005
Date
2005
Author(s)
Chou, Hsuan-Hsien
DOI
en-US
Abstract
Fisher’s Linear Discriminant (FLD) analysis is an efficient method for feature extraction. The objective of that is to find the direction on which the linear projection of training instances, called “discriminant scores”, can provide the maximal separability of classes. Kernel Fisher Discriminant (KFD) analysis will map training instances into a higher dimensional space and perform FLD analysis there. Such a nonlinear approach can be feasible by reformulating the FLD problem in terms of only inner products and applying the kernel trick to that.
Based on the discriminant scores, we can use various classifiers to classify testing instances. The nearest center classifier is usually combined with FLD and KFD. But when the score distribution is skewed, no matter the use of Euclidean or Mahalanobis distance, it will lead to weak classification performance. We propose two classifiers to enhance the classification result in such a situation. Both of them use a proportion of training instances for covariance matrix computation. The instance selection of one method is based on the included angle defined by testing and training instances; that of the other method is according to the hyperplanes determined by paired-class centers. The advantage of these classifiers is that we can avoid a loose estimate of the covariance matrix by excluding the training instances deemed irrelevant.
Another issue we attempt to address in is how sensitive a classifier is to environment changes. These environment changes that we consider include changes of parameter value, changes of training dataset size, and changes of training dataset sampling. We also discuss the combined sensitivity, i.e., how a classifier’s sensitivity to one type of environment change is affected by other types of environment changes.
Finally, we demonstrate our proposed methods through two real-world datasets in the case study. We compare the performance of various classifiers based on these datasets and perform sensitivity analysis to them. The results indicate that the performance of the proposed methods is equivalent, even superior, to that of other KFD-based classifiers. We also find that one of our methods is more sensitive to different environment changes than other classifiers.
Subjects
核心費雪區別
最近中心點分類法
馬氏距離
敏感度分析
kernel Fisher discriminant
nearest center classifier
Mahalanobis distance
sensitivity analysis
Type
thesis
File(s)![Thumbnail Image]()
Loading...
Name
ntu-94-R92546001-1.pdf
Size
23.53 KB
Format
Adobe PDF
Checksum
(MD5):f5c74de61166419864a63c31db656644
