Novel Linear/Nonlinear Classification Methods using Fukunaga-Koontz and Fisher Discriminant approaches
Date Issued
2006
Date
2006
Author(s)
Chen, Po-Liang
DOI
en-US
Abstract
Fisher’s Linear Discriminant (FLD) and Fukunaga-Koontz Linear Discriminant (FKLD) have different characteristics in classification analysis. However, they share the same objective to find a “discriminant vector” on to which the projection of training instances, called “discriminant scores”, can be used to discriminate different classes.
In this research, we propose novel approaches to address issues of using the FKLD approach and improve its performance. First, in order to make the FKLD approach more meaningful in classification, we combine the FKLD with the concept of Principal Component Analysis (PCA). Then, we extend the FKLD approach with two “discriminant spaces” to improve the classification accuracy, and name the novel linear classification method “Two-Space Fukunaga-Koontz Linear Discriminant (Two-Space FKLD)”. But the Two-Space FKLD method is still not sufficient in some cases. Therefore, we further combine the Two-Space FKLD with the concept of FLD analysis. Then we expend the novel classification methods to multi-class classification problems using one-against-one approach.
Because linear methods are not sufficient to analyze the data with nonlinear characteristics, the nonlinear Kernel Fisher Discriminant (KFD) is extended approach from the FLD based on kernel function. KFD transforms the instances from the original attribute space to the feature space with higher dimension. In this research we will also develop a kernel-based nonlinear FK discriminant approach called “Two-Space Kernel Fukunaga-Koontz Discriminant (Two-Space KFKD)”. The Two-Space KFKD method can also effectively discriminate the data with nonlinear characteristics.
Finally, we demonstrate our proposed methods through three real-world datasets in our case study. We also compare the performance of various classifiers based on these datasets. The results indicate that the performances of our methods are equal, sometimes superior, to FLD, KFD or SVM approaches.
In this research, we propose novel approaches to address issues of using the FKLD approach and improve its performance. First, in order to make the FKLD approach more meaningful in classification, we combine the FKLD with the concept of Principal Component Analysis (PCA). Then, we extend the FKLD approach with two “discriminant spaces” to improve the classification accuracy, and name the novel linear classification method “Two-Space Fukunaga-Koontz Linear Discriminant (Two-Space FKLD)”. But the Two-Space FKLD method is still not sufficient in some cases. Therefore, we further combine the Two-Space FKLD with the concept of FLD analysis. Then we expend the novel classification methods to multi-class classification problems using one-against-one approach.
Because linear methods are not sufficient to analyze the data with nonlinear characteristics, the nonlinear Kernel Fisher Discriminant (KFD) is extended approach from the FLD based on kernel function. KFD transforms the instances from the original attribute space to the feature space with higher dimension. In this research we will also develop a kernel-based nonlinear FK discriminant approach called “Two-Space Kernel Fukunaga-Koontz Discriminant (Two-Space KFKD)”. The Two-Space KFKD method can also effectively discriminate the data with nonlinear characteristics.
Finally, we demonstrate our proposed methods through three real-world datasets in our case study. We also compare the performance of various classifiers based on these datasets. The results indicate that the performances of our methods are equal, sometimes superior, to FLD, KFD or SVM approaches.
Subjects
費雪線性區別
Fukunaga-Koontz 線性區別
核心費雪區別
Fisher’s Linear Discriminant
Fukunaga-Koontz Linear Discriminan
Kernel Fisher Discriminant
Type
thesis
File(s)![Thumbnail Image]()
Loading...
Name
ntu-95-R93546003-1.pdf
Size
23.53 KB
Format
Adobe PDF
Checksum
(MD5):cfc5573f0d06508f8dd4c6fa6270d5aa