dc.description.abstract | Fisher’s Linear Discriminant (FLD) and Fukunaga-Koontz Linear Discriminant (FKLD) have different characteristics in classification analysis. However, they share the same objective to find a “discriminant vector” on to which the projection of training instances, called “discriminant scores”, can be used to discriminate different classes.
In this research, we propose novel approaches to address issues of using the FKLD approach and improve its performance. First, in order to make the FKLD approach more meaningful in classification, we combine the FKLD with the concept of Principal Component Analysis (PCA). Then, we extend the FKLD approach with two “discriminant spaces” to improve the classification accuracy, and name the novel linear classification method “Two-Space Fukunaga-Koontz Linear Discriminant (Two-Space FKLD)”. But the Two-Space FKLD method is still not sufficient in some cases. Therefore, we further combine the Two-Space FKLD with the concept of FLD analysis. Then we expend the novel classification methods to multi-class classification problems using one-against-one approach.
Because linear methods are not sufficient to analyze the data with nonlinear characteristics, the nonlinear Kernel Fisher Discriminant (KFD) is extended approach from the FLD based on kernel function. KFD transforms the instances from the original attribute space to the feature space with higher dimension. In this research we will also develop a kernel-based nonlinear FK discriminant approach called “Two-Space Kernel Fukunaga-Koontz Discriminant (Two-Space KFKD)”. The Two-Space KFKD method can also effectively discriminate the data with nonlinear characteristics.
Finally, we demonstrate our proposed methods through three real-world datasets in our case study. We also compare the performance of various classifiers based on these datasets. The results indicate that the performances of our methods are equal, sometimes superior, to FLD, KFD or SVM approaches. | en |
dc.relation.reference | [1] Sandrine Dudoit and Robert Gentleman, “Classification in microarray experiments”, 2003.
[2] Qingshan Liu, Hanqing Lu, and Songde Ma, “Improving Kernel Fisher Discriminant Analysis for Face Recognition”, IEEE Transactions on Circuits and Systems for Video Technology, VOL. 14, NO. 1, JANUARY 2004
[3] Donald H. Foley,John W. Sammon,JR., An Optimal Set of Discriminant Vectors.IEEE Transactions on Computers,Vol.c-24,No.3,March 1975,281-289.
[4] John, “Applied Multivariate Statistical Analysis- Discriminant and classification”, 2002.
[5] Jianguo Zhang and Kai-Kuang Ma, “Kernel Fisher Discriminant for Texture Classification”, 2004.
[6] Gene H. Golub, Michael Heath, Grace Wahba, ”Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter”, Technometrics, Vol. 21, No. 2 (May, 1979) , pp. 215-223.
[7] Fisher, R.A. (1936). The use of multiple measurements in taxonomic problems. Annals of Eugenics 7, 179-188.
[8] method Keinosuke Fukunaga,Warren L. G. Koontz, Application of the Karhunen-Loeve Expansion to Feature Selection and Ordering, IEEE Transactions on Computers,Vol.c-19,No.4,April,1970,311-318.
[9] R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification. New York: Wiley-Interscience, 2001.
[10] Mika, S., Rätsch, G.., Weston, J., Schölkopf, B., & Müller, K-R. (1999). “Fisher Discriminant Analysis With Kernels”. IEEE International Workshop on Neural Networks for Signal Processing, Vol. IX, Madison, USA, August 1999, 41-48.
[11] B. Schölkopf, A. Smola, & K-R Müller. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10: 1299-1319, 1998.
[12] I. Jollife. Principal Component Analysis. Spring-Verlag, New York,1986.
[13] B. Schölkopf, A. Smola, & K-R Müller. Kernel principal component analysis. In B. Schölkopf, C.J.C. Burges, & A. J. Smola, editors, Advanced in Kernel Methods – Support Vector Machine. 327-352. MIT Press, Cambridge, MA, 1999.
[14] B. Schölkopf, C.J.C. Burges, & A. J. Smola, editors. Advanced in Kernel Methods – Support Vector Machine. MIT Press, Cambridge, MA, 1999.
[15] S. Saitoh, Theory of Reproducing Kernels and its Applications, Longman Scientific & Technical, Harlow, England, 1988.
[16] Malte Kuss, “Nonlinear Multivariate Analysis,” Technische Universität Berlin, Diplomarbeit, 14. Februar 2002.
[17] Chih-Wei Hsu, Chih-Chung Chang and Chih-Jen Lin, “A Practical Guide to Support Vector Classification,” available http://www.csie.ntu.edu.tw/~cjlin/papaers/guide/guide.pdf.
[18] Chih-Chung Chang and Chih-Jen Lin. LIBSVM: A Library for Support Vector Machines, 2001. Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm.
[19] University of Minnesota. 2006. Computer Science & Engineering. 12 Jun. 2006 http://www-users.cs.umn.edu/~hpark/data.html.
[20] UCI database available at http://www.ics.uci.edu/~mlearn/MLSummary.html. | en |