skip to main content
article

Generalized Discriminant Analysis Using a Kernel Approach

Published: 01 October 2000 Publication History

Abstract

We present a new method that we call generalized discriminant analysis (GDA) to deal with nonlinear discriminant analysis using kernel function operator. The underlying theory is close to the support vector machines (SVM) insofar as the GDA method provides a mapping of the input vectors into high-dimensional feature space. In the transformed space, linear properties make it easy to extend and generalize the classical linear discriminant analysis (LDA) to nonlinear discriminant analysis. The formulation is expressed as an eigenvalue problem resolution. Using a different kernel, one can cover a wide class of nonlinearities. For both simulated data and alternate kernels, we give classification results, as well as the shape of the decision function. The results are confirmed using real data to perform seed classification.

References

[1]
Aizerman, M. A., Braverman, E. M., & Rozonoér, L. I. (1964). Theoretical foundations of the potential function method in pattern recognition learning. Automation and Remote Control, 25, 821-837.]]
[2]
Anouar, F., Badran, F., & Thiria, S. (1998). Probabilistic self organizing map and radial basis function. Journal Neurocomputing, 20, 83-96.]]
[3]
Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. In D. Haussler (Ed.), Fifth Annual ACM Workshop on COLT (pp. 144-152). Pittsburgh, PA: ACM Press.]]
[4]
Burges, C. J. C. (1998). A tutorial on support vector machine for pattern recognition. Support vector web page available online at: http://svm.first.gmd.de.]]
[5]
Burges, C. J. C., & Schölkopf, B. (1997). Improving the accuracy and speed of support vector machines. In M. Mozer, M. Jordan, & T. Petsche (Eds.), Neural information processing systems, 9. Cambridge, MA: MIT Press.]]
[6]
Drucker, H., Burges, C. J. C., Kaufman, L., Smola, A., & Vapnik, V. (1997). Support vector regression machines. In M. Mozer, M. Jordan, & T. Petsche (Eds.), Neural information processing systems, 9. Cambridge, MA: MIT Press.]]
[7]
Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems. Annual Eugenics, 7, 179-188.]]
[8]
Fukunaga, K. (1990). Introduction to statistical pattern recognition (2nd ed.). Orlando, FL: Academic Press.]]
[9]
Gabrijel, I., & Dobnikar, A. (1997). Adaptive RBF neural network. In Proceedings of SOCO '97 Conference (pp. 164-170).]]
[10]
Gunn, S. R. (1997). Support vector machines for classification and regression (Tech. rep.). Image Speech and Intelligent Systems Research Group, University of Southampton. Available online at: http://www.isis.ecs.soton.ac.uk/ resource/svminfo/.]]
[11]
Hastie, T., Tibshirani, R., & Buja, A. (1993). Flexible discriminant analysis by optimal scoring (Res. Rep.). AT&T Bell Labs.]]
[12]
Kohonen, T. (1994). Self-organizing maps. New York: Springer-Verlag.]]
[13]
Musavi, M. T., Kalantri, K., Ahmed, W., & Chan, K. H. (1993). A minimum error neural network (MNN). Neural Networks, 6, 397-407.]]
[14]
Poggio, T. (1975). On optimal nonlinear associative recall. Biological Cybernetics, 19, 201-209.]]
[15]
Saporta, G. (1990). Probabilites, analyse des données et statistique. Paris: Editions Technip.]]
[16]
Schölkopf, B. (1997). Support vector learning. Munich: R. Oldenbourg Verlag.]]
[17]
Schölkopf, B., Smola, A., & Müller, K. R. (1996). Nonlinear component analysis as a kernel eigenvalue problem (Tech. Rep. No. 44). MPI fur biologische kybernetik.]]
[18]
Schölkopf, B., Smola, A., & Müller, K. R. (1998). Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10, 1299-1319.]]
[19]
Specht, D. F. (1990). Probabilistic neural networks. Neural Networks, 3(1), 109- 118.]]
[20]
Vapnik, V. (1995). The nature of statistical learning theory. New York: Springer-Verlag.]]
[21]
Vapnik, V., Golowich, S. E., & Smola, A. (1997). Support vector method for function approximation, regression estimation, and signal processing. In M. Mozer, M. Jordan, & T. Petsche (Eds.), Neural information processing systems, 9. Cambridge, MA: MIT Press.]]
[22]
Wilkinson, J. H., & Reinsch, C. (1971). Handbook for automatic computation, Vol. 2: Linear algebra. New York: Springer-Verlag.]]
[23]
Jaakkola, T. S., & Haussler, D. (1999). Exploiting generative models in discriminative classifiers. In M. S. Kearns, S. A. Solla, & D. A. Cohn (Eds.), Advances in neural information processing systems, 11. Cambridge, MA: MIT Press.]]
[24]
Mika, S., Rätsch, G., Weston, J., Schölkopf, B., & Müller, K. R. (1999). Fisher discriminant analysis with kernels. In Proc. IEEE Neural Networks for Signal Processing Workshop, NNSP.]]

Cited By

View all
  1. Generalized Discriminant Analysis Using a Kernel Approach

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Neural Computation
    Neural Computation  Volume 12, Issue 10
    October 2000
    242 pages

    Publisher

    MIT Press

    Cambridge, MA, United States

    Publication History

    Published: 01 October 2000

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 06 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    View options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media