Person:
Park, Haesun

Associated Organization(s)
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 2 of 2
  • Item
    Multiclass Classifiers Based on Dimension Reduction with Generalized LDA
    (Georgia Institute of Technology, 2006-01-27) Kim, Hyunsoo ; Drake, Barry L. ; Park, Haesun
    Linear discriminant analysis (LDA) has been widely used for dimension reduction of data sets with multiple classes. The LDA has been recently extended to various generalized LDA methods which are applicable regardless of the relative sizes between the data dimension and the number of data items. In this paper, we propose several multiclass classifiers based on generalized LDA algorithms, taking advantage of the dimension reducing transformation matrix without requiring additional training or any parameter optimization. A marginal linear discriminant classifier, a Bayesian linear discriminant classifier, and a one-dimensional Bayesian linear discriminant classifier are introduced for multiclass classification. Our experimental results illustrate that these classifiers produce higher ten-fold cross validation accuracy than kNN and centroid based classification in the reduced dimensional space providing efficient general multiclass classifiers.
  • Item
    Relationships Between Support Vector Classifiers and Generalized Linear Discriminant Analysis on Support Vectors
    (Georgia Institute of Technology, 2006) Kim, Hyunsoo ; Drake, Barry L. ; Park, Haesun
    The linear discriminant analysis based on the generalized singular value decomposition (LDA/GSVD) has been introduced to circumvent the nonsingularity restriction inherent in the classical LDA. The LDA/GSVD provides a framework in which a dimension reducing transformation can be effectively obtained for undersampled problems. In this paper, relationships between support vector machines (SVMs) and the generalized linear discriminant analysis applied to the support vectors are studied. Based on the GSVD, the weight vector of the hard-margin SVM is proved to be equivalent to the dimension reducing transformation vector generated by LDA/GSVD applied to the support vectors of the binary class. We also show that the dimension reducing transformation vector and the weight vector of soft-margin SVMs are related when a subset of support vectors are considered. These results can be generalized when kernelized SVMs and the kernelized LDA/GSVD called KDA/GSVD are considered. Through these relationships, it is shown that support vector classification is related to data reduction as well as dimension reduction by LDA/GSVD.