Organizational Unit:
School of Computational Science and Engineering

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Organizational Unit
Includes Organization(s)

Publication Search Results

Now showing 1 - 2 of 2
Thumbnail Image
Item

Fast Linear Discriminant Analysis using QR Decomposition and Regularization

2007-03-23 , Park, Haesun , Drake, Barry L. , Lee, Sangmin , Park, Cheong Hee

Linear Discriminant Analysis (LDA) is among the most optimal dimension reduction methods for classification, which provides a high degree of class separability for numerous applications from science and engineering. However, problems arise with this classical method when one or both of the scatter matrices is singular. Singular scatter matrices are not unusual in many applications, especially for high-dimensional data. For high-dimensional undersampled and oversampled problems, the classical LDA requires modification in order to solve a wider range of problems. In recent work the generalized singular value decomposition (GSVD) has been shown to mitigate the issue of singular scatter matrices, and a new algorithm, LDA/GSVD, has been shown to be very robust for many applications in machine learning. However, the GSVD inherently has a considerable computational overhead. In this paper, we propose fast algorithms based on the QR decomposition and regularization that solve the LDA/GSVD computational bottleneck. In addition, we present fast algorithms for classical LDA and regularized LDA utilizing the framework based on LDA/GSVD and preprocessing by the Cholesky decomposition. Experimental results are presented that demonstrate substantial speedup in all of classical LDA, regularized LDA, and LDA/GSVD algorithms without any sacrifice in classification performance for a wide range of machine learning applications.

Thumbnail Image
Item

A Comparison of Generalized Linear Discriminant Analysis Algorithms

2006-01-28 , Park, Cheong Hee , Park, Haesun

Linear Discriminant Analysis (LDA) is a dimension reduction method which finds an optimal linear transformation that maximizes the class separability. However, in undersampled problems where the number of data samples is smaller than the dimension of data space, it is difficult to apply the LDA due to the singularity of scatter matrices caused by high dimensionality. In order to make the LDA applicable, several generalizations of the LDA have been proposed recently. In this paper, we present theoretical and algorithmic relationships among several generalized LDA algorithms and compare their computational complexities and performances in text classification and face recognition. Towards a practical dimension reduction method for high dimensional data, an efficient algorithm is proposed, which reduces the computational complexity greatly while achieving competitive prediction accuracies. We also present nonlinear extensions of these LDA algorithms based on kernel methods. It is shown that a generalized eigenvalue problem can be formulated in the kernel-based feature space, and generalized LDA algorithms are applied to solve the generalized eigenvalue problem, resulting in nonlinear discriminant analysis. Performances of these linear and nonlinear discriminant analysis algorithms are compared extensively.