• Title/Summary/Keyword: Linear Discriminant

Search Result 393, Processing Time 0.024 seconds

Linear Discriminant Clustering in Pattern Recognition

  • Sun, Zhaojia;Choi, Mi-Seon;Kim, Young-Kuk
    • Proceedings of the IEEK Conference
    • /
    • 2008.06a
    • /
    • pp.717-718
    • /
    • 2008
  • Fisher Linear Discriminant(FLD) is a sample and intuitive linear feature extraction method in pattern recognition. But in some special cases, such as un-separable case, one class data dispersed into several clustering case, FLD doesn't work well. In this paper, a new discriminant named K-means Fisher Linear Discriminant, which combines FLD with K-means clustering is proposed. It could deal with this case efficiently, not only possess FLD's global-view merit, but also K-means' local-view property. Finally, the simulation results also demonstrate its advantage against K-means and FLD individually.

  • PDF

Transformation Technique for Null Space-Based Linear Discriminant Analysis with Lagrange Method (라그랑지 기법을 쓴 영 공간 기반 선형 판별 분석법의 변형 기법)

  • Hou, Yuxi;Min, Hwang-Ki;Song, Iickho;Choi, Myeong Soo;Park, Sun;Lee, Seong Ro
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.38C no.2
    • /
    • pp.208-212
    • /
    • 2013
  • Due to the singularity of the within-class scatter, linear discriminant analysis (LDA) becomes ill-posed for small sample size (SSS) problems. An extension of LDA, the null space-based LDA (NLDA) provides good discriminant performances for SSS problems. In this paper, by applying the Lagrange technique, the procedure of transforming the problem of finding the feature extractor of NLDA into a linear equation problem is derived.

A Study on the Optimal Discriminant Model Predicting the likelihood of Insolvency for Technology Financing (기술금융을 위한 부실 가능성 예측 최적 판별모형에 대한 연구)

  • Sung, Oong-Hyun
    • Journal of Korea Technology Innovation Society
    • /
    • v.10 no.2
    • /
    • pp.183-205
    • /
    • 2007
  • An investigation was undertaken of the optimal discriminant model for predicting the likelihood of insolvency in advance for medium-sized firms based on the technology evaluation. The explanatory variables included in the discriminant model were selected by both factor analysis and discriminant analysis using stepwise selection method. Five explanatory variables were selected in factor analysis in terms of explanatory ratio and communality. Six explanatory variables were selected in stepwise discriminant analysis. The effectiveness of linear discriminant model and logistic discriminant model were assessed by the criteria of the critical probability and correct classification rate. Result showed that both model had similar correct classification rate and the linear discriminant model was preferred to the logistic discriminant model in terms of criteria of the critical probability In case of the linear discriminant model with critical probability of 0.5, the total-group correct classification rate was 70.4% and correct classification rates of insolvent and solvent groups were 73.4% and 69.5% respectively. Correct classification rate is an estimate of the probability that the estimated discriminant function will correctly classify the present sample. However, the actual correct classification rate is an estimate of the probability that the estimated discriminant function will correctly classify a future observation. Unfortunately, the correct classification rate underestimates the actual correct classification rate because the data set used to estimate the discriminant function is also used to evaluate them. The cross-validation method were used to estimate the bias of the correct classification rate. According to the results the estimated bias were 2.9% and the predicted actual correct classification rate was 67.5%. And a threshold value is set to establish an in-doubt category. Results of linear discriminant model can be applied for the technology financing banks to evaluate the possibility of insolvency and give the ranking of the firms applied.

  • PDF

Detection of Pathological Voice Using Linear Discriminant Analysis

  • Lee, Ji-Yeoun;Jeong, Sang-Bae;Choi, Hong-Shik;Hahn, Min-Soo
    • MALSORI
    • /
    • no.64
    • /
    • pp.77-88
    • /
    • 2007
  • Nowadays, mel-frequency cesptral coefficients (MFCCs) and Gaussian mixture models (GMMs) are used for the pathological voice detection. This paper suggests a method to improve the performance of the pathological/normal voice classification based on the MFCC-based GMM. We analyze the characteristics of the mel frequency-based filterbank energies using the fisher discriminant ratio (FDR). And the feature vectors through the linear discriminant analysis (LDA) transformation of the filterbank energies (FBE) and the MFCCs are implemented. An accuracy is measured by the GMM classifier. This paper shows that the FBE LDA-based GMM is a sufficiently distinct method for the pathological/normal voice classification, with a 96.6% classification performance rate. The proposed method shows better performance than the MFCC-based GMM with noticeable improvement of 54.05% in terms of error reduction.

  • PDF

Generalization of Fisher′s linear discriminant analysis via the approach of sliced inverse regression

  • Chen, Chun-Houh;Li, Ker-Chau
    • Journal of the Korean Statistical Society
    • /
    • v.30 no.2
    • /
    • pp.193-217
    • /
    • 2001
  • Despite of the rich literature in discriminant analysis, this complicated subject remains much to be explored. In this article, we study the theoretical foundation that supports Fisher's linear discriminant analysis (LDA) by setting up the classification problem under the dimension reduction framework as in Li(1991) for introducing sliced inverse regression(SIR). Through the connection between SIR and LDA, our theory helps identify sources of strength and weakness in using CRIMCOORDS(Gnanadesikan 1977) as a graphical tool for displaying group separation patterns. This connection also leads to several ways of generalizing LDA for better exploration and exploitation of nonlinear data patterns.

  • PDF

Sonar Target Classification using Generalized Discriminant Analysis (일반화된 판별분석 기법을 이용한 능동소나 표적 식별)

  • Kim, Dong-wook;Kim, Tae-hwan;Seok, Jong-won;Bae, Keun-sung
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.22 no.1
    • /
    • pp.125-130
    • /
    • 2018
  • Linear discriminant analysis is a statistical analysis method that is generally used for dimensionality reduction of the feature vectors or for class classification. However, in the case of a data set that cannot be linearly separated, it is possible to make a linear separation by mapping a feature vector into a higher dimensional space using a nonlinear function. This method is called generalized discriminant analysis or kernel discriminant analysis. In this paper, we carried out target classification experiments with active sonar target signals available on the Internet using both liner discriminant and generalized discriminant analysis methods. Experimental results are analyzed and compared with discussions. For 104 test data, LDA method has shown correct recognition rate of 73.08%, however, GDA method achieved 95.19% that is also better than the conventional MLP or kernel-based SVM.

Concave penalized linear discriminant analysis on high dimensions

  • Sunghoon Kwon;Hyebin Kim;Dongha Kim;Sangin Lee
    • Communications for Statistical Applications and Methods
    • /
    • v.31 no.4
    • /
    • pp.393-408
    • /
    • 2024
  • The sparse linear discriminant analysis can be incorporated into the penalized linear regression framework, but most studies have been limited to specific convex penalties, including the least absolute selection and shrinkage operator and its variants. Within this framework, concave penalties can serve as natural counterparts of the convex penalties. Implementing the concave penalized direction vector of discrimination appears to be straightforward, but developing its theoretical properties remains challenging. In this paper, we explore a class of concave penalties that covers the smoothly clipped absolute deviation and minimax concave penalties as examples. We prove that employing concave penalties guarantees an oracle property uniformly within this penalty class, even for high-dimensional samples. Here, the oracle property implies that an ideal direction vector of discrimination can be exactly recovered through concave penalized least squares estimation. Numerical studies confirm that the theoretical results hold with finite samples.

The Enhanced Power Analysis Using Linear Discriminant Analysis (선형판별분석을 이용한 전력분석 기법의 성능 향상)

  • Kang, Ji-Su;Kim, HeeSeok;Hong, Seokhie
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.24 no.6
    • /
    • pp.1055-1063
    • /
    • 2014
  • Recently, various methods have been proposed for improving the performance of the side channel analysis using the power consumption. Of those method, waveform compression method applies to reduce the noise component in pre-processing step. In this paper, we propose the new LDA(Linear Discriminant Analysis)-based signal compression method finding unique feature vector. Through experimentations, we are comparing the proposed method with the PCA(Principal Component Analysis)-based method which has known for the best performance among existing signal compression methods.

Photon-counting linear discriminant analysis for face recognition at a distance

  • Yeom, Seok-Won
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.12 no.3
    • /
    • pp.250-255
    • /
    • 2012
  • Face recognition has wide applications in security and surveillance systems as well as in robot vision and machine interfaces. Conventional challenges in face recognition include pose, illumination, and expression, and face recognition at a distance involves additional challenges because long-distance images are often degraded due to poor focusing and motion blurring. This study investigates the effectiveness of applying photon-counting linear discriminant analysis (Pc-LDA) to face recognition in harsh environments. A related technique, Fisher linear discriminant analysis, has been found to be optimal, but it often suffers from the singularity problem because the number of available training images is generally much smaller than the number of pixels. Pc-LDA, on the other hand, realizes the Fisher criterion in high-dimensional space without any dimensionality reduction. Therefore, it provides more invariant solutions to image recognition under distortion and degradation. Two decision rules are employed: one is based on Euclidean distance; the other, on normalized correlation. In the experiments, the asymptotic equivalence of the photon-counting method to the Fisher method is verified with simulated data. Degraded facial images are employed to demonstrate the robustness of the photon-counting classifier in harsh environments. Four types of blurring point spread functions are applied to the test images in order to simulate long-distance acquisition. The results are compared with those of conventional Eigen face and Fisher face methods. The results indicate that Pc-LDA is better than conventional facial recognition techniques.