• Title/Summary/Keyword: Linear discriminant function analysis

Search Result 43, Processing Time 0.041 seconds

Palatability Grading Analysis of Hanwoo Beef using Sensory Properties and Discriminant Analysis (관능특성 및 판별함수를 이용한 한우고기 맛 등급 분석)

  • Cho, Soo-Hyun;Seo, Gu-Reo-Un-Dal-Nim;Kim, Dong-Hun;Kim, Jae-Hee
    • Food Science of Animal Resources
    • /
    • v.29 no.1
    • /
    • pp.132-139
    • /
    • 2009
  • The objective of this study was to investigate the most effective analysis methods for palatability grading of Hanwoo beef by comparing the results of discriminant analysis with sensory data. The sensory data were obtained from sensory testing by 1,300 consumers evaluated tenderness, juiciness, flavor-likeness and overall acceptability of Hanwoo beef samples prepared by boiling, roasting and grilling cooking methods. For the discriminant analysis with one factor, overall acceptability, the linear discriminant functions and the non-parametric discriminant function with the Gaussian kernel were estimated. The linear discriminant functions were simple and easy to understand while the non-parametric discriminant functions were not explicit and had the problem of selection of kernel function and bandwidth. With the three palatability factors such as tenderness, juiciness and flavor-likeness, the canonical discriminant analysis was used and the ability of classification was calculated with the accurate classification rate and the error rate. The canonical discriminant analysis did not need the specific distributional assumptions and only used the principal component and canonical correlation. Also, it contained the function of 3 factors (tenderness, juiciness and flavor-likeness) and accurate classification rate was similar with the other discriminant methods. Therefore, the canonical discriminant analysis was the most proper method to analyze the palatability grading of Hanwoo beef.

A Study on the Optimal Discriminant Model Predicting the likelihood of Insolvency for Technology Financing (기술금융을 위한 부실 가능성 예측 최적 판별모형에 대한 연구)

  • Sung, Oong-Hyun
    • Journal of Korea Technology Innovation Society
    • /
    • v.10 no.2
    • /
    • pp.183-205
    • /
    • 2007
  • An investigation was undertaken of the optimal discriminant model for predicting the likelihood of insolvency in advance for medium-sized firms based on the technology evaluation. The explanatory variables included in the discriminant model were selected by both factor analysis and discriminant analysis using stepwise selection method. Five explanatory variables were selected in factor analysis in terms of explanatory ratio and communality. Six explanatory variables were selected in stepwise discriminant analysis. The effectiveness of linear discriminant model and logistic discriminant model were assessed by the criteria of the critical probability and correct classification rate. Result showed that both model had similar correct classification rate and the linear discriminant model was preferred to the logistic discriminant model in terms of criteria of the critical probability In case of the linear discriminant model with critical probability of 0.5, the total-group correct classification rate was 70.4% and correct classification rates of insolvent and solvent groups were 73.4% and 69.5% respectively. Correct classification rate is an estimate of the probability that the estimated discriminant function will correctly classify the present sample. However, the actual correct classification rate is an estimate of the probability that the estimated discriminant function will correctly classify a future observation. Unfortunately, the correct classification rate underestimates the actual correct classification rate because the data set used to estimate the discriminant function is also used to evaluate them. The cross-validation method were used to estimate the bias of the correct classification rate. According to the results the estimated bias were 2.9% and the predicted actual correct classification rate was 67.5%. And a threshold value is set to establish an in-doubt category. Results of linear discriminant model can be applied for the technology financing banks to evaluate the possibility of insolvency and give the ranking of the firms applied.

  • PDF

Face Recognition by Combining Linear Discriminant Analysis and Radial Basis Function Network Classifiers (선형판별법과 레이디얼 기저함수 신경망 결합에 의한 얼굴인식)

  • Oh Byung-Joo
    • The Journal of the Korea Contents Association
    • /
    • v.5 no.6
    • /
    • pp.41-48
    • /
    • 2005
  • This paper presents a face recognition method based on the combination of well-known statistical representations of Principal Component Analysis(PCA), and Linear Discriminant Analysis(LDA) with Radial Basis Function Networks. The original face image is first processed by PCA to reduce the dimension, and thereby avoid the singularity of the within-class scatter matrix in LDA calculation. The result of PCA process is applied to LDA classifier. In the second approach, the LDA process Produce a discriminational features of the face image, which is taken as the input of the Radial Basis Function Network(RBFN). The proposed approaches has been tested on the ORL face database. The experimental results have been demonstrated, and the recognition rate of more than 93.5% has been achieved.

  • PDF

Principal Discriminant Variate (PDV) Method for Classification of Multicollinear Data: Application to Diagnosis of Mastitic Cows Using Near-Infrared Spectra of Plasma Samples

  • Jiang, Jian-Hui;Tsenkova, Roumiana;Yu, Ru-Qin;Ozaki, Yukihiro
    • Proceedings of the Korean Society of Near Infrared Spectroscopy Conference
    • /
    • 2001.06a
    • /
    • pp.1244-1244
    • /
    • 2001
  • In linear discriminant analysis there are two important properties concerning the effectiveness of discriminant function modeling. The first is the separability of the discriminant function for different classes. The separability reaches its optimum by maximizing the ratio of between-class to within-class variance. The second is the stability of the discriminant function against noises present in the measurement variables. One can optimize the stability by exploring the discriminant variates in a principal variation subspace, i. e., the directions that account for a majority of the total variation of the data. An unstable discriminant function will exhibit inflated variance in the prediction of future unclassified objects, exposed to a significantly increased risk of erroneous prediction. Therefore, an ideal discriminant function should not only separate different classes with a minimum misclassification rate for the training set, but also possess a good stability such that the prediction variance for unclassified objects can be as small as possible. In other words, an optimal classifier should find a balance between the separability and the stability. This is of special significance for multivariate spectroscopy-based classification where multicollinearity always leads to discriminant directions located in low-spread subspaces. A new regularized discriminant analysis technique, the principal discriminant variate (PDV) method, has been developed for handling effectively multicollinear data commonly encountered in multivariate spectroscopy-based classification. The motivation behind this method is to seek a sequence of discriminant directions that not only optimize the separability between different classes, but also account for a maximized variation present in the data. Three different formulations for the PDV methods are suggested, and an effective computing procedure is proposed for a PDV method. Near-infrared (NIR) spectra of blood plasma samples from mastitic and healthy cows have been used to evaluate the behavior of the PDV method in comparison with principal component analysis (PCA), discriminant partial least squares (DPLS), soft independent modeling of class analogies (SIMCA) and Fisher linear discriminant analysis (FLDA). Results obtained demonstrate that the PDV method exhibits improved stability in prediction without significant loss of separability. The NIR spectra of blood plasma samples from mastitic and healthy cows are clearly discriminated between by the PDV method. Moreover, the proposed method provides superior performance to PCA, DPLS, SIMCA and FLDA, indicating that PDV is a promising tool in discriminant analysis of spectra-characterized samples with only small compositional difference, thereby providing a useful means for spectroscopy-based clinic applications.

  • PDF

PRINCIPAL DISCRIMINANT VARIATE (PDV) METHOD FOR CLASSIFICATION OF MULTICOLLINEAR DATA WITH APPLICATION TO NEAR-INFRARED SPECTRA OF COW PLASMA SAMPLES

  • Jiang, Jian-Hui;Yuqing Wu;Yu, Ru-Qin;Yukihiro Ozaki
    • Proceedings of the Korean Society of Near Infrared Spectroscopy Conference
    • /
    • 2001.06a
    • /
    • pp.1042-1042
    • /
    • 2001
  • In linear discriminant analysis there are two important properties concerning the effectiveness of discriminant function modeling. The first is the separability of the discriminant function for different classes. The separability reaches its optimum by maximizing the ratio of between-class to within-class variance. The second is the stability of the discriminant function against noises present in the measurement variables. One can optimize the stability by exploring the discriminant variates in a principal variation subspace, i. e., the directions that account for a majority of the total variation of the data. An unstable discriminant function will exhibit inflated variance in the prediction of future unclassified objects, exposed to a significantly increased risk of erroneous prediction. Therefore, an ideal discriminant function should not only separate different classes with a minimum misclassification rate for the training set, but also possess a good stability such that the prediction variance for unclassified objects can be as small as possible. In other words, an optimal classifier should find a balance between the separability and the stability. This is of special significance for multivariate spectroscopy-based classification where multicollinearity always leads to discriminant directions located in low-spread subspaces. A new regularized discriminant analysis technique, the principal discriminant variate (PDV) method, has been developed for handling effectively multicollinear data commonly encountered in multivariate spectroscopy-based classification. The motivation behind this method is to seek a sequence of discriminant directions that not only optimize the separability between different classes, but also account for a maximized variation present in the data. Three different formulations for the PDV methods are suggested, and an effective computing procedure is proposed for a PDV method. Near-infrared (NIR) spectra of blood plasma samples from daily monitoring of two Japanese cows have been used to evaluate the behavior of the PDV method in comparison with principal component analysis (PCA), discriminant partial least squares (DPLS), soft independent modeling of class analogies (SIMCA) and Fisher linear discriminant analysis (FLDA). Results obtained demonstrate that the PDV method exhibits improved stability in prediction without significant loss of separability. The NIR spectra of blood plasma samples from two cows are clearly discriminated between by the PDV method. Moreover, the proposed method provides superior performance to PCA, DPLS, SIMCA md FLDA, indicating that PDV is a promising tool in discriminant analysis of spectra-characterized samples with only small compositional difference.

  • PDF

Linear Discriminant Analysis in Agricultural Experiment (농업실험에서 직선분리함수의 이용)

  • Young-Am Chae
    • KOREAN JOURNAL OF CROP SCIENCE
    • /
    • v.22 no.1
    • /
    • pp.80-82
    • /
    • 1977
  • Using head length and head width of two wheat monosomic lines linear discriminant function of these two variables was calculated and also illustrated how one can effectively classify unknown individuals into a correct group belonging by means of this linear discriminant function in reverse. Brief suggestion on the utilization of this analysis in genetics and breeding program was given.

  • PDF

Discriminant Analysis with Icomplete Pattern Vectors

  • Hie Choon Chung
    • Communications for Statistical Applications and Methods
    • /
    • v.4 no.1
    • /
    • pp.49-63
    • /
    • 1997
  • We consider the problem of classifying a p x 1 observation into one of two multivariate normal populations when the training smaples contain a block of missing observation. A new classification procedure is proposed which is a linear combination of two discriminant functions, one based on the complete samples and the other on the incomplete samples. The new discriminant function is easy to use.

  • PDF

Study on Classification Function into Sasang Constitution Using Data Mining Techniques (데이터마이닝 기법을 이용한 사상체질 판별함수에 관한 연구)

  • Kim Kyu Kon;Kim Jong Won;Lee Eui Ju;Kim Jong Yeol;Choi Sun-Mi
    • Journal of Physiology & Pathology in Korean Medicine
    • /
    • v.18 no.6
    • /
    • pp.1938-1944
    • /
    • 2004
  • In this study, when we make a diagnosis of constitution using QSCC Ⅱ(Questionnaire of Sasang Constitution Classification). data mining techniques are applied to seek the classification function for improving the accuracy. Data used in the analysis are the questionnaires of 1051 patients who had been treated in Dong Eui Oriental Medical Hospital and Kyung Hee Oriental Medical Hospital. The criteria for data cleansing are the response pattern in the opposite questionnaires and the positive proportion of specific questionnaires in each constitution. And the criteria for variable selection are the test of homogeneity in frequency analysis and the coefficients in the linear discriminant function. Discriminant analysis model and decision tree model are applied to seek the classification function into Sasang constitution. The accuracy in learning sample is similar in two models, the higher accuracy in test sample is obtained in discriminant analysis model.

Sonar Target Classification using Generalized Discriminant Analysis (일반화된 판별분석 기법을 이용한 능동소나 표적 식별)

  • Kim, Dong-wook;Kim, Tae-hwan;Seok, Jong-won;Bae, Keun-sung
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.22 no.1
    • /
    • pp.125-130
    • /
    • 2018
  • Linear discriminant analysis is a statistical analysis method that is generally used for dimensionality reduction of the feature vectors or for class classification. However, in the case of a data set that cannot be linearly separated, it is possible to make a linear separation by mapping a feature vector into a higher dimensional space using a nonlinear function. This method is called generalized discriminant analysis or kernel discriminant analysis. In this paper, we carried out target classification experiments with active sonar target signals available on the Internet using both liner discriminant and generalized discriminant analysis methods. Experimental results are analyzed and compared with discussions. For 104 test data, LDA method has shown correct recognition rate of 73.08%, however, GDA method achieved 95.19% that is also better than the conventional MLP or kernel-based SVM.

Design of Optimized Radial Basis Function Neural Networks Classifier with the Aid of Principal Component Analysis and Linear Discriminant Analysis (주성분 분석법과 선형판별 분석법을 이용한 최적화된 방사형 기저 함수 신경회로망 분류기의 설계)

  • Kim, Wook-Dong;Oh, Sung-Kwun
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.22 no.6
    • /
    • pp.735-740
    • /
    • 2012
  • In this paper, we introduce design methodologies of polynomial radial basis function neural network classifier with the aid of Principal Component Analysis(PCA) and Linear Discriminant Analysis(LDA). By minimizing the information loss of given data, Feature data is obtained through preprocessing of PCA and LDA and then this data is used as input data of RBFNNs. The hidden layer of RBFNNs is built up by Fuzzy C-Mean(FCM) clustering algorithm instead of receptive fields and linear polynomial function is used as connection weights between hidden and output layer. In order to design optimized classifier, the structural and parametric values such as the number of eigenvectors of PCA and LDA, and fuzzification coefficient of FCM algorithm are optimized by Artificial Bee Colony(ABC) optimization algorithm. The proposed classifier is applied to some machine learning datasets and its result is compared with some other classifiers.