• Title/Summary/Keyword: Fisher linear discriminant analysis

Search Result 38, Processing Time 0.031 seconds

Action Recognition with deep network features and dimension reduction

  • Li, Lijun;Dai, Shuling
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.2
    • /
    • pp.832-854
    • /
    • 2019
  • Action recognition has been studied in computer vision field for years. We present an effective approach to recognize actions using a dimension reduction method, which is applied as a crucial step to reduce the dimensionality of feature descriptors after extracting features. We propose to use sparse matrix and randomized kd-tree to modify it and then propose modified Local Fisher Discriminant Analysis (mLFDA) method which greatly reduces the required memory and accelerate the standard Local Fisher Discriminant Analysis. For feature encoding, we propose a useful encoding method called mix encoding which combines Fisher vector encoding and locality-constrained linear coding to get the final video representations. In order to add more meaningful features to the process of action recognition, the convolutional neural network is utilized and combined with mix encoding to produce the deep network feature. Experimental results show that our algorithm is a competitive method on KTH dataset, HMDB51 dataset and UCF101 dataset when combining all these methods.

Local Similarity based Discriminant Analysis for Face Recognition

  • Xiang, Xinguang;Liu, Fan;Bi, Ye;Wang, Yanfang;Tang, Jinhui
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.11
    • /
    • pp.4502-4518
    • /
    • 2015
  • Fisher linear discriminant analysis (LDA) is one of the most popular projection techniques for feature extraction and has been widely applied in face recognition. However, it cannot be used when encountering the single sample per person problem (SSPP) because the intra-class variations cannot be evaluated. In this paper, we propose a novel method called local similarity based linear discriminant analysis (LS_LDA) to solve this problem. Motivated by the "divide-conquer" strategy, we first divide the face into local blocks, and classify each local block, and then integrate all the classification results to make final decision. To make LDA feasible for SSPP problem, we further divide each block into overlapped patches and assume that these patches are from the same class. To improve the robustness of LS_LDA to outliers, we further propose local similarity based median discriminant analysis (LS_MDA), which uses class median vector to estimate the class population mean in LDA modeling. Experimental results on three popular databases show that our methods not only generalize well SSPP problem but also have strong robustness to expression, illumination, occlusion and time variation.

A Note on Linear SVM in Gaussian Classes

  • Jeon, Yongho
    • Communications for Statistical Applications and Methods
    • /
    • v.20 no.3
    • /
    • pp.225-233
    • /
    • 2013
  • The linear support vector machine(SVM) is motivated by the maximal margin separating hyperplane and is a popular tool for binary classification tasks. Many studies exist on the consistency properties of SVM; however, it is unknown whether the linear SVM is consistent for estimating the optimal classification boundary even in the simple case of two Gaussian classes with a common covariance, where the optimal classification boundary is linear. In this paper we show that the linear SVM can be inconsistent in the univariate Gaussian classification problem with a common variance, even when the best tuning parameter is used.

An Improved method of Two Stage Linear Discriminant Analysis

  • Chen, Yarui;Tao, Xin;Xiong, Congcong;Yang, Jucheng
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.3
    • /
    • pp.1243-1263
    • /
    • 2018
  • The two-stage linear discrimination analysis (TSLDA) is a feature extraction technique to solve the small size sample problem in the field of image recognition. The TSLDA has retained all subspace information of the between-class scatter and within-class scatter. However, the feature information in the four subspaces may not be entirely beneficial for classification, and the regularization procedure for eliminating singular metrics in TSLDA has higher time complexity. In order to address these drawbacks, this paper proposes an improved two-stage linear discriminant analysis (Improved TSLDA). The Improved TSLDA proposes a selection and compression method to extract superior feature information from the four subspaces to constitute optimal projection space, where it defines a single Fisher criterion to measure the importance of single feature vector. Meanwhile, Improved TSLDA also applies an approximation matrix method to eliminate the singular matrices and reduce its time complexity. This paper presents comparative experiments on five face databases and one handwritten digit database to validate the effectiveness of the Improved TSLDA.

Multi-Modal Biometrics Recognition Method of Face Recognition using Fuzzy-EBGM and Iris Recognition using Fuzzy LDA (Fuzzy-EBGM을 이용한 얼굴인식과 Fuzzy-LDA를 이용한 홍채인식의 다중생체인식 기법 연구)

  • Go Hyoun-Joo;Kwon Mann-Jun;Chun Myung-Ceun
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2005.11a
    • /
    • pp.299-301
    • /
    • 2005
  • 본 연구는 생체정보를 이용하여 개인을 인증하고 확인하기 위한 방법으로 기존 단일 생체인식 기법의 단점을 보완하기 위해 홍채와 얼굴을 이용한 다중생체인식(Multi-Modal Biometrics Recognition)기법을 연구하였다. 중국 홍채 데이터베이스 CASIA(Chinese Academy of Science)에 Gabor Wavelet과 FLDA(Fuzzy Linear Discriminant Analysis)를 사용하여 특징벡터를 획득하였으며, FERET(FERET(Face Recognition Technology) 얼굴영상데이터를 사용하여 FERET 연구에서 매우 우수한 성능을 보인 EBGM알고리듬으로 특징벡터를 획득하였다. 이로부터 얻어진 두 score 값에 대하여 다양한 균등화 과정을 시도해 보았으며, 등록자와 침입자를 구분하기 위한 Fusion Algorithm으로 Bayesian Classifier, Support vector machine, Fisher's linear discriminant를 사용하였다. 또한, 널리 사용되는 방법 중 Weighted Summation을 이용하여 다중생체인식의 성능을 비교해 보았다.

  • PDF

Principal Discriminant Variate (PDV) Method for Classification of Multicollinear Data: Application to Diagnosis of Mastitic Cows Using Near-Infrared Spectra of Plasma Samples

  • Jiang, Jian-Hui;Tsenkova, Roumiana;Yu, Ru-Qin;Ozaki, Yukihiro
    • Proceedings of the Korean Society of Near Infrared Spectroscopy Conference
    • /
    • 2001.06a
    • /
    • pp.1244-1244
    • /
    • 2001
  • In linear discriminant analysis there are two important properties concerning the effectiveness of discriminant function modeling. The first is the separability of the discriminant function for different classes. The separability reaches its optimum by maximizing the ratio of between-class to within-class variance. The second is the stability of the discriminant function against noises present in the measurement variables. One can optimize the stability by exploring the discriminant variates in a principal variation subspace, i. e., the directions that account for a majority of the total variation of the data. An unstable discriminant function will exhibit inflated variance in the prediction of future unclassified objects, exposed to a significantly increased risk of erroneous prediction. Therefore, an ideal discriminant function should not only separate different classes with a minimum misclassification rate for the training set, but also possess a good stability such that the prediction variance for unclassified objects can be as small as possible. In other words, an optimal classifier should find a balance between the separability and the stability. This is of special significance for multivariate spectroscopy-based classification where multicollinearity always leads to discriminant directions located in low-spread subspaces. A new regularized discriminant analysis technique, the principal discriminant variate (PDV) method, has been developed for handling effectively multicollinear data commonly encountered in multivariate spectroscopy-based classification. The motivation behind this method is to seek a sequence of discriminant directions that not only optimize the separability between different classes, but also account for a maximized variation present in the data. Three different formulations for the PDV methods are suggested, and an effective computing procedure is proposed for a PDV method. Near-infrared (NIR) spectra of blood plasma samples from mastitic and healthy cows have been used to evaluate the behavior of the PDV method in comparison with principal component analysis (PCA), discriminant partial least squares (DPLS), soft independent modeling of class analogies (SIMCA) and Fisher linear discriminant analysis (FLDA). Results obtained demonstrate that the PDV method exhibits improved stability in prediction without significant loss of separability. The NIR spectra of blood plasma samples from mastitic and healthy cows are clearly discriminated between by the PDV method. Moreover, the proposed method provides superior performance to PCA, DPLS, SIMCA and FLDA, indicating that PDV is a promising tool in discriminant analysis of spectra-characterized samples with only small compositional difference, thereby providing a useful means for spectroscopy-based clinic applications.

  • PDF

PRINCIPAL DISCRIMINANT VARIATE (PDV) METHOD FOR CLASSIFICATION OF MULTICOLLINEAR DATA WITH APPLICATION TO NEAR-INFRARED SPECTRA OF COW PLASMA SAMPLES

  • Jiang, Jian-Hui;Yuqing Wu;Yu, Ru-Qin;Yukihiro Ozaki
    • Proceedings of the Korean Society of Near Infrared Spectroscopy Conference
    • /
    • 2001.06a
    • /
    • pp.1042-1042
    • /
    • 2001
  • In linear discriminant analysis there are two important properties concerning the effectiveness of discriminant function modeling. The first is the separability of the discriminant function for different classes. The separability reaches its optimum by maximizing the ratio of between-class to within-class variance. The second is the stability of the discriminant function against noises present in the measurement variables. One can optimize the stability by exploring the discriminant variates in a principal variation subspace, i. e., the directions that account for a majority of the total variation of the data. An unstable discriminant function will exhibit inflated variance in the prediction of future unclassified objects, exposed to a significantly increased risk of erroneous prediction. Therefore, an ideal discriminant function should not only separate different classes with a minimum misclassification rate for the training set, but also possess a good stability such that the prediction variance for unclassified objects can be as small as possible. In other words, an optimal classifier should find a balance between the separability and the stability. This is of special significance for multivariate spectroscopy-based classification where multicollinearity always leads to discriminant directions located in low-spread subspaces. A new regularized discriminant analysis technique, the principal discriminant variate (PDV) method, has been developed for handling effectively multicollinear data commonly encountered in multivariate spectroscopy-based classification. The motivation behind this method is to seek a sequence of discriminant directions that not only optimize the separability between different classes, but also account for a maximized variation present in the data. Three different formulations for the PDV methods are suggested, and an effective computing procedure is proposed for a PDV method. Near-infrared (NIR) spectra of blood plasma samples from daily monitoring of two Japanese cows have been used to evaluate the behavior of the PDV method in comparison with principal component analysis (PCA), discriminant partial least squares (DPLS), soft independent modeling of class analogies (SIMCA) and Fisher linear discriminant analysis (FLDA). Results obtained demonstrate that the PDV method exhibits improved stability in prediction without significant loss of separability. The NIR spectra of blood plasma samples from two cows are clearly discriminated between by the PDV method. Moreover, the proposed method provides superior performance to PCA, DPLS, SIMCA md FLDA, indicating that PDV is a promising tool in discriminant analysis of spectra-characterized samples with only small compositional difference.

  • PDF

Rapid discrimination of commercial strawberry cultivars using Fourier transform infrared spectroscopy data combined by multivariate analysis

  • Kim, Suk Weon;Min, Sung Ran;Kim, Jonghyun;Park, Sang Kyu;Kim, Tae Il;Liu, Jang R.
    • Plant Biotechnology Reports
    • /
    • v.3 no.1
    • /
    • pp.87-93
    • /
    • 2009
  • To determine whether pattern recognition based on metabolite fingerprinting for whole cell extracts can be used to discriminate cultivars metabolically, leaves and fruits of five commercial strawberry cultivars were subjected to Fourier transform infrared (FT-IR) spectroscopy. FT-IR spectral data from leaves were analyzed by principal component analysis (PCA) and Fisher's linear discriminant function analysis. The dendrogram based on hierarchical clustering analysis of these spectral data separated the five commercial cultivars into two major groups with originality. The first group consisted of Korean cultivars including 'Maehyang', 'Seolhyang', and 'Gumhyang', whereas in the second group, 'Ryukbo' clustered with 'Janghee', both Japanese cultivars. The results from analysis of fruits were the same as of leaves. We therefore conclude that the hierarchical dendrogram based on PCA of FT-IR data from leaves represents the most probable chemotaxonomical relationship between cultivars, enabling discrimination of cultivars in a rapid and simple manner.

Real-time BCI for imagery movement and Classification for uncued EEG signal (상상 움직임에 대한 실시간 뇌전도 뇌 컴퓨터 상호작용, 큐 없는 상상 움직임에서의 뇌 신호 분류)

  • Kang, Sung-Wook;Jun, Sung-Chan
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.2083-2085
    • /
    • 2009
  • Brain Computer Interface (BCI) is a communication pathway between devices (computers) and human brain. It treats brain signals in real-time basis and discriminates some information of what human brain is doing. In this work, we develop a EEG BCI system using a feature extraction such as common spatial pattern (CSP) and a classifier using Fisher linear discriminant analysis (FLDA). Two-class EEG motor imagery movement datasets with both cued and uncued are tested to verify its feasibility.

  • PDF

Types of Train Delay of High-Speed Rail : Indicators and Criteria for Classification (고속철도 열차지연 유형의 구분지표 및 기준)

  • Kim, Hansoo;Kang, Joonghyuk;Bae, Yeong-Gyu
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.38 no.3
    • /
    • pp.37-50
    • /
    • 2013
  • The purpose of this study is to determine the indicators and the criteria to classify types of train delays of high-speed rail in South Korea. Types of train delays have divided into the chronic delays and the knock-on delays. The Indicators based on relevance, reliability, and comparability were selected with arrival delay rate of over five minutes, median of arrival delays of preceding train and following train, knock-on delay rate of over five minutes, correlation of delay between preceding train and following train on intermediate and last stations, average train headway, average number of passengers per train, and average seat usages. Types of train delays were separated using the Ward's hierarchical cluster analysis. The criteria for classification of train delay were presented by the Fisher's linear discriminant. The analysis on the situational characteristics of train delays is as follows. If the train headway in last station is short, the probability of chronic delay is high. If the planned running times of train is short, the seriousness of chronic delay is high. The important causes of train delays are short headway of train, shortly planned running times, delays of preceding train, and the excessive number of passengers per train.