• Title/Summary/Keyword: dimension reduction method

Search Result 250, Processing Time 0.023 seconds

Intensive comparison of semi-parametric and non-parametric dimension reduction methods in forward regression

  • Shin, Minju;Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.29 no.5
    • /
    • pp.615-627
    • /
    • 2022
  • Principal Fitted Component (PFC) is a semi-parametric sufficient dimension reduction (SDR) method, which is originally proposed in Cook (2007). According to Cook (2007), the PFC has a connection with other usual non-parametric SDR methods. The connection is limited to sliced inverse regression (Li, 1991) and ordinary least squares. Since there is no direct comparison between the two approaches in various forward regressions up to date, a practical guidance between the two approaches is necessary for usual statistical practitioners. To fill this practical necessity, in this paper, we newly derive a connection of the PFC to covariance methods (Yin and Cook, 2002), which is one of the most popular SDR methods. Also, intensive numerical studies have done closely to examine and compare the estimation performances of the semi- and non-parametric SDR methods for various forward regressions. The founding from the numerical studies are confirmed in a real data example.

Dimension Reduction Method of Speech Feature Vector for Real-Time Adaptation of Voice Activity Detection (음성구간 검출기의 실시간 적응화를 위한 음성 특징벡터의 차원 축소 방법)

  • Park Jin-Young;Lee Kwang-Seok;Hur Kang-In
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.7 no.3
    • /
    • pp.116-121
    • /
    • 2006
  • In this paper, we propose the dimension reduction method of multi-dimension speech feature vector for real-time adaptation procedure in various noisy environments. This method which reduces dimensions non-linearly to map the likelihood of speech feature vector and noise feature vector. The LRT(Likelihood Ratio Test) is used for classifying speech and non-speech. The results of implementation are similar to multi-dimensional speech feature vector. The results of speech recognition implementation of detected speech data are also similar to multi-dimensional(10-order dimensional MFCC(Mel-Frequency Cepstral Coefficient)) speech feature vector.

  • PDF

Analysis of the Effect of Manufacturing Tolerance on Induction Motor Performance by Univariate Dimension Reduction Method (단변수 차원 감소법을 이용한 제작 공차가 유도전동기 성능에 미치는 영향력 분석)

  • Lee, Sang-Kyun;Kang, Byung-Su;Back, Jong Hyun;Kim, Dong-Hun
    • Journal of the Korean Magnetics Society
    • /
    • v.25 no.6
    • /
    • pp.203-207
    • /
    • 2015
  • This paper introduces a probabilistic analysis method in order to analyze the effect of manufacturing tolerance on induction motor performance occurring in massive production. The univariate dimension reduction method is adapted to predict probabilistic characteristics of a performance function due to certain probabilistic distributions of design variables. Moreover, the sensitivity information on mean and variance of the performance function is estimated, and then the effect of randomness of individual design variables on the probability performance function is analyzed. The effectiveness and accuracy of the method is investigated with a mathematical model and an induction motor.

Voice Activity Detection in Noisy Environment based on Statistical Nonlinear Dimension Reduction Techniques (통계적 비선형 차원축소기법에 기반한 잡음 환경에서의 음성구간검출)

  • Han Hag-Yong;Lee Kwang-Seok;Go Si-Yong;Hur Kang-In
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.9 no.5
    • /
    • pp.986-994
    • /
    • 2005
  • This Paper proposes the likelihood-based nonlinear dimension reduction method of the speech feature parameters in order to construct the voice activity detecter adaptable in noisy environment. The proposed method uses the nonlinear values of the Gaussian probability density function with the new parameters for the speec/nonspeech class. We adapted Likelihood Ratio Test to find speech part and compared its performance with that of Linear Discriminant Analysis technique. In experiments we found that the proposed method has the similar results to that of Gaussian Mixture Models.

Comparison of Dimension Reduction Methods for Time Series Factor Analysis: A Case Study (Value at Risk의 사후검증을 통한 다변량 시계열자료의 차원축소 방법의 비교: 사례분석)

  • Lee, Dae-Su;Song, Seong-Joo
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.4
    • /
    • pp.597-607
    • /
    • 2011
  • Value at Risk(VaR) is being widely used as a simple tool for measuring financial risk. Although VaR has a few weak points, it is used as a basic risk measure due to its simplicity and easiness of understanding. However, it becomes very difficult to estimate the volatility of the portfolio (essential to compute its VaR) when the number of assets in the portfolio is large. In this case, we can consider the application of a dimension reduction technique; however, the ordinary factor analysis cannot be applied directly to financial data due to autocorrelation. In this paper, we suggest a dimension reduction method that uses the time-series factor analysis and DCC(Dynamic Conditional Correlation) GARCH model. We also compare the method using time-series factor analysis with the existing method using ordinary factor analysis by backtesting the VaR of real data from the Korean stock market.

Automatic Generation of Analysis Model Using Multi-resolution Modeling Algorithm (다중해상도 알고리즘을 이용한 자동 해석모델 생성)

  • Kim M.C.;Lee K.W.;Kim S.C.
    • Korean Journal of Computational Design and Engineering
    • /
    • v.11 no.3
    • /
    • pp.172-182
    • /
    • 2006
  • This paper presents a method to convert 3D CAD model to an appropriate analysis model using wrap-around, smooth-out and thinning operators that have been originally developed to realize the multi-resolution modeling. Wrap-around and smooth-out operators are used to simplify 3D model, and thinning operator is to reduce the dimension of a target object with simultaneously decomposing the simplified 3D model to 1D or 2D shapes. By using the simplification and dimension-reduction operations in an appropriate way, the user can generate an analysis model that matches specific applications. The advantage of this method is that the user can create optimized analysis models of various simplification levels by selecting appropriate number of detailed features and removing them.

Face recognition by PLS

  • Baek, Jang-Sun
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2003.10a
    • /
    • pp.69-72
    • /
    • 2003
  • The paper considers partial least squares (PLS) as a new dimension reduction technique for the feature vector to overcome the small sample size problem in face recognition. Principal component analysis (PCA), a conventional dimension reduction method, selects the components with maximum variability, irrespective of the class information. So PCA does not necessarily extract features that are important for the discrimination of classes. PLS, on the other hand, constructs the components so that the correlation between the class variable and themselves is maximized. Therefore PLS components are more predictive than PCA components in classification. The experimental results on Manchester and ORL databases show that PLS is to be preferred over PCA when classification is the goal and dimension reduction is needed.

  • PDF

Comparison of Methods for Reducing the Dimension of Compositional Data with Zero Values

  • Song, Taeg-Youn;Choi, Byung-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.4
    • /
    • pp.559-569
    • /
    • 2012
  • Compositional data consist of compositions that are non-negative vectors of proportions with the unit-sum constraint. In disciplines such as petrology and archaeometry, it is fundamental to statistically analyze this type of data. Aitchison (1983) introduced a log-contrast principal component analysis that involves logratio transformed data, as a dimension-reduction technique to understand and interpret the structure of compositional data. However, the analysis is not usable when zero values are present in the data. In this paper, we introduce 4 possible methods to reduce the dimension of compositional data with zero values. Two real data sets are analyzed using the methods and the obtained results are compared.

More on directional regression

  • Kim, Kyongwon;Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.5
    • /
    • pp.553-562
    • /
    • 2021
  • Directional regression (DR; Li and Wang, 2007) is well-known as an exhaustive sufficient dimension reduction method, and performs well in complex regression models to have linear and nonlinear trends. However, the extension of DR is not well-done upto date, so we will extend DR to accommodate multivariate regression and large p-small n regression. We propose three versions of DR for multivariate regression and discuss how DR is applicable for the latter regression case. Numerical studies confirm that DR is robust to the number of clusters and the choice of hierarchical-clustering or pooled DR.

DIMENSION REDUCTION FOR APPROXIMATION OF ADVANCED RETRIAL QUEUES : TUTORIAL AND REVIEW

  • SHIN, YANG WOO
    • Journal of applied mathematics & informatics
    • /
    • v.35 no.5_6
    • /
    • pp.623-649
    • /
    • 2017
  • Retrial queues have been widely used to model the many practical situations arising from telephone systems, telecommunication networks and call centers. An approximation method for a simple Markovian retrial queue by reducing the two dimensional problem to one dimensional problem was presented by Fredericks and Reisner in 1979. The method seems to be a promising approach to approximate the retrial queues with complex structure, but the method has not been attracted a lot of attention for about thirty years. In this paper, we exposit the method in detail and show the usefulness of the method by presenting the recent results for approximating the retrial queues with complex structure such as multi-server retrial queues with phase type distribution of retrial time, impatient customers with general persistent function and/or multiclass customers, etc.