• Title/Summary/Keyword: Locally Linear Embedding

Search Result 15, Processing Time 0.024 seconds

Training of Locally Linear Embedding using Multilayer Perceptrons (LLE(Locally Linear Embedding)의 함수관계에 대한 다층퍼셉트론 학습)

  • Oh, Sang-Hoon
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2007.11a
    • /
    • pp.217-220
    • /
    • 2007
  • LLE(Locally Linear Embedding) has been proposed to compute low dimensional, neighborhood preserving embeddings of high dimensional data. Here, we should perform whole processes of LLE when untrained patterns are presented. In this paper, we propose a training of MLPs(Multilayer Perceptrons) to perform the mapping of LLE from high dimensional data to low dimensional ones.

  • PDF

Face Image Synthesis using Nonlinear Manifold Learning (비선형 매니폴드 학습을 이용한 얼굴 이미지 합성)

  • 조은옥;김대진;방승양
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.2
    • /
    • pp.182-188
    • /
    • 2004
  • This paper proposes to synthesize facial images from a few parameters for the pose and the expression of their constituent components. This parameterization makes the representation, storage, and transmission of face images effective. But it is difficult to parameterize facial images because variations of face images show a complicated nonlinear manifold in high-dimensional data space. To tackle this problem, we use an LLE (Locally Linear Embedding) technique for a good representation of face images, where the relationship among face images is preserving well and the projected manifold into the reduced feature space becomes smoother and more continuous. Next, we apply a snake model to estimate face feature values in the reduced feature space that corresponds to a specific pose and/or expression parameter. Finally, a synthetic face image is obtained from an interpolation of several neighboring face images in the vicinity of the estimated feature value. Experimental results show that the proposed method shows a negligible overlapping effect and creates an accurate and consistent synthetic face images with respect to changes of pose and/or expression parameters.

An Analysis of 3-D Object Characteristics Using Locally Linear Embedding (시점별 형상의 지역적 선형 사상을 통한 3차원 물체의 특성 분석)

  • Lee, Soo-Chahn;Yun, Il-Dong
    • Journal of Broadcast Engineering
    • /
    • v.14 no.1
    • /
    • pp.81-84
    • /
    • 2009
  • This paper explores the possibility of describing objects from the change in the shape according to the change in viewpoint. Specifically, we sample the shapes from various viewpoints of a 3-D model, and apply dimension reduction by locally linear embedding. A low dimensional distribution of points are constructed, and characteristics of the object are described from this distribution. Also, we propose two 3-D retrieval methods by applying the iterative closest point algorithm, and by applying Fourier transform and measuring similarity by modified Housdorff distance, and present experimental results. The proposed method shows that the change of shape according to the change in viewpoint can describe the characteristics of an object.

Locally Linear Embedding for Face Recognition with Simultaneous Diagonalization (얼굴 인식을 위한 연립 대각화와 국부 선형 임베딩)

  • Kim, Eun-Sol;Noh, Yung-Kyun;Zhang, Byoung-Tak
    • Journal of KIISE
    • /
    • v.42 no.2
    • /
    • pp.235-241
    • /
    • 2015
  • Locally linear embedding (LLE) [1] is a type of manifold algorithms, which preserves inner product value between high-dimensional data when embedding the high-dimensional data to low-dimensional space. LLE closely embeds data points on the same subspace in low-dimensional space, because the data points have significant inner product values. On the other hand, if the data points are located orthogonal to each other, these are separately embedded in low-dimensional space, even though they are in close proximity to each other in high-dimensional space. Meanwhile, it is well known that the facial images of the same person under varying illumination lie in a low-dimensional linear subspace [2]. In this study, we suggest an improved LLE method for face recognition problem. The method maximizes the characteristic of LLE, which embeds the data points totally separately when they are located orthogonal to each other. To accomplish this, all of the subspaces made by each class are forced to locate orthogonally. To make all of the subspaces orthogonal, the simultaneous Diagonalization (SD) technique was applied. From experimental results, the suggested method is shown to dramatically improve the embedding results and classification performance.

Feature Extraction via Sparse Difference Embedding (SDE)

  • Wan, Minghua;Lai, Zhihui
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.7
    • /
    • pp.3594-3607
    • /
    • 2017
  • The traditional feature extraction methods such as principal component analysis (PCA) cannot obtain the local structure of the samples, and locally linear embedding (LLE) cannot obtain the global structure of the samples. However, a common drawback of existing PCA and LLE algorithm is that they cannot deal well with the sparse problem of the samples. Therefore, by integrating the globality of PCA and the locality of LLE with a sparse constraint, we developed an improved and unsupervised difference algorithm called Sparse Difference Embedding (SDE), for dimensionality reduction of high-dimensional data in small sample size problems. Significantly differing from the existing PCA and LLE algorithms, SDE seeks to find a set of perfect projections that can not only impact the locality of intraclass and maximize the globality of interclass, but can also simultaneously use the Lasso regression to obtain a sparse transformation matrix. This characteristic makes SDE more intuitive and more powerful than PCA and LLE. At last, the proposed algorithm was estimated through experiments using the Yale and AR face image databases and the USPS handwriting digital databases. The experimental results show that SDE outperforms PCA LLE and UDP attributed to its sparse discriminating characteristics, which also indicates that the SDE is an effective method for face recognition.

Discovering Meaningful Trends in the Inaugural Addresses of United States Presidents Via Text Mining (텍스트마이닝을 활용한 미국 대통령 취임 연설문의 트렌드 연구)

  • Cho, Su Gon;Cho, Jaehee;Kim, Seoung Bum
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.41 no.5
    • /
    • pp.453-460
    • /
    • 2015
  • Identification of meaningful patterns and trends in large volumes of text data is an important task in various research areas. In the present study, we propose a procedure to find meaningful tendencies based on a combination of text mining, cluster analysis, and low-dimensional embedding. To demonstrate applicability and effectiveness of the proposed procedure, we analyzed the inaugural addresses of the presidents of the United States from 1789 to 2009. The main results of this study show that trends in the national policy agenda can be discovered based on clustering and visualization algorithms.

ON DIFFERENTIAL INVARIANTS OF HYPERPLANE SYSTEMS ON NONDEGENERATE EQUIVARIANT EMBEDDINGS OF HOMOGENEOUS SPACES

  • HONG, JAEHYUN
    • Communications of the Korean Mathematical Society
    • /
    • v.30 no.3
    • /
    • pp.253-267
    • /
    • 2015
  • Given a complex submanifoldM of the projective space $\mathbb{P}$(T), the hyperplane system R on M characterizes the projective embedding of M into $\mathbb{P}$(T) in the following sense: for any two nondegenerate complex submanifolds $M{\subset}\mathbb{P}$(T) and $M^{\prime}{\subset}\mathbb{P}$(T'), there is a projective linear transformation that sends an open subset of M onto an open subset of M' if and only if (M,R) is locally equivalent to (M', R'). Se-ashi developed a theory for the differential invariants of these types of systems of linear differential equations. In particular, the theory applies to systems of linear differential equations that have symbols equivalent to the hyperplane systems on nondegenerate equivariant embeddings of compact Hermitian symmetric spaces. In this paper, we extend this result to hyperplane systems on nondegenerate equivariant embeddings of homogeneous spaces of the first kind.

A review on the t-distributed stochastic neighbors embedding (t-SNE에 대한 요약)

  • Kipoong Kim;Choongrak Kim
    • The Korean Journal of Applied Statistics
    • /
    • v.36 no.2
    • /
    • pp.167-173
    • /
    • 2023
  • This paper investigates several methods of visualizing high-dimensional data in a low-dimensional space. At first, principal component analysis and multidimensional scaling are briefly introduced as linear approaches, and then kernel principal component analysis, self-organizing map, locally linear embedding, Isomap, Laplacian Eigenmaps, and local multidimensional scaling are introduced as nonlinear approaches. In particular, t-SNE, which is widely used but relatively unfamiliar in the field of statistics, is described in more detail. We also present a simple example for several methods, including t-SNE. Finally, we provide a review of several recent studies pointing out the limitations of t-SNE and discuss the future research problems presented.

Modelling of timber joints made with steel dowels and locally reinforced by DVW discs

  • Guan, Zhongwei;Rodd, Peter
    • Structural Engineering and Mechanics
    • /
    • v.16 no.4
    • /
    • pp.391-404
    • /
    • 2003
  • Local reinforcement in dowel type timber joints is essential to improve ductility, to increase load carrying capacity and to reduce the risk of brittle failure, especially in the case of using solid dowel. In many types of reinforcing materials available today, DVW (densified veneer wood) has been demonstrated to be the most advantages in terms of compatibility, embedding performance and ductility. Preliminary studies show that using appropriately sized DVW discs bonded into the timber interfaces may be an effective way to reinforce the connection. In this paper, non-linear 3-dimensional finite element models, incorporating orthotropic and non-linear material behaviour, have been developed to simulate structural performance of the timber joints locally reinforced by DVW discs. Different contact algorithms were applied to simulate contact conditions in the joints. The models were validated by the corresponding structural tests. Correlation between the experimental results and the finite element simulations is reasonably good. Using validated finite element models, parametric studies were undertaken to investigate effects of the DVW disc sizes and the end distances on shear stresses and normal stresses in a possible failure plane in the joint.

Comparative Study of Dimension Reduction Methods for Highly Imbalanced Overlapping Churn Data

  • Lee, Sujee;Koo, Bonhyo;Jung, Kyu-Hwan
    • Industrial Engineering and Management Systems
    • /
    • v.13 no.4
    • /
    • pp.454-462
    • /
    • 2014
  • Retention of possible churning customer is one of the most important issues in customer relationship management, so companies try to predict churn customers using their large-scale high-dimensional data. This study focuses on dealing with large data sets by reducing the dimensionality. By using six different dimension reduction methods-Principal Component Analysis (PCA), factor analysis (FA), locally linear embedding (LLE), local tangent space alignment (LTSA), locally preserving projections (LPP), and deep auto-encoder-our experiments apply each dimension reduction method to the training data, build a classification model using the mapped data and then measure the performance using hit rate to compare the dimension reduction methods. In the result, PCA shows good performance despite its simplicity, and the deep auto-encoder gives the best overall performance. These results can be explained by the characteristics of the churn prediction data that is highly correlated and overlapped over the classes. We also proposed a simple out-of-sample extension method for the nonlinear dimension reduction methods, LLE and LTSA, utilizing the characteristic of the data.