• Title/Summary/Keyword: Embeddings

Search Result 91, Processing Time 0.031 seconds

PROJECTIONS OF BOUQUET GRAPH WITH TWO CYCLES

  • Huh, Young-Sik
    • Journal of the Korean Mathematical Society
    • /
    • v.45 no.5
    • /
    • pp.1341-1360
    • /
    • 2008
  • In this paper we investigate the projections of bouquet graph B with two cycles. A projection of B is said to be trivial if only trivial embeddings are obtained from the projection. It is shown that, to cover all nontrivial projections of B, at least three embeddings of B are needed. We also show that a nontrivial projection of B is covered by one of some two embeddings if the image of each cycle has at most one self-crossing.

SMOOTH HOROSPHERICAL VARIETIES OF PICARD NUMBER ONE AS LINEAR SECTIONS OF RATIONAL HOMOGENEOUS VARIETIES

  • Hong, Jaehyun
    • Journal of the Korean Mathematical Society
    • /
    • v.53 no.2
    • /
    • pp.433-446
    • /
    • 2016
  • We construct projective embeddings of horospherical varieties of Picard number one by means of Fano varieties of cones over rational homogeneous varieties. Then we use them to give embeddings of smooth horospherical varieties of Picard number one as linear sections of rational homogeneous varieties.

ENUMERATING EMBEDDINGS OF A DARTBOARD GRAPH INTO SURFACES

  • Kim, Jin-Hwan;Kim, Hye-Kyung;Lim, Dae-Keun
    • Communications of the Korean Mathematical Society
    • /
    • v.11 no.4
    • /
    • pp.1095-1104
    • /
    • 1996
  • We enumerate the congruence classes of 2-cell embeddings of a dartboard graph into surfaces with respect to a group consisting of graph automorphisms of a dartboard graph.

  • PDF

A Discourse-based Compositional Approach to Overcome Drawbacks of Sequence-based Composition in Text Modeling via Neural Networks (신경망 기반 텍스트 모델링에 있어 순차적 결합 방법의 한계점과 이를 극복하기 위한 담화 기반의 결합 방법)

  • Lee, Kangwook;Han, Sanggyu;Myaeng, Sung-Hyon
    • KIISE Transactions on Computing Practices
    • /
    • v.23 no.12
    • /
    • pp.698-702
    • /
    • 2017
  • Since the introduction of Deep Neural Networks to the Natural Language Processing field, two major approaches have been considered for modeling text. One method involved learning embeddings, i.e. the distributed representations containing abstract semantics of words or sentences, with the textual context. The other strategy consisted of composing the embeddings trained by the above to get embeddings of longer texts. However, most studies of the composition methods just adopt word embeddings without consideration of the optimal embedding unit and the optimal method of composition. In this paper, we conducted experiments to analyze the optimal embedding unit and the optimal composition method for modeling longer texts, such as documents. In addition, we suggest a new discourse-based composition to overcome the limitation of the sequential composition method on composing sentence embeddings.

ON DIFFERENTIAL INVARIANTS OF HYPERPLANE SYSTEMS ON NONDEGENERATE EQUIVARIANT EMBEDDINGS OF HOMOGENEOUS SPACES

  • HONG, JAEHYUN
    • Communications of the Korean Mathematical Society
    • /
    • v.30 no.3
    • /
    • pp.253-267
    • /
    • 2015
  • Given a complex submanifoldM of the projective space $\mathbb{P}$(T), the hyperplane system R on M characterizes the projective embedding of M into $\mathbb{P}$(T) in the following sense: for any two nondegenerate complex submanifolds $M{\subset}\mathbb{P}$(T) and $M^{\prime}{\subset}\mathbb{P}$(T'), there is a projective linear transformation that sends an open subset of M onto an open subset of M' if and only if (M,R) is locally equivalent to (M', R'). Se-ashi developed a theory for the differential invariants of these types of systems of linear differential equations. In particular, the theory applies to systems of linear differential equations that have symbols equivalent to the hyperplane systems on nondegenerate equivariant embeddings of compact Hermitian symmetric spaces. In this paper, we extend this result to hyperplane systems on nondegenerate equivariant embeddings of homogeneous spaces of the first kind.

ON ORBIFOLD EMBEDDINGS

  • Cho, Cheol-Hyun;Hong, Hansol;Shin, Hyung-Seok
    • Journal of the Korean Mathematical Society
    • /
    • v.50 no.6
    • /
    • pp.1369-1400
    • /
    • 2013
  • The concept of "orbifold embedding" is introduced. This is more general than sub-orbifolds. Some properties of orbifold embeddings are studied, and in the case of translation groupoids, orbifold embedding is shown to be equivalent to a strong equivariant immersion.

CLASSIFICATION OF REFLEXIBLE EDGE-TRANSITIVE EMBEDDINGS OF $K_{m,n}$ FOR ODD m, n

  • Kwon, Young-Soo
    • East Asian mathematical journal
    • /
    • v.25 no.4
    • /
    • pp.533-541
    • /
    • 2009
  • In this paper, we classify reflexible edge-transitive embeddings of complete bipartite graphs $K_{m,n}$ for any odd positive integers m and n. As a result, for any odd m, n, it will be shown that there exists only one reflexible edge-transitive embedding of $K_{m,n}$ up to isomorphism.

An Efficient Deep Learning Ensemble Using a Distribution of Label Embedding

  • Park, Saerom
    • Journal of the Korea Society of Computer and Information
    • /
    • v.26 no.1
    • /
    • pp.27-35
    • /
    • 2021
  • In this paper, we propose a new stacking ensemble framework for deep learning models which reflects the distribution of label embeddings. Our ensemble framework consists of two phases: training the baseline deep learning classifier, and training the sub-classifiers based on the clustering results of label embeddings. Our framework aims to divide a multi-class classification problem into small sub-problems based on the clustering results. The clustering is conducted on the label embeddings obtained from the weight of the last layer of the baseline classifier. After clustering, sub-classifiers are constructed to classify the sub-classes in each cluster. From the experimental results, we found that the label embeddings well reflect the relationships between classification labels, and our ensemble framework can improve the classification performance on a CIFAR 100 dataset.

Latent Semantic Analysis Approach for Document Summarization Based on Word Embeddings

  • Al-Sabahi, Kamal;Zuping, Zhang;Kang, Yang
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.1
    • /
    • pp.254-276
    • /
    • 2019
  • Since the amount of information on the internet is growing rapidly, it is not easy for a user to find relevant information for his/her query. To tackle this issue, the researchers are paying much attention to Document Summarization. The key point in any successful document summarizer is a good document representation. The traditional approaches based on word overlapping mostly fail to produce that kind of representation. Word embedding has shown good performance allowing words to match on a semantic level. Naively concatenating word embeddings makes common words dominant which in turn diminish the representation quality. In this paper, we employ word embeddings to improve the weighting schemes for calculating the Latent Semantic Analysis input matrix. Two embedding-based weighting schemes are proposed and then combined to calculate the values of this matrix. They are modified versions of the augment weight and the entropy frequency that combine the strength of traditional weighting schemes and word embedding. The proposed approach is evaluated on three English datasets, DUC 2002, DUC 2004 and Multilingual 2015 Single-document Summarization. Experimental results on the three datasets show that the proposed model achieved competitive performance compared to the state-of-the-art leading to a conclusion that it provides a better document representation and a better document summary as a result.