• Title/Summary/Keyword: 어휘분석

Search Result 863, Processing Time 0.02 seconds

A Semantic Interpretation of the Design Language in the ChwuiseokJeong Wonlim of Gochang - Focusing on the Alegory and Mimesis in 'Chwuiseok' and 'Chilseongam' - (취석정원림에 담긴 조형언어의 의미론적 해석 - '취석'과 '칠성암'에 담긴 알레고리와 미메시스를 중심으로 -)

  • Rho, Jae-Hyun;Lee, Hyun-Woo;Lee, Jung-Han
    • Journal of the Korean Institute of Traditional Landscape Architecture
    • /
    • v.30 no.1
    • /
    • pp.76-89
    • /
    • 2012
  • This study aimed at carrying out a semantic interpretation of the core Design language that seemed to influence deeply in the creation of the ChwuiseokJeong wonlim of Gochang. Especially, this paper aimed at inferring how the spiritual culture of seclusion of the 16th century influenced the creation of the wonlim by understanding the metaphor and symbolism by grasping the transmission meaning and reception meaning of the creators and the people concerned with keywords like Eunil(隱逸: seclusion), Chwuiseok(醉石), and Chilseongam(七星巖). 'Building up a wall' was intentionally carried out in order to represent 'Seven Stars(The Big Dipper)' inside of the wonlim. This is a kind of two-dimensional 'enframement', and a result of active creation of a meaningful landscape. From Chilseongam that was created by assembling, we presumed that Kyung-Hee Kim, Nohgye(蘆溪), the creator showed the recognition and thoughts of astronomy as a Confucian scholar that the ChwuiseokJeong Wonlim where he secluded is the center of the universe. The interpretation of words in Nohgyezip, an anthology, showed that the articles and writtings of Nohgye, his decsendants, and the people of ChwuiseokJeong included alcohols, Chwuiseok, Yeon-Myung Do, and Yuli(栗里) where Do secluded; this means that Nohgye ranked himself with Do because Nohgye also lived in peace by drinking alcohols and enjoying nature like Do did. 'Drinking' was what expressed the mind of Nohgye who wanted to be free and have the joy of enjoying mountains, water, and their landscape like Do did. In other words, 'Drinking' is the symbol of freedom that makes him forget himself and equate himself with nature. These are the representation, imitation, and mimesis of respecting Yeon-Myung Do. As the alegory of 'speaking something with other things' suggested, it is possible to read 'Chwuiseok', came from the story of Yeon-Myung Do, in multiple ways; it superficially points out 'a rock on which he laid when he was drinking', but it also can be interpreted as 'an object' that made him forget his personal troubles. In addition, it means freewill protecting unselfish mind with the spiritual aberration of drinking, 'Chwui(醉)', mentally; also, it can be interpreted metaphorically and broadly as a tool that makes Nohgye reach to the state of nature by the satisfied mind of Yeon-Myung Do. 'Chwuiseok' was a design language that showed the situation of Nohgye by comparing his mind with the mind of Yeon-Myung Do from the Confucian point of view, and a kind of behavioral mimesis based on his respect to Do and 'aesthetic representation of objective reality.' It is not coincidental that this mimesis was shown in the engraved words on Chwuiseok and the creation of ChwuiseokJeong that has the same name with Chwuiseok in Korea and China.

An Interpretation of the Landscape Meaning and Culture of Anpyung-Daegun(Prince)'s Bihaedang Garden (안평대군 비해당(匪懈堂) 원림의 의미경관과 조경문화)

  • Shin, Sang-Sup;Rho, Jae-Hyun
    • Journal of the Korean Institute of Traditional Landscape Architecture
    • /
    • v.29 no.1
    • /
    • pp.28-37
    • /
    • 2011
  • In this study, the series-poem, Bihaedangsasippalyoung(48 poems for beautiful scene of Bihaedang), written by scholars of Jiphyonjeon for Bihaedang garden of Anpyung-Daegun(Prince Anpyung, 1416-1453), was analyzed focusing on scenery lexeme to interpret the meaning of scenery and gardening culture of Sadaebu(noblemen) during the first term of Chosun Dynasty. The study result is as followings. First, the subtitle of Sasippalyoung(48 poems) written by Anpyung-Daegun while he grew Bihaedang garden on the foot of Inwang Mountain showed repetitive nomativity comparing joining of yin and yang, such as life and form of animal and plan, time and space, meaning and symbolism, etc. Among scenery lexemes, 38 are represented plant and flowers, and 8 are represented gardening ornaments and animals. Second, the names of gardens were expressed as Wonrim, Jongje, Imchon(Trees and Ponds), or Hwawon(Flower garden), or also presented as Gongjeong(Empty garden), Manwon(Full garden), Jungjeong(Middle garden), Huwon(Backyard), Wonrak(Inner court), or Byulwon(Seperated garden) depending on density and location. In addition, there were pavilions and ponds, stepping stones and stairs, a pergola, a flat bench, flowerpots, an artificial hill, oddly shaped stones, wells, aviary, flower beds, or hedges. A gardener was called Sahwa(flower keeper), planting and gardening of garden trees were called Jaebae(cultivation), a pond island was called Boogoo(floating hill), and miniature landscapes were called Chukjee(reduced land). Third, willows were planted on the outdoor yard, and plum trees were planted in front of the library, which led to bamboo woods road. Peony, camellia, tree peony and crepe myrtle were planted on the inner court with mossy rocks, small artificial hills, glass rocks, flower pots. There were rectangular ponds, while breeding deer, dove, rooster, and cranes. Fourth, landscape elements were enjoyed as metaphysical symbolic landscape by anthropomorphism, such as (1) gentlemen and loyalty, (2) wealth and prosperity, (3) Taoist hermit and poetical life, (4) reclusion and seclusion, (5) filial piety, virtue, introspection, etc. In other words, the garden presented a variety of gardening culture appreciating meaningful landscape, such as investigation of things, reclusion and seclusion, and building orientation of a fairyland yearning eternal youth and Mureungdowon(Taoist Arcadia) by making a garden blending beautiful flowers and trees, with precious birds and animals. Fifth, there were many landscape appreciation schemes, such as Angkyung(looking-up), Bukyung(looking-down), Jeokyung(looking-under), Chakyung(bringing outer space into inside), Yookyung(flower viewing), Yojeong(walking around the garden enjoying flowers), Hwasaekhyangbyuk(flower gardening), and Garden appreciation enjoying landscape through time and seasons with different inspirations.

Korean Sentence Generation Using Phoneme-Level LSTM Language Model (한국어 음소 단위 LSTM 언어모델을 이용한 문장 생성)

  • Ahn, SungMahn;Chung, Yeojin;Lee, Jaejoon;Yang, Jiheon
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.2
    • /
    • pp.71-88
    • /
    • 2017
  • Language models were originally developed for speech recognition and language processing. Using a set of example sentences, a language model predicts the next word or character based on sequential input data. N-gram models have been widely used but this model cannot model the correlation between the input units efficiently since it is a probabilistic model which are based on the frequency of each unit in the training set. Recently, as the deep learning algorithm has been developed, a recurrent neural network (RNN) model and a long short-term memory (LSTM) model have been widely used for the neural language model (Ahn, 2016; Kim et al., 2016; Lee et al., 2016). These models can reflect dependency between the objects that are entered sequentially into the model (Gers and Schmidhuber, 2001; Mikolov et al., 2010; Sundermeyer et al., 2012). In order to learning the neural language model, texts need to be decomposed into words or morphemes. Since, however, a training set of sentences includes a huge number of words or morphemes in general, the size of dictionary is very large and so it increases model complexity. In addition, word-level or morpheme-level models are able to generate vocabularies only which are contained in the training set. Furthermore, with highly morphological languages such as Turkish, Hungarian, Russian, Finnish or Korean, morpheme analyzers have more chance to cause errors in decomposition process (Lankinen et al., 2016). Therefore, this paper proposes a phoneme-level language model for Korean language based on LSTM models. A phoneme such as a vowel or a consonant is the smallest unit that comprises Korean texts. We construct the language model using three or four LSTM layers. Each model was trained using Stochastic Gradient Algorithm and more advanced optimization algorithms such as Adagrad, RMSprop, Adadelta, Adam, Adamax, and Nadam. Simulation study was done with Old Testament texts using a deep learning package Keras based the Theano. After pre-processing the texts, the dataset included 74 of unique characters including vowels, consonants, and punctuation marks. Then we constructed an input vector with 20 consecutive characters and an output with a following 21st character. Finally, total 1,023,411 sets of input-output vectors were included in the dataset and we divided them into training, validation, testsets with proportion 70:15:15. All the simulation were conducted on a system equipped with an Intel Xeon CPU (16 cores) and a NVIDIA GeForce GTX 1080 GPU. We compared the loss function evaluated for the validation set, the perplexity evaluated for the test set, and the time to be taken for training each model. As a result, all the optimization algorithms but the stochastic gradient algorithm showed similar validation loss and perplexity, which are clearly superior to those of the stochastic gradient algorithm. The stochastic gradient algorithm took the longest time to be trained for both 3- and 4-LSTM models. On average, the 4-LSTM layer model took 69% longer training time than the 3-LSTM layer model. However, the validation loss and perplexity were not improved significantly or became even worse for specific conditions. On the other hand, when comparing the automatically generated sentences, the 4-LSTM layer model tended to generate the sentences which are closer to the natural language than the 3-LSTM model. Although there were slight differences in the completeness of the generated sentences between the models, the sentence generation performance was quite satisfactory in any simulation conditions: they generated only legitimate Korean letters and the use of postposition and the conjugation of verbs were almost perfect in the sense of grammar. The results of this study are expected to be widely used for the processing of Korean language in the field of language processing and speech recognition, which are the basis of artificial intelligence systems.