• Title/Summary/Keyword: 문제 생성

Search Result 4,184, Processing Time 0.038 seconds

Evaluation of Control Pollination Efficiency and Management Status in Control Pollinated Progeny Populations of Pinus densiflora using Pedigree Analysis based on Microsatellite Markers (소나무 인공교배 차대집단에서 Microsatellite marker 혈통분석을 이용한 인공교배 효율 및 관리상태 평가)

  • Tae-Lim Yeo;Jihun Kim;Dayoung Lee;Kyu-Suk Kang
    • Journal of Korean Society of Forest Science
    • /
    • v.112 no.2
    • /
    • pp.157-172
    • /
    • 2023
  • Controlled pollination (CP) is an important method in tree breeding programs because CP quickly generates desirable genotypes and rapidly maximizes genetic gains. However, few studies have evaluated the efficiency and success rate of CP in the breeding program of Pinus densiflora. To evaluate CP and the management of control pollinated progenies, we used 159 individuals in CB2 × KW40 or KW40 × CB2 populations that were established in 2015. After genotyping microsatellite loci, we estimated whether the number of primers was sufficient or not. Then, we performed pedigree analysis. The result showed that the number of primers was sufficient. By pedigree analysis, we found out that 60 of 159 individuals had been generated by the mating between CB2 and KW40. In the maternity analysis, there was evidence to indicate the possibility of management problems. Therefore, we excluded 54 individuals and repeated the pedigree analysis. In the second analysis, 47 of 105 individuals were generated by the mating between CB2 and KW40. To increase the efficiency of CP in tree breeding programs, several precautions are required. It is necessary to identify the exact clone names of the mother and father trees. In addition, CP processes should be performed properly, including deciding on the schedule of CP and the isolation of female strobili or flowers. Finally, the monitoring of hybrid progenies management after mating is important. Molecular markers should be used to identify the clone names of the mother and father trees and for monitoring post hoc management. This study provides a reference for tree breeding programs for the future control pollination of pine species.

Prediction of Key Variables Affecting NBA Playoffs Advancement: Focusing on 3 Points and Turnover Features (미국 프로농구(NBA)의 플레이오프 진출에 영향을 미치는 주요 변수 예측: 3점과 턴오버 속성을 중심으로)

  • An, Sehwan;Kim, Youngmin
    • Journal of Intelligence and Information Systems
    • /
    • v.28 no.1
    • /
    • pp.263-286
    • /
    • 2022
  • This study acquires NBA statistical information for a total of 32 years from 1990 to 2022 using web crawling, observes variables of interest through exploratory data analysis, and generates related derived variables. Unused variables were removed through a purification process on the input data, and correlation analysis, t-test, and ANOVA were performed on the remaining variables. For the variable of interest, the difference in the mean between the groups that advanced to the playoffs and did not advance to the playoffs was tested, and then to compensate for this, the average difference between the three groups (higher/middle/lower) based on ranking was reconfirmed. Of the input data, only this year's season data was used as a test set, and 5-fold cross-validation was performed by dividing the training set and the validation set for model training. The overfitting problem was solved by comparing the cross-validation result and the final analysis result using the test set to confirm that there was no difference in the performance matrix. Because the quality level of the raw data is high and the statistical assumptions are satisfied, most of the models showed good results despite the small data set. This study not only predicts NBA game results or classifies whether or not to advance to the playoffs using machine learning, but also examines whether the variables of interest are included in the major variables with high importance by understanding the importance of input attribute. Through the visualization of SHAP value, it was possible to overcome the limitation that could not be interpreted only with the result of feature importance, and to compensate for the lack of consistency in the importance calculation in the process of entering/removing variables. It was found that a number of variables related to three points and errors classified as subjects of interest in this study were included in the major variables affecting advancing to the playoffs in the NBA. Although this study is similar in that it includes topics such as match results, playoffs, and championship predictions, which have been dealt with in the existing sports data analysis field, and comparatively analyzed several machine learning models for analysis, there is a difference in that the interest features are set in advance and statistically verified, so that it is compared with the machine learning analysis result. Also, it was differentiated from existing studies by presenting explanatory visualization results using SHAP, one of the XAI models.

Carbon Dioxide-based Plastic Pyrolysis for Hydrogen Production Process: Sustainable Recycling of Waste Fishing Nets (이산화탄소 기반 플라스틱 열분해 수소 생산 공정: 지속가능한 폐어망 재활용)

  • Yurim Kim;Seulgi Lee;Sungyup Jung;Jaewon Lee;Hyungtae Cho
    • Korean Chemical Engineering Research
    • /
    • v.62 no.1
    • /
    • pp.36-43
    • /
    • 2024
  • Fishing net waste (FNW) constitutes over half of all marine plastic waste and is a major contributor to the degradation of marine ecosystems. While current treatment options for FNW include incineration, landfilling, and mechanical recycling, these methods often result in low-value products and pollutant emissions. Importantly, FNWs, comprised of plastic polymers, can be converted into valuable resources like syngas and pyrolysis oil through pyrolysis. Thus, this study presents a process for generating high-purity hydrogen (H2) by catalytically pyrolyzing FNW in a CO2 environment. The proposed process comprises of three stages: First, the pretreated FNW undergoes Ni/SiO2 catalytic pyrolysis under CO2 conditions to produce syngas and pyrolysis oil. Second, the produced pyrolysis oil is incinerated and repurposed as an energy source for the pyrolysis reaction. Lastly, the syngas is transformed into high-purity H2 via the Water-Gas-Shift (WGS) reaction and Pressure Swing Adsorption (PSA). This study compares the results of the proposed process with those of traditional pyrolysis conducted under N2 conditions. Simulation results show that pyrolyzing 500 kg/h of FNW produced 2.933 kmol/h of high-purity H2 under N2 conditions and 3.605 kmol/h of high-purity H2 under CO2 conditions. Furthermore, pyrolysis under CO2 conditions improved CO production, increasing H2 output. Additionally, the CO2 emissions were reduced by 89.8% compared to N2 conditions due to the capture and utilization of CO2 released during the process. Therefore, the proposed process under CO2 conditions can efficiently recycle FNW and generate eco-friendly hydrogen product.

Establishment of Test Conditions and Interlaboratory Comparison Study of Neuro-2a Assay for Saxitoxin Detection (Saxitoxin 검출을 위한 Neuro-2a 시험법 조건 확립 및 실험실 간 변동성 비교 연구)

  • Youngjin Kim;Jooree Seo;Jun Kim;Jeong-In Park;Jong Hee Kim;Hyun Park;Young-Seok Han;Youn-Jung Kim
    • Journal of Marine Life Science
    • /
    • v.9 no.1
    • /
    • pp.9-21
    • /
    • 2024
  • Paralytic shellfish poisoning (PSP) including Saxitoxin (STX) is caused by harmful algae, and poisoning occurs when the contaminated seafood is consumed. The mouse bioassay (MBA), a standard test method for detecting PSP, is being sanctioned in many countries due to its low detection limit and the animal concerns. An alternative to the MBA is the Neuro-2a cell-based assay. This study aimed to establish various test conditions for Neuro-2a assay, including cell density, culture conditions, and STX treatment conditions, to suit the domestic laboratory environment. As a result, the initial cell density was set to 40,000 cells/well and the incubation time to 24 hours. Additionally, the concentration of Ouabain and Veratridine (O/V) was set to 500/50 μM, at which most cells died. In this study, we identified eight concentrations of STX, ranging from 368 to 47,056 fg/μl, which produced an S-shaped dose-response curve when treated with O/V. Through inter-laboratory variability comparison of the Neuro-2a assay, we established five Quality Control Criteria to verify the appropriateness of the experiments and six Data Criteria (Top and Bottom OD, EC50, EC20, Hill slop, and R2 of graph) to determine the reliability of the experimental data. The Neuro-2a assay conducted under the established conditions showed an EC50 value of approximately 1,800~3,500 fg/μl. The intra- & inter-lab variability comparison results showed that the coefficients of variation (CVs) for the Quality Control and Data values ranged from 1.98% to 29.15%, confirming the reproducibility of the experiments. This study presented Quality Control Criteria and Data Criteria to assess the appropriateness of the experiments and confirmed the excellent repeatability and reproducibility of the Neuro-2a assay. To apply the Neuro-2a assay as an alternative method for detecting PSP in domestic seafood, it is essential to establish a toxin extraction method from seafood and toxin quantification methods, and perform correlation analysis with MBA and instrumental analysis methods.

A Study on the Digital Drawing of Archaeological Relics Using Open-Source Software (오픈소스 소프트웨어를 활용한 고고 유물의 디지털 실측 연구)

  • LEE Hosun;AHN Hyoungki
    • Korean Journal of Heritage: History & Science
    • /
    • v.57 no.1
    • /
    • pp.82-108
    • /
    • 2024
  • With the transition of archaeological recording method's transition from analog to digital, the 3D scanning technology has been actively adopted within the field. Research on the digital archaeological digital data gathered from 3D scanning and photogrammetry is continuously being conducted. However, due to cost and manpower issues, most buried cultural heritage organizations are hesitating to adopt such digital technology. This paper aims to present a digital recording method of relics utilizing open-source software and photogrammetry technology, which is believed to be the most efficient method among 3D scanning methods. The digital recording process of relics consists of three stages: acquiring a 3D model, creating a joining map with the edited 3D model, and creating an digital drawing. In order to enhance the accessibility, this method only utilizes open-source software throughout the entire process. The results of this study confirms that in terms of quantitative evaluation, the deviation of numerical measurement between the actual artifact and the 3D model was minimal. In addition, the results of quantitative quality analysis from the open-source software and the commercial software showed high similarity. However, the data processing time was overwhelmingly fast for commercial software, which is believed to be a result of high computational speed from the improved algorithm. In qualitative evaluation, some differences in mesh and texture quality occurred. In the 3D model generated by opensource software, following problems occurred: noise on the mesh surface, harsh surface of the mesh, and difficulty in confirming the production marks of relics and the expression of patterns. However, some of the open source software did generate the quality comparable to that of commercial software in quantitative and qualitative evaluations. Open-source software for editing 3D models was able to not only post-process, match, and merge the 3D model, but also scale adjustment, join surface production, and render image necessary for the actual measurement of relics. The final completed drawing was tracked by the CAD program, which is also an open-source software. In archaeological research, photogrammetry is very applicable to various processes, including excavation, writing reports, and research on numerical data from 3D models. With the breakthrough development of computer vision, the types of open-source software have been diversified and the performance has significantly improved. With the high accessibility to such digital technology, the acquisition of 3D model data in archaeology will be used as basic data for preservation and active research of cultural heritage.

Air Pollution and Its Effects on E.N.T. Field (대기오염과 이비인후과)

  • 박인용
    • Proceedings of the KOR-BRONCHOESO Conference
    • /
    • 1972.03a
    • /
    • pp.6-7
    • /
    • 1972
  • The air pollutants can be classified into the irritant gas and the asphixation gas, and the irritant gas is closely related to the otorhinolaryngological diseases. The common irritant gases are nitrogen oxides, sulfur oxides, hydrogen carbon compounds, and the potent and irritating PAN (peroxy acyl nitrate) which is secondarily liberated from photosynthesis. Those gases adhers to the mucous membrane to result in ulceration and secondary infection due to their potent oxidizing power. 1. Sulfur dioxide gas Sulfur dioxide gas has the typical characteristics of the air pollutants. Because of its high solubility it gets easily absorbed in the respiratory tract, when the symptoms and signs by irritation become manifested initially and later the resistance in the respiratory tract brings central about pulmonary edema and respiratory paralysis of origin. Chronic exposure to the gas leads to rhinitis, pharyngitis, laryngitis, and olfactory or gustatory disturbances. 2. Carbon monoxide Toxicity of carbon monoxide is due to its deprivation of the oxygen carrying capacity of the hemoglobin. The degree of the carbon monoxide intoxication varies according to its concentration and the duration of inhalation. It starts with headache, vertigo, nausea, vomiting and tinnitus, which can progress to respiratory difficulty, muscular laxity, syncope, and coma leading to death. 3. Nitrogen dioxide Nitrogen dioxide causes respiratory disturbances by formation of methemoglobin. In acute poisoning, it can cause pulmonary congestion, pulmonary edema, bronchitis, and pneumonia due to its strong irritation on the eyes and the nose. In chronic poisoning, it causes chronic pulmonary fibrosis and pulmonary edema. 4. Ozone It has offending irritating odor, and causes dryness of na sopharyngolaryngeal mucosa, headache and depressed pulmonary function which may eventually lead to pulmonary congestion or edema. 5. Smog The most outstanding incident of the smog occurred in London from December 5 through 8, 1952, because of which the mortality of the respiratory diseases increased fourfold. The smog was thought to be due to the smoke produced by incomplete combustion and its byproduct the sulfur oxides, and the dust was thought to play the secondary role. In new sense, hazardous is the photochemical smog which is produced by combination of light energy and the hydrocarbons and oxidant in the air. The Yonsei University Institute for Environmental :pollution Research launched a project to determine the relationship between the pollution and the medical, ophthalmological and rhinopharyngological disorders. The students (469) of the "S" Technical School in the most heavily polluted area in Pusan (Uham Dong district) were compared with those (345) of "K" High School in the less polluted area. The investigated group had those with subjective symptoms twice as much as the control group, 22.6% (106) in investigated group and 11.3% (39) in the control group. Among those symptomatic students of the investigated group. There were 29 with respiratory symptoms (29%), 22 with eye symptoms (21%), 50 with stuffy nose and rhinorrhea (47%), and 5 with sore thorat (5%), which revealed that more than half the students (52%) had subjective symptoms of the rhinopharyngological aspects. Physical examination revealed that the investigated group had more number of students with signs than those of the control group by 10%, 180 (38.4%) versus 99 (28.8%). Among the preceding 180 students of the investigated group, there were 8 with eye diseases (44%), 1 with respiratory disease (0.6%), 97 with rhinitis (54%), and 74 with pharyngotonsillitis (41%) which means that 95% of them had rharygoical diseases. The preceding data revealed that the otolaryngological diseases are conspicuously outnumbered in the heavily polluted area, and that there must be very close relationship between the air pollution and the otolaryngological diseases, and the anti-pollution measure is urgently needed.

  • PDF

A Study of 'Emotion Trigger' by Text Mining Techniques (텍스트 마이닝을 이용한 감정 유발 요인 'Emotion Trigger'에 관한 연구)

  • An, Juyoung;Bae, Junghwan;Han, Namgi;Song, Min
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.2
    • /
    • pp.69-92
    • /
    • 2015
  • The explosion of social media data has led to apply text-mining techniques to analyze big social media data in a more rigorous manner. Even if social media text analysis algorithms were improved, previous approaches to social media text analysis have some limitations. In the field of sentiment analysis of social media written in Korean, there are two typical approaches. One is the linguistic approach using machine learning, which is the most common approach. Some studies have been conducted by adding grammatical factors to feature sets for training classification model. The other approach adopts the semantic analysis method to sentiment analysis, but this approach is mainly applied to English texts. To overcome these limitations, this study applies the Word2Vec algorithm which is an extension of the neural network algorithms to deal with more extensive semantic features that were underestimated in existing sentiment analysis. The result from adopting the Word2Vec algorithm is compared to the result from co-occurrence analysis to identify the difference between two approaches. The results show that the distribution related word extracted by Word2Vec algorithm in that the words represent some emotion about the keyword used are three times more than extracted by co-occurrence analysis. The reason of the difference between two results comes from Word2Vec's semantic features vectorization. Therefore, it is possible to say that Word2Vec algorithm is able to catch the hidden related words which have not been found in traditional analysis. In addition, Part Of Speech (POS) tagging for Korean is used to detect adjective as "emotional word" in Korean. In addition, the emotion words extracted from the text are converted into word vector by the Word2Vec algorithm to find related words. Among these related words, noun words are selected because each word of them would have causal relationship with "emotional word" in the sentence. The process of extracting these trigger factor of emotional word is named "Emotion Trigger" in this study. As a case study, the datasets used in the study are collected by searching using three keywords: professor, prosecutor, and doctor in that these keywords contain rich public emotion and opinion. Advanced data collecting was conducted to select secondary keywords for data gathering. The secondary keywords for each keyword used to gather the data to be used in actual analysis are followed: Professor (sexual assault, misappropriation of research money, recruitment irregularities, polifessor), Doctor (Shin hae-chul sky hospital, drinking and plastic surgery, rebate) Prosecutor (lewd behavior, sponsor). The size of the text data is about to 100,000(Professor: 25720, Doctor: 35110, Prosecutor: 43225) and the data are gathered from news, blog, and twitter to reflect various level of public emotion into text data analysis. As a visualization method, Gephi (http://gephi.github.io) was used and every program used in text processing and analysis are java coding. The contributions of this study are as follows: First, different approaches for sentiment analysis are integrated to overcome the limitations of existing approaches. Secondly, finding Emotion Trigger can detect the hidden connections to public emotion which existing method cannot detect. Finally, the approach used in this study could be generalized regardless of types of text data. The limitation of this study is that it is hard to say the word extracted by Emotion Trigger processing has significantly causal relationship with emotional word in a sentence. The future study will be conducted to clarify the causal relationship between emotional words and the words extracted by Emotion Trigger by comparing with the relationships manually tagged. Furthermore, the text data used in Emotion Trigger are twitter, so the data have a number of distinct features which we did not deal with in this study. These features will be considered in further study.

Relationships between Micronutrient Contents in Soils and Crops of Plastic Film House (시설재배 토양과 작물 잎 중의 미량원소 함량 관계)

  • Chung, Jong-Bae;Kim, Bok-Jin;Ryu, Kwan-Sig;Lee, Seung-Ho;Shin, Hyun-Jin;Hwang, Tae-Kyung;Choi, Hee-Youl;Lee, Yong-Woo;Lee, Yoon-Jeong;Kim, Jong-Jib
    • Korean Journal of Environmental Agriculture
    • /
    • v.25 no.3
    • /
    • pp.217-227
    • /
    • 2006
  • Micronutrient status in soils and crops of plastic film house and their relationship were investigated. Total 203 plastic film houses were selected (red pepper, 66; cucumber, 63; tomato, 74) in Yeongnam region and soil and leaf samples were collected. Hot-water extractable B and 0.1 N HCl extractable Cu, Zn, Fe, and Mn in soil samples and total micronutrients in leaf samples were analyzed. Contents Zn, Fe, and Mn in most of the investigated soils were higher than the upper limits of optimum level for general crop cultivation. Contents of Cu in most soils of cucumber and tomato cultivation were higher than the upper limit of optimum level, but Cu contents in about 30% of red pepper cultivation soils were below the sufficient level. Contents of B in most soils of cucumber and tomato were above the sufficient level but in 48% of red pepper cultivation soils B were found to be deficient. Micronutrient contents in leaf of investigated crops were much variable. Contents of B, Fe, and Mn were mostly within the sufficient levels, while in 71% of red pepper samples Cu was under deficient level and in 44% of cucumber samples Cu contents were higher than the upper limit of sufficient level. Contents of Zn in red pepper and cucumber samples were mostly within the sufficient level but in 62% of tomato samples Zn contents were under deficient condition. However, any visible deficiency or toxicity symptoms of micronutrients were not found in the crops. No consistent relationships were found between micronutrient contents in soil and leaf, and this indicates that growth and absorption activity of root and interactions among the nutrients in soil might be important factors in overall micronutrient uptake of crops. For best management of micronutrients in plastic film house, much attention should be focused on the management of soil and plant characteristics which control the micronutrient uptake of crops.

Performance analysis of Frequent Itemset Mining Technique based on Transaction Weight Constraints (트랜잭션 가중치 기반의 빈발 아이템셋 마이닝 기법의 성능분석)

  • Yun, Unil;Pyun, Gwangbum
    • Journal of Internet Computing and Services
    • /
    • v.16 no.1
    • /
    • pp.67-74
    • /
    • 2015
  • In recent years, frequent itemset mining for considering the importance of each item has been intensively studied as one of important issues in the data mining field. According to strategies utilizing the item importance, itemset mining approaches for discovering itemsets based on the item importance are classified as follows: weighted frequent itemset mining, frequent itemset mining using transactional weights, and utility itemset mining. In this paper, we perform empirical analysis with respect to frequent itemset mining algorithms based on transactional weights. The mining algorithms compute transactional weights by utilizing the weight for each item in large databases. In addition, these algorithms discover weighted frequent itemsets on the basis of the item frequency and weight of each transaction. Consequently, we can see the importance of a certain transaction through the database analysis because the weight for the transaction has higher value if it contains many items with high values. We not only analyze the advantages and disadvantages but also compare the performance of the most famous algorithms in the frequent itemset mining field based on the transactional weights. As a representative of the frequent itemset mining using transactional weights, WIS introduces the concept and strategies of transactional weights. In addition, there are various other state-of-the-art algorithms, WIT-FWIs, WIT-FWIs-MODIFY, and WIT-FWIs-DIFF, for extracting itemsets with the weight information. To efficiently conduct processes for mining weighted frequent itemsets, three algorithms use the special Lattice-like data structure, called WIT-tree. The algorithms do not need to an additional database scanning operation after the construction of WIT-tree is finished since each node of WIT-tree has item information such as item and transaction IDs. In particular, the traditional algorithms conduct a number of database scanning operations to mine weighted itemsets, whereas the algorithms based on WIT-tree solve the overhead problem that can occur in the mining processes by reading databases only one time. Additionally, the algorithms use the technique for generating each new itemset of length N+1 on the basis of two different itemsets of length N. To discover new weighted itemsets, WIT-FWIs performs the itemset combination processes by using the information of transactions that contain all the itemsets. WIT-FWIs-MODIFY has a unique feature decreasing operations for calculating the frequency of the new itemset. WIT-FWIs-DIFF utilizes a technique using the difference of two itemsets. To compare and analyze the performance of the algorithms in various environments, we use real datasets of two types (i.e., dense and sparse) in terms of the runtime and maximum memory usage. Moreover, a scalability test is conducted to evaluate the stability for each algorithm when the size of a database is changed. As a result, WIT-FWIs and WIT-FWIs-MODIFY show the best performance in the dense dataset, and in sparse dataset, WIT-FWI-DIFF has mining efficiency better than the other algorithms. Compared to the algorithms using WIT-tree, WIS based on the Apriori technique has the worst efficiency because it requires a large number of computations more than the others on average.

Characteristics of Everyday Movement Represented in Steve Paxton's Works: Focused on Satisfyin' Lover, Bound, Contact at 10th & 2nd- (스티브 팩스톤(Steve Paxton)의 작품에서 나타난 일상적 움직임의 특성에 관한 연구: , , 를 중심으로)

  • KIM, Hyunhee
    • Trans-
    • /
    • v.3
    • /
    • pp.109-135
    • /
    • 2017
  • The purpose of this thesis is to analyze characteristics of everyday movement showed in performances of Steve Paxton. A work of art has been realized as a special object enjoyed by high class people as high culture for a long time. Therefore, a gap between everyday life and art has been greatly existed, and the emergence of everyday elements in a work of art means that public awareness involving social change is changed. The postmodernism as the period when a boundary between art and everyday life is uncertain was a postwar society after the Second World War and a social situation that rapidly changes into a capitalistic society. Changes in this time made scholars gain access academically concepts related to everyday life, and affected artists as the spirit of the times of pluralistic postmodernism refusing totality. At the same period of the time, modern dance also faced a turning point as post-modern dance. After the Second World War, modern dance started to be evaluated as it reaches the limit, and at this juncture, headed by dancers including the Judson Dance Theatre. Acting as a dancer in a dance company of Merce Cunningham, Steve Paxton, one of founders of the Judson Dance Theatre, had a critical mind of the conditions of dance company with the social structure and the process that movement is made. This thinking is showed in early performances as an at tempt to realize everyday motion it self in performances. His early activity represented by a walking motion attracted attention as a simple motion that excludes all artful elements of existing dance performances and is possible to conduct by a person who is not a dancer. Although starting the use of everyday movement is regarded as an open characteristic of post-modern dance, advanced researches on this were rare, so this study started. In addition, studies related to Steve Paxton are skewed towards Contact Improvisation that he rose as an active practician. As the use of ordinary movement before he focused on Contact Improvisation, this study examines other attempts including Contact Improvisation as attempts after the beginning of his performances. Therefore, the study analyzes Satisfyin' Lover, Contact at 10th & 2nd and Bound that are performances of Steve Paxton, and based on this, draws everyday characteristics. In addition, related books, academic essays, dance articles and reviews are consulted to consider a concept related to everyday life and understand dance historical movement of post-modern dance. Paxton attracted attention because of his activity starting at critical approach of movement of existing modern dance. As walking of performers who are not dancers, a walking motion showed in Satisfyin' Lover gave esthetic meaning to everyday movement. After that, he was affected by Eastern ideas, so developed Contact Improvisation making a motion through energy of the natural laws. In addition, he had everyday things on his performances, and used a method to deliver various images by using mundane movement and impromptu gestures originating from relaxed body. Everyday movement of his performances represents change in awareness of performances of the art of dancing that are traditionally maintained including change of dance genre of an area. His activity with unprecedented attempt and experimentation should be highly evaluated as efforts to overcome the limit of modern dance.

  • PDF