• Title/Summary/Keyword: application case

Search Result 8,026, Processing Time 0.041 seconds

Detection of Surface Changes by the 6th North Korea Nuclear Test Using High-resolution Satellite Imagery (고해상도 위성영상을 활용한 북한 6차 핵실험 이후 지표변화 관측)

  • Lee, Won-Jin;Sun, Jongsun;Jung, Hyung-Sup;Park, Sun-Cheon;Lee, Duk Kee;Oh, Kwan-Young
    • Korean Journal of Remote Sensing
    • /
    • v.34 no.6_4
    • /
    • pp.1479-1488
    • /
    • 2018
  • On September 3rd 2017, strong artificial seismic signals from North Korea were detected in KMA (Korea Meteorological Administration) seismic network. The location of the epicenter was estimated to be Punggye-ri nuclear test site and it was the most powerful to date. The event was not studied well due to accessibility and geodetic measurements. Therefore, we used remote sensing data to analyze surface changes around Mt. Mantap area. First of all, we tried to detect surface deformation using InSAR method with Advanced Land Observation Satellite-2 (ALOS-2). Even though ALOS-2 data used L-band long wavelength, it was not working well for this particular case because of decorrelation on interferogram. The main reason would be large deformation near the Mt. Mantap area. To overcome this limitation of decorrelation, we applied offset tracking method to measure deformation. However, this method is affected by window kernel size. So we applied various window sizes from 32 to 224 in 16 steps. We could retrieve 2D surface deformation of about 3 m in maximum in the west side of Mt. Mantap. Second, we used Pleiadas-A/B high resolution satellite optical images which were acquired before and after the 6th nuclear test. We detected widespread surface damage around the top of Mt. Mantap such as landslide and suspected collapse area. This phenomenon may be caused by a very strong underground nuclear explosion test. High-resolution satellite images could be used to analyze non-accessible area.

Analysis of Commercial Organic Compost Manufactured with Livestock Manure (국내 유통중인 가축분퇴비의 품질 특성)

  • Kim, Myung-Sook;Kim, Seok-Cheol;Park, Seong-Jin;Lee, Chang-Hoon
    • Journal of the Korea Organic Resources Recycling Association
    • /
    • v.26 no.4
    • /
    • pp.21-29
    • /
    • 2018
  • The contents of total nitrogen(T-N), phosphate($T-P_2O_5$), and potash($T-K_2O$) are important factors to determine the application rate of the livestock compost to prevent nutrients accumulation and maintain their appropriate levels in arable lands. The concentrations of nutrient, organic matter, salt, water content, heavy metal in livestock compost in circulation were investigated with 659 samples from 2016 to 2017. In order to investigate the fluctuation nutrient contents of livestock composts with the same product name, 19 samples were collected and analyzed T-N, and $T-P_2O_5$, and $T-K_2O$ concentration during two years. The mean levels of T-N, $T-P_2O_5$, and $T-K_2O$ in livestock composts of from 2016 to 2017 were 1.73%, 1.88%, and 1.66%, respectively. The average contents of organic matter, water, and salt were 38.9%, 40.9%, and 1.2%, respectively. There were found that the maximum concentrations of Cr, Ni, Cu, and Zn in some livestock composts were exceeded the criteria of the official standard of commercial fertilizer. The maximum variation coefficient of T-N, $T-P_2O_5$ and $T-K_2O$ content of livestock composts was found to be 24%, 27%, and 50% on average, respectively. In order to manage the nutrients in agricultural soils, it will be reasonable that the error range of T-N and $T-P_2O_5$ content in livestock composts should be recommended to be 27% in mean as variation coefficient in case of displaying the nutrient element in liverstock compost.

Technological Diversities Observed in Bronze Objects of the Late Goryo Period - Case Study on the Bronze Bowls Excavated from the Burial Complex at Deobu-gol in Goyang - (고려 말 청동용기에 적용된 제작기술의 다양성 연구 - 고양 더부골 고분군 출토 청동용기를 중심으로 -)

  • Jeon, Ik Hwan;Lee, Jae Sung;Park, Jang Sik
    • Korean Journal of Heritage: History & Science
    • /
    • v.46 no.1
    • /
    • pp.208-227
    • /
    • 2013
  • Twenty-seven bronze bowls excavated from the Goryo burial complex at Deobu-gol were examined for their microstructure and chemical composition to characterize the bronze technology practiced by commoners at the time. Results showed that the objects examined can be classified into four groups: 1) objects forged out of Cu-near 22%Sn alloys and then quenched; 2) objects cast from Cu-below 10% Sn alloys containing lead; 3) objects cast from Cu-10%~20% Sn alloys containing lead and then quenched; 4) objects forged out of Cu-10~20% Sn alloys containing lead and then quenched. This study revealed that the fabrication technique as determined by alloy compositions plays an important role in bronze technology. The use of lead was clearly associated with the selection of quenching temperatures, the character of inclusions and the color characteristics of bronze surfaces. It was found that the objects containing lead were quenched at temperatures of $520^{\circ}{\sim}586^{\circ}C$ while those without lead were quenched at the range of $586^{\circ}{\sim}799^{\circ}C$. The presence of selenium in impurity inclusions was detected only in alloys containing lead, suggesting that the raw materials, Cu and Sn, used in making the lead-free alloys for the first group were carefully selected from those smelted using ores without lead contamination. Furthermore, the addition of lead was found to have significant effects on the color characteristics of the surface of bronze alloys when they are subjected to corrosion during interment. In leaded alloys, corrosion turns the surface light green or dark green while in unleaded alloys, corrosion turns the surface dark brown or black. It was found that in fabrication, the wall thickness of the bronze bowls varies depending on the application of quenching; most of the quenched objects have walls 1mm thick or below while those without quenching have walls 1mm thick or above. Fabrication techniques in bronze making usually reflect social environments of a community. It is likely that in the late Goryo period, experiencing lack of skilled bronze workers, the increased demand for bronze was met in two ways; by the use of chief lead instead of expensive tin and by the use of casting suitable for mass production. The above results show that the Goryo bronze workers tried to overcome such a resource-limited environment through technological innovations as apparent in the use of varying fabrication techniques for different alloys. Recently, numerous bronze objects are excavated and available for investigation. This study shows that with the use of proper analytical techniques they can serve as a valuable source of information required for the characterization of the associated technology as well as the social environment leading to the establishment of such technology.

Long-Term Trend Analysis in Nuclear Medicine Examinations (핵의학 영상 검사의 중장기 추세 분석 - 서울 소재 일개 상급 종합병원을 중심으로 -)

  • Jung, Woo-Young;Shim, Dong-Oh;Choi, Jae-Min
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.23 no.1
    • /
    • pp.15-28
    • /
    • 2019
  • Purpose Nuclear medicine was initially introduced in Korea in 1969 and widely applied to treat hyperthyroidism with $^{131}I$. Also, gamma camera was adopted in 1969 in the first place and its application has been growing continually in many ways. We analyzed long-term trend in nuclear medicine examinations for the last 2 decades. The purpose of this paper is to make predictions and to set both plans and directions on the development of nuclear medicine. Materials and Methods We analyzed the performance of nuclear medicine examinations and therapies performed in Asan Medical Center from 1998 to 2017. Results Results from the last 20 years regarding Bone scan, Renal scan, MUGA scan and $^{18}F$-FPCIT, Bone Mineral Density were on a increase. And Myocardium perfusion SPECT, Thyroid scan, Lung scan were on a decrease while $^{18}F-FDG$ PET maintained on a steady course. Until 2010 there was a positive performance with the therapy but after the excessive medical care in thyroid examination performance is at status quo. Key events such as a medical strike(2000), Middle-East Respiratory Syndrome (2015) influenced the overall performance of the therapy. Conclusion In order to promote a long-term growth in nuclear medicine examination and therapy, it is inevitable to respond to the changes in current medical environment. Furthermore, it is strongly suggested to put efforts to maintain and develop new examinations and clinical indicators.

A Study on Seeking a Multilateral Cooperation Framework for the Inter-Korean Exchange of Intangible Cultural Heritage - Through a Multinational Nomination of a Representative List of Intangible Cultural Heritage of Humanity - (남북 무형유산 교류 협력의 다자간 협력 틀 모색 - 유네스코 인류무형문화유산 남북 공동 등재 사례 -)

  • Kim, Deoksoon
    • Korean Journal of Heritage: History & Science
    • /
    • v.52 no.3
    • /
    • pp.252-269
    • /
    • 2019
  • Since the inauguration of the Kim Jong-un regime in 2012, the safeguarding and management system of cultural heritage in the Democratic People's Republic of Korea (DPRK) has been changing to a form similar to that of a democratic country's legal system. In addition, the National Authority for the Protection of Cultural Heritage (NAPCH) has continuously recorded and cataloged intangible cultural heritage elements in the DPRK, listing Arirang, kimchi-making, and ssireum on the UNESCO Intangible Cultural Heritage Representative List. In particular, the multinational nomination of ssireum in October 2018 is symbolic in terms of inter-Korean exchanges and cooperation for peace and reconciliation, raising expectations for the further multinational nomination of the two Koreas' intangible cultural heritage. Currently, South Korea lists 20 items on its Representative List of the Intangible Cultural Heritage of Humanity, three of which are shared by various countries with multinational nominations such as falconry, tug-of-war, and ssireum. However, when comparing the process of applying for multinational nomination in the three elements that follow, it is necessary to discuss whether these cases reflect the nature of multinational nomination. In particular, in the case of ssireum, without a working-level consultation between the two Koreas to prepare an application for a multinational nomination, each applied for a single registration; these applications were approved exceptionally as a multinational nomination by the Intergovernmental Committee under the leadership of the Secretary-General of UNESCO, and no bilateral exchanges have taken place until now. This is symbolic, formal, and substantially similar to the individual listings in terms of the spirit of co-listing on the premise of mutual exchange and cooperation. Therefore, the only way to strengthen the effectiveness of the multinational nomination between the two Koreas and to guarantee the spirit of multinational nomination is to request multilateral co-registration, including the two Koreas. For this, the Korean government needs a strategic approach, such as finding elements for multilateral co-listing; accumulating expertise, capabilities, and experience as a leading country in multilateral co-listing; and building cooperative governance with stakeholders. Besides, to reduce the volatility of inter-Korean cultural exchanges and cooperation depending on political situations and the special nature of inter-Korean relations, measures should be taken toward achieving inter-Korean cultural heritage exchanges and cooperation under a multilateral cooperation system using UNESCO, an international organization.

Multi-Vector Document Embedding Using Semantic Decomposition of Complex Documents (복합 문서의 의미적 분해를 통한 다중 벡터 문서 임베딩 방법론)

  • Park, Jongin;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.19-41
    • /
    • 2019
  • According to the rapidly increasing demand for text data analysis, research and investment in text mining are being actively conducted not only in academia but also in various industries. Text mining is generally conducted in two steps. In the first step, the text of the collected document is tokenized and structured to convert the original document into a computer-readable form. In the second step, tasks such as document classification, clustering, and topic modeling are conducted according to the purpose of analysis. Until recently, text mining-related studies have been focused on the application of the second steps, such as document classification, clustering, and topic modeling. However, with the discovery that the text structuring process substantially influences the quality of the analysis results, various embedding methods have actively been studied to improve the quality of analysis results by preserving the meaning of words and documents in the process of representing text data as vectors. Unlike structured data, which can be directly applied to a variety of operations and traditional analysis techniques, Unstructured text should be preceded by a structuring task that transforms the original document into a form that the computer can understand before analysis. It is called "Embedding" that arbitrary objects are mapped to a specific dimension space while maintaining algebraic properties for structuring the text data. Recently, attempts have been made to embed not only words but also sentences, paragraphs, and entire documents in various aspects. Particularly, with the demand for analysis of document embedding increases rapidly, many algorithms have been developed to support it. Among them, doc2Vec which extends word2Vec and embeds each document into one vector is most widely used. However, the traditional document embedding method represented by doc2Vec generates a vector for each document using the whole corpus included in the document. This causes a limit that the document vector is affected by not only core words but also miscellaneous words. Additionally, the traditional document embedding schemes usually map each document into a single corresponding vector. Therefore, it is difficult to represent a complex document with multiple subjects into a single vector accurately using the traditional approach. In this paper, we propose a new multi-vector document embedding method to overcome these limitations of the traditional document embedding methods. This study targets documents that explicitly separate body content and keywords. In the case of a document without keywords, this method can be applied after extract keywords through various analysis methods. However, since this is not the core subject of the proposed method, we introduce the process of applying the proposed method to documents that predefine keywords in the text. The proposed method consists of (1) Parsing, (2) Word Embedding, (3) Keyword Vector Extraction, (4) Keyword Clustering, and (5) Multiple-Vector Generation. The specific process is as follows. all text in a document is tokenized and each token is represented as a vector having N-dimensional real value through word embedding. After that, to overcome the limitations of the traditional document embedding method that is affected by not only the core word but also the miscellaneous words, vectors corresponding to the keywords of each document are extracted and make up sets of keyword vector for each document. Next, clustering is conducted on a set of keywords for each document to identify multiple subjects included in the document. Finally, a Multi-vector is generated from vectors of keywords constituting each cluster. The experiments for 3.147 academic papers revealed that the single vector-based traditional approach cannot properly map complex documents because of interference among subjects in each vector. With the proposed multi-vector based method, we ascertained that complex documents can be vectorized more accurately by eliminating the interference among subjects.

A Study on the Effect of the Document Summarization Technique on the Fake News Detection Model (문서 요약 기법이 가짜 뉴스 탐지 모형에 미치는 영향에 관한 연구)

  • Shim, Jae-Seung;Won, Ha-Ram;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.201-220
    • /
    • 2019
  • Fake news has emerged as a significant issue over the last few years, igniting discussions and research on how to solve this problem. In particular, studies on automated fact-checking and fake news detection using artificial intelligence and text analysis techniques have drawn attention. Fake news detection research entails a form of document classification; thus, document classification techniques have been widely used in this type of research. However, document summarization techniques have been inconspicuous in this field. At the same time, automatic news summarization services have become popular, and a recent study found that the use of news summarized through abstractive summarization has strengthened the predictive performance of fake news detection models. Therefore, the need to study the integration of document summarization technology in the domestic news data environment has become evident. In order to examine the effect of extractive summarization on the fake news detection model, we first summarized news articles through extractive summarization. Second, we created a summarized news-based detection model. Finally, we compared our model with the full-text-based detection model. The study found that BPN(Back Propagation Neural Network) and SVM(Support Vector Machine) did not exhibit a large difference in performance; however, for DT(Decision Tree), the full-text-based model demonstrated a somewhat better performance. In the case of LR(Logistic Regression), our model exhibited the superior performance. Nonetheless, the results did not show a statistically significant difference between our model and the full-text-based model. Therefore, when the summary is applied, at least the core information of the fake news is preserved, and the LR-based model can confirm the possibility of performance improvement. This study features an experimental application of extractive summarization in fake news detection research by employing various machine-learning algorithms. The study's limitations are, essentially, the relatively small amount of data and the lack of comparison between various summarization technologies. Therefore, an in-depth analysis that applies various analytical techniques to a larger data volume would be helpful in the future.

A study on Multiple Entity Data Model Design for Visual-Arts Archives and Information Management in the case of the KS X ISO 23081 Multiple Entity Model (시각예술기록정보 관리를 위한 데이터모델 설계 KS X ISO 23081 다중 엔티티 모델의 적용을 중심으로)

  • Hwang, Jin-hyun;Yim, Jin-hee
    • The Korean Journal of Archival Studies
    • /
    • no.33
    • /
    • pp.155-206
    • /
    • 2012
  • Interests in archives management are getting expanded from the public sector into the cultural and artistic field for the ten years after legislation of "Act on the Management of Public Archives" in 1999. However, due to lack of recognition on the importance of archives in the cultural and artistic field, it is rather frequent that information is kept scattered or archives are lost. As an example, absence of precise contract documents or notes of bestowal keeps people from locating great amount of cultural properties, and because of it these creative properties are in the risk of thefts, the closed-door auctioning, or trades in unofficial channels. As how a nation manages cultural and artistic creation inside the nation reflects its cultural level, it can be said that one of the indexes to notice the extent of a nation's cultural level is to take a look at how they are circulated. This study started from this point. Growing economy and rising interests in culture and art made the society more cognizant of the importance and value that visual artworks have, but the archives and information which are showing the context of these artworks and are produced in the course of social interaction are relatively disregarded because too much emphasis lies on the work itself. It is harder to find archives or documentations in Korea than in other advanced countries about the artists themselves or philosophical discourse on the background of the artworks. There is not so much interest to preserve the archives and information produced after the exhibition also, and they are used for no more than promotion or reference. Hereupon, the researcher recognized the importance of visual arts archives and believed that systemic management on them are high in need. And metadata is an essential way for the systemic management, as recently management on artworks or their archives are conducted using the system of the agencies even though they are not produced electronically. The objective of this study is to manage visual arts archives systematically by designing a data model reflecting traits of visual arts archives. Metadata are needed in the every course of archives from acquisition to management, preservation and application. Visual arts archives find its rich value only when a systemic relationship is established among information on artist, artwork and events including exhibition. By establishing a Multiple Entity Data Model, in which artworks, artists and events (exhibitions) make relationship all together, metadata for management on visual arts archive gets more efficiency and at the same time explanatory trait of the archive gets higher. For this reason we, in the study, tried to design a data model by setting each as an independent entities and designating relations between them, in order to find a way to manage visual arts archives more systematically.

A Study on Usability of Open Source Software for Developing Records System : A Case of ICA AtoM (공개 소프트웨어를 이용한 기록시스템 구축가능성 연구 ICA AtoM을 중심으로)

  • Lee, Bo-Ram;Hwang, Jin-Hyun;Park, Min-Yung;Kim, Hyung-Hee;Choi, Dong-Woon;Choi, Yun-Jin;Yim, Jin-Hee
    • The Korean Journal of Archival Studies
    • /
    • no.39
    • /
    • pp.193-228
    • /
    • 2014
  • In recent years, as well as management of public records, interest in the private archive of large and small is growing. Dedicated archive has various types. In addition, lack of personnel and budget, personnel records management professional because the absence, that help you maintain these records in a systematic manner is not easy. Request to the system have continued to rise, but the budget and professionals in order to solve this problem are missing. As breakthrough of the burden to the system with archive dedicated, it introduces the trends and meaning of public recording system, and was examined in detail AtoM function. AtoM is public land can be made by a method that requires a Web service, the database server. Without restrictions, including the advantage of being available free of charge, by the application or operating system specific, installation and operation is convenient. In addition, compatibility, and is highly scalable, AtoM use and convenient archive of private experiencing a shortage of personnel and budget. Because in terms of data management, and excellent interoperability and search share, and use, it is possible in the future, it favors also documentary use through a network of inter-agency archives and private. In addition, Enhancements exhibition services through cooperation with Omeka, long-term storage through Archivematica, many discussion is needed. Public centered around the private area of the recording management spilling expanded, open-source software allows to balance the recording system will be able to play an important role. In addition, the efforts of academia and in the field, close collaboration between the open source recording system through a user study should be continued. Furthermore, co-operation and sharing of private archives expect come true.

Evaluation of static fracture resistances and patterns of pulpless tooth restored with poly-ether-ketone-ketone (PEKK) post (Poly-ether-ketone-ketone (PEKK) 포스트로 수복한 근관 치료 치아의 정적 파절 저항성 및 파절 형태에 관한 평가)

  • Park, Ha Eun;Lee, Cheol Won;Lee, Won Sup;Yang, Sung Eun;Lee, Su Young
    • The Journal of Korean Academy of Prosthodontics
    • /
    • v.57 no.2
    • /
    • pp.127-133
    • /
    • 2019
  • Purpose: The purpose of present study was to investigate fracture strength and mode of failure of endodontically treated teeth restored with metal cast post-core system, prefabricated fiber post system, and newly introduced polyetherketoneketone (PEKK) post-core system. Materials and methods: A total of 21 mandibular premolar were randomly grouped into 3 groups of 7 each according to the post material. Group A was for metal cast post core; Group B for prefabricated glass fiber post and resin core; and Group C for milled PEKK post cores. All specimens were restored with metal crown. The fracture strength of each specimen was measured by applying a static load of 135-degree to the tooth at 2 mm/min crosshead speed using a universal testing machine. After the fracture strength measurement, the mode of failure was observed. The results were analyzed using Kruscal-Wallis test and post hoc Mann-Whitney U test at confidence interval ${\alpha}=.05$. Results: Fracture resistance of PEKK post core was lower than those of cast metal post and fiber reinforced post with composite resin core. In the aspect of fracture mode most of the root fracture occurred in the metal post core, whereas the post detachment occurred mainly in the fiber reinforced post. In the case of PEKK post core, teeth and post were fractured together. Conclusion: It is necessary to select appropriate materials of post for extensively damaged teeth restoration and clinical application of the PEKK post seems to require more research on improvement of strength.