• Title/Summary/Keyword: application case

Search Result 7,993, Processing Time 0.04 seconds

Technological Diversities Observed in Bronze Objects of the Late Goryo Period - Case Study on the Bronze Bowls Excavated from the Burial Complex at Deobu-gol in Goyang - (고려 말 청동용기에 적용된 제작기술의 다양성 연구 - 고양 더부골 고분군 출토 청동용기를 중심으로 -)

  • Jeon, Ik Hwan;Lee, Jae Sung;Park, Jang Sik
    • Korean Journal of Heritage: History & Science
    • /
    • v.46 no.1
    • /
    • pp.208-227
    • /
    • 2013
  • Twenty-seven bronze bowls excavated from the Goryo burial complex at Deobu-gol were examined for their microstructure and chemical composition to characterize the bronze technology practiced by commoners at the time. Results showed that the objects examined can be classified into four groups: 1) objects forged out of Cu-near 22%Sn alloys and then quenched; 2) objects cast from Cu-below 10% Sn alloys containing lead; 3) objects cast from Cu-10%~20% Sn alloys containing lead and then quenched; 4) objects forged out of Cu-10~20% Sn alloys containing lead and then quenched. This study revealed that the fabrication technique as determined by alloy compositions plays an important role in bronze technology. The use of lead was clearly associated with the selection of quenching temperatures, the character of inclusions and the color characteristics of bronze surfaces. It was found that the objects containing lead were quenched at temperatures of $520^{\circ}{\sim}586^{\circ}C$ while those without lead were quenched at the range of $586^{\circ}{\sim}799^{\circ}C$. The presence of selenium in impurity inclusions was detected only in alloys containing lead, suggesting that the raw materials, Cu and Sn, used in making the lead-free alloys for the first group were carefully selected from those smelted using ores without lead contamination. Furthermore, the addition of lead was found to have significant effects on the color characteristics of the surface of bronze alloys when they are subjected to corrosion during interment. In leaded alloys, corrosion turns the surface light green or dark green while in unleaded alloys, corrosion turns the surface dark brown or black. It was found that in fabrication, the wall thickness of the bronze bowls varies depending on the application of quenching; most of the quenched objects have walls 1mm thick or below while those without quenching have walls 1mm thick or above. Fabrication techniques in bronze making usually reflect social environments of a community. It is likely that in the late Goryo period, experiencing lack of skilled bronze workers, the increased demand for bronze was met in two ways; by the use of chief lead instead of expensive tin and by the use of casting suitable for mass production. The above results show that the Goryo bronze workers tried to overcome such a resource-limited environment through technological innovations as apparent in the use of varying fabrication techniques for different alloys. Recently, numerous bronze objects are excavated and available for investigation. This study shows that with the use of proper analytical techniques they can serve as a valuable source of information required for the characterization of the associated technology as well as the social environment leading to the establishment of such technology.

Long-Term Trend Analysis in Nuclear Medicine Examinations (핵의학 영상 검사의 중장기 추세 분석 - 서울 소재 일개 상급 종합병원을 중심으로 -)

  • Jung, Woo-Young;Shim, Dong-Oh;Choi, Jae-Min
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.23 no.1
    • /
    • pp.15-28
    • /
    • 2019
  • Purpose Nuclear medicine was initially introduced in Korea in 1969 and widely applied to treat hyperthyroidism with $^{131}I$. Also, gamma camera was adopted in 1969 in the first place and its application has been growing continually in many ways. We analyzed long-term trend in nuclear medicine examinations for the last 2 decades. The purpose of this paper is to make predictions and to set both plans and directions on the development of nuclear medicine. Materials and Methods We analyzed the performance of nuclear medicine examinations and therapies performed in Asan Medical Center from 1998 to 2017. Results Results from the last 20 years regarding Bone scan, Renal scan, MUGA scan and $^{18}F$-FPCIT, Bone Mineral Density were on a increase. And Myocardium perfusion SPECT, Thyroid scan, Lung scan were on a decrease while $^{18}F-FDG$ PET maintained on a steady course. Until 2010 there was a positive performance with the therapy but after the excessive medical care in thyroid examination performance is at status quo. Key events such as a medical strike(2000), Middle-East Respiratory Syndrome (2015) influenced the overall performance of the therapy. Conclusion In order to promote a long-term growth in nuclear medicine examination and therapy, it is inevitable to respond to the changes in current medical environment. Furthermore, it is strongly suggested to put efforts to maintain and develop new examinations and clinical indicators.

A Study on Seeking a Multilateral Cooperation Framework for the Inter-Korean Exchange of Intangible Cultural Heritage - Through a Multinational Nomination of a Representative List of Intangible Cultural Heritage of Humanity - (남북 무형유산 교류 협력의 다자간 협력 틀 모색 - 유네스코 인류무형문화유산 남북 공동 등재 사례 -)

  • Kim, Deoksoon
    • Korean Journal of Heritage: History & Science
    • /
    • v.52 no.3
    • /
    • pp.252-269
    • /
    • 2019
  • Since the inauguration of the Kim Jong-un regime in 2012, the safeguarding and management system of cultural heritage in the Democratic People's Republic of Korea (DPRK) has been changing to a form similar to that of a democratic country's legal system. In addition, the National Authority for the Protection of Cultural Heritage (NAPCH) has continuously recorded and cataloged intangible cultural heritage elements in the DPRK, listing Arirang, kimchi-making, and ssireum on the UNESCO Intangible Cultural Heritage Representative List. In particular, the multinational nomination of ssireum in October 2018 is symbolic in terms of inter-Korean exchanges and cooperation for peace and reconciliation, raising expectations for the further multinational nomination of the two Koreas' intangible cultural heritage. Currently, South Korea lists 20 items on its Representative List of the Intangible Cultural Heritage of Humanity, three of which are shared by various countries with multinational nominations such as falconry, tug-of-war, and ssireum. However, when comparing the process of applying for multinational nomination in the three elements that follow, it is necessary to discuss whether these cases reflect the nature of multinational nomination. In particular, in the case of ssireum, without a working-level consultation between the two Koreas to prepare an application for a multinational nomination, each applied for a single registration; these applications were approved exceptionally as a multinational nomination by the Intergovernmental Committee under the leadership of the Secretary-General of UNESCO, and no bilateral exchanges have taken place until now. This is symbolic, formal, and substantially similar to the individual listings in terms of the spirit of co-listing on the premise of mutual exchange and cooperation. Therefore, the only way to strengthen the effectiveness of the multinational nomination between the two Koreas and to guarantee the spirit of multinational nomination is to request multilateral co-registration, including the two Koreas. For this, the Korean government needs a strategic approach, such as finding elements for multilateral co-listing; accumulating expertise, capabilities, and experience as a leading country in multilateral co-listing; and building cooperative governance with stakeholders. Besides, to reduce the volatility of inter-Korean cultural exchanges and cooperation depending on political situations and the special nature of inter-Korean relations, measures should be taken toward achieving inter-Korean cultural heritage exchanges and cooperation under a multilateral cooperation system using UNESCO, an international organization.

Multi-Vector Document Embedding Using Semantic Decomposition of Complex Documents (복합 문서의 의미적 분해를 통한 다중 벡터 문서 임베딩 방법론)

  • Park, Jongin;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.19-41
    • /
    • 2019
  • According to the rapidly increasing demand for text data analysis, research and investment in text mining are being actively conducted not only in academia but also in various industries. Text mining is generally conducted in two steps. In the first step, the text of the collected document is tokenized and structured to convert the original document into a computer-readable form. In the second step, tasks such as document classification, clustering, and topic modeling are conducted according to the purpose of analysis. Until recently, text mining-related studies have been focused on the application of the second steps, such as document classification, clustering, and topic modeling. However, with the discovery that the text structuring process substantially influences the quality of the analysis results, various embedding methods have actively been studied to improve the quality of analysis results by preserving the meaning of words and documents in the process of representing text data as vectors. Unlike structured data, which can be directly applied to a variety of operations and traditional analysis techniques, Unstructured text should be preceded by a structuring task that transforms the original document into a form that the computer can understand before analysis. It is called "Embedding" that arbitrary objects are mapped to a specific dimension space while maintaining algebraic properties for structuring the text data. Recently, attempts have been made to embed not only words but also sentences, paragraphs, and entire documents in various aspects. Particularly, with the demand for analysis of document embedding increases rapidly, many algorithms have been developed to support it. Among them, doc2Vec which extends word2Vec and embeds each document into one vector is most widely used. However, the traditional document embedding method represented by doc2Vec generates a vector for each document using the whole corpus included in the document. This causes a limit that the document vector is affected by not only core words but also miscellaneous words. Additionally, the traditional document embedding schemes usually map each document into a single corresponding vector. Therefore, it is difficult to represent a complex document with multiple subjects into a single vector accurately using the traditional approach. In this paper, we propose a new multi-vector document embedding method to overcome these limitations of the traditional document embedding methods. This study targets documents that explicitly separate body content and keywords. In the case of a document without keywords, this method can be applied after extract keywords through various analysis methods. However, since this is not the core subject of the proposed method, we introduce the process of applying the proposed method to documents that predefine keywords in the text. The proposed method consists of (1) Parsing, (2) Word Embedding, (3) Keyword Vector Extraction, (4) Keyword Clustering, and (5) Multiple-Vector Generation. The specific process is as follows. all text in a document is tokenized and each token is represented as a vector having N-dimensional real value through word embedding. After that, to overcome the limitations of the traditional document embedding method that is affected by not only the core word but also the miscellaneous words, vectors corresponding to the keywords of each document are extracted and make up sets of keyword vector for each document. Next, clustering is conducted on a set of keywords for each document to identify multiple subjects included in the document. Finally, a Multi-vector is generated from vectors of keywords constituting each cluster. The experiments for 3.147 academic papers revealed that the single vector-based traditional approach cannot properly map complex documents because of interference among subjects in each vector. With the proposed multi-vector based method, we ascertained that complex documents can be vectorized more accurately by eliminating the interference among subjects.

A Study on the Effect of the Document Summarization Technique on the Fake News Detection Model (문서 요약 기법이 가짜 뉴스 탐지 모형에 미치는 영향에 관한 연구)

  • Shim, Jae-Seung;Won, Ha-Ram;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.201-220
    • /
    • 2019
  • Fake news has emerged as a significant issue over the last few years, igniting discussions and research on how to solve this problem. In particular, studies on automated fact-checking and fake news detection using artificial intelligence and text analysis techniques have drawn attention. Fake news detection research entails a form of document classification; thus, document classification techniques have been widely used in this type of research. However, document summarization techniques have been inconspicuous in this field. At the same time, automatic news summarization services have become popular, and a recent study found that the use of news summarized through abstractive summarization has strengthened the predictive performance of fake news detection models. Therefore, the need to study the integration of document summarization technology in the domestic news data environment has become evident. In order to examine the effect of extractive summarization on the fake news detection model, we first summarized news articles through extractive summarization. Second, we created a summarized news-based detection model. Finally, we compared our model with the full-text-based detection model. The study found that BPN(Back Propagation Neural Network) and SVM(Support Vector Machine) did not exhibit a large difference in performance; however, for DT(Decision Tree), the full-text-based model demonstrated a somewhat better performance. In the case of LR(Logistic Regression), our model exhibited the superior performance. Nonetheless, the results did not show a statistically significant difference between our model and the full-text-based model. Therefore, when the summary is applied, at least the core information of the fake news is preserved, and the LR-based model can confirm the possibility of performance improvement. This study features an experimental application of extractive summarization in fake news detection research by employing various machine-learning algorithms. The study's limitations are, essentially, the relatively small amount of data and the lack of comparison between various summarization technologies. Therefore, an in-depth analysis that applies various analytical techniques to a larger data volume would be helpful in the future.

A study on Multiple Entity Data Model Design for Visual-Arts Archives and Information Management in the case of the KS X ISO 23081 Multiple Entity Model (시각예술기록정보 관리를 위한 데이터모델 설계 KS X ISO 23081 다중 엔티티 모델의 적용을 중심으로)

  • Hwang, Jin-hyun;Yim, Jin-hee
    • The Korean Journal of Archival Studies
    • /
    • no.33
    • /
    • pp.155-206
    • /
    • 2012
  • Interests in archives management are getting expanded from the public sector into the cultural and artistic field for the ten years after legislation of "Act on the Management of Public Archives" in 1999. However, due to lack of recognition on the importance of archives in the cultural and artistic field, it is rather frequent that information is kept scattered or archives are lost. As an example, absence of precise contract documents or notes of bestowal keeps people from locating great amount of cultural properties, and because of it these creative properties are in the risk of thefts, the closed-door auctioning, or trades in unofficial channels. As how a nation manages cultural and artistic creation inside the nation reflects its cultural level, it can be said that one of the indexes to notice the extent of a nation's cultural level is to take a look at how they are circulated. This study started from this point. Growing economy and rising interests in culture and art made the society more cognizant of the importance and value that visual artworks have, but the archives and information which are showing the context of these artworks and are produced in the course of social interaction are relatively disregarded because too much emphasis lies on the work itself. It is harder to find archives or documentations in Korea than in other advanced countries about the artists themselves or philosophical discourse on the background of the artworks. There is not so much interest to preserve the archives and information produced after the exhibition also, and they are used for no more than promotion or reference. Hereupon, the researcher recognized the importance of visual arts archives and believed that systemic management on them are high in need. And metadata is an essential way for the systemic management, as recently management on artworks or their archives are conducted using the system of the agencies even though they are not produced electronically. The objective of this study is to manage visual arts archives systematically by designing a data model reflecting traits of visual arts archives. Metadata are needed in the every course of archives from acquisition to management, preservation and application. Visual arts archives find its rich value only when a systemic relationship is established among information on artist, artwork and events including exhibition. By establishing a Multiple Entity Data Model, in which artworks, artists and events (exhibitions) make relationship all together, metadata for management on visual arts archive gets more efficiency and at the same time explanatory trait of the archive gets higher. For this reason we, in the study, tried to design a data model by setting each as an independent entities and designating relations between them, in order to find a way to manage visual arts archives more systematically.

A Study on Usability of Open Source Software for Developing Records System : A Case of ICA AtoM (공개 소프트웨어를 이용한 기록시스템 구축가능성 연구 ICA AtoM을 중심으로)

  • Lee, Bo-Ram;Hwang, Jin-Hyun;Park, Min-Yung;Kim, Hyung-Hee;Choi, Dong-Woon;Choi, Yun-Jin;Yim, Jin-Hee
    • The Korean Journal of Archival Studies
    • /
    • no.39
    • /
    • pp.193-228
    • /
    • 2014
  • In recent years, as well as management of public records, interest in the private archive of large and small is growing. Dedicated archive has various types. In addition, lack of personnel and budget, personnel records management professional because the absence, that help you maintain these records in a systematic manner is not easy. Request to the system have continued to rise, but the budget and professionals in order to solve this problem are missing. As breakthrough of the burden to the system with archive dedicated, it introduces the trends and meaning of public recording system, and was examined in detail AtoM function. AtoM is public land can be made by a method that requires a Web service, the database server. Without restrictions, including the advantage of being available free of charge, by the application or operating system specific, installation and operation is convenient. In addition, compatibility, and is highly scalable, AtoM use and convenient archive of private experiencing a shortage of personnel and budget. Because in terms of data management, and excellent interoperability and search share, and use, it is possible in the future, it favors also documentary use through a network of inter-agency archives and private. In addition, Enhancements exhibition services through cooperation with Omeka, long-term storage through Archivematica, many discussion is needed. Public centered around the private area of the recording management spilling expanded, open-source software allows to balance the recording system will be able to play an important role. In addition, the efforts of academia and in the field, close collaboration between the open source recording system through a user study should be continued. Furthermore, co-operation and sharing of private archives expect come true.

Evaluation of static fracture resistances and patterns of pulpless tooth restored with poly-ether-ketone-ketone (PEKK) post (Poly-ether-ketone-ketone (PEKK) 포스트로 수복한 근관 치료 치아의 정적 파절 저항성 및 파절 형태에 관한 평가)

  • Park, Ha Eun;Lee, Cheol Won;Lee, Won Sup;Yang, Sung Eun;Lee, Su Young
    • The Journal of Korean Academy of Prosthodontics
    • /
    • v.57 no.2
    • /
    • pp.127-133
    • /
    • 2019
  • Purpose: The purpose of present study was to investigate fracture strength and mode of failure of endodontically treated teeth restored with metal cast post-core system, prefabricated fiber post system, and newly introduced polyetherketoneketone (PEKK) post-core system. Materials and methods: A total of 21 mandibular premolar were randomly grouped into 3 groups of 7 each according to the post material. Group A was for metal cast post core; Group B for prefabricated glass fiber post and resin core; and Group C for milled PEKK post cores. All specimens were restored with metal crown. The fracture strength of each specimen was measured by applying a static load of 135-degree to the tooth at 2 mm/min crosshead speed using a universal testing machine. After the fracture strength measurement, the mode of failure was observed. The results were analyzed using Kruscal-Wallis test and post hoc Mann-Whitney U test at confidence interval ${\alpha}=.05$. Results: Fracture resistance of PEKK post core was lower than those of cast metal post and fiber reinforced post with composite resin core. In the aspect of fracture mode most of the root fracture occurred in the metal post core, whereas the post detachment occurred mainly in the fiber reinforced post. In the case of PEKK post core, teeth and post were fractured together. Conclusion: It is necessary to select appropriate materials of post for extensively damaged teeth restoration and clinical application of the PEKK post seems to require more research on improvement of strength.

Application of Automated Microscopy Equipment for Rock Analog Material Experiments: Static Grain Growth and Simple Shear Deformation Experiments Using Norcamphor (유사물질 실험을 위한 자동화 현미경 실험 기기의 적용과 노캠퍼를 이용한 입자 성장 및 단순 전단 변형 실험의 예)

  • Ha, Changsu;Kim, Sungshil
    • Economic and Environmental Geology
    • /
    • v.54 no.2
    • /
    • pp.233-245
    • /
    • 2021
  • Many studies on the microstructures in rocks have been conducted using experimental methods with various equipment as well as natural rock studies to see the development of microstructures and understand their mechanisms. Grain boundary migration of mineral aggregates in rocks could cause grain growth or grain size changes during metamorphism or deformation as one of the main recrystallization mechanisms. This study suggests improved ways regarding the analog material experiments with reformed equipment to see sequential observations of these grain boundary migration. It can be more efficient than the existing techniques and carry out an appropriate microstructure analysis. This reformed equipment was implemented to enable optical manipulation by mounting polarizing plates capable of rotating operation on a stereoscopic microscope and a deformation rig capable of experimenting with analog materials. The equipment can automatically control the temperature and strain rate of the deformation rig by microcontrollers and programming and can take digital photomicrographs with constant time intervals during the experiment to observe any microstructure changes. The composite images synthesized using images by rotated polarizing plates enable us to see more accurate grain boundaries. As a rock analog material, norcamphor(C7H10O) was used, which has similar birefringence to quartz. Static grain growth and simple shear deformation experiments were performed using the norcamphor to verify the effectiveness of the equipment. The static grain growth experiments showed the characteristics of typical grain growth behavior. The number of grains decreases and the average grain size increases over time. These case experiments also showed a clear difference between the growth curves with three temperature conditions. The result of the simple shear deformation experiment under the medium temperature-low strain rate showed no significant change in the average grain size but presented the increased elongation of grain shapes in the direction of about 53° regarding the direction perpendicular to the shearing direction as the shear strain increases over time. These microstructures are interpreted as both the plastic deformation and the internal recovery process in grains are balanced by the deformation under the given experimental conditions. These experiments using the reformed equipment represent the ability to sequentially observe changing the microstructure during experiments as desired in the tests with the analog material during the entire process.

Knowledge graph-based knowledge map for efficient expression and inference of associated knowledge (연관지식의 효율적인 표현 및 추론이 가능한 지식그래프 기반 지식지도)

  • Yoo, Keedong
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.4
    • /
    • pp.49-71
    • /
    • 2021
  • Users who intend to utilize knowledge to actively solve given problems proceed their jobs with cross- and sequential exploration of associated knowledge related each other in terms of certain criteria, such as content relevance. A knowledge map is the diagram or taxonomy overviewing status of currently managed knowledge in a knowledge-base, and supports users' knowledge exploration based on certain relationships between knowledge. A knowledge map, therefore, must be expressed in a networked form by linking related knowledge based on certain types of relationships, and should be implemented by deploying proper technologies or tools specialized in defining and inferring them. To meet this end, this study suggests a methodology for developing the knowledge graph-based knowledge map using the Graph DB known to exhibit proper functionality in expressing and inferring relationships between entities and their relationships stored in a knowledge-base. Procedures of the proposed methodology are modeling graph data, creating nodes, properties, relationships, and composing knowledge networks by combining identified links between knowledge. Among various Graph DBs, the Neo4j is used in this study for its high credibility and applicability through wide and various application cases. To examine the validity of the proposed methodology, a knowledge graph-based knowledge map is implemented deploying the Graph DB, and a performance comparison test is performed, by applying previous research's data to check whether this study's knowledge map can yield the same level of performance as the previous one did. Previous research's case is concerned with building a process-based knowledge map using the ontology technology, which identifies links between related knowledge based on the sequences of tasks producing or being activated by knowledge. In other words, since a task not only is activated by knowledge as an input but also produces knowledge as an output, input and output knowledge are linked as a flow by the task. Also since a business process is composed of affiliated tasks to fulfill the purpose of the process, the knowledge networks within a business process can be concluded by the sequences of the tasks composing the process. Therefore, using the Neo4j, considered process, task, and knowledge as well as the relationships among them are defined as nodes and relationships so that knowledge links can be identified based on the sequences of tasks. The resultant knowledge network by aggregating identified knowledge links is the knowledge map equipping functionality as a knowledge graph, and therefore its performance needs to be tested whether it meets the level of previous research's validation results. The performance test examines two aspects, the correctness of knowledge links and the possibility of inferring new types of knowledge: the former is examined using 7 questions, and the latter is checked by extracting two new-typed knowledge. As a result, the knowledge map constructed through the proposed methodology has showed the same level of performance as the previous one, and processed knowledge definition as well as knowledge relationship inference in a more efficient manner. Furthermore, comparing to the previous research's ontology-based approach, this study's Graph DB-based approach has also showed more beneficial functionality in intensively managing only the knowledge of interest, dynamically defining knowledge and relationships by reflecting various meanings from situations to purposes, agilely inferring knowledge and relationships through Cypher-based query, and easily creating a new relationship by aggregating existing ones, etc. This study's artifacts can be applied to implement the user-friendly function of knowledge exploration reflecting user's cognitive process toward associated knowledge, and can further underpin the development of an intelligent knowledge-base expanding autonomously through the discovery of new knowledge and their relationships by inference. This study, moreover than these, has an instant effect on implementing the networked knowledge map essential to satisfying contemporary users eagerly excavating the way to find proper knowledge to use.