• Title/Summary/Keyword: information and documentation

Search Result 312, Processing Time 0.03 seconds

A Study on "Dongyeopyengo" Housed by the National Library of Korea (국립중앙도서관 소장 "동여편고" 연구)

  • Lee, Ki-Bong
    • Journal of the Korean association of regional geographers
    • /
    • v.18 no.1
    • /
    • pp.27-41
    • /
    • 2012
  • This research aims to organize and introduce "Dongyeopyengo(東輿便攷)" housed by the National Library of Korea that contains a limitless number of proofreading and adding marks in the margin of the pages, and to look into the academic value that it has in the history of geography book compilation. In conclusion, first, "Dongyeopyengo" was compiled with contents originated from "Sinjungdonggukyeojisungnam(新增東國輿地勝覽)" removed, under the reign of King Sunjo(純祖, 1800~1834). Second, mostly under the reign of King Sunjo, "Dongyeopyengo" has gone through proofreading and adding process based on information from "Donggukmooneonbigo (東國文獻備考)". Third, under the reign of King Heonjong(憲宗, 1834~1849), proofreading and adding work was performed based on various materials including "Jungjeongnamhanji(重訂南漢志)". Fourth, the compiler of "Dongyeopyengo" is surely Kim Jeong-ho(金正浩), considering documentations on the transfer of central city of Yangju (楊州) and the overall tendency found in geography book compilation. "Dongyeopyengo" is an important documentation that depicts the process of geography books that proceeded in the early years of his life as the greatest geographer in Korea, and it holds academic values in that it helps us to understand the life of Kim Jeong-ho prior to 1834, to which people has not paid much attention due to the lack of material and documents.

  • PDF

Comparison and Analyzing System for Protein Tertiary Structure Database expands LOCK (LOCK을 확장한 3차원 단백질 구조비교 및 분석시스템의 설계 및 구현)

  • Jung Kwang Su;Han Yu;Park Sung Hee;Ryu Keun Ho
    • The KIPS Transactions:PartD
    • /
    • v.12D no.2 s.98
    • /
    • pp.247-258
    • /
    • 2005
  • Protein structure is highly related to its function and comparing protein structure is very important to identify structural motif, family and their function. In this paper, we construct an integrated database system which has all the protein structure data and their literature. The structure queries from the web interface are compared with the target structures in database, and the results are shown to the user for future analysis. To constructs this system, we analyze the Flat-File of Protein Data Bank. Then we select the necessary structure data and store as a new formatted data. The literature data related to these structures are stored in a relational database to query the my kinds of data easily In our structure comparison system, the structure of matched pattern and RMSD valure are calculated, then they are showed to the user with their relational documentation data. This system provides the more quick comparison and nice analyzing environment.

A Case Study on Improving SW Quality through Software Visualization (소프트웨어 가시화를 통한 품질 개선 사례 연구)

  • Park, Bo Kyung;Kwon, Ha Eun;Son, Hyun Seung;Kim, Young Soo;Lee, Sang-Eun;Kim, R. Young Chul
    • Journal of KIISE
    • /
    • v.41 no.11
    • /
    • pp.935-942
    • /
    • 2014
  • Today, it is very important issue to high quality of software issue on huge scale of code and time-to-market. In the industrial fields still developers focuses on Code based development. Therefore we try to consider two points of views 1) improving the general developer the bad development habit, and 2) maintenance without design, documentation and code visualization. To solve these problems, we need to make the code visualization of code. In this paper, we suggest how to visualize the inner structure of code, and also how to proceed improvement of quality with constructing the Tool-Chain for visualizing Java code's inner structure. For our practical case, we applied Object Code with NIPA's SW Visualization, and then reduced code complexity through quantitatively analyzing and visualizing code based on setting the basic module unit, the class of object oriented code.

Analysis of the Nursing Interventions performed by neurosurgery unit using NIC (간호중재분류체계(NIC)에 근거한 간호중재 수행분석 - 신경외과 간호단위 간호사를 중심으로 -)

  • Oh, Myung-Seon;Park, Kyung-Sook
    • Korean Journal of Adult Nursing
    • /
    • v.14 no.2
    • /
    • pp.265-275
    • /
    • 2002
  • Pursose: The purpose of this study was to evaluate the selected nursing interventions and to describe the most common nursing interventions used by neurosurgery unit nurses. Method: The data was collected from 65 nurses of 5 general hospitals from Jan. 8, 2001 to Feb. 28, 2001. The instrument for this study was the Korean translation of 486 nursing intervention classifications, developed by MacClosky & Bluecheck in 2000. In the 486 nursing interventions 310 nursing interventions were selected, 8 from among the 10 professional nurses group in the neurosurgery care unit. The 310 nursing interventions were used in a secondary questionnaire. In the secondary questionnaire, all 310 intervention lables and definitions were listed. The data was analysed with SPSS program. Result: The results of this study are as follows. 1. The most frequently used nursing intervention domains were "physiological: complex", "physiological: basic", "Health system", "Behavior", "Safety", "Family". 2. Neurosurgery care unit core nursing interventions were performed several times a day by 50% or more of the Neurosurgery care unit. Neurosurgery core nursing intervention, 5 domain ("physiological: complex", "physiological: basic", "Health system", "Safety", "Behavior"), 16 class, 48 core nursing intervention. The most frequently used Neurosurgery core nursing interventions were Intravenous Therapy, Pressure ulcer prevention, Documentation, Airway suctioning, Medication: intravenous, Pain management, Medication: intramuscular, Shift report, Intravenous insertion, Positioning, Aspiration precaution, Pressure management, Physician support, Pressure ulcer care. 3. Compared with carrier and age of nurses, the more effective nursing interventions were "Family", Compared with the nursing place and the use of nursing interventions of nurses the most effective nursing interventions were "Health system" performed by nurse in university hospital. Conclusion: The purpose of this study was to analysis the nursing intervention performed by neurosurgery unit nurses. This study analyses nursing intervention and core nursing interventions performed by neurosurgery unit nurses. Basis on this study result, neurosurgery nursing interventions will be systematized, and progression of qualitative nursing, data of computerized nusing information system will be utilized.

  • PDF

Implementation of Smart Collaboration Environment Framework (지능형 협업 환경 프레임워크 구현)

  • Han, Sang-Woo;Kim, Nam-Gon;Choi, Ki-Ho;Ko, Su-Jin;Bae, Chang-Hyeok;Kim, Jong-Won
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.14 no.1
    • /
    • pp.40-51
    • /
    • 2008
  • To realize advanced collaboration environments for knowledge workers distributed geographically, there are extensive researches in ubiquitous computing environments. Especially, to cope with several known problems in traditional collaboration tools such as limited display resolution, uncomfortable shared documentation, difficult operation of collaboration environments, various approaches are attempted in the aspect of framework design. In this paper, we design a framework for collaboration environments covering hardware/software/networking architecture to flexibly coordinate a set of collaboration services and devices considering users' expectation and node capabilities. Based on the proposed framework, we develop the collaboration environment supporting the interactive networked tiled display enabling media/data sharing via networking, display interaction using pointing/tracking, and high-resolution tiled display. Finally the demonstration of the developed prototype is introduced to prove the possibility of its realization.

Using the METHONTOLOGY Approach to a Graduation Screen Ontology Development: An Experiential Investigation of the METHONTOLOGY Framework

  • Park, Jin-Soo;Sung, Ki-Moon;Moon, Se-Won
    • Asia pacific journal of information systems
    • /
    • v.20 no.2
    • /
    • pp.125-155
    • /
    • 2010
  • Ontologies have been adopted in various business and scientific communities as a key component of the Semantic Web. Despite the increasing importance of ontologies, ontology developers still perceive construction tasks as a challenge. A clearly defined and well-structured methodology can reduce the time required to develop an ontology and increase the probability of success of a project. However, no reliable knowledge-engineering methodology for ontology development currently exists; every methodology has been tailored toward the development of a particular ontology. In this study, we developed a Graduation Screen Ontology (GSO). The graduation screen domain was chosen for the several reasons. First, the graduation screen process is a complicated task requiring a complex reasoning process. Second, GSO may be reused for other universities because the graduation screen process is similar for most universities. Finally, GSO can be built within a given period because the size of the selected domain is reasonable. No standard ontology development methodology exists; thus, one of the existing ontology development methodologies had to be chosen. The most important considerations for selecting the ontology development methodology of GSO included whether it can be applied to a new domain; whether it covers a broader set of development tasks; and whether it gives sufficient explanation of each development task. We evaluated various ontology development methodologies based on the evaluation framework proposed by G$\acute{o}$mez-P$\acute{e}$rez et al. We concluded that METHONTOLOGY was the most applicable to the building of GSO for this study. METHONTOLOGY was derived from the experience of developing Chemical Ontology at the Polytechnic University of Madrid by Fern$\acute{a}$ndez-L$\acute{o}$pez et al. and is regarded as the most mature ontology development methodology. METHONTOLOGY describes a very detailed approach for building an ontology under a centralized development environment at the conceptual level. This methodology consists of three broad processes, with each process containing specific sub-processes: management (scheduling, control, and quality assurance); development (specification, conceptualization, formalization, implementation, and maintenance); and support process (knowledge acquisition, evaluation, documentation, configuration management, and integration). An ontology development language and ontology development tool for GSO construction also had to be selected. We adopted OWL-DL as the ontology development language. OWL was selected because of its computational quality of consistency in checking and classification, which is crucial in developing coherent and useful ontological models for very complex domains. In addition, Protege-OWL was chosen for an ontology development tool because it is supported by METHONTOLOGY and is widely used because of its platform-independent characteristics. Based on the GSO development experience of the researchers, some issues relating to the METHONTOLOGY, OWL-DL, and Prot$\acute{e}$g$\acute{e}$-OWL were identified. We focused on presenting drawbacks of METHONTOLOGY and discussing how each weakness could be addressed. First, METHONTOLOGY insists that domain experts who do not have ontology construction experience can easily build ontologies. However, it is still difficult for these domain experts to develop a sophisticated ontology, especially if they have insufficient background knowledge related to the ontology. Second, METHONTOLOGY does not include a development stage called the "feasibility study." This pre-development stage helps developers ensure not only that a planned ontology is necessary and sufficiently valuable to begin an ontology building project, but also to determine whether the project will be successful. Third, METHONTOLOGY excludes an explanation on the use and integration of existing ontologies. If an additional stage for considering reuse is introduced, developers might share benefits of reuse. Fourth, METHONTOLOGY fails to address the importance of collaboration. This methodology needs to explain the allocation of specific tasks to different developer groups, and how to combine these tasks once specific given jobs are completed. Fifth, METHONTOLOGY fails to suggest the methods and techniques applied in the conceptualization stage sufficiently. Introducing methods of concept extraction from multiple informal sources or methods of identifying relations may enhance the quality of ontologies. Sixth, METHONTOLOGY does not provide an evaluation process to confirm whether WebODE perfectly transforms a conceptual ontology into a formal ontology. It also does not guarantee whether the outcomes of the conceptualization stage are completely reflected in the implementation stage. Seventh, METHONTOLOGY needs to add criteria for user evaluation of the actual use of the constructed ontology under user environments. Eighth, although METHONTOLOGY allows continual knowledge acquisition while working on the ontology development process, consistent updates can be difficult for developers. Ninth, METHONTOLOGY demands that developers complete various documents during the conceptualization stage; thus, it can be considered a heavy methodology. Adopting an agile methodology will result in reinforcing active communication among developers and reducing the burden of documentation completion. Finally, this study concludes with contributions and practical implications. No previous research has addressed issues related to METHONTOLOGY from empirical experiences; this study is an initial attempt. In addition, several lessons learned from the development experience are discussed. This study also affords some insights for ontology methodology researchers who want to design a more advanced ontology development methodology.

Usability Evaluation Criteria Development and Application for Map-Based Data Visualization (지도 기반 데이터 시각화 플랫폼 사용성 평가 기준 개발 및 적용 연구)

  • Sungha Moon;Hyunsoo Yoon;Seungwon Yang;Sanghee Oh
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.58 no.2
    • /
    • pp.225-249
    • /
    • 2024
  • The purpose of this study is to develop an evaluation tool for map-based data visualization platforms and to conduct heuristic usability evaluations on existing platforms representing inter-regional information. We compared and analyzed the usability evaluation criteria of map-based platforms from the previous studies along with Nielsen's (1994) 10 usability evaluation principles. We proposed nine evaluation criteria, including (1) visibility, (2) representation of the real world, (3) consistency and standards, (4) user control and friendliness, (5) flexibility, (6) design, (7) compatibility, (8) error prevention and handling, and (9) help provision and documentation. Additionally, to confirm the effectiveness of the proposed criteria, four experts was invited to evaluate five domestic and international map-based data visualization platforms. As a result, the experts were able to rank the usability of the five platforms using the proposed map-based data visualization usability evaluation criteria, which included quantified scores and subjective opinions. The results of this study are expected to serve as foundational material for the future development and evaluation of map-based visualization platforms.

Documentation of the History of Ok-Cheon Catholic Church by standardized 2D CAD and 3D Digital Modeling (표준화된 2D CAD와 3D Digital Modeling을 이용한 옥천천주교회의 연혁 기록)

  • Kim, Myung-Sun;Choi, Soon-Yong
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.12 no.1
    • /
    • pp.523-528
    • /
    • 2011
  • Ok-Cheon catholic church has been changed 4 times since it's first construction in 1955. Prior three changes were small ones of windows, doors, roof finish etc. but the last alteration was the extension of it's plan from 一 shape to long cross shape and along with it the size, structure and form of it changed. This history of the church has not been recorded in drawing but only in text with indistinct features not documented. This study makes a new 2D CAD files using layers matched the changes and 3D digital models, these have not only present information but also change informations of the church. They are useful data for effective management, conservation restoration or possible reuse of it.

E-Commerce in the Historical Approach to Usage and Practice of International Trade ("무역상무(貿易商務)에의 역사적(歷史的) 어프로치와 무역취인(貿易取引)의 전자화(電子化)")

  • Tsubaki, Koji
    • THE INTERNATIONAL COMMERCE & LAW REVIEW
    • /
    • v.19
    • /
    • pp.224-242
    • /
    • 2003
  • The author believes that the main task of study in international trade usage and practice is the management of transactional risks involved in international sale of goods. They are foreign exchange risks, transportation risks, credit risk, risk of miscommunication, etc. In most cases, these risks are more serious and enormous than those involved in domestic sales. Historically, the merchant adventurers organized the voyage abroad, secured trade finance, and went around the ocean with their own or consigned cargo until around the $mid-19^{th}$ century. They did business faceto-face at the trade fair or the open port where they maintained the local offices, so-called "Trading House"(商館). Thererfore, the transactional risks might have been one-sided either with the seller or the buyer. The bottomry seemed a typical arrangement for risk sharing among the interested parties to the adventure. In this way, such organizational arrangements coped with or bore the transactional risks. With the advent of ocean liner services and wireless communication across the national border in the $19^{th}$ century, the business of merchant adventurers developed toward the clear division of labor; sales by mercantile agents, and ocean transportation by the steam ship companies. The international banking helped the process to be accelerated. Then, bills of lading backed up by the statute made it possible to conduct documentary sales with a foreign partner in different country. Thus, FOB terms including ocean freight and CIF terms emerged gradually as standard trade terms in which transactional risks were allocated through negotiation between the seller and the buyer located in different countries. Both of them did not have to go abroad with their cargo. Instead, documentation in compliance with the terms of the contract(plus an L/C in some cases) must by 'strictly' fulfilled. In other words, the set of contractual documents must be tendered in advance of the arrival of the goods at port of discharge. Trust or reliance is placed on such contractual paper documents. However, the container transport services introduced as international intermodal transport since the late 1960s frequently caused the earlier arrival of the goods at the destination before the presentation of the set of paper documents, which may take 5 to 10% of the amount of transaction. In addition, the size of the container vessel required the speedy transport documentation before sailing from the port of loading. In these circumstances, computerized processing of transport related documents became essential for inexpensive transaction cost and uninterrupted distribution of the goods. Such computerization does not stop at the phase of transportation but extends to cover the whole process of international trade, transforming the documentary sales into less-paper trade and further into paperless trade, i.e., EDI or E-Commerce. Now we face the other side of the coin, which is data security and paperless transfer of legal rights and obligations. Unfortunately, these issues are not effectively covered by a set of contracts only. Obviously, EDI or E-Commerce is based on the common business process and harmonized system of various data codes as well as the standard message formats. This essential feature of E-Commerce needs effective coordination of different divisions of business and tight control over credit arrangements in addition to the standard contract of sales. In a few word, information does not alway invite "trust". Credit flows from people, or close organizational tie-ups. It is our common understanding that, without well-orchestrated organizational arrangements made by leading companies, E-Commerce does not work well for paperless trade. With such arrangements well in place, participating E-business members do not need to seriously care for credit risk. Finally, it is also clear that E-International Commerce must be linked up with a set of government EDIs such as NACCS, Port EDI, JETRAS, etc, in Japan. Therefore, there is still a long way before us to go for E-Commerce in practice, not on the top of information manager's desk.

  • PDF

A Study on Computation of the Reduction Rate in the Total Cost of Ownership of the Open Source Software in Comparison to the Commercial Software (상용소프트웨어대비 공개소프트웨어 총소유비용 절감비율 산정에 관한 연구)

  • Kim, Shin-Pyo;Kim, Tae-Yeol;Park, Keun-Ha
    • Journal of Digital Convergence
    • /
    • v.11 no.3
    • /
    • pp.115-126
    • /
    • 2013
  • The purpose of this study was to confirm the extent of the reduction in the total cost of ownership of the open source software in comparison to the commercial software installed for information system, PC and cloud computing. Accordingly, the actual reduction rates in the total cost of ownership, when open source software is installed in the information system, PC and cloud computing, were computed and analyzed for 51 companies in the area of information system, 18 companies in the area of PC and 6 companies in the area of cloud computing, which included government institution, educational institution and private enterprises. The results of expert survey illustrated that the reduction rates are (1)63.3% on the average for the 4areas of information system, namely, DBMS, WAS, Web and OS, (2) 59.4% on the average for the 6 areas of PC, namely, OS, Documentation Program, Back-up and Restoration, Screen Capture, Vaccine and Others and (3) 61.2% on the average for the 6 areas of cloud computing, namely, Virtualization, OS, WEM/WAS, DBMS, DFS and Cloud Management.