• 제목/요약/키워드: Information Processing Process

검색결과 4,568건 처리시간 0.035초

Extracting Ontology from Medical Documents with Ontology Maturing Process

  • Nyamsuren, Enkhbold;Kang, Dong-Yeop;Kim, Su-Kyoung;Choi, Ho-Jin
    • 한국정보처리학회:학술대회논문집
    • /
    • 한국정보처리학회 2009년도 춘계학술발표대회
    • /
    • pp.50-52
    • /
    • 2009
  • Ontology maintenance is a time consuming and costly process which requires special skill and knowledge. It requires joint effort of both ontology engineer and domain specialist to properly maintain ontology and update knowledge in it. This is specially true for medical domain which is highly specialized domain. This paper proposes a novel approach for maintenance and update of existing ontologies in a medical domain. The proposed approach is based on modified Ontology Maturing Process which was originally developed for web domain. The proposed approach provides way to populate medical ontology with new knowledge obtained from medical documents. This is achieved through use of natural language processing techniques and highly specialized medical knowledge bases such as Unified Medical Language System.

NIST Lightweight Cryptography Standardization Process: Classification of Second Round Candidates, Open Challenges, and Recommendations

  • Gookyi, Dennis Agyemanh Nana;Kanda, Guard;Ryoo, Kwangki
    • Journal of Information Processing Systems
    • /
    • 제17권2호
    • /
    • pp.253-270
    • /
    • 2021
  • In January 2013, the National Institute of Standards and Technology (NIST) announced the CAESAR (Competition for Authenticated Encryption: Security, Applicability, and Robustness) contest to identify authenticated ciphers that are suitable for a wide range of applications. A total of 57 submissions made it into the first round of the competition out of which 6 were announced as winners in March 2019. In the process of the competition, NIST realized that most of the authenticated ciphers submitted were not suitable for resource-constrained devices used as end nodes in the Internet-of-Things (IoT) platform. For that matter, the NIST Lightweight Cryptography Standardization Process was set up to identify authenticated encryption and hashing algorithms for IoT devices. The call for submissions was initiated in 2018 and in April 2019, 56 submissions made it into the first round of the competition. In August 2019, 32 out of the 56 submissions were selected for the second round which is due to end in the year 2021. This work surveys the 32 authenticated encryption schemes that made it into the second round of the NIST lightweight cryptography standardization process. The paper presents an easy-to-understand comparative overview of the recommended parameters, primitives, mode of operation, features, security parameter, and hardware/software performance of the 32 candidate algorithms. The paper goes further by discussing the challenges of the Lightweight Cryptography Standardization Process and provides some suitable recommendations.

Fuzzy Petri-net Approach to Fault Diagnosis in Power Systems Using the Time Sequence Information of Protection System

  • Roh, Myong-Gyun;Hong, Sang-Eun
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.1727-1731
    • /
    • 2003
  • In this paper we proposed backward fuzzy Petri-net to diagnoses faults in power systems by using the time sequence information of protection system. As the complexity of power systems increases, especially in the case of multiple faults or incorrect operation of protective devices, fault diagnosis requires new and systematic methods to the reasoning process, which improves both its accuracy and its efficiency. The fuzzy Petri-net models of protection system are composed of the operating process of protective devices and the fault diagnosis process. Fault diagnosis model, which makes use of the nature of fuzzy Petri-net, is developed to overcome the drawbacks of methods that depend on operator knowledge. The proposed method can reduce processing time and increase accuracy when compared with the traditional methods. And also this method covers online processing of real-time data from SCADA (Supervisory Control and Data Acquisition)

  • PDF

문서 라우팅 기법을 이용한 간호진단 과정에서의 정보접근 (Applying document routing mode of information access in nursing diagnosis process)

  • 백우진
    • 한국정보관리학회:학술대회논문집
    • /
    • 한국정보관리학회 2006년도 제13회 학술대회 논문집
    • /
    • pp.163-168
    • /
    • 2006
  • Nursing diagnosis process is described as nurses assessing the patients' conditions by applying reasoning and looking for patterns, which fit the defining characteristics of one or more diagnoses. This process is similar to using a typical document retrieval system if we consider the patients' conditions as queries, nursing diagnoses as documents, and the defining characteristics as index terms of the documents. However, there is a small fixed number of nursing diagnoses and infinite number of patients' conditions in a typical hospital setting. This state is more suitable to applying document routing mode of information access, which is defined as a number of archived profiles, compared to individual documents. In this paper, we describe a ROUting-based Nursing Diagnosis (ROUND) system and its Natural Language Processing-based query processing component, which converts the defining characteristics of nursing diagnoses into query representations.

  • PDF

제품디자인 방법에서의 정보 처리 모델 연구 (Study of the information processing model in a way of product design method)

  • 조성근
    • 디자인학연구
    • /
    • 제16권1호
    • /
    • pp.289-296
    • /
    • 2003
  • 본 논고는 제품디자인 방법에서의 정보수집과 조직화 과정에 대한 정보처리모델을 개념적으로 연구한 것이다 과거의 디자인은 디자이너가 직접 손으로 소재를 다듬어 제품을 만들었으나, 소재가 정보로 전환된 오늘날의 제품디자인은 본질적으로 정보의 수집과 조직화 과정에 의해 이루어진다고 볼 수 있다. 제품디자인이 정보 처리과정이라고 한다면, 그것은 정보 이론의 양적인 변환이 아니라, 주로 정보의 질적인 변환을 의미한다. 본 연구의 초점은 제품디자인에서 소재가 아닌, 용도를 지향할 때 다루어지는 대상을 정보로 전환, 처리되는 모델을 개념화하는데 있다. 기존에는 합리적인 문제해결을 위한 제품디자인 방법의 논의는 <분석-종합-평가>의 프로세스 모델을 기본으로 한 정량적, 정성적, 유기적 방법들과 수단으로서의 모델링에 관한 주로 형식론적인 것이었다면, 제품디자인에서 정보처리방법은 물질을 형태화하기에 앞서, 대상물의 소재를 추상화, 구조화하여 디자인 지향의 정보로 모델링해야 한다. 디지털 환경의 기반 하에서 제품디자인의 성패는 과거 소재에 직접 가공하던 것과는 달리, 무엇을 어떻게 해야 할 것인지 결정하기 위해 정보를 어떻게 다루느냐에 달려 있다.

  • PDF

Intelligent missing persons index system Implementation based on the OpenCV image processing and TensorFlow Deep-running Image Processing

  • Baek, Yeong-Tae;Lee, Se-Hoon;Kim, Ji-Seong
    • 한국컴퓨터정보학회논문지
    • /
    • 제22권1호
    • /
    • pp.15-21
    • /
    • 2017
  • In this paper, we present a solution to the problems caused by using only text - based information as an index element when a commercialized missing person indexing system indexes missing persons registered in the database. The existing system could not be used for the missing persons inquiry because it could not formalize the image of the missing person registered together when registering the missing person. To solve these problems, we propose a method to extract the similarity of images by using OpenCV image processing and TensorFlow deep - running image processing, and to process images of missing persons to process them into meaningful information. In order to verify the indexing method used in this paper, we constructed a Web server that operates to provide the information that is most likely to be needed to users first, using the image provided in the non - regular environment of the same subject as the search element.

Zero-knowledge proof algorithm for Data Privacy

  • Min, Youn-A
    • International Journal of Internet, Broadcasting and Communication
    • /
    • 제13권2호
    • /
    • pp.67-75
    • /
    • 2021
  • As pass the three revised bills, the Personal Information Protection Act was revised to have a larger application for personal information. For an industrial development through an efficient and secure usage of personal information, there is a need to revise the existing anonymity processing method. This paper modifies the Zero Knowledge Proofs algorithm among the anonymity processing methods to modify the anonymity process calculations by taking into account the reliability of the used service company. More detail, the formula of ZKP (Zero Knowledge Proof) used by ZK-SNAKE is used to modify the personal information for pseudonymization processing. The core function of the proposed algorithm is the addition of user variables and adjustment of the difficulty level according to the reliability of the data user organization and the scope of use. Through Setup_p, the additional variable γ can be selectively applied according to the reliability of the user institution, and the degree of agreement of Witness is adjusted according to the reliability of the institution entered through Prove_p. The difficulty of the verification process is adjusted by considering the reliability of the institution entered through Verify_p. SimProve, a simulator, also refers to the scope of use and the reliability of the input authority. With this suggestion, it is possible to increase reliability and security of anonymity processing and distribution of personal information.

The Mapping Method for Parallel Processing of SAR Data

  • In-Pyo Hong;Jae-Woo Joo;Han-Kyu Park
    • 한국통신학회논문지
    • /
    • 제26권11A호
    • /
    • pp.1963-1970
    • /
    • 2001
  • It is essential design process to analyze processing method and set out top level HW configuration using main parameters before implementation of the SAR processor. This paper identifies the impact of the I/O and algorithm structure upon the parallel processing to be assessed and suggests the practical mapping method fur parallel processing to the SAR data. Also, simulation is performed to the E-SAR processor to examine the usefulness of the method, and the results are analyzed and discussed.

  • PDF

삼차원 재구성을 위한 Data-Flow 기반의 프레임워크 (A data-flow oriented framework for video-based 3D reconstruction)

  • 김희관
    • 한국정보처리학회:학술대회논문집
    • /
    • 한국정보처리학회 2009년도 춘계학술발표대회
    • /
    • pp.71-74
    • /
    • 2009
  • The data-flow paradigm has been employed in various application areas. It is particularly useful where large data-streams must be processed, for example in video and audio processing, or for scientific visualization. A video-based 3D reconstruction system should process multiple synchronized video streams. The system exhibits many properties that can be targeted using a data-flow approach that is naturally divided into a sequence of processing tasks. In this paper we introduce our concept to apply the data-flow approach to a multi-video 3D reconstruction system.

Performance Study of Satellite Image Processing on Graphics Processors Unit Using CUDA

  • Jeong, In-Kyu;Hong, Min-Gee;Hahn, Kwang-Soo;Choi, Joonsoo;Kim, Choen
    • 대한원격탐사학회지
    • /
    • 제28권6호
    • /
    • pp.683-691
    • /
    • 2012
  • High resolution satellite images are now widely used for a variety of mapping applications including photogrammetry, GIS data acquisition and visualization. As the spectral and spatial data size of satellite images increases, a greater processing power is needed to process the images. The solution of these problems is parallel systems. Parallel processing techniques have been developed for improving the performance of image processing along with the development of the computational power. However, conventional CPU-based parallel computing is often not good enough for the demand for computational speed to process the images. The GPU is a good candidate to achieve this goal. Recently GPUs are used in the field of highly complex processing including many loop operations such as mathematical transforms, ray tracing. In this study we proposed a technique for parallel processing of high resolution satellite images using GPU. We implemented a spectral radiometric processing algorithm on Landsat-7 ETM+ imagery using CUDA, a parallel computing architecture developed by NVIDIA for GPU. Also performance of the algorithm on GPU and CPU is compared.