• Title/Summary/Keyword: Computer code

Search Result 2,715, Processing Time 0.028 seconds

Legislation Cases, Management Policies and Countermeasures on Scientific Data -Focusing Australia, the United States and China- (과학데이터에 관한 입법례와 관리정책 그리고 대응방안 -호주, 미국, 중국을 중심으로-)

  • Yoon, Chong-Min;Kim, Kyubin
    • Journal of Korea Technology Innovation Society
    • /
    • v.16 no.1
    • /
    • pp.63-100
    • /
    • 2013
  • Research data means data in the form of facts, observations, images, computer program results, recordings, measurements or experiences on which an argument, theory, test or hypothesis, or another research output is based. Data may be numerical, descriptive, visual or tactile. Scientific research is changing because of the paradigm shift. It is all being affected by the data deluge, and a data-intensive science paradigm is emerging. Hence, paradigm shift in scientific research led to increase of value and importance of scientific data. Essential to the creative research and development for scientific data can be reused efficiently is the sharing and utilization of establishing management system. Establishing of management system for sharing and utilization of scientific data should be done at the national level, but compared with Europe, Australia, the United States, China, the management system of Korea doesn't have not linkage or efficiency or internal stability. Australia, the United States, China continues to expand a Mid- and Long-Term policy making, legislation, its investment in infrastructure, so as to promote the utilization of data, such as collection, management and maintenance of scientific data through the relevant agencies at the national level. This study consider legislation cases and management policies of the above countries to the end to that establish management system for the efficient and fair sharing and utilization of scientific data and the legal system, and that provide scientific data legislation and policies related to the future of our country.

  • PDF

The Performance Analysis of Equalizer for Next Generation W-LAN with OFDM System (OFDM 방식의 차세대 무선 LAN 환경에서 등화기의 성능 분석)

  • Han, Kyung-Su;Youn, Hee-Sang
    • Journal of Advanced Navigation Technology
    • /
    • v.6 no.1
    • /
    • pp.44-51
    • /
    • 2002
  • This paper describes the performance evaluation and analysis of an Orthogonal Frequency-Division Multiplexing (OFDM) system having the least Inter Symbol Interference (ISI) in a multi-path fading channel environment. Wireless Local Area Network (W-LAN) in accordance with IEEE 802.11a and IEEE 802.11b provides high-speed transmission to universities, businesses and other various places. In addition, service providers can offer a public W-LAN service on restricted areas such as a subway. The proliferation of W-LAN has led to greater W-LAN service demands, but problems are also on the rise in offering a good W-LAN service. In particular, urban areas with high radio wave interference and many buildings are vulnerable to deteriorated QoS including disconnected data and errors. For example, when high-speed data is transmitted in such areas, the relatively high frequency generates ISI between Access Points (AP) and Mobile Terminals (such as a notebook computer), leading to a frequency selective fading channel environment. Consequently, it is difficult to expect a goodW-LAN service. The simulation proves that the OFDM system enables W-LAN to implement QoS in high-speed data transmission in a multi-path fading channel environment. The enhanced OFDM performance with 52 sub-carriers is verified via data modulation methods such as BPSK, QPSK and 16QAM based on IEEE 802.11a and punched convolutional codes with code rate of 1/2 and 3/4 and constraint length of 7. Especially, the simulation finds that the OFDM system has better performance and there is no data disconnection even in a mobile environment by applying a single tap equalizer and a decision feedback equalizer to a mobile channel environment with heavy fading influence. Given the above result, the OFDM system is an ideal solution to guarantee QoS of the W-LAN service in a high-speed mobile environment.

  • PDF

A Data Hiding Scheme for Binary Image Authentication with Small Image Distortion (이미지 왜곡을 줄인 이진 이미지 인증을 위한 정보 은닉 기법)

  • Lee, Youn-Ho;Kim, Byoung-Ho
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.36 no.2
    • /
    • pp.73-86
    • /
    • 2009
  • This paper proposes a new data hiding scheme for binary image authentication with minimizing the distortion of host image. Based on the Hamming-Code-Based data embedding algorithm, the proposed scheme makes it possible to embed authentication information into host image with only flipping small number of pixels. To minimize visual distortion, the proposed scheme only modifies the values of the flippable pixels that are selected based on Yang et al's flippablity criteria. In addition to this, by randomly shuffling the bit-order of the authentication information to be embedded, only the designated receiver, who has the secret key that was used for data embedding, can extract the embedded data. To show the superiority of the proposed scheme, the two measurement metrics, the miss detection rate and the number of flipped pixels by data embedding, are used for the comparison analysis between the proposed scheme and the previous schemes. As a result of analysis, it has been shown that the proposed scheme flips smaller number of pixels than the previous schemes to embed the authentication information of the same bit-length. Moreover, it has been shown that the proposed scheme causes smaller visual distortion and more resilient against recent steg-analysis attacks than the previous schemes by the experimental results.

Cloning of Autoregulator Receptor Gene form Saccharopolyspora erythraea IFO 13426 (Saccharopolyspora erythraea IFO 13426으로부터 Autoregulator Receptor Protein Gene의 Cloning)

  • 김현수;이경화;조재만
    • Microbiology and Biotechnology Letters
    • /
    • v.31 no.2
    • /
    • pp.117-123
    • /
    • 2003
  • For screening of autoregulator receptor gene from Saccharopolyspora erythraea, PCR was performed with primers of receptor gene designed on the basis of amino acid sequences of autoregulator receptor proteins with known function. PCR products were subcloned into the BamHI site of pUC19 and transformed into the E. coli DH5$\alpha$. The isolated plasmid from transformant contained the fragment of 120 bp, which was detected on 2% gel after BamHI treatment. The insert, 120 bp PCR product, was confirmed as the expected internal segment of gene encoding autoregulator receptor protein by sequencing. Southern and colony hybridization using Saccha. erythraea chromosomal DNA were performed with the insert as probe. The plasmid (pEsg) having 3.2 kbp SacI DNA fragment from Saccha. erythraea is obtained. The 3.2 kbp SacI DNA fragment was sequenced by the dye terminator sequencing. The nucleotide sequence data was analyzed with GENETYX-WIN (ver 3.2) computer program and DNA database. frame analyses of the nucleotide sequence revealed a gene encoding autoregulator receptor protein which is a region including KpnI and SalI sites on 3.2 kbp SacI DNA fragment. The autoregulator receptor protein consisting of 205 amino acid was named EsgR by author. In comparison with known autoregulator receptor proteins, homology of EsgR showed above 30%.

Design and Implementation of a BPEL Engine for Dynamic Function using Aspect-Oriented Programming (동적 기능 추가를 위하여 관점지향 프로그래밍 기법을 이용한 BPEL 엔진의 설계와 구현)

  • Kwak, Dong-Gyu;Choi, Jae-Young
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.37 no.4
    • /
    • pp.205-214
    • /
    • 2010
  • BPEL is a standard workflow language, which interacts with Web Services and is used in various applications. But it is difficult to use BPEL for specific applications which require additional functions. In this paper, we present a system which can add new functions to BPEL based on an aspect-oriented programming (AOP) technique. In order to add new functions to BPEL, we define a JWX document format that can describe new functions to apply to BPEL. JWX is XML-oriented document that can code the corresponding Java program in order to dynamically add new functions to BPEL documents. It is possible for BPEL workflow to add new functions without modifying the existing programs using the AOP technique, which guarantees low degree of coupling between key and additional requirements. Additionally this systems weaves based on new functions of Java program and JWX document by expanding BPEL engine called B2J based on AOP and execute them. Therefore it is possible to develop a new BPEL engine with additional functions easily and with low cost. The new system can execute additional conditions that the current BPEL engine doesn’t provide. The new system using functions of BPEL supplied by B2J. The new system can be used to add a new rule engine, which isn't currently provided.

An Ontology-based Data Variability Processing Method (온톨로지 기반 데이터 가변성 처리 기법)

  • Lim, Yoon-Sun;Kim, Myung
    • Journal of KIISE:Software and Applications
    • /
    • v.37 no.4
    • /
    • pp.239-251
    • /
    • 2010
  • In modern distributed enterprise applications that have multilayered architecture, business entities are a kind of crosscutting concerns running through service components that implements business logic in each layer. When business entities are modified, service components related to them should also be modified so that they can deal with those business entities with new types, even though their functionality remains the same. Our previous paper proposed what we call the DTT (Data Type-Tolerant) component model to efficiently process the variability of business entities, which are data externalized from service components. While the DTT component model, by removing direct coupling between service components and business entities, exempts the need to rewrite service components when business entities are modified, it incurs the burden of implementing data type converters that mediate between them. To solve this problem, this paper proposes a method to use ontology as the metadata of both SCDTs (Self-Contained Data Types) in service components and business entities, and a method to generate data type converter code using the ontology. This ontology-based DTT component model greatly enhances the reusability of service components and the efficiency in processing data variability by allowing the computer to automatically generate data type converters without error.

An On-chip Cache and Main Memory Compression System Optimized by Considering the Compression rate Distribution of Compressed Blocks (압축블록의 압축률 분포를 고려해 설계한 내장캐시 및 주 메모리 압축시스템)

  • Yim, Keun-Soo;Lee, Jang-Soo;Hong, In-Pyo;Kim, Ji-Hong;Kim, Shin-Dug;Lee, Yong-Surk;Koh, Kern
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.31 no.1_2
    • /
    • pp.125-134
    • /
    • 2004
  • Recently, an on-chip compressed cache system was presented to alleviate the processor-memory Performance gap by reducing on-chip cache miss rate and expanding memory bandwidth. This research Presents an extended on-chip compressed cache system which also significantly expands main memory capacity. Several techniques are attempted to expand main memory capacity, on-chip cache capacity, and memory bandwidth as well as reduce decompression time and metadata size. To evaluate the performance of our proposed system over existing systems, we use execution-driven simulation method by modifying a superscalar microprocessor simulator. Our experimental methodology has higher accuracy than previous trace-driven simulation method. The simulation results show that our proposed system reduces execution time by 4-23% compared with conventional memory system without considering the benefits obtained from main memory expansion. The expansion rates of data and code areas of main memory are 57-120% and 27-36%, respectively.

Automated Schedulability-Aware Mapping of Real-Time Object-Oriented Models to Multi-Threaded Implementations (실시간 객체 모델의 다중 스레드 구현으로의 스케줄링을 고려한 자동화된 변환)

  • Hong, Sung-Soo
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.8 no.2
    • /
    • pp.174-182
    • /
    • 2002
  • The object-oriented design methods and their CASE tools are widely used in practice by many real-time software developers. However, object-oriented CASE tools require an additional step of identifying tasks from a given design model. Unfortunately, it is difficult to automate this step for a couple of reasons: (1) there are inherent discrepancies between objects and tasks; and (2) it is hard to derive tasks while maximizing real-time schedulability since this problem makes a non-trivial optimization problem. As a result, in practical object-oriented CASE tools, task identification is usually performed in an ad-hoc manner using hints provided by human designers. In this paper, we present a systematic, schedulability-aware approach that can help mapping real-time object-oriented models to multi-threaded implementations. In our approach, a task contains a group of mutually exclusive transactions that may possess different periods and deadline. For this new task model, we provide a new schedulability analysis algorithm. We also show how the run-time system is implemented and how executable code is generated in our frame work. We have performed a case study. It shows the difficulty of task derivation problem and the utility of the automated synthesis of implementations as well as the Inappropriateness of the single-threaded implementations.

Verification and Implementation of a Service Bundle Authentication Mechanism in the OSGi Service Platform Environment (OSGi 서비스 플랫폼 환경에서 서비스 번들 인증 메커니즘의 검증 및 구현)

  • 김영갑;문창주;박대하;백두권
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.31 no.1_2
    • /
    • pp.27-40
    • /
    • 2004
  • The OSGi service platform has several characteristics as in the followings. First, the service is deployed in the form of self-installable component called service bundle. Second, the service is dynamic according to its life-cycle and has interactions with other services. Third, the system resources of a home gateway are restricted. Due to these characteristics of a home gateway, there are a lot of rooms for malicious services can be Installed, and further, the nature of service can be changed. It is possible for those service bundles to influence badly on service gateways and users. However, there is no service bundle authentication mechanism considering those characteristics for the home gateway In this paper, we propose a service bundle authentication mechanism considering those characteristics for the home gateway environment. We design the mechanism for sharing a key which transports a service bundle safely in bootstrapping step that recognize and initialize equipments. And we propose the service bundle authentication mechanism based on MAC that use a shared secret created in bootstrapping step. Also we verify the safety of key sharing mechanism and service bundle authentication mechanism using a BAN Logic. This service bundle authentication mechanism Is more efficient than PKI-based service bundle authentication mechanism or RSH protocol in the service platform which has restricted resources such as storage spaces and operations.

A Study on Photon Characteristics Generated from Target of Electron Linear Accelerator for Container Security Inspection using MCNP6 Code (MCNP6 코드를 이용한 컨테이너 보안 검색용 전자 선형가속기 표적에서 발생한 광자 평가에 관한 연구)

  • Lee, Chang-Ho;Kim, Jang-Oh;Lee, Yoon-Ji;Jeon, Chan-hee;Lee, Ji-Eun;Min, Byung-In
    • Journal of the Korean Society of Radiology
    • /
    • v.14 no.3
    • /
    • pp.193-201
    • /
    • 2020
  • The purpose of this study is to evaluate the photon characteristics according to the material and thickness of the electrons incidented through a linear accelerator. The computer simulation design is a linear accelerator target consisting of a 2 mm thick tungsten single material and a 1.8 mm and 2.3 mm thick tungsten and copper composite material. In the research method, First, the behavior of primary particles in the target was evaluated by electron fluence and electron energy deposition. Second, photons occurring within the target were evaluated by photon fluence. Finally, the photon angle-energy distribution at a distance of 1 m from the target was evaluated by photon fluence. As a result, first, electrons, which are primary particles, were not released out of the target for electron fluence and energy deposition in the target of a single material and a composite material. Then, electrons were linearly attenuated negatively according to the target thickness. Second, it was found that the composite material target had a higher photon generation than the single material target. This confirmed that the material composition and thickness influences photon production. Finally, photon fluence according to the angular distribution required for shielding analysis was calculated. These results confirmed that the photon generation rate differed depending on the material and thickness of the linear accelerator target. Therefore, this study is necessary for designing and operating a linear accelerator use facility for container security screening that is being introduced in the country. In addition, it is thought that it can be used as basic data for radiation protection.