• Title/Summary/Keyword: Basic Design System

Search Result 2,904, Processing Time 0.037 seconds

An Emulation System for Efficient Verification of ASIC Design (ASIC 설계의 효과적인 검증을 위한 에뮬레이션 시스템)

  • 유광기;정정화
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.36C no.10
    • /
    • pp.17-28
    • /
    • 1999
  • In this paper, an ASIC emulation system called ACE (ASIC Emulator) is proposed. It can produce the prototype of target ASIC in a short time and verify the function of ASIC circuit immediately The ACE is consist of emulation software in which there are EDIF reader, library translator, technology mapper, circuit partitioner and LDF generator and emulation hardware including emulation board and logic analyzer. Technology mapping is consist of three steps such as circuit partitioning and extraction of logic function, minimization of logic function and grouping of logic function. During those procedures, the number of basic logic blocks and maximum levels are minimized by making the output to be assigned in a same block sharing product-terms and input variables as much as possible. Circuit partitioner obtain chip-level netlists satisfying some constraints on routing structure of emulation board as well as the architecture of FPGA chip. A new partitioning algorithm whose objective function is the minimization of the number of interconnections among FPGA chips and among group of FPGA chips is proposed. The routing structure of emulation board take the advantage of complete graph and partial crossbar structure in order to minimize the interconnection delay between FPGA chips regardless of circuit size. logic analyzer display the waveform of probing signal on PC monitor that is designated by user. In order to evaluate the performance of the proposed emulation system, video Quad-splitter, one of the commercial ASIC, is implemented on the emulation board. Experimental results show that it is operated in the real time of 14.3MHz and functioned perfectly.

  • PDF

The Study on Design of lead monoxide based radiation detector for Checking the Position of a Radioactive Source in an NDT (비파괴검사 분야에서 방사선원의 위치 확인을 위한 산화납 기반 방사선 검출기 설계에 관한 연구)

  • Ahn, Ki-Jung
    • Journal of the Korean Society of Radiology
    • /
    • v.11 no.4
    • /
    • pp.183-188
    • /
    • 2017
  • In recent years, the automatic remote control controller of the gamma ray irradiator malfunctions, and radiation workers are continuously exposed to radiation exposure accidents. In the non-destructive testing field, much time and resources are invested in establishing a radioactive source monitoring system in order to prevent potential incidents of radiation. In this study, the gamma-ray response properties of the lead monoxide-based radiation detector were estimated through monte carlo simulation as a previous study for the development of a radioactive source location monitoring system that can be applied universally to various non-destructive testing equipment. As a result of the study, the optimized thickness of the radiation detector varies according to the gamma-ray energy emitted from the radioactive source, and the optimized thickness gradually increases with increasing energy. In conclusion, the optimized thickness of the lead monoxide-based radiation detector was $200{\mu}m$ for the Ir-192, $150{\mu}m$ for the Se-75 and $300{\mu}m$ for the Co-60. Based on these results, the appropriate thickness of lead monoxide-based radiation detector considering secondary-electron equilibrium was evaluated to be $300{\mu}m$ for general application. These results can be used as a basic data for determining the appropriate thickness required in the radiation detector when developing a radiation source location monitoring system for universal application to various non-destructive testing equipment in the future.

Bacterial Hash Function Using DNA-Based XOR Logic Reveals Unexpected Behavior of the LuxR Promoter

  • Pearson, Brianna;Lau, Kin H.;Allen, Alicia;Barron, James;Cool, Robert;Davis, Kelly;DeLoache, Will;Feeney, Erin;Gordon, Andrew;Igo, John;Lewis, Aaron;Muscalino, Kristi;Parra, Madeline;Penumetcha, Pallavi;Rinker, Victoria G.;Roland, Karlesha;Zhu, Xiao;Poet, Jeffrey L.;Eckdahl, Todd T.;Heyer, Laurie J.;Campbell, A. Malcolm
    • Interdisciplinary Bio Central
    • /
    • v.3 no.3
    • /
    • pp.10.1-10.8
    • /
    • 2011
  • Introduction: Hash functions are computer algorithms that protect information and secure transactions. In response to the NIST's "International Call for Hash Function", we developed a biological hash function using the computing capabilities of bacteria. We designed a DNA-based XOR logic gate that allows bacterial colonies arranged in a series on an agar plate to perform hash function calculations. Results and Discussion: In order to provide each colony with adequate time to process inputs and perform XOR logic, we designed and successfully demonstrated a system for time-delayed bacterial growth. Our system is based on the diffusion of ${\ss}$-lactamase, resulting in destruction of ampicillin. Our DNA-based XOR logic gate design is based on the op-position of two promoters. Our results showed that $P_{lux}$ and $P_{OmpC}$ functioned as expected individually, but $P_{lux}$ did not behave as expected in the XOR construct. Our data showed that, contrary to literature reports, the $P_{lux}$ promoter is bidirectional. In the absence of the 3OC6 inducer, the LuxR activator can bind to the $P_{lux}$ promoter and induce backwards transcription. Conclusion and Prospects: Our system of time delayed bacterial growth allows for the successive processing of a bacterial hash function, and is expected to have utility in other synthetic biology applications. While testing our DNA-based XOR logic gate, we uncovered a novel function of $P_{lux}$. In the absence of autoinducer 3OC6, LuxR binds to $P_{lux}$ and activates backwards transcription. This result advances basic research and has important implications for the widespread use of the $P_{lux}$ promoter.

A Study of Information Update and Framework for Intergrated Maintenance and Operation of River Facilities (하천시설 통합 유지운영을 위한 정보 현행화 및 프레임워크 구축방향 연구)

  • Nam, Jeong-Yong;Kim, Min-Jeong;Jo, Chan-Won
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.12
    • /
    • pp.140-149
    • /
    • 2017
  • Recently, it has become necessary to consider climate change when managing multi-purpose river functions. However, in terms of domestic rivers, the management of national and local rivers is separated and the river information cannot be integratedly handled. Especially, it is not sufficient to collect and update information by recycling reports for design and construction. In addition, the basic information of the rivers is dependent on the GIS-based RIMGIS system, but the reliability of the information is deteriorating due to the construction of spatial information using the river basement planning results. The purpose of this study is to investigate the current status of the information system with regard to the maintenance and operation of the river facilities. Through the verification of actual cases, the optimal solution was suggested from the point of view of practical information. As a result, we constructed an information system for the reliable maintenance of river facilities and examined the integrated information management plan. The results of this study can be used to improve the existing information and technical and institutional procedures for the integrated maintenance and operation of river facilities. It will be helpful to introduce the BIM as well as solve to the information gap with other fields through the establishment of an information framework to improve the information construction of river areas.

Developing a General Recycling Method of FRP Boats (FRP선박의 범용 재활용을 위한 재처리시스템의 연구)

  • Yoon, Koo-Young
    • Journal of the Korean Society for Marine Environment & Energy
    • /
    • v.12 no.1
    • /
    • pp.29-34
    • /
    • 2009
  • For several decades, many researchers have been involved in developing recycling methods for FRP boats. There are four basic classes of recycling covered in the literature. Despite of environmental problems(safety hazards), mechanical recycling of FRP boats, which involves shredding and grinding of the scrap FRP, is one of the simpler and more technically proven methods than incineration, reclamation or chemical ones. Because FRP is made up of reinforced fiber glass, it is very difficult to break into pieces. It also leads to secondary problem in recycling process, such as air pollution and unacceptable shredding noise level. Another serious problem of mechanical FRP recycling is very limited reusable applications for the residue. This study is to propose a new and efficient method which is more wide range applications and environment friendly waste FRP regenerating system. New system is added with the cyclone sorting machine for airborne pollutions and modified cutting system for several glass fiber chips sizes. It also has shown the FRP chip fiber-reinforced concrete and fiber-reinforced secondary concrete applications with the waste FRP boat to be more eligible than existing recycling method.

  • PDF

A Study on Development of GenBank-based Prototype System for Linking Heterogeneous Content (GenBank를 활용한 이종의 콘텐트 연계 프로토타입 시스템 개발 연구)

  • Ahn, Bu-Young;Shin, Young-Ju;Kim, Dea-Hwan
    • Journal of Information Management
    • /
    • v.40 no.4
    • /
    • pp.109-133
    • /
    • 2009
  • Among biological information, GenBank, provided by the National Center for Biotechnology Information (NCBI)of the United States, is a representative database on genetic information and is the most widely used by researchers around the world. Korea Institute of Science and Technology Information (KISTI) visits NCBI on a regular basis and downloads the latest version of GenBank to reorganize the information gathered there into a database. This database is provided for Korean researchers of science and technology through the Bio-KRISTAL search engine, developed by KISTI. This study aims to design a service model that links information on papers, patents, and biodiversity and other contents of NDSL, an integrated service on scientific and technological information run by KISTI, with GenBank's reference and organism fields and to develop a prototype system. For this purpose, this paper explores the possibility of a linkage and convergence service between heterogeneous content by: (a) collecting GenBank data from NCBI's FTP site; (b) dividing GenBank text files into basic and reference genetic information and restructuring them into a database; (c) extracting article and patent information from the GenBank reference fields to generate new tables; and (d) leveraging data mapping technology to implement a prototype system where GenBank and NDSL data are interlinked and provided.

Design of T-DMB Automatic Emergency Alert Service Standard : Part 1 Requirements Analysis (지상파 DMB 자동재난경보방송표준 설계 : Part 1 요구사항 분석)

  • Choi, Seong-Jong;Kwon, Dae-Bok;Kim, Jae-Yeon;Oh, Keon-Sik;Chang, Tae-Uk;Hahm, Young-Kwon
    • Journal of Broadcast Engineering
    • /
    • v.12 no.3
    • /
    • pp.230-241
    • /
    • 2007
  • This paper presents the requirements analysis for the Terrestrial DMB Automatic Emergency Alert Service (AEAS) Standard. First, the basic concepts in disaster management and the AEAS system structure are presented as a background. Next, other emergency alert systems and the related standards are analyzed. We propose taxonomy to categorize the emergency alert systems and analyze the characteristics of each system. Next, we analyze advantages of T-DMB for the delivery medium of emergency alert message and problems to resolve for the enhanced performance. Finally, we propose service requirements which will achieve general/special-purpose, non-interrupting, location-adaptive, automatic, message delivery service. The paper will contribute as a guideline to the development for emergency alert service standards for other broadcasting media.

Performance Optimization Strategies for Fully Utilizing Apache Spark (아파치 스파크 활용 극대화를 위한 성능 최적화 기법)

  • Myung, Rohyoung;Yu, Heonchang;Choi, Sukyong
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.7 no.1
    • /
    • pp.9-18
    • /
    • 2018
  • Enhancing performance of big data analytics in distributed environment has been issued because most of the big data related applications such as machine learning techniques and streaming services generally utilize distributed computing frameworks. Thus, optimizing performance of those applications at Spark has been actively researched. Since optimizing performance of the applications at distributed environment is challenging because it not only needs optimizing the applications themselves but also requires tuning of the distributed system configuration parameters. Although prior researches made a huge effort to improve execution performance, most of them only focused on one of three performance optimization aspect: application design, system tuning, hardware utilization. Thus, they couldn't handle an orchestration of those aspects. In this paper, we deeply analyze and model the application processing procedure of the Spark. Through the analyzed results, we propose performance optimization schemes for each step of the procedure: inner stage and outer stage. We also propose appropriate partitioning mechanism by analyzing relationship between partitioning parallelism and performance of the applications. We applied those three performance optimization schemes to WordCount, Pagerank, and Kmeans which are basic big data analytics and found nearly 50% performance improvement when all of those schemes are applied.

Design of Compound Knowledge Repository for Recommendation System (추천시스템을 위한 복합지식저장소 설계)

  • Han, Jung-Soo;Kim, Gui-Jung
    • Journal of Digital Convergence
    • /
    • v.10 no.11
    • /
    • pp.427-432
    • /
    • 2012
  • The article herein suggested a compound repository and a descriptive method to develop a compound knowledge process. A data target saved in a compound knowledge repository suggested in this article includes all compound knowledge meta data and digital resources, which can be divided into the three following factors according to the purpose: user roles, functional elements, and service ranges. The three factors are basic components to describe abstract models of repository. In this article, meta data of compound knowledge are defined by being classified into the two factors. A component stands for the property about a main agent, activity unit or resource that use and create knowledge, and a context presents the context in which knowledge object are included. An agent of the compound knowledge process performs classification, registration, and pattern information management of composite knowledge, and serves as data flow and processing between compound knowledge repository and user. The agent of the compound knowledge process consists of the following functions: warning to inform data search and extraction, data collection and output for data exchange in an distributed environment, storage and registration for data, request and transmission to call for physical material wanted after search of meta data. In this article, the construction of a compound knowledge repository for recommendation system to be developed can serve a role to enhance learning productivity through real-time visualization of timely knowledge by presenting well-put various contents to users in the field of industry to occur work and learning at the same time.

The Study on Speaker Change Verification Using SNR based weighted KL distance (SNR 기반 가중 KL 거리를 활용한 화자 변화 검증에 관한 연구)

  • Cho, Joon-Beom;Lee, Ji-eun;Lee, Kyong-Rok
    • Journal of Convergence for Information Technology
    • /
    • v.7 no.6
    • /
    • pp.159-166
    • /
    • 2017
  • In this paper, we have experimented to improve the verification performance of speaker change detection on broadcast news. It is to enhance the input noisy speech and to apply the KL distance $D_s$ using the SNR-based weighting function $w_m$. The basic experimental system is the verification system of speaker change using GMM-UBM based KL distance D(Experiment 0). Experiment 1 applies the input noisy speech enhancement using MMSE Log-STSA. Experiment 2 applies the new KL distance $D_s$ to the system of Experiment 1. Experiments were conducted under the condition of 0% MDR in order to prevent missing information of speaker change. The FAR of Experiment 0 was 71.5%. The FAR of Experiment 1 was 67.3%, which was 4.2% higher than that of Experiment 0. The FAR of experiment 2 was 60.7%, which was 10.8% higher than that of experiment 0.