• Title/Summary/Keyword: Data-processing program

Search Result 1,006, Processing Time 0.038 seconds

Korea Standard Earthquake Data Format and Analyst Program for Basic Data Processing (한국지진표준 format 제안 및 기본자료처리용 프로그램)

  • 지현철
    • Proceedings of the Earthquake Engineering Society of Korea Conference
    • /
    • 2000.04a
    • /
    • pp.36-43
    • /
    • 2000
  • Many formats are used for recording and processing in the research of earthquake seismology and earthquake engineering. It is very difficult to program the Reading and writing algorithm for data processing because fill formats are very different from each other. It is suggested new file format of Korea Standard earthquake data (KSED) two types of ASCII and binary that are read and written easily. The Program package of basic data processing (Analyst) which has function of basic filtering spectrum analysis event gathering phase picking and location is developed In addition this program supports file transformation from another format(Mini SEED OMD K2) to KSED format.

  • PDF

DEVELOPMENT OF A COMPUTER PROGRAM FOR ASTRONOMICAL IMAGE DATA PROCESSING BY OBSERVATIONAL EQUIPMENT IN ASTRONOMICAL OBSERVATORY OF KYUNG HEE UNIVERSITY (경희대학교 천문대의 천체관측 자료처리용 프로그램 개발)

  • Kim, Gap-Seong
    • Publications of The Korean Astronomical Society
    • /
    • v.10 no.1
    • /
    • pp.135-146
    • /
    • 1995
  • We have developed a graphic software for image processing of astronomical data obtained by observational equipment in Astronomical Observatory of Kyung Hee University. The essential hardware for running our computer program is simply composed of a PC with the graphic card to handle 256 colors and the color graphic monitor, including CCD camera system. Our software has been programmed in WINDOWS to provide good environments for users, by using various techniques of image processing on astronomical image data recorded in FITS format by KHCCD program(Jin and Kim, 1994) with a compressional mode. We are convinced that our results will be a fundamental and useful technique in the construction of data processing system and can be effectively used in any other observatories, as well as in data processing system of Kyung Hee University.

  • PDF

Data Structure and Visualization Algorithm in a Post-processing Program (가시화 프로그램에서의 데이터 구조와 가시화 알고리즘)

  • Na J. S.;Kim K. Y.;Kim B. S.
    • 한국전산유체공학회:학술대회논문집
    • /
    • 2003.08a
    • /
    • pp.82-87
    • /
    • 2003
  • Post-processing programs play an important role in the CFD data visualization and analysis. A variety of post-processing softwares have been developed and are being used in the CFD community. Developing a good quality of post-processing program requires dedication and efforts. In this paper an experience obtained through previous studies and developing post-processing programs are introduced which includes data structure and visualization algorithms.

  • PDF

A Functional Design of Programmable Logic Controller Based on Parallel Architecture (병렬 구조에 의한 가변 논리제어장치의 기능적 설계)

  • 이정훈;신현식
    • The Transactions of the Korean Institute of Electrical Engineers
    • /
    • v.40 no.8
    • /
    • pp.836-844
    • /
    • 1991
  • PLC(programmable logic controller) system is widely used for the control of factory. PLC system receives ladder diagram which is drawn by the user to implement hardware logic, converts the ladder diagram into sequence program which is executable in the PLC system, and executes the sequence program indefinitely unless user breaks. The sequence program processes the data of on/off signal, and endures 1 scan delay and missing of pulse-type signal shorter than a scan time. So, data dependency doesn't exist. By applying theis characteristics to multiprocessor architecture, we design parellel PLC functionally and evaluate performance upgrade. Parallel PLC consists of central processing module, N general processing unit, and a shared memory by master-slave type. Each module executes allocated sequence program by the control of central processing module. We can expect performance upgrade by parallel processing, and reliability by relocation of sequence program when error occurs in processing module.

  • PDF

Data Extraction of Manufacturing Process for Data Mining (데이터 마이닝을 위한 생산공정 데이터 추출)

  • Park H.K.;Lee G.A.;Choi S.;Lee H.W.;Bae S.M.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.06a
    • /
    • pp.118-122
    • /
    • 2005
  • Data mining is the process of autonomously extracting useful information or knowledge from large data stores or sets. For analyzing data of manufacturing processes obtained from database using data mining, source data should be collected form production process and transformed to appropriate form. To extract those data from database, a computer program should be made for each database. This paper presents a program to extract easily data form database in industry. The advantage of this program is that user can extract data from all types of database and database table and interface with Teamcenter Manufacturing.

  • PDF

Qualification Test of ROCSAT -2 Image Processing System

  • Liu, Cynthia;Lin, Po-Ting;Chen, Hong-Yu;Lee, Yong-Yao;Kao, Ricky;Wu, An-Ming
    • Proceedings of the KSRS Conference
    • /
    • 2003.11a
    • /
    • pp.1197-1199
    • /
    • 2003
  • ROCSAT-2 mission is to daily image over Taiwan and the surrounding area for disaster monitoring, land use, and ocean surveillance during the 5-year mission lifetime. The satellite will be launched in December 2003 into its mission orbit, which is selected as a 14 rev/day repetitive Sun-synchronous orbit descending over (120 deg E, 24 deg N) and 9:45 a.m. over the equator with the minimum eccentricity. National Space Program Office (NSPO) is developing a ROCSAT-2 Image Processing System (IPS), which aims to provide real-time high quality image data for ROCSAT-2 mission. A simulated ROCSAT-2 image, based on Level 1B QuickBird Data, is generated for IPS verification. The test image is comprised of one panchromatic data and four multispectral data. The qualification process consists of four procedures: (a) QuickBird image processing, (b) generation of simulated ROCSAT-2 image in Generic Raw Level Data (GERALD) format, (c) ROCSAT-2 image processing, and (d) geometric error analysis. QuickBird standard photogrammetric parameters of a camera that models the imaging and optical system is used to calculate the latitude and longitude of each line and sample. The backward (inverse model) approach is applied to find the relationship between geodetic coordinate system (latitude, longitude) and image coordinate system (line, sample). The bilinear resampling method is used to generate the test image. Ground control points are used to evaluate the error for data processing. The data processing contains various coordinate system transformations using attitude quaternion and orbit elements. Through the qualification test process, it is verified that the IPS is capable of handling high-resolution image data with the accuracy of Level 2 processing within 500 m.

  • PDF

Development of a Computation Program for Automatic Processing of Calibration Data of Radiation Instrument (방사선 측정기 교정 데이터의 자동처리를 위한 전산프로그램 개발)

  • Jang, Ji-Woon;Shin, Hee-Sung;Youn, Cheung;Lee, Yun-Hee;Kim, Ho-Dong;Jung, Ki-Jung
    • Journal of the Korean Society for Nondestructive Testing
    • /
    • v.26 no.4
    • /
    • pp.246-254
    • /
    • 2006
  • A computation program has been developed for automatic data processing in the calibration process of gamma survey meter. The automatic processing program has been developed based on Visual Basic. The program has been coded according to steps of calibration procedure. The OLE(object linking an embedding) Excel automation method fur automatic data processing is used in this program, which is a kind of programming technique for the Excel control. The performance test on the basis of reference data has been carried out by using the developed program. In the results of performance test, the values of calibration factors and uncertainties by the developed program were equal to those obtained from the reference data. In addition, It was revealed that the efficiency and precision of working are significantly increased by using the developed program.

On Processing Raw Data from Micrometeorological Field Experiments (미기상학 야외실험에서 얻어지는 자료 처리에 관하여)

  • Hong, Jin-kyu;Kim, Joon
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.4 no.2
    • /
    • pp.119-126
    • /
    • 2002
  • Recently, the flux community in Korea established a new regional flux network, so-called KoFlux, which will provide an infrastructure for collecting, synthesizing, and analysing long-term measurements of energy and mass exchange between the atmosphere and the various vegetated surfaces. KoFlux requires the collection of long time series of raw data, and a large amount of data are expected to accumulate due to continuous flux observations at each KoFlux sites. Therefore, we need a systematic and efficient tool to manage these raw data. As a part of this effort, a computer program far processing raw data measured from micrometeorological field experiments was developed for the flux community in Korea. In this paper, we introduce this program for processing raw data to estimate fluxes and other turbulent statistics and explain the micrometeolological processes coded in this data-processing program. Also, we show some examples on how to run the program and handle the outputs for the unique purpose of research interest.

The Improvement of Point Cloud Data Processing Program For Efficient Earthwork BIM Design (토공 BIM 설계 효율화를 위한 포인트 클라우드 데이터 처리 프로그램 개선에 관한 연구)

  • Kim, Heeyeon;Kim, Jeonghwan;Seo, Jongwon;Shim, Ho
    • Korean Journal of Construction Engineering and Management
    • /
    • v.21 no.5
    • /
    • pp.55-63
    • /
    • 2020
  • Earthwork automation has emerged as a promising technology in the construction industry, and the application of earthwork automation technology is starting from the acquisition and processing of point cloud data of the site. Point cloud data has more than a million data due to vast extent of the construction site, and the processing time of the original point cloud data is critical because it takes tens or hundreds of hours to generate a Digital Terrain Model (DTM), and enhancement of the processing time can largely impact on the efficiency of the modeling. Currently, a benchmark program (BP) is actively used for the purpose of both point cloud data processing and BIM design as an integrated program in Korea, however, there are some aspects to be modified and refined. This study modified the BP, and developed an updated program by adopting a compile-based development environment, newly designed UI/UX, and OpenGL while maintaining existing PCD processing functions, and expended compatibility of the PCD file formats. We conducted a comparative test in terms of loading speed with different number of point cloud data, and the results showed that 92 to 99% performance increase was found in the developed program. This program can be used as a foundation for the development of a program that reduces the gap between design and construction by integrating PCD and earthwork BIM functions in the future.

Study on a post-processing program for flow analysis based on the object-oriented programming concept (객체재향 개념을 반영한 유동해석 후처리 프로그램에 대한 연구)

  • Na J. S.;Kim K. Y.;Kim B. S.
    • Journal of computational fluids engineering
    • /
    • v.9 no.2
    • /
    • pp.1-10
    • /
    • 2004
  • In the present study, a post-processing program is developed for 3D data visualization and analysis. Because the graphical user interface(GUI) of the program is based on Qt-library while all the graphic rendering is performed with OpenGL library, the program runs on not only MS Windows but also UNU and Linux systems without modifying source code. The structure of the program is designed according to the object-oriented programming(OOP) concept so that it has extensibility, reusability, and easiness compared to those by procedural programming. The program is organized as modules by classes, and these classes are made to function through inheritance and cooperation which is an important and valuable concept of object-oriented programming. The major functions realized so far which include mesh plot, contour plot, vector plot, streamline plot, and boundary plot are demonstrated and the relevant algorithms are described.