• Title/Summary/Keyword: 시스템 감시 도구

Search Result 66, Processing Time 0.025 seconds

A Benchmark of Open Source Data Mining Package for Thermal Environment Modeling in Smart Farm(R, OpenCV, OpenNN and Orange) (스마트팜 열환경 모델링을 위한 Open source 기반 Data mining 기법 분석)

  • Lee, Jun-Yeob;Oh, Jong-wo;Lee, DongHoon
    • Proceedings of the Korean Society for Agricultural Machinery Conference
    • /
    • 2017.04a
    • /
    • pp.168-168
    • /
    • 2017
  • ICT 융합 스마트팜 내의 환경계측 센서, 영상 및 사양관리 시스템의 증가에도 불구하고 이들 장비에서 확보되는 데이터를 적절히 유효하게 활용하는 기술이 미흡한 실정이다. 돈사의 경우 가축의 복지수준, 성장 변화를 실시간으로 모니터링 및 예측할 수 있는 데이터 분석 및 모델링 기술 확보가 필요하다. 이를 위해선 가축의 생리적 변화 및 행동적 변화를 조기에 감지하고 가축의 복지수준을 실시간으로 감시하고 분석 및 예측 기술이 필요한데 이를 위한 대표적인 정보 통신 공학적 접근법 중에 하나가 Data mining 이다. Data mining에 대한 연구 수행에 필요한 다양한 소프트웨어 중에서 Open source로 제공이 되는 4가지 도구를 비교 분석하였다. 스마트 돈사 내에서 열환경 모델링을 목표로 한 데이터 분석에서 고려해야할 요인으로 데이터 분석 알고리즘 도출 시간, 시각화 기능, 타 라이브러리와 연계 기능 등을 중점 적으로 분석하였다. 선정된 4가지 분석 도구는 1) R(https://cran.r-project.org), 2) OpenCV(http://opencv.org), 3) OpenNN (http://www.opennn.net), 4) Orange(http://orange.biolab.si) 이다. 비교 분석을 수행한 운영체제는 Linux-Ubuntu 16.04.4 LTS(X64)이며, CPU의 클럭속도는 3.6 Ghz, 메모리는 64 Gb를 설치하였다. 개발언어 측면에서 살펴보면 1) R 스크립트, 2) C/C++, Python, Java, 3) C++, 4) C/C++, Python, Cython을 지원하여 C/C++ 언어와 Python 개발 언어가 상대적으로 유리하였다. 데이터 분석 알고리즘의 경우 소스코드 범위에서 라이브러리를 제공하는 경우 Cross-Platform 개발이 가능하여 여러 운영체제에서 개발한 결과를 별도의 Porting 과정을 거치지 않고 사용할 수 있었다. 빌트인 라이브러리 경우 순서대로 R 의 경우 가장 많은 수의 Data mining 알고리즘을 제공하고 있다. 이는 R 운영 환경 자체가 개방형으로 되어 있어 온라인에서 추가되는 새로운 라이브러리를 클라우드를 통하여 공유하기 때문인 것으로 판단되었다. OpenCV의 경우 영상 처리에 강점이 있었으며, OpenNN은 신경망학습과 관련된 라이브러리를 소스코드 레벨에서 공개한 것이 강점이라 할 수 있다. Orage의 경우 라이브러리 집합을 제공하는 것에 중점을 둔 다른 패키지와 달리 시각화 기능 및 망 구성 등 사용자 인터페이스를 통합하여 운영한 것이 강점이라 할 수 있다. 열환경 모델링에 요구되는 시간 복잡도에 대응하기 위한 부가 정보 처리 기술에 대한 연구를 수행하여 스마트팜 열환경 모델링을 실시간으로 구현할 수 있는 방안 연구를 수행할 것이다.

  • PDF

Risk Assessment Tools for Invasive Alien Species in Japan and Europe (일본과 유럽의 침입외래생물 생태계위해성평가 기법)

  • Kil, Jihyon;Mun, Saeromi;Kim, Chang-Gi
    • Ecology and Resilient Infrastructure
    • /
    • v.2 no.3
    • /
    • pp.191-197
    • /
    • 2015
  • Invasive alien species are considered to be one of the main factors that cause biodiversity loss. Establishment of management strategies through continuous monitoring and risk assessment is a key element for invasive alien species management policy. In the present study, we introduce examples of ecological risk assessment tools developed in Japan, Germany-Austria and Belgium. Invasive alien species have been designated in Japan based on the assessment of risks to ecosystems, human health and primary industry. German-Austrian Black List Information System categorized alien species into Black List, White List and Grey List according to their risks to biodiversity. In the Harmonia Information System developed in Belgium, invasiveness, adverse impacts on native species and ecosystem functions and invasion stages were assessed and alien species were categorized into Black List, Watch List and Alert List. These international risk assessment tools may be helpful to improve our national risk assessment protocol for the prioritization of invasive alien species management.

Implementation of reliable dynamic honeypot file creation system for ransomware attack detection (랜섬웨어 공격탐지를 위한 신뢰성 있는 동적 허니팟 파일 생성 시스템 구현)

  • Kyoung Wan Kug;Yeon Seung Ryu;Sam Beom Shin
    • Convergence Security Journal
    • /
    • v.23 no.2
    • /
    • pp.27-36
    • /
    • 2023
  • In recent years, ransomware attacks have become more organized and specialized, with the sophistication of attacks targeting specific individuals or organizations using tactics such as social engineering, spear phishing, and even machine learning, some operating as business models. In order to effectively respond to this, various researches and solutions are being developed and operated to detect and prevent attacks before they cause serious damage. In particular, honeypots can be used to minimize the risk of attack on IT systems and networks, as well as act as an early warning and advanced security monitoring tool, but in cases where ransomware does not have priority access to the decoy file, or bypasses it completely. has a disadvantage that effective ransomware response is limited. In this paper, this honeypot is optimized for the user environment to create a reliable real-time dynamic honeypot file, minimizing the possibility of an attacker bypassing the honeypot, and increasing the detection rate by preventing the attacker from recognizing that it is a honeypot file. To this end, four models, including a basic data collection model for dynamic honeypot generation, were designed (basic data collection model / user-defined model / sample statistical model / experience accumulation model), and their validity was verified.

Comparison of the wall clock time for extracting remote sensing data in Hierarchical Data Format using Geospatial Data Abstraction Library by operating system and compiler (운영 체제와 컴파일러에 따른 Geospatial Data Abstraction Library의 Hierarchical Data Format 형식 원격 탐사 자료 추출 속도 비교)

  • Yoo, Byoung Hyun;Kim, Kwang Soo;Lee, Jihye
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.21 no.1
    • /
    • pp.65-73
    • /
    • 2019
  • The MODIS (Moderate Resolution Imaging Spectroradiometer) data in Hierarchical Data Format (HDF) have been processed using the Geospatial Data Abstraction Library (GDAL). Because of a relatively large data size, it would be preferable to build and install the data analysis tool with greater computing performance, which would differ by operating system and the form of distribution, e.g., source code or binary package. The objective of this study was to examine the performance of the GDAL for processing the HDF files, which would guide construction of a computer system for remote sensing data analysis. The differences in execution time were compared between environments under which the GDAL was installed. The wall clock time was measured after extracting data for each variable in the MODIS data file using a tool built lining against GDAL under a combination of operating systems (Ubuntu and openSUSE), compilers (GNU and Intel), and distribution forms. The MOD07 product, which contains atmosphere data, were processed for eight 2-D variables and two 3-D variables. The GDAL compiled with Intel compiler under Ubuntu had the shortest computation time. For openSUSE, the GDAL compiled using GNU and intel compilers had greater performance for 2-D and 3-D variables, respectively. It was found that the wall clock time was considerably long for the GDAL complied with "--with-hdf4=no" configuration option or RPM package manager under openSUSE. These results indicated that the choice of the environments under which the GDAL is installed, e.g., operation system or compiler, would have a considerable impact on the performance of a system for processing remote sensing data. Application of parallel computing approaches would improve the performance of the data processing for the HDF files, which merits further evaluation of these computational methods.

Scheme on Environmental Risk Assessment and Management for Carbon Dioxide Sequestration in Sub-seabed Geological Structures in Korea (이산화탄소 해양 지중저장사업의 환경위해성평가관리 방안)

  • Choi, Tae-Seob;Lee, Jung-Suk;Lee, Kyu-Tae;Park, Young-Gyu;Hwang, Jin-Hwan;Kang, Seong-Gil
    • Journal of the Korean Society for Marine Environment & Energy
    • /
    • v.12 no.4
    • /
    • pp.307-319
    • /
    • 2009
  • Carbon dioxide capture and storage (CCS) technology has been regarded as one of the most possible and practical option to reduce the emission of carbon dioxide ($CO_2$) and consequently to mitigate the climate change. Korean government also have started a 10-year R&D project on $CO_2$ storage in sea-bed geological structure including gas field and deep saline aquifer since 2005. Various relevant researches are carried out to cover the initial survey of suitable geological structure storage site, monitoring of the stored $CO_2$ behavior, basic design of $CO_2$ transport and storage process and the risk assessment and management related to $CO_2$ leakage from engineered and geological processes. Leakage of $CO_2$ to the marine environment can change the chemistry of seawater including the pH and carbonate composition and also influence adversely on the diverse living organisms in ecosystems. Recently, IMO (International Maritime Organization) have developed the risk assessment and management framework for the $CO_2$ sequestration in sub-seabed geological structures (CS-SSGS) and considered the sequestration as a waste management option to mitigate greenhouse gas emissions. This framework for CS-SSGS aims to provide generic guidance to the Contracting Parties to the London Convention and Protocol, in order to characterize the risks to the marine environment from CS-SSGS on a site-specific basis and also to collect the necessary information to develop a management strategy to address uncertainties and any residual risks. The environmental risk assessment (ERA) plan for $CO_2$ storage work should include site selection and characterization, exposure assessment with probable leak scenario, risk assessment from direct and in-direct impact to the living organisms and risk management strategy. Domestic trial of the $CO_2$ capture and sequestration in to the marine geologic formation also should be accomplished through risk management with specified ERA approaches based on the IMO framework. The risk assessment procedure for $CO_2$ marine storage should contain the following components; 1) prediction of leakage probabilities with the reliable leakage scenarios from both engineered and geological part, 2) understanding on physio-chemical fate of $CO_2$ in marine environment especially for the candidate sites, 3) exposure assessment methods for various receptors in marine environments, 4) database production on the toxic effect of $CO_2$ to the ecologically and economically important species, and finally 5) development of surveillance procedures on the environmental changes with adequate monitoring techniques.

  • PDF

A Study on the Online Newspaper Archive : Focusing on Domestic and International Case Studies (온라인 신문 아카이브 연구 국내외 구축 사례를 중심으로)

  • Song, Zoo Hyung
    • The Korean Journal of Archival Studies
    • /
    • no.48
    • /
    • pp.93-139
    • /
    • 2016
  • Aside from serving as a body that monitors and criticizes the government through reviews and comments on public issues, newspapers can also form and spread public opinion. Metadata contains certain picture records and, in the case of local newspapers, the former is an important means of obtaining locality. Furthermore, advertising in newspapers and the way of editing in newspapers can be viewed as a representation of the times. For the value of archiving in newspapers when a documentation strategy is established, the newspaper is considered as a top priority that should be collected. A newspaper archive that will handle preservation and management carries huge significance in many ways. Journalists use them to write articles while scholars can use a newspaper archive for academic purposes. Also, the NIE is a type of a practical usage of such an archive. In the digital age, the newspaper archive has an important position because it is located in the core of MAM, which integrates and manages the media asset. With this, there are prospects that an online archive will perform a new role in the production of newspapers and the management of publishing companies. Korea Integrated News Database System (KINDS), an integrated article database, began its service in 1991, whereas Naver operates an online newspaper archive called "News Library." Initially, KINDS received an enthusiastic response, but nowadays, the utilization ratio continues to decrease because of the omission of some major newspapers, such as Chosun Ilbo and JoongAng Ilbo, and the numerous user interface problems it poses. Despite these, however, the system still presents several advantages. For example, it is easy to access freely because there is a set budget for the public, and accessibility to local papers is simple. A national library consistently carries out the digitalization of time-honored newspapers. In addition, individual newspaper companies have also started the service, but it is not enough for such to be labeled an archive. In the United States (US), "Chronicling America"-led by the Library of Congress with funding from the National Endowment for the Humanities-is in the process of digitalizing historic newspapers. The universities of each state and historical association provide funds to their public library for the digitalization of local papers. In the United Kingdom, the British Library is constructing an online newspaper archive called "The British Newspaper Archive," but unlike the one in the US, this service charges a usage fee. The Joint Information Systems Committee has also invested in "The British Newspaper Archive," and its construction is still ongoing. ProQuest Archiver and Gale NewsVault are the representative platforms because of their efficiency and how they have established the standardization of newspapers. Now, it is time to change the way we understand things, and a drastic investment is required to improve the domestic and international online newspaper archive.