• Title/Summary/Keyword: 그레이리스트

Search Result 7, Processing Time 0.024 seconds

Risk Assessment Tools for Invasive Alien Species in Japan and Europe (일본과 유럽의 침입외래생물 생태계위해성평가 기법)

  • Kil, Jihyon;Mun, Saeromi;Kim, Chang-Gi
    • Ecology and Resilient Infrastructure
    • /
    • v.2 no.3
    • /
    • pp.191-197
    • /
    • 2015
  • Invasive alien species are considered to be one of the main factors that cause biodiversity loss. Establishment of management strategies through continuous monitoring and risk assessment is a key element for invasive alien species management policy. In the present study, we introduce examples of ecological risk assessment tools developed in Japan, Germany-Austria and Belgium. Invasive alien species have been designated in Japan based on the assessment of risks to ecosystems, human health and primary industry. German-Austrian Black List Information System categorized alien species into Black List, White List and Grey List according to their risks to biodiversity. In the Harmonia Information System developed in Belgium, invasiveness, adverse impacts on native species and ecosystem functions and invasion stages were assessed and alien species were categorized into Black List, Watch List and Alert List. These international risk assessment tools may be helpful to improve our national risk assessment protocol for the prioritization of invasive alien species management.

Detecting Malicious Scripts in Web Contents through Remote Code Verification (원격코드검증을 통한 웹컨텐츠의 악성스크립트 탐지)

  • Choi, Jae-Yeong;Kim, Sung-Ki;Lee, Hyuk-Jun;Min, Byoung-Joon
    • The KIPS Transactions:PartC
    • /
    • v.19C no.1
    • /
    • pp.47-54
    • /
    • 2012
  • Sharing cross-site resources has been adopted by many recent websites in the forms of service-mashup and social network services. In this change, exploitation of the new vulnerabilities increases, which includes inserting malicious codes into the interaction points between clients and services instead of attacking the websites directly. In this paper, we present a system model to identify malicious script codes in the web contents by means of a remote verification while the web contents downloaded from multiple trusted origins are executed in a client's browser space. Our system classifies verification items according to the origin of request based on the information on the service code implementation and stores the verification results into three databases composed of white, gray, and black lists. Through the experimental evaluations, we have confirmed that our system provides clients with increased security by effectively detecting malicious scripts in the mashup web environment.

DEVS Simulation of Spam Voice Signal Detection in VoIP Service (VoIP 스팸 콜 탐지를 위한 음성신호의 DEVS 모델링 및 시뮬레이션)

  • Kim, Ji-Yeon;Kim, Hyung-Jong;Cho, Young-Duk;Kim, Hwan-Kuk;Won, Yoo-Jae;Kim, Myuhng-Joo
    • Journal of the Korea Society for Simulation
    • /
    • v.16 no.3
    • /
    • pp.75-87
    • /
    • 2007
  • As the VoIP service quality is getting better and many shortcomings are being overcome, users are getting interested in this service. Also, there are several additional features that provide a convenience to users such as presence service, instant messaging service and so on. But, as there are always two sides of rein, some security issues have users hesitate to make use of it. This paper deals with one of the issues, the VoIP spam problem. We took into account the signal pattern of voice message in spam call and we have constructed voice signal models of normal call, normal call with noise and spam call. Each voice signal case is inserted into our spam decision algorithm which detects the spam calls based on the amount of information in the call signal. We made use of the DEVS-$Java^{TM}$ for our modeling and simulation. The contribution of this work is in suggestion of a way to detect voice spam call signal and testing of the method using modeling and simulation methodology.

  • PDF

List-event Data Resampling for Quantitative Improvement of PET Image (PET 영상의 정량적 개선을 위한 리스트-이벤트 데이터 재추출)

  • Woo, Sang-Keun;Ju, Jung Woo;Kim, Ji Min;Kang, Joo Hyun;Lim, Sang Moo;Kim, Kyeong Min
    • Progress in Medical Physics
    • /
    • v.23 no.4
    • /
    • pp.309-316
    • /
    • 2012
  • Multimodal-imaging technique has been rapidly developed for improvement of diagnosis and evaluation of therapeutic effects. In despite of integrated hardware, registration accuracy was decreased due to a discrepancy between multimodal image and insufficiency of count in accordance with different acquisition method of each modality. The purpose of this study was to improve the PET image by event data resampling through analysis of data format, noise and statistical properties of small animal PET list data. Inveon PET listmode data was acquired as static data for 10 min after 60 min of 37 MBq/0.1 ml $^{18}F$-FDG injection via tail vein. Listmode data format was consist of packet containing 48 bit in which divided 8 bit header and 40 bit payload space. Realigned sinogram was generated from resampled event data of original listmode by using adjustment of LOR location, simple event magnification and nonparametric bootstrap. Sinogram was reconstructed for imaging using OSEM 2D algorithm with 16 subset and 4 iterations. Prompt coincidence was 13,940,707 count measured from PET data header and 13,936,687 count measured from analysis of list-event data. In simple event magnification of PET data, maximum was improved from 1.336 to 1.743, but noise was also increased. Resampling efficiency of PET data was assessed from de-noised and improved image by shift operation of payload value of sequential packet. Bootstrap resampling technique provides the PET image which noise and statistical properties was improved. List-event data resampling method would be aid to improve registration accuracy and early diagnosis efficiency.

The Model of Appraisal Method on Authentic Records (전자기록의 진본 평가 시스템 모형 연구)

  • Kim, Ik-Han
    • The Korean Journal of Archival Studies
    • /
    • no.14
    • /
    • pp.91-117
    • /
    • 2006
  • Electronic Records need to be appraised the authenticity as well as the value itself. There has been various kinds of discussion about how records to be appraised the value of themselves, but there's little argument about how electronic records to be appraised the authenticity of themselves. Therefore this article is modeling some specific authenticity appraisal methods and showing each stages those methods should or may be applied. At the Ingest stage, integrity verification right after records creation in the organization which produced the records, quality and integrity verification about the transferred in the organization which received the records and integrity check between SIP and AIP in the organization which received and preserved the records are essential. At the Preservation stage, integrity check between same AIPs stored in different medium separately and validation of records where or not damaged and recovery damaged records are needed. At the various Processing stages, suitability evaluation after changing the record's management control meta data and changing the record's classification, integrity check after records migration and periodical validation and integrity verification about DIPs are required. For those activities, the appraisal methods including integrity verification, content consistency check, suitability evaluation about record's meta data, feasibility check of unauthorized update and physical status validation should be applied to the electronic records management process.

Color Image Coding using Variable Block of Fractal (프랙탈 기반의 가변블록을 이용한 컬러영상 부호화)

  • Park, Jae-Hong;Park, Cheol-Woo
    • Journal of the Korean Society of Radiology
    • /
    • v.8 no.7
    • /
    • pp.435-441
    • /
    • 2014
  • This paper suggests techniques to enhance coding time which is a problem in traditional fractal compression and to improve fidelity of reconstructed images by determining fractal coefficient through adaptive selection of block approximation formula. First, to reduce coding time, we construct a linear list of domain blocks of which characteristics is given by their luminance and variance and then we control block searching time according to the first permissible threshold value. Next, when employing three-level block partition, if a range block of minimum partition level cannot find a domain block which has a satisfying approximation error, There applied to 24-bpp color image compression and image techniques. The result did not occur a loss in the image quality of the image when using the encoding method, such as almost to the color in the RGB image compression rate and image quality, such as gray-level images and showed good.

Apache NiFi-based ETL Process for Building Data Lakes (데이터 레이크 구축을 위한 Apache NiFi기반 ETL 프로세스)

  • Lee, Kyoung Min;Lee, Kyung-Hee;Cho, Wan-Sup
    • The Journal of Bigdata
    • /
    • v.6 no.1
    • /
    • pp.145-151
    • /
    • 2021
  • In recent years, digital data has been generated in all areas of human activity, and there are many attempts to safely store and process the data to develop useful services. A data lake refers to a data repository that is independent of the source of the data and the analytical framework that leverages the data. In this paper, we designed a tool to safely store various big data generated by smart cities in a data lake and ETL it so that it can be used in services, and a web-based tool necessary to use it effectively. Implement. A series of processes (ETLs) that quality-check and refine source data, store it safely in a data lake, and manage it according to data life cycle policies are often significant for costly infrastructure and development and maintenance. It is a labor-intensive technology. The mounting technology makes it possible to set and execute ETL work monitoring and data life cycle management visually and efficiently without specialized knowledge in the IT field. Separately, a data quality checklist guide is needed to store and use reliable data in the data lake. In addition, it is necessary to set and reserve data migration and deletion cycles using the data life cycle management tool to reduce data management costs.