• Title/Summary/Keyword: Standard Model

Search Result 7,026, Processing Time 0.035 seconds

An effective edge detection method for noise images based on linear model and standard deviation (선형모형과 표준편차에 기반한 잡음영상에 효과적인 에지 검출 방법)

  • Park, Youngho
    • The Korean Journal of Applied Statistics
    • /
    • v.33 no.6
    • /
    • pp.813-821
    • /
    • 2020
  • Recently, research using unstructured data such as images and videos has been actively conducted in various fields. Edge detection is one of the most useful image enhancement techniques to improve the quality of the image process. However, it is very difficult to perform edge detection in noise images because the edges and noise having high frequency components. This paper uses a linear model and standard deviation as an effective edge detection method for noise images. The edge is detected by the difference between the standard deviation of the pixels included in the pixel block and the standard deviation of the residual obtained by fitting the linear model. The results of edge detection are compared with the results of the Sobel edge detector. In the original image, the Sobel edge detection result and the proposed edge detection result are similar. Proposed method was confirmed that the edge with reduced noise was detected in the various levels of noise images.

A Study on Feature Analysis of Archival Metadata Standards in the Records Lifecycle

  • Baek, Jae-Eun
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.48 no.3
    • /
    • pp.71-111
    • /
    • 2014
  • Metadata schemas are well recognized as one of the important technological components for archiving and preservation of digital resources. However, a single standard is not enough to cover the whole lifecycle for archiving and preserving digital resources. This means that we need to appropriately select metadata standards and combine them to develop metadata schemas to cover the whole lifecycle of resources (or records). Creating a unified framework to understand the features of metadata standards is necessary in order to improve metadata interoperability that covers the whole resource lifecycle. In this study, the author approached this issue from the task-centric view of metadata, proposing a Task model as a framework and analyzing the feature of archival metadata standards. The proposed model provides a new scheme to create metadata element mappings and to make metadata interoperable. From this study, the author found out that no single metadata standard can cover the whole lifecycle and also that an in-depth analysis of mappings between metadata standards in accordance with the lifecycle stages is required. The author also discovered that most metadata standards are primarily resource-centric and the different tasks in the resource lifecycle are not reflected in the design of metadata standard data models.

Identity-based Threshold Broadcast Encryption in the Standard Model

  • Zhang, Leyou;Hu, Yupu;Wu, Qing
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.4 no.3
    • /
    • pp.400-410
    • /
    • 2010
  • In an identity-based threshold broadcast encryption (IDTHBE) scheme, a broadcaster chooses a set of n recipients and a threshold value t, and the plaintext can be recovered only if at least t receivers cooperate. IDTHBE scheme is different from the standard threshold public key encryption schemes, where the set of receivers and the threshold value are decided from the beginning. This kind of scheme has wide applications in ad hoc networks. Previously proposed IDTHBE schemes have ciphertexts which contain at least n elements. In addition, the security of theses schemes relies on the random oracles. In this paper, we introduce two new constructions of IDTHBE for ad hoc networks. Our first scheme achieves S-size private keys while the modified scheme achieves constant size private keys. Both schemes achieve approximately (n-t)-size ciphertexts. Furthermore, we also show that they are provablesecurity under the decision bilinear Diffie-Hellman Exponent (BDHE) assumption in the standard model.

Modular approach to Petri net modeling of flexible assembly system

  • Park, T.K.;Choi, B.K.
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1992.04b
    • /
    • pp.436-443
    • /
    • 1992
  • Presented in the paper is a systematic approach to constructing a Petri net model of FAS (flexible assembly system). Petri net is widely used in modeling automated manufacturing systems. But, it found to be very difficult for an FA engineer to build a correct model of an FAS with Petri net symbols (ie, place, transition, and token) from the beginning. An automated manufacturing system in general is built from a set of "standard" hardware components. An FAS in particular is usually composed of assembly robots, work tables, conveyor lines, buffer storages, part feeders, etc. In the proposed modeling scheme, each type of standard resources is represented as a standard "module" which is a sub Petri net. Then, the model of a FAS can be conveniently constructed using the predefined modules the same way the FAS itself is built from the standard components. The network representation of a FAS is termed a JR-net (job resource relation net) which is easy to construct. This JR net is then mechanically converted to a formal Petri net (to simulate the behavior of the FAS). The proposed modeling scheme may easily be extended to the modeling of other types of automated manufacturing systems such as FMS and AS/RS.ch as FMS and AS/RS.

  • PDF

Analysis of Cleavage Fracture Toughness of PCVN Specimens Based on a Scaling Model (PCVN 시편 파괴인성의 균열 깊이 영향에 대한 Scaling 모델 해석)

  • Park, Sang-Yun;Lee, Ho-Jin;Lee, Bong-Sang
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.33 no.4
    • /
    • pp.409-416
    • /
    • 2009
  • Standard procedures for a fracture toughness testing require very severe restrictions for the specimen geometry to eliminate a size effect on the measured properties. Therefore, the used standard fracture toughness data results in the integrity assessment being irrationally conservative. However, a realistic fracture in general structures, such as in nuclear power plants, may develop under the low constraint condition of a large scale yielding with a shallow surface crack. In this paper, cleavage fracture toughness tests have been made on side-grooved PCVN (precracked charpy V-notch) type specimens (10 by 10 by 55 mm) with various crack depths. The constraint effects on the crack depth ratios were evaluated quantitatively by the developed scaling method using the 3-D finite element method. After the fracture toughness correction from scaling model, the statistical size effects were also corrected according to the standard ASTM E 1921 procedure. The results were evaluated through a comparison with the $T_0$ of the standard CT specimen. The corrected $T_0$ for all of the PCVN specimens showed a good agreement to within $5.4^{\circ}C$ regardless of the crack depth, while the averaged PCVN $T_0$ was $13.4^{\circ}C$ higher than the real CT test results.

IOT-based SMEs producing standardized information system model analysis and design (IOT기반 중소기업 생산정보화시스템 표준화 모델 분석 및 설계)

  • Yoon, Kyungbae;Chang, Younghyun
    • The Journal of the Convergence on Culture Technology
    • /
    • v.2 no.1
    • /
    • pp.87-91
    • /
    • 2016
  • This study is to develop a standard model in order to establish IOT production information system and to analyze the effect. Professional IT industry and SMEs that want to build a production information system can be applied to standard models to build the system more effectively. It provides ease of construction and reliability for IOT production information system with removing irrational elements, product quality and reducing production cost. In addition, it can be applied to standardize management of raw materials supply and demand aggregation processes of production and constructed a system more effectively using standard module.

An Open Standard-based Terrain Tile Production Chain for Geo-referenced Simulation

  • Yoo, Byoung-Hyun
    • Korean Journal of Remote Sensing
    • /
    • v.24 no.5
    • /
    • pp.497-506
    • /
    • 2008
  • The needs for digital models of real environment such as 3D terrain or cyber city model are increasing. Most of applications related with modeling and simulation require virtual environment constructed from geospatial information of real world in order to guarantee reliability and accuracy of the simulation. The most fundamental data for building virtual environment, terrain elevation and orthogonal imagery is acquired from optical sensor of satellite or airplane. Providing interoperable and reusable digital model is important to promote practical application of high-resolution satellite imagery. This paper presents the new research regarding representation of geospatial information, especially for 3D shape and appearance of virtual terrain. and describe framework for constructing real-time 3D model of large terrain based on high-resolution satellite imagery. It provides infrastructure of 3D simulation with geographical context. Web architecture, XML language and open protocols to build a standard based 3D terrain are presented. Details of standard-based approach for providing infrastructure of real-time 3D simulation using high-resolution satellite imagery are also presented. This work would facilitate interchange and interoperability across diverse systems and be usable by governments, industry scientists and general public.

The Method to Setup the Path Loss Model by the Partial Interval Analysis in the Cellular Band

  • Park, Kyung-Tae;Bae, Sung-Hyuk
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.14 no.2
    • /
    • pp.105-109
    • /
    • 2013
  • There are the free space model, the direct-path and ground reflected model, Egli model, Okumura-Hata model in the representative propagational models. The measured results at the area of PNG area were used as the experimental data in this paper. The new proposed partial interval analysis method is applied on the measured propagation data in the cellular band. The interval for the analysis is divided from the entire 30 Km distance to 5 Km, and next to 1 Km. The best-fit propagation models are chosen on all partial intervals. The means and standard deviations are calculated for the differences between the measured data and all partial interval models. By using the 5 Km- or 1 Km- partial interval analysis, the standard deviation between the measured data and the partial propagation models was improved more than 1.7 dB.

Mathematical Modeling of VSB-Based Digital Television Systems

  • Kim, Hyoung-Nam;Lee, Yong-Tae;Kim, Seung-Won
    • ETRI Journal
    • /
    • v.25 no.1
    • /
    • pp.9-18
    • /
    • 2003
  • We mathematically analyze the passband vestigial sideband (VSB) system for the Advanced Television Systems Committee (ATSC) digital television standard and present a baseband-equivalent VSB model. The obtained baseband VSB model is represented by convolution of the transmission signal (before modulation) and the baseband equivalent of the complex VSB channel. Due to the operation of the physical channel as an RF passband and the asymmetrical property of VSB modulation, it is necessary to use a complex model. However, the passband channel may be reduced to an equivalent baseband. We show how to apply standard channel model information such as delay, gain, and phase for multiple signal paths to compute both the channel frequency response with a given carrier frequency and the resulting demodulated impulse response. Simulation results illustrate that the baseband VSB model is equivalent to the passband VSB model.

  • PDF

Cancer Genomics Object Model: An Object Model for Cancer Research Using Microarray

  • Park, Yu-Rang;Lee, Hye-Won;Cho, Sung-Bum;Kim, Ju-Han
    • Proceedings of the Korean Society for Bioinformatics Conference
    • /
    • 2005.09a
    • /
    • pp.29-34
    • /
    • 2005
  • DNA microarray becomes a major tool for the investigation of global gene expression in all aspects of cancer and biomedical research. DNA microarray experiment generates enormous amounts of data and they are meaningful only in the context of a detailed description of microarrays, biomaterials, and conditions under which they were generated. MicroArray Gene Expression Data (MGED) society has established microarray standard for structured management of these diverse and large amount data. MGED MAGE-OM (MicroArray Gene Expression Object Model) is an object oriented data model, which attempts to define standard objects for gene expression. To assess the relevance of DNA microarray analysis of cancer research it is required to combine clinical and genomics data. MAGE-OM, however, does not have an appropriate structure to describe clinical information of cancer. For systematic integration of gene expression and clinical data, we create a new model, Cancer Genomics Object Model.

  • PDF