• Title/Summary/Keyword: Data modeling

Search Result 9,666, Processing Time 0.035 seconds

Reasonable Load Characteristic Experiment for Component Load Modeling (개별 부하모델링을 위한 부하의 합리적인 특성실험)

  • Ji, Pyeong-Sik;Lee, Jong-Pil;Im, Jae-Yun;Chu, Jin-Bu;Kim, Jeong-Hun
    • The Transactions of the Korean Institute of Electrical Engineers A
    • /
    • v.51 no.2
    • /
    • pp.45-52
    • /
    • 2002
  • Load modeling is classified into two methods according to approaching method, so called the measurement and component-based method. The measurement method is to model the load characteristics measured directly at substations and feeders. But it is difficult to measure continuously load characteristics from naturally occurring. system variation. The component-based method consists of the fellowing process; component load modeling, composition rate estimation and aggregation of component loads, etc. In this paper, the characteristic experiment of component loads was performed to obtain data for the component load modeling as the component-based method. At first, representative component loads were selected by the proposed method considering the accuracy of load modeling and the performance possibility of component load experiment in the laboratory. Also an algorithm was Proposed to identify the reliability of data obtained from the component load characteristic experiments. In addition, the results were presented as the case studies.

Hacking Detection Mechanism of Cyber Attacks Modeling (외부 해킹 탐지를 위한 사이버 공격 모델링)

  • Cheon, Yang-Ha
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.8 no.9
    • /
    • pp.1313-1318
    • /
    • 2013
  • In order to actively respond to cyber attacks, not only the security systems such as IDS, IPS, and Firewalls, but also ESM, a system that detects cyber attacks by analyzing various log data, are preferably deployed. However, as the attacks be come more elaborate and advanced, existing signature-based detection methods start to face their limitations. In response to that, researches upon symptom detection technology based on attack modeling by employing big-data analysis technology are actively on-going. This symptom detection technology is effective when it can accurately extract features of attacks and manipulate them to successfully execute the attack modeling. We propose the ways to extract attack features which can play a role as the basis of the modeling and detect intelligent threats by carrying out scenario-based modeling.

Information Strategy Planning for Digital Infrastructure Building with Geo-based Nonrenewable Resources Information in Korea: Conceptual Modeling Units

  • Chi, Kwang-Hoon;Yeon, Young-Kwang;Park, No-Wook;Lee, Ki-Won
    • Proceedings of the KSRS Conference
    • /
    • 2002.10a
    • /
    • pp.191-196
    • /
    • 2002
  • From this year, KIGAM, one of Korean government-supported research institutes, has started new national program for digital geologic/natural resources infrastructure building. The goal of this program is to prepare digitally oriented infrastructure for practical digital database building, management, and public services of numerous types of paper maps related to geo-scientific resources or geologic thematic map sets: hydro-geologic map, applied geologic map, geo-chemical map, airborne radiometric/magnetic map, coal geologic map and off-shelf bathymetry map and so forth. As for digital infrastructure, several research issues in this topic are composed of: ISP (Information Strategy Planning), geo-framework modeling of each map set, pilot database building, cyber geo-mineral directory service system, and web based geologic information retrieval system upgrade which services Korean digital geologic maps scaled 1:50K. In this study, UML (Unified Modeling Language)-based data modeling of geo-data sets by and in KIGAM, among them, is mainly discussed, and its results are also presented in the viewpoint of digital geo-modeling ISP. It is expected this model is further progressed with the purpose of being a guidance or framework modeling for geologic thematic mapping and practical database building, as well as other types of national thematic map database building.

  • PDF

How Practitioners Perceive a Ternary Relationship in ER Conceptual Modeling

  • Jihae Suh;Jinsoo Park;Buomsoo Kim;Hamirahanim Abdul Rahman
    • Asia pacific journal of information systems
    • /
    • v.28 no.2
    • /
    • pp.75-92
    • /
    • 2018
  • Conceptual modeling is well suited as a subject that constitutes the "core" of the Information Systems (IS) discipline and has grown in response to IS development. Several modeling grammars and methods have been studied extensively in the IS discipline. Previous studies, however, present deficiencies in research methods and even put forward contradictory results about the ternary relationship in conceptual modeling. For instance, some studies contend that the semantics of a binary relationship are better for novices, but others argue that a ternary relationship is better than three binary relationships when the association among three entity types clearly exists. The objective of this research is to acquire complete and accurate understanding of the ternary relationship, specifically to understand practitioners' modeling performance when utilizing either a ternary or binary relationship. To the best of our knowledge, no previous work clearly compares real-world modeler performance differences between binary and ternary representations. By investigating practitioners' understanding of ternary relationship and identifying practitioners' cognition, this research can broaden the perspective on conceptual modeling.

A Development of an UML-Based Business Process Modeling Tool Generating Standard-Compliant Workflow Definition Data (표준 워크플로우 정의 데이터를 산출하는 UML 기반 프로세스 모델링 도구 개발)

  • Han Gwan Il;Hwang Tae Il
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 2003.05a
    • /
    • pp.1085-1092
    • /
    • 2003
  • Proposed in this paper is a standard-compliant business process modeling tool which is based on the UML(Unified Modeling Language) activity diagram and produces an XPDL(XML Process Definition Language) file as an output. The XPDL is a standard process definition exchange format by WfMC(Workflow Management Coalition). To develop an UML/XPDL-based modeling tool, the mapping of modeling elements between activity diagram and XPDL format is conducted after the detailed analysis of each modeling specification. As a result of this mapping, it is revealed that modeling elements of each activity diagram and XPDL must be expanded. So new modeling elements are identified and added to each specification. Based on this mapping, the prototype system is developed, and the usefulness of the developed system is shown through the case study.

  • PDF

Modeling of the triangle optimum shape in the surface of an Aluminum dome structure (알루미늄 돔 구조물에서 표면의 삼각형 최적 형상 모델링)

  • 이성철;조종두
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 1997.10a
    • /
    • pp.647-650
    • /
    • 1997
  • A complete dome structure is based on a basic dome modeling, and the basic dome modeling affects safety of the dome structure. In other to save the manufacture expenses, an optimum shape modeling of a dome structure is necessary work of before manufacture of the dome. In this study, modeling of the triangle optimum shape in the surface of an aluminum dome is more focused to optimize shape of the dome and save manufacture expenses. After being made the systematic procedure of the basic modeling, the programming work of the procedure is performed. The program is made by C language, and the trust of the program is proved by comparison between output data of the program and basic modeling in PATRAN.

  • PDF

An XPDL-Based Workflow Control-Structure and Data-Sequence Analyzer

  • Kim, Kwanghoon Pio
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.3
    • /
    • pp.1702-1721
    • /
    • 2019
  • A workflow process (or business process) management system helps to define, execute, monitor and manage workflow models deployed on a workflow-supported enterprise, and the system is compartmentalized into a modeling subsystem and an enacting subsystem, in general. The modeling subsystem's functionality is to discover and analyze workflow models via a theoretical modeling methodology like ICN, to graphically define them via a graphical representation notation like BPMN, and to systematically deploy those graphically defined models onto the enacting subsystem by transforming into their textual models represented by a standardized workflow process definition language like XPDL. Before deploying those defined workflow models, it is very important to inspect its syntactical correctness as well as its structural properness to minimize the loss of effectiveness and the depreciation of efficiency in managing the corresponding workflow models. In this paper, we are particularly interested in verifying very large-scale and massively parallel workflow models, and so we need a sophisticated analyzer to automatically analyze those specialized and complex styles of workflow models. One of the sophisticated analyzers devised in this paper is able to analyze not only the structural complexity but also the data-sequence complexity, especially. The structural complexity is based upon combinational usages of those control-structure constructs such as subprocesses, exclusive-OR, parallel-AND and iterative-LOOP primitives with preserving matched pairing and proper nesting properties, whereas the data-sequence complexity is based upon combinational usages of those relevant data repositories such as data definition sequences and data use sequences. Through the devised and implemented analyzer in this paper, we are able eventually to achieve the systematic verifications of the syntactical correctness as well as the effective validation of the structural properness on those complicate and large-scale styles of workflow models. As an experimental study, we apply the implemented analyzer to an exemplary large-scale and massively parallel workflow process model, the Large Bank Transaction Workflow Process Model, and show the structural complexity analysis results via a series of operational screens captured from the implemented analyzer.

Speaker Verification with the Constraint of Limited Data

  • Kumari, Thyamagondlu Renukamurthy Jayanthi;Jayanna, Haradagere Siddaramaiah
    • Journal of Information Processing Systems
    • /
    • v.14 no.4
    • /
    • pp.807-823
    • /
    • 2018
  • Speaker verification system performance depends on the utterance of each speaker. To verify the speaker, important information has to be captured from the utterance. Nowadays under the constraints of limited data, speaker verification has become a challenging task. The testing and training data are in terms of few seconds in limited data. The feature vectors extracted from single frame size and rate (SFSR) analysis is not sufficient for training and testing speakers in speaker verification. This leads to poor speaker modeling during training and may not provide good decision during testing. The problem is to be resolved by increasing feature vectors of training and testing data to the same duration. For that we are using multiple frame size (MFS), multiple frame rate (MFR), and multiple frame size and rate (MFSR) analysis techniques for speaker verification under limited data condition. These analysis techniques relatively extract more feature vector during training and testing and develop improved modeling and testing for limited data. To demonstrate this we have used mel-frequency cepstral coefficients (MFCC) and linear prediction cepstral coefficients (LPCC) as feature. Gaussian mixture model (GMM) and GMM-universal background model (GMM-UBM) are used for modeling the speaker. The database used is NIST-2003. The experimental results indicate that, improved performance of MFS, MFR, and MFSR analysis radically better compared with SFSR analysis. The experimental results show that LPCC based MFSR analysis perform better compared to other analysis techniques and feature extraction techniques.

A Study on Improvement of the Use and Quality Control for New GNSS RO Satellite Data in Korean Integrated Model (한국형모델의 신규 GNSS RO 자료 활용과 품질검사 개선에 관한 연구)

  • Kim, Eun-Hee;Jo, Youngsoon;Lee, Eunhee;Lee, Yong Hee
    • Atmosphere
    • /
    • v.31 no.3
    • /
    • pp.251-265
    • /
    • 2021
  • This study examined the impact of assimilating the bending angle (BA) obtained via the global navigation satellite system radio occultation (GNSS RO) of the three new satellites (KOMPSAT-5, FY-3C, and FY-3D) on analyses and forecasts of a numerical weather prediction model. Numerical data assimilation experiments were performed using a three-dimensional variational data assimilation system in the Korean Integrated Model (KIM) at a 25-km horizontal resolution for August 2019. Three experiments were designed to select the height and quality control thresholds using the data. A comparison of the data with an analysis of the European Centre for Medium-Range Weather Forecasts (ECMWF) integrated forecast system showed a clear positive impact of BA assimilation in the Southern Hemisphere tropospheric temperature and stratospheric wind compared with that without the assimilation of the three new satellites. The impact of new data in the upper atmosphere was compared with observations using the infrared atmospheric sounding interferometer (IASI). Overall, high volume GNSS RO data helps reduce the RMSE quantitatively in analytical and predictive fields. The analysis and forecasting performance of the upper temperature and wind were improved in the Southern and Northern Hemispheres.

Avionics Software Data Modeling Method and Test For FACE Conformance (FACE 적합성을 위한 항공전자 소프트웨어 데이터 모델링 방안 및 검증)

  • Kyeong-Yeon, Cho;Doo-Hwan, Lee;Sang-Cheol, Cha;Jeong-Yeol, Kim
    • Journal of Aerospace System Engineering
    • /
    • v.16 no.6
    • /
    • pp.45-53
    • /
    • 2022
  • The avionics industry has recently adopted an open architecture to increase software portability and reduce the development schedule and cost associated with changing hardware equipment. This paper presents a data modeling method compliant with FACE, a widely used open avionics architecture. A FACE data model is designed and implemented to output data from VOR/ILS avionics equipment. A FACE Conformance Test Suite (CTS) program is utilised to verify that the data model meets FACE standards. The proposed data modeling method is expected to improve the development schedule and cost associated with modifying communication methods and ICDs (Interface Control Documents).