• Title/Summary/Keyword: data traceability

Search Result 141, Processing Time 0.034 seconds

A Study on the Test and Evaluation Process Development for Korea Next Generation Highspeed Electric Multiple Unit (차세대 고속열차 시험평가 프로세스에 관한 연구)

  • Lee, Tae-Hyung;Kim, Sang-Soo;Kim, Seog-Won;Kim, Ki-Hwan;Chung, Heung-Chai
    • Journal of the Korean Society of Systems Engineering
    • /
    • v.7 no.2
    • /
    • pp.7-11
    • /
    • 2011
  • A high-speed railway system represents a typical example of large-scale multi-disciplinary system, consisting of subsystems such as rolling-stock, electrical hardware, electronics, control, information, communication, civil technology etc. The system design and acquisition data of the large-scale system must be the subject under strict configuration control and management. Systems engineering technology development project for Korea next generation High-speed Electric Multiple Unit (HEMU) system in progress is a national large system development project that is not only a large-size and complex but also multi-disciplinary in nature. Therefore, all stakeholders must understand and share the functional and performance requirements of HEMU throughout its life-cycle phases. Also in the test and evaluation phase, all systems requirements must be verified. In 2011, the prototype train manufacturing will be completed. It will do test run on the commercial line and all systems requirements are verified until 2012. For the system verification, the test and evaluation process have to be established before the test trial run. Using a systems engineering tool, the system design database(SDD) with requirements traceability and development process management in the course of the development have to be established. This paper represents the test and evaluation process development based on the SEMP(Systems Engineering Management Plan) developed in the design stage. The test and evaluation process is refined and updated in comparison to the design stage one. The test and evaluation process consists of procedure, test and evaluation method and schedule. So through this process, it is defined that each systems requirements is verified on which test and about what time.

On Improving the Test and Evaluation Process by Incorporating the RAMS and Risk Management Processes (무기체계 개발에서 RAMS 및 위험 관리를 통한 시험평가 프로세스의 개선에 관한 연구)

  • Shin, Young-Don;Sim, Sang-Hyun;Lee, Jae-Chon
    • Journal of the Korea Safety Management & Science
    • /
    • v.16 no.2
    • /
    • pp.31-42
    • /
    • 2014
  • As weapon systems become complex in terms of the scale and functionality, the required time to complete the test and evaluation (T&E) process is inevitably getting longer. However, nowadays the reduction of T&E period becomes one of the core targets in the weapon systems acquisition programs. This is because the reduced time for T&E process can yield the reduction of defense budget and also faster deployment of the weapon systems, thereby having a competitive edge over rival countries. On the other hand, in weapon systems development the management of reliability, availability, maintainability and safety (RAMS), and risk is considered important to keep competitiveness and thus has been carried out separately. Thus, the objective of this paper is to study on improving the T&E process by integrating the RAMS and risk management process in it. To do so, the related processes are analyzed and modeled first. Then an integrated process model is developed. The resulting model is equipped with the traceability among the data and interfaces that are generated from the T&E and RAMS/risk processes. As a case study, the model developed is applied in tanks development. The effective use of the traceability is expected to reduce the time and cost required to complete T&E process.

Practical Software Architecture Design Methods for Non-Conventional Quality Requirements (비전형적인 품질 요구사항을 고려한 실용적 소프트웨어 아키텍처 설계 기법)

  • La, Hyun Jung;Kim, Soo Dong
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.6 no.8
    • /
    • pp.391-400
    • /
    • 2017
  • Software architecture plays a key role in satisfying non-functional requirement (NFR), i.e. quality requirements and constraints. Architecture design methods and tactics for conventional NFR are largely available in literatures. However, the methods for the target system-specific non-conventional NFRs are not readily available; rather architects should invent the design methods from their experiences and intuitions. Hence, the hardship to design architectures for non-conventional NFRs is quite high. In this paper, we provide a systematic architecture design methodology for non-conventional NFRs. We provide a five-step process, and detailed instructions for the steps. In the process, we treat the traceability among artifacts and seamlessness as essential values for supporting effective architecture design. We apply the methodology on designing architectures for a platform software system. We believe that the proposed methodology can be effectively utilized in designing high quality architectures for non-conventional NFRs.

A Study on the Flow Analysis on the Software-Defined Networks through Simulation Environment Establishment (시뮬레이션 환경 구축을 통한 소프트웨어-정의 네트워크에서 흐름 분석에 관한 연구)

  • Lee, Dong-Yoon
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.13 no.1
    • /
    • pp.88-93
    • /
    • 2020
  • Recently, SDN technology is applied to real communication business, users are getting bigger, and as the amount of data flowing in the network increases, interest in network data flow management is increasing. During this process, it must be ensured that the confidentiality, integrity, availability, and traceability of the data on the network being transmitted. In addition, it is necessary to develop an environment for observing the flow of data in real time on a network required in various fields and visually confirming the control. In this paper, first, Mininet is applied to construct a network topology and various environment attributes. Second, we added OpenDayLight in Mininet environment to develop a simulation environment to visually check and control network traffic flow in network topology.

Deep learning framework for bovine iris segmentation

  • Heemoon Yoon;Mira Park;Hayoung Lee;Jisoon An;Taehyun Lee;Sang-Hee Lee
    • Journal of Animal Science and Technology
    • /
    • v.66 no.1
    • /
    • pp.167-177
    • /
    • 2024
  • Iris segmentation is an initial step for identifying the biometrics of animals when establishing a traceability system for livestock. In this study, we propose a deep learning framework for pixel-wise segmentation of bovine iris with a minimized use of annotation labels utilizing the BovineAAEyes80 public dataset. The proposed image segmentation framework encompasses data collection, data preparation, data augmentation selection, training of 15 deep neural network (DNN) models with varying encoder backbones and segmentation decoder DNNs, and evaluation of the models using multiple metrics and graphical segmentation results. This framework aims to provide comprehensive and in-depth information on each model's training and testing outcomes to optimize bovine iris segmentation performance. In the experiment, U-Net with a VGG16 backbone was identified as the optimal combination of encoder and decoder models for the dataset, achieving an accuracy and dice coefficient score of 99.50% and 98.35%, respectively. Notably, the selected model accurately segmented even corrupted images without proper annotation data. This study contributes to the advancement of iris segmentation and the establishment of a reliable DNN training framework.

Draft Design of DataLake Framework based on Abyss Storage Cluster (Abyss Storage Cluster 기반의 DataLake Framework의 설계)

  • Cha, ByungRae;Park, Sun;Shin, Byeong-Chun;Kim, JongWon
    • Smart Media Journal
    • /
    • v.7 no.1
    • /
    • pp.9-15
    • /
    • 2018
  • As an organization or organization grows in size, many different types of data are being generated in different systems. There is a need for a way to improve efficiency by processing data smarter in different systems. Just like DataLake, we are creating a single domain model that accurately describes the data and can represent the most important data for the entire business. In order to realize the benefits of a DataLake, it is import to know how a DataLake may be expected to work and what components architecturally may help to build a fully functional DataLake. DataLake components have a life cycle according to the data flow. And while th data flows into a DataLake from the point of acquisition, its meta-data is captured and managed along with data traceability, data lineage, and security aspects based on data sensitivity across its life cycle. According to this reason, we have designed the DataLake Framework based on Abyss Storage Cluster.

Verification on the Measurement Uncertainty for Surface Roughness (표면거칠기측정에 대한 측정불확도 추정방법)

  • Kim, Chang-Soon;Park, Min-Won
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.27 no.4
    • /
    • pp.40-45
    • /
    • 2010
  • Evaluation of uncertainty is an ongoing process that can consume time and resources. It can also require the service of someone who is familiar with data analysis techniques. Therefore, it is important for laboratory personnel who are approaching uncertainty analysis for the first time to be aware of the resources required. International inclination of measurement filed to guarantee the traceability and confidence of measurement results discards the error concept and instead analyzes the measurement uncertainty. In this paper, we analyzed the elements of measurement uncertainty on surface roughness test which are the important things in mechanical parts test. Repeat the test by 3 men, the measurement uncertainty could be calculated.

Study of Configuration Management Using Se Tool (SE 전산지원도구를 이용한 형상관리 방안 연구)

  • Park, Jong-Sun
    • Journal of the Korean Society of Systems Engineering
    • /
    • v.7 no.1
    • /
    • pp.53-56
    • /
    • 2011
  • Configuration management plays a key role in systems engineering process for any project from earlier stage of development. It consists of five major activities, ie., configuration management planing, configuration identification, configuration control, configuration status accounting and configuration verification and audit, and is essential to control system design, development and operation throughout entire life cycle of the system development. And it is directly associated with other part of systems engineering management process, ie., technical data management which provides traceability of important decisions and changes during development. In this paper, we describe how to apply CASE(Computer-aided Systems Engineering) tool-Cradle for the configuration management to achieve effectiveness of Technical Management process.

Application of Systems Engineering in Shipbuilding Industry in Korea

  • Kim, Jinil;Park, Jongsun
    • Journal of the Korean Society of Systems Engineering
    • /
    • v.7 no.2
    • /
    • pp.39-43
    • /
    • 2011
  • Modern naval ships are large complex systems with the number of requirements ranges from thousands to tens of thousands. To build a quality ship, the satisfaction of the requirements should be traced. In most shipbuilding projects it is almost impossible to manage all the requirements without a proper CASE (computer aided systems engineering) tool. And for effective management of the shipbuilding project, the integrated database for technical data is very important. This paper describes how the requirements are managed, and the integrated database is built in the naval shipbuilding industry in Korea.

Development of a Quality Assurance Safety Assessment Database for Near Surface Radioactive Waste Disposal

  • Park J.W.;Kim C.L.;Park J.B.;Lee E.Y.;Lee Y.M.;Kang C.H.;Zhou W.;Kozak M.W.
    • Nuclear Engineering and Technology
    • /
    • v.35 no.6
    • /
    • pp.556-565
    • /
    • 2003
  • A quality assurance safety assessment database, called QUARK (QUality Assurance Program for Radioactive Waste Management in Korea), has been developed to manage both analysis information and parameter database for safety assessment of low- and intermediate-level radioactive waste (LILW) disposal facility in Korea. QUARK is such a tool that serves QA purposes for managing safety assessment information properly and securely. In QUARK, the information is organized and linked to maximize the integrity of information and traceability. QUARK provides guidance to conduct safety assessment analysis, from scenario generation to result analysis, and provides a window to inspect and trace previous safety assessment analysis and parameter values. QUARK also provides default database for safety assessment staff who construct input data files using SAGE(Safety Assessment Groundwater Evaluation), a safety assessment computer code.