• Title/Summary/Keyword: Data Quality Framework

Search Result 537, Processing Time 0.03 seconds

Data Server Oriented Computing Infrastructure for Process Integration and Multidisciplinary Design Optimization (다분야통합최적설계를 위한 데이터 서버 중심의 컴퓨팅 기반구조)

  • 홍은지;이세정;이재호;김승민
    • Korean Journal of Computational Design and Engineering
    • /
    • v.8 no.4
    • /
    • pp.231-242
    • /
    • 2003
  • Multidisciplinary Design Optimization (MDO) is an optimization technique considering simultaneously multiple disciplines such as dynamics, mechanics, structural analysis, thermal and fluid analysis and electromagnetic analysis. A software system enabling multidisciplinary design optimization is called MDO framework. An MDO framework provides an integrated and automated design environment that increases product quality and reliability, and decreases design cycle time and cost. The MDO framework also works as a common collaborative workspace for design experts on multiple disciplines. In this paper, we present the architecture for an MDO framework along with the requirement analysis for the framework. The requirement analysis has been performed through interviews of design experts in industry and thus we claim that it reflects the real needs in industry. The requirements include integrated design environment, friendly user interface, highly extensible open architecture, distributed design environment, application program interface, and efficient data management to handle massive design data. The resultant MDO framework is datasever-oriented and designed around a centralized data server for extensible and effective data exchange in a distributed design environment among multiple design tools and software.

Considerations for generating meaningful HRA data: Lessons learned from HuREX data collection

  • Kim, Yochan
    • Nuclear Engineering and Technology
    • /
    • v.52 no.8
    • /
    • pp.1697-1705
    • /
    • 2020
  • To enhance the credibility of human reliability analysis, various kinds of data have been recently collected and analyzed. Although it is obvious that the quality of data is critical, the practices or considerations for securing data quality have not been sufficiently discussed. In this work, based on the experience of the recent human reliability data extraction projects, which produced more than fifty thousand data-points, we derive a number of issues to be considered for generating meaningful data. As a result, thirteen considerations are presented here as pertaining to the four different data extraction activities: preparation, collection, analysis, and application. Although the lessons were acquired from a single kind of data collection framework, it is believed that these results will guide researchers to consider important issues in the process of extracting data.

A Master Data Management Framework for Medium-Sized Companies (중견기업을 위한 마스터 데이터 관리 프레임웍)

  • Park, Kwang-Ho
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.31 no.4
    • /
    • pp.66-76
    • /
    • 2008
  • In medium-sized enterprises that comprise of multiple business branches and companies, various types of information systems are constructed and operated. One of the difficult problems these enterprises face is that integrated information cannot be delivered due to loosely managed mater data. This paper proposes an effective master data management framework for these enterprises. The framework is designed after comparing a tightly controlled centralization model with a coordinated centralization model. Through a case study on a customer master data integration project, the practicality of the framework is explored.

Understanding of the Overview of Quality 4.0 Using Text Mining (텍스트마이닝을 활용한 품질 4.0 연구동향 분석)

  • Kim, Minjun
    • Journal of Korean Society for Quality Management
    • /
    • v.51 no.3
    • /
    • pp.403-418
    • /
    • 2023
  • Purpose: The acceleration of technological innovation, specifically Industry 4.0, has triggered the emergence of a quality management paradigm known as Quality 4.0. This study aims to provide a systematic overview of dispersed studies on Quality 4.0 across various disciplines and to stimulate further academic discussions and industrial transformations. Methods: Text mining and machine learning approaches are applied to learn and identify key research topics, and the suggested key references are manually reviewed to develop a state-of-the-art overview of Quality 4.0. Results: 1) A total of 27 key research topics were identified based on the analysis of 1234 research papers related to Quality 4.0. 2) A relationship among the 27 key research topics was identified. 3) A multilevel framework consisting of technological enablers, business methods and strategies, goals, application industries of Quality 4.0 was developed. 4) The trends of key research topics was analyzed. Conclusion: The identification of 27 key research topics and the development of the Quality 4.0 framework contribute to a better understanding of Quality 4.0. This research lays the groundwork for future academic and industrial advancements in the field and encourages further discussions and transformations within the industry.

A Framework for Supporting RFID-enabled Business Processes Automation

  • Moon, Mi-Kyeing
    • Journal of information and communication convergence engineering
    • /
    • v.9 no.6
    • /
    • pp.712-720
    • /
    • 2011
  • Radio frequency identification (RFID) is an established technology and has the potential, in a variety of applications, to significantly reduce cost and improve performance. As RFID-enabled applications will fulfill similar tasks across a range of processes adapted to use the data gained from RFID tags, they can be considered as software products derived from a common infrastructure and assets that capture specific ions in the domain. This paper discusses a framework that supports the development of RFID-enabled applications based on a business process family model (BPFM), explicitly representing both commonalities and variabilities. To develop this framework, common activities are identified from RFID-enabled applications and the variabilities in the common activities are analyzed in detail using variation point concepts. Through this framework, RFID data is preprocessed, and thus, RFID-enabled applications can be developed without having to process RFID data. Sharing a common model and reusing assets to deploy recurrent services may be considered an advantage in terms of economic significance and the overall product quality afforded.

A Framework for Quality Evaluation of Geospatial Data (Geospatial Data의 품질평가를 위한 Framework)

  • Cho, Gi-Sung
    • Journal of Korean Society for Geospatial Information Science
    • /
    • v.4 no.2 s.8
    • /
    • pp.123-136
    • /
    • 1996
  • Lately, the demand for data standardization become increased to obtain various data jointly along with development of information technology and diversity of society. Thus the research on tile definition and evaluation of data quality indicating accuracy and confidence of geospatial data, is required for this standardization. In this study, by virtue of comparison of definitions and evaluation methods of data quality element being selected from representative countries, the following results were obtained: (1) Application of ISO/TC211's Draft having accepted evaluation standard to KSDTS(Korea Spatial Data Transfer Standard) is desirable for definitions of data quality elements. (2) This study presented the quality evaluation of much more resonable geospatial data accompaning with quality element. Furthermore, this study suggests that this evaluation be applicable to KSDTS and be contained in the digital map product specification of National Geography Institute with more clearness of a report form of data quality evaluation result. (3) Studies on various sampling methods, establishment of AQL(Acceptable Quality Level) suitable for our country, and computer programming which can rapidly and automatically evaluate mass much of data are required.

  • PDF

A Study on the Development of a Quality-Driven CIM System (part l: Framework) (품질 지향적 CIM시스템 개발에 관한 연구 (제1부:Freamwork))

  • Kang, Mujin
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.13 no.12
    • /
    • pp.63-69
    • /
    • 1996
  • As the significance of quality in the sense of customer satisfaction is growing, the management of quality becomes one of the main interests in the manufacturing systems research. This paper presents the concept of quality-driven CIM(Computer Integrated Manufacturing) system which is composed of a business process domain and a quality domain. In the business process domain, business functions are integrated by conventional design and manufacturing databases on the one hand, and an integrated quality system is interlinked to them via several quality modules on the other hand. Quality information model connects the business process domain with the quality domain where various types of quality data are stored in the form of quality database. This framework helps a manufacturing enterprise to implement the quality-driven CIM system to achieve its final objective "customer satisfaction".ion".uot;.

  • PDF

A Framework for Implementing Information Systems Integration to Optimize Organizational Performance

  • Ali Sirageldeen Ahmed
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.10
    • /
    • pp.11-20
    • /
    • 2023
  • The primary aim of this study is to investigate the influence of Service Provider Quality (SPQ), System Quality (SQ), Information Quality (IQ), and Training Quality (TQ) on the interconnected aspect of organizational performance known as growth and development (GD). The study examined the influence of information systems (IS) on organisational performance and provided a theory-based technique for conducting research. The theoretical foundation for this study is derived from the widely employed [1]. IS success model in information systems research. The study's framework incorporates several novel elements, drawn from a comprehensive review of both recent and earlier literature, which researchers have utilized to evaluate the dimensions of [1]. In this study, we collected data from a diverse group of 348 individuals representing various industries through a web-based questionnaire. The collected data were subjected to analysis using SPSS. We conducted a multiple regression analysis involving 15 factors to assess several hypotheses regarding the relationship between the independent construct IS effectiveness and the dependent construct organizational performance. Several noteworthy descriptive statistics emerged, which hold significance for management. The study's findings strongly indicate that information systems exert a significant and beneficial influence on organizational performance. To sustain and continually enhance organizational effectiveness, the study recommends that managers periodically scrutinize and assess their information systems.

A Performance Evaluation Framework for e-Clinical Data Management (임상시험 전자자료 관리를 위한 평가 프레임웍)

  • Lee, Hyun-Ju
    • Journal of Internet Computing and Services
    • /
    • v.13 no.1
    • /
    • pp.45-55
    • /
    • 2012
  • Electronic data management is getting important to reduce overall cost and run-time of clinical data management with the enhancement of data quality. It also critically needs to meet regulated guidelines for the overall quality and safety of electronic clinical trials. The purpose of this paper is to develop the performance evaluation framework in electronic clinical data management. Four key metrics in the area of infrastructure, intellectual preparation, study implementation and study completion covering major aspects of clinical trial processes are proposed. The performance measures evaluate the extent of regulation compliance, data quality, cost and efficiency of electronic data management process. They also provide measurement indicators for each evaluation items. Based on the key metrics, the performance evaluation framework is developed in three major areas involved in clinical data management - clinical site, monitoring and data coordinating center. As of the initial attempt how to evaluate the extent of electronic data management in clinical trials by Delphi survey, further empirical studies are planned and recommended.

Development of Framework for Digital Map Time Series Analysis of Earthwork Sites (토공현장 디지털맵 시계열 변화분석 프레임워크 기술개발)

  • Kim, Yong-Gun;Park, Su-Yeul;Kim, Seok
    • Journal of KIBIM
    • /
    • v.13 no.1
    • /
    • pp.22-32
    • /
    • 2023
  • Due to the increased use of digital maps in the construction industry, there is a growing demand for high-quality digital map analysis. With the large amounts of data found in digital maps at earthwork sites, there is a particular need to enhance the accuracy and speed of digital map analysis. To address this issue, our study aims to develop new technology and verify its performance to address non-ground and range mismatch issues that commonly arise. Additionally, our study presents a new digital map analysis framework for earthwork sites that utilizes three newly developed technologies to improve the performance of digital map analysis. Through this, it achieved about 95% improvement in analysis performance compared to the existing framework. This study is expected to contribute to the improvement of the quality of digital map analysis data of earthworks.