• Title/Summary/Keyword: data management

Search Result 37,749, Processing Time 0.055 seconds

A Study on Data Resource Management Comparing Big Data Environments with Traditional Environments (전통적 환경과 빅데이터 환경의 데이터 자원 관리 비교 연구)

  • Park, Jooseok;Kim, Inhyun
    • The Journal of Bigdata
    • /
    • v.1 no.2
    • /
    • pp.91-102
    • /
    • 2016
  • In traditional environments we have called the data life cycle DIKW, which represents data-information-knowledge-wisdom. In big data environments, on the other hand, we call it DIA, which represents data-insight-action. The difference between the two data life cycles results in new architecture of data resource management. In this paper, we study data resource management architecture for big data environments. Especially main components of the architecture are proposed in this paper.

  • PDF

Selection Criteria of Target Systems for Quality Management of National Defense Data (국방데이터 품질관리를 위한 대상 체계 선정 기준)

  • Jiseong Son;Yun-Young Hwang
    • Journal of Internet Computing and Services
    • /
    • v.24 no.6
    • /
    • pp.155-160
    • /
    • 2023
  • In principle, data from all databases and systems managed by the Ministry of Defense or public institutions must be guaranteed to have a certain level of quality or higher, but since most information systems are built and operated, data quality management for all systems is realistically limited. Most defense data is not disclosed due to the nature of the work, and many systems are strategically developed or integrated and managed by the military depending on the need and importance of the work. In addition, many types of data that require data quality management are being accumulated and generated, such as sensor data generated from weapon systems, unstructured data, and artificial intelligence learning data. However, there is no data quality management guide for defense data and a guide for selecting quality control targets, and the selection criteria are ambiguous to select databases and systems for quality control of defense data according to the standards of the public data quality management manual. Depends on the person in charge. Therefore, this paper proposes criteria for selecting a target system for quality control of defense data, and describes the relationship between the proposed selection criteria and the selection criteria in the existing manual.

A study on data management policy direction for disaster safety management governance (재난안전관리 거버넌스 구축을 위한 데이터관리정책 방향에 관한 소고)

  • Kim, Young Mi
    • Journal of Digital Convergence
    • /
    • v.17 no.12
    • /
    • pp.83-90
    • /
    • 2019
  • In addition to the proliferation of intelligent information technology, the field of disaster management is being approached from a multifaceted perspective. In particular, as the interest in establishing a disaster safety management system using data increases, there is an increasing need for a large amount of big data distribution generated in real time and a systematic management. Furthermore, efforts are being made to improve the quality of data in order to increase the prevention effect of disasters through data analysis and to make a system that can respond effectively and to predict the overall situation caused by the disasters. Disaster management should seek both precautionary measures and quick responses in the event of a disaster as well as a technical approach to establishing governance and safety. This study explores the policy implications of the significance and structure of disaster safety management governance using data.

A Method for Engineering Change Analysis by Using OLAP (OLAP를 이용한 설계변경 분석 방법에 관한 연구)

  • Do, Namchul
    • Korean Journal of Computational Design and Engineering
    • /
    • v.19 no.2
    • /
    • pp.103-110
    • /
    • 2014
  • Engineering changes are indispensable engineering and management activities for manufactures to develop competitive products and to maintain consistency of its product data. Analysis of engineering changes provides a core functionality to support decision makings for engineering change management. This study aims to develop a method for analysis of engineering changes based on On-Line Analytical Processing (OLAP), a proven database analysis technology that has been applied to various business areas. This approach automates data processing for engineering change analysis from product databases that follow an international standard for product data management (PDM), and enables analysts to analyze various aspects of engineering changes with its OLAP operations. The study consists of modeling a standard PDM database and a multidimensional data model for engineering change analysis, implementing the standard and multidimensional models with PDM and data cube systems and applying the implemented data cube to core functions of engineering change management, the evaluation and propagation of engineering changes.

Development of an Integrated Management System for Data Survey of Cadastral Surveying (지적측량 자료조사 통합관리시스템 구축)

  • Choi, Jung Ju;Hong, Sung Eon;Park, Soo Hong
    • Journal of Korean Society for Geospatial Information Science
    • /
    • v.21 no.1
    • /
    • pp.77-86
    • /
    • 2013
  • This study presents a plan to develop a integrated management system for data survey of cadastral surveying to efficiently manage and research cadastral survey data. As a result of analysis of the existing cadastral survey data used to evaluate efficiency of the developed integrated cadastral survey data management system, it was shown that data survey time can be reduced by approximately half.

A study on the data quality management evaluation model (데이터 품질관리 평가 모델에 관한 연구)

  • Kim, Hyung-Sub
    • Journal of the Korea Convergence Society
    • /
    • v.11 no.7
    • /
    • pp.217-222
    • /
    • 2020
  • This study is about the data quality management evaluation model. As the information and communication technology is advanced and the importance of storage and management begins to increase, the guam feeling for data is increasing. In particular, interest in the fourth industrial revolution and artificial intelligence has been increasing recently. Data is important in the fourth industrial revolution and the era of artificial intelligence. In the 21st century, data will likely play a role as a new crude oil. It can be said that the management of the quality of this data is very important. However, research is being conducted at a practical level, but research at an academic level is insufficient. Therefore, this study examined factors affecting data quality management for experts and suggested implications. As a result of the analysis, there was a difference in the importance of data quality management.

An Empirical Analysis on the Effect of Data Quality on Economic Performance in the Financial Industry (금융산업에서의 데이터 품질이 경제적인 성과에 주는 영향의 실증분석)

  • Lee, Sang-Ho;Park, Joo-Seok;Kim, Jae-Kyeong
    • Information Systems Review
    • /
    • v.13 no.1
    • /
    • pp.1-11
    • /
    • 2011
  • This study empirically investigated the effect of firm-level data quality on economic performance in the Korean financial industry during 2008~2009. The data quality was measured by data quality management process index and data quality criteria by Korea Database Agency, and financial firm performance data was acquired from Financial Statistics Information System of the Financial Supervisory Service. The result showed that the data quality has statistically significant impacts on financial firm performance such as sales, operating profit, and value added. If the data quality management process index increases by one, the value added can increase by 2.3 percent. Moreover, the data quality criteria increase by one, the value added can increase by 72.6 percent.

An Efficient Cloud Service Quality Performance Management Method Using a Time Series Framework (시계열 프레임워크를 이용한 효율적인 클라우드서비스 품질·성능 관리 방법)

  • Jung, Hyun Chul;Seo, Kwang-Kyu
    • Journal of the Semiconductor & Display Technology
    • /
    • v.20 no.2
    • /
    • pp.121-125
    • /
    • 2021
  • Cloud service has the characteristic that it must be always available and that it must be able to respond immediately to user requests. This study suggests a method for constructing a proactive and autonomous quality and performance management system to meet these characteristics of cloud services. To this end, we identify quantitative measurement factors for cloud service quality and performance management, define a structure for applying a time series framework to cloud service application quality and performance management for proactive management, and then use big data and artificial intelligence for autonomous management. The flow of data processing and the configuration and flow of big data and artificial intelligence platforms were defined to combine intelligent technologies. In addition, the effectiveness was confirmed by applying it to the cloud service quality and performance management system through a case study. Using the methodology presented in this study, it is possible to improve the service management system that has been managed artificially and retrospectively through various convergence. However, since it requires the collection, processing, and processing of various types of data, it also has limitations in that data standardization must be prioritized in each technology and industry.

Development of Intelligent Database Program for PSI/ISI Data Management of Nuclear Power Plant (Part II) (원자력발전소 PSI/ISI 데이더 관리를 위한 지능형 데이더베이스 프로그램 개발 (제 2보))

  • Park, Un-Su;Park, Ik-Keun;Um, Byong-Guk;Lee, Jong-Po;Han, Chi-Hyun
    • Journal of the Korean Society for Nondestructive Testing
    • /
    • v.20 no.3
    • /
    • pp.200-205
    • /
    • 2000
  • In a previous paper, we have discussed the intelligent Windows 95-based data management program(IDPIN) which was developed for effective and efficient management of large amounts of pre-/in-service inspection(PSI/ISI) data of Kori nuclear power plants. The IDPIN program enables the prompt extraction of previously conducted PSI/ISI conditions and results so that the time-consuming data management, painstaking data processing and analysis of the past are avoided. In this study, the intelligent Windows based data management program(WS-IDPIN) has been developed as an effective data management of PSI/ISI data for the Wolsong nuclear power plants. The WS-IDPIN program includes the modules of comprehensive management and analysis of PSI/ISI results, statistical reliability assessment program of PSI/ISI results(depth and length sizing performance etc), standardization of UT report form and computerization of UT results. In addition, the program can be further developed as a unique PSI/ISI data management expert system which can be part of the PSI/ISI total support system for Korean nuclear power plants.

  • PDF

The Process Reference Model for the Data Quality Management Process Assessment (데이터 품질관리 프로세스 평가를 위한 프로세스 참조모델)

  • Kim, Sunho;Lee, Changsoo
    • The Journal of Society for e-Business Studies
    • /
    • v.18 no.4
    • /
    • pp.83-105
    • /
    • 2013
  • There are two ways to assess data quality : measurement of data itself and assessment of data quality management process. Recently maturity assessment of data quality management process is used to ensure and certify the data quality level of an organization. Following this trend, the paper presents the process reference model which is needed to assess data quality management process maturity. First, the overview of assessment model for data quality management process maturity is presented. Second, the process reference model that can be used to assess process maturity is proposed. The structure of process reference model and its detail processes are developed based on the process derivation approach, basic principles of data quality management and the basic concept of process reference model in SPICE. Furthermore, characteristics of the proposed model are described compared with ISO 8000-150 processes.