• Title/Summary/Keyword: Data Quality Control

Search Result 3,235, Processing Time 0.043 seconds

Development of Smart City IoT Data Quality Indicators and Prioritization Focusing on Structured Sensing Data (스마트시티 IoT 품질 지표 개발 및 우선순위 도출)

  • Yang, Hyun-Mo;Han, Kyu-Bo;Lee, Jung Hoon
    • The Journal of Bigdata
    • /
    • v.6 no.1
    • /
    • pp.161-178
    • /
    • 2021
  • The importance of 'Big Data' is increasing to the point that it is likened to '21st century crude oil'. For smart city IoT data, attention should be paid to quality control as the quality of data is associated with the quality of public services. However, data quality indicators presented through ISO/IEC organizations and domestic/foreign organizations are limited to the 'User' perspective. To complement these limitations, the study derives supplier-centric indicators and their priorities. After deriving 3 categories and 13 indicators of supplier-oriented smart city IoT data quality evaluation indicators, we derived the priority of indicator categories and data quality indicators through AHP analysis and investigated the feasibility of each indicator. The study can contribute to improving sensor data quality by presenting the basic requirements that data should have to individuals or companies performing the task. Furthermore, data quality control can be performed based on indicator priorities to provide improvements in quality control task efficiency.

Estimation of Qualities and Inference of Operating Conditions for Optimization of Wafer Fabrication Using Artificial Intelligent Methods

  • Bae, Hyeon;Kim, Sung-Shin;Woo, Kwang-Bang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.1101-1106
    • /
    • 2005
  • The purpose of this study was to develop a process management system to manage ingot fabrication and the quality of the ingot. The ingot is the first manufactured material of wafers. Operating data (trace parameters) were collected on-line but quality data (measurement parameters) were measured by sampling inspection. The quality parameters were applied to evaluate the quality. Thus, preprocessing was necessary to extract useful information from the quality data. First, statistical methods were employed for data generation, and then modeling was accomplished, using the generated data, to improve the performance of the models. The function of the models is to predict the quality corresponding to the control parameters. The dynamic polynomial neural network (DPNN) was used for data modeling that used the ingot fabrication data.

  • PDF

Study on Correlation-based Feature Selection in an Automatic Quality Inspection System using Support Vector Machine (SVM) (SVM 기반 자동 품질검사 시스템에서 상관분석 기반 데이터 선정 연구)

  • Song, Donghwan;Oh, Yeong Gwang;Kim, Namhun
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.42 no.6
    • /
    • pp.370-376
    • /
    • 2016
  • Manufacturing data analysis and its applications are getting a huge popularity in various industries. In spite of the fast advancement in the big data analysis technology, however, the manufacturing quality data monitored from the automated inspection system sometimes is not reliable enough due to the complex patterns of product quality. In this study, thus, we aim to define the level of trusty of an automated quality inspection system and improve the reliability of the quality inspection data. By correlation analysis and feature selection, this paper presents a method of improving the inspection accuracy and efficiency in an SVM-based automatic product quality inspection system using thermal image data in an auto part manufacturing case. The proposed method is implemented in the sealer dispensing process of the automobile manufacturing and verified by the analysis of the optimal feature selection from the quality analysis results.

Assessment and Validation of the Reliability of Quality Measurement System for Wireless Data Services (무선데이터 서비스 품질 측정 시스템의 신뢰성 검증 및 평가)

  • Choi, Dong-Hwan;Park, Seok-Cheon
    • Journal of Information Technology Services
    • /
    • v.11 no.1
    • /
    • pp.239-246
    • /
    • 2012
  • The user increases around the WiBro and in which the mobile broad band service is the wireless internet base technology HSDPA, that is the cellular phone foundation technique, and moreover the issue about the user effective quality guarantee emerges but the quality control of the wireless network is the incomplete actual condition. Therefore, the reliability verification according to the development of the system for the performance measure for the continued wireless data service and it and evaluation are needed for the quality control. In this paper, the wireless data service quality measurement system was implemented with the design. The same test environment as the existing network performance measurement program was built for the reliability verification of the embodied device for measuring quality and evaluation and the cost performance was performed. It could confirm that design and embodied wireless data service quality measurement system operated accurately in this paper through the quality measure result value analysis.

Application and System Establishment on Quality Control of Flowing Concrete (유동화 콘크리트의 현장적용과 품질관리 시스템 구축)

  • 김규용;길배수;한장현;주지현;박선규;한승구;조성기;김무한
    • Proceedings of the Korea Concrete Institute Conference
    • /
    • 1999.10a
    • /
    • pp.801-804
    • /
    • 1999
  • The interest in workability and quality control of concrete is increasing as to improve quality of concrete structure according to industrialization. Therefore, it is the aim of this study to evaluate the quality of flowing concrete and systematize quality control by analyzing the data of the quality control of concrete through a basic quality control system for improvement of the concrete in pumping flowing concrete for construction industy.

  • PDF

Quality Control and Assurance of Eddy Covariance Data at the Two KoFlux Sites (KoFlux 관측지에서 에디 공분산 자료의 품질관리 및 보증)

  • Kwon, Hyo-Jung;Park, Sung-Bin;Kang, Min-Seok;Yoo, Jae-Il;Yuan, Renmin;Kim, Joon
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.9 no.4
    • /
    • pp.260-267
    • /
    • 2007
  • This research note introduces the procedure of the quality control and quality assurance applied to the eddy covariance data collected at the two KoFlux sites (i.e., Gwangneung forest and Haenam farmland). The quality control was conducted through several steps based on micrometeorological theories and statistical tests. The data quality was determined at each step of the quality control procedure and was denoted by five different quality flags. The programs, which were used to perform the quality control, and the quality assessed data are available at KoFlux website (http://www.koflux.org/).

A Design of Control Chart for Fraction Nonconforming Using Fuzzy Data (퍼지 데이터를 이용한 불량률(p) 관리도의 설계)

  • 김계완;서현수;윤덕균
    • Journal of Korean Society for Quality Management
    • /
    • v.32 no.2
    • /
    • pp.191-200
    • /
    • 2004
  • Using the p chart is not adequate in case that there are lots of data and it is difficult to divide into products conforming or nonconforming because of obscurity of binary classification. So we need to design a new control chart which represents obscure situation efficiently. This study deals with the method to performing arithmetic operation representing fuzzy data into fuzzy set by applying fuzzy set theory and designs a new control chart taking account of a concept of classification on the term set and membership function associated with term set.

Quality Analysis for the Data Distribution Service of the Real-time Integrated Railway Safety Monitoring and Control System (실시간 철도안전 통합 감시제어시스템의 데이터 분산 서비스 품질 적합성 분석)

  • Kim, Sang Ahm;Kim, Seon Woo
    • Journal of The Korean Society For Urban Railway
    • /
    • v.6 no.4
    • /
    • pp.351-361
    • /
    • 2018
  • In this paper, we analyse DDS(Data Distribution Service) QoSs(Quality of Services) with experiments, which can control the network transmission quality provided by 'Object Management Group' DDS standard to satisfy the data transmission quality requirements of 'Real-time integrated railway safety monitoring and control system(RIRSMCS)'. 'RIRSMCS' is a system for collecting and analyzing data from various sensors at a railway field to predict and prevent the risk of potential railway accidents. Data transmission quality should be ensured for transmitting the collected data in real-time, accurately and stably for this system. Experimental results show that it is possible to ensure real-time, accurate and stable data transmission for monitoring and control of railway safety using DDS QoS.

Statistical Analysis of Quality Control Data of Blood Components (혈액성분제제 품질관리 자료의 통계학적인 비교)

  • Kim, Chongahm;Seo, Dong Hee;Kwon, So Yong;Oh, Yuong Chul;Lim, Chae Seung;Jang, Choong Hoon;Kim, Soonduck
    • Korean Journal of Clinical Laboratory Science
    • /
    • v.36 no.1
    • /
    • pp.19-26
    • /
    • 2004
  • According to increase of domestic blood components use, the quality control of blood components is necessary to support good products. The purpose of this study is used to provide the producing index of the good product as compared with the accuracy and validity for the distribution of the quality control data. The value of mean, standard deviation, 95% confidence interval and degree of normal distribution of data were calculated by univariate procedure, the value of monthly mean of each blood centers per items were compared by Analysis of Variance(ANOVA) test for the degree of distribution. When there was difference among the mean values, the Duncan's multiple range test was done to confirm the difference. Finally, methods for accessing accuracy and validity of the quality data was done by the Contingency table test. The quality data of five blood centers was showed to the normal distribution and it was in a acceptable range. For each blood centers, the monthly means of Hematocrit(Hct), Platelet(PLT) and pH were not significantly different except Hct of C center, PLT of B, D center and pH of A center. The quality data per items was graded according to quality to six level. As a result of the comparative analysis, the monthly means of Hct of C and E center was significantly different higher than that of D, B and A center. The monthly means of PLT of A center and pH of C center was significantly different higher than that of the others. In the accuracy and validity of the quality control data, C center for Hct, A center for PLT and C center for pH were better than the other. The C blood center was most satisfiable and stable in the quality control for blood component. If the quality control method used in C blood center is adopted in other blood centers, the prepared level of the blood component of the center will be improved partly.

  • PDF

Development and Assessment of Real-Time Quality Control Algorithm for PM10 Data Observed by Continuous Ambient Particulate Monitor (부유분진측정기(PM10) 관측 자료 실시간 품질관리 알고리즘 개발 및 평가)

  • Kim, Sunyoung;Lee, Hee Choon;Ryoo, Sang-Boom
    • Atmosphere
    • /
    • v.26 no.4
    • /
    • pp.541-551
    • /
    • 2016
  • A real-time quality control algorithm for $PM_{10}$ concentration measured by Continuous Ambient Particulate Monitor (FH62C14, Thermo Fisher Scientific Inc.) has been developed. The quality control algorithm for $PM_{10}$ data consists of five main procedures. The first step is valid value check. The values should be within the acceptable range limit. Upper ($5,000{\mu}g\;m^{-3}$) and lower ($0{\mu}g\;m^{-3}$) values of instrument detectable limit have to be eliminated as being unrealistic. The second step is valid error check. Whenever unusual condition occurs, the instrument will save error code. Value having an error code is eliminated. The third step is persistence check. This step checks on a minimum required variability of data during a certain period. If the $PM_{10}$ data do not vary over the past 60 minutes by more than the specific limit ($0{\mu}g\;m^{-3}$) then the current 5-minute value fails the check. The fourth step is time continuity check, which is checked to eliminate gross outlier. The last step is spike check. The spikes in the time series are checked. The outlier detection is based on the double-difference time series, using the median. Flags indicating normal and abnormal are added to the raw data after quality control procedure. The quality control algorithm is applied to $PM_{10}$ data for Asian dust and non-Asian dust case at Seoul site and dataset for the period 2013~2014 at 26 sites in Korea.