• Title/Summary/Keyword: Data Quality Check

Search Result 331, Processing Time 0.026 seconds

Standardization of Data Quality and Management Regulation for Korean CORS (국내 GNSS 상시관측소 데이터 품질 및 관리규정 표준화에 관한 연구)

  • Jin Sang, Hwang;Hyuk Gil, Kim;Hong Sik, Yun;Jae Myoung, Cho
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.33 no.4
    • /
    • pp.245-258
    • /
    • 2015
  • This study aimed to conduct the standardization of various specifications for determining the proper construction and operation of domestic CORS (Continuously Operating Reference Station). To achieve the plan, the standardization was proposed for various compositions of CORS, such as the data quality, structure, and equipment. Also, we have studied the method for empirically determining the reference values of QC (Quality Check) of CORS data. Those large amounts of samples for each QC index values were built to approach in empirical and statistical methods. In fact, those general and recommended reference values were determined from analyzing the sample distributions, using the empirical and statistical approaches. The result is expected to be utilized for a variety of research fields for standardization, accurate data acquisitions and service operations for the domestic CORS

Meteorological Data Integrity for Environmental Impact Assessment in Yongdam Catchment (용담댐시험유역 환경영향평가의 신뢰수준 향상을 위한 기상자료의 품질검정)

  • Lee, Khil-Ha
    • Journal of Environmental Science International
    • /
    • v.29 no.10
    • /
    • pp.981-988
    • /
    • 2020
  • This study presents meteorological data integrity to improve environmental quality assessment in Yongdam catchment. The study examines both extreme ranges of meteorological data measurements and data reliability which include maximum and minimum temperature, relative humidity, dew point temperature, radiation, heat flux. There were some outliers and missing data from the measurements. In addition, the latent heat flux and sensible heat flux data were not reasonable and evapotranspiration data did not match at some points. The accuracy and consistency of data stored in a database for the study were secured from the data integrity. Users need to take caution when using meteorological data from the Yongdam catchment in the preparation of water resources planning, environmental impact assessment, and natural hazards analysis.

Development of Healthcare Data Quality Control Algorithm Using Interactive Decision Tree: Focusing on Hypertension in Diabetes Mellitus Patients (대화식 의사결정나무를 이용한 보건의료 데이터 질 관리 알고리즘 개발: 당뇨환자의 고혈압 동반을 중심으로)

  • Hwang, Kyu-Yeon;Lee, Eun-Sook;Kim, Go-Won;Hong, Seong-Ok;Park, Jung-Sun;Kwak, Mi-Sook;Lee, Ye-Jin;Lim, Chae-Hyeok;Park, Tae-Hyun;Park, Jong-Ho;Kang, Sung-Hong
    • The Korean Journal of Health Service Management
    • /
    • v.10 no.3
    • /
    • pp.63-74
    • /
    • 2016
  • Objectives : There is a need to develop a data quality management algorithm to improve the quality of healthcare data using a data quality management system. In this study, we developed a data quality control algorithms associated with diseases related to hypertension in patients with diabetes mellitus. Methods : To make a data quality algorithm, we extracted the 2011 and 2012 discharge damage survey data from diabetes mellitus patients. Derived variables were created using the primary diagnosis, diagnostic unit, primary surgery and treatment, minor surgery and treatment items. Results : Significant factors in diabetes mellitus patients with hypertension were sex, age, ischemic heart disease, and diagnostic ultrasound of the heart. Depending on the decision tree results, we found four groups with extreme values for diabetes accompanying hypertension patients. Conclusions : There is a need to check the actual data contained in the Outlier (extreme value) groups to improve the quality of the data.

An Analysis of Emergency Care Based on Prehospital Care Reports (일부 구급대의 응급처치활동 분석 - 구급활동일지를 중심으로 -)

  • Uhm, Tai-Hwan
    • The Korean Journal of Emergency Medical Services
    • /
    • v.9 no.1
    • /
    • pp.101-109
    • /
    • 2005
  • The purpose of this study which was done by 250 Prehospital Care Reports(PCRs) survey of some squads in Seoul Metropolitan Fire & Disaster Management Department was to improve prehospital emergency care by means of quality management. The data were collected in 3 squads from Jun. 21 to Jul. 18, 2004 and analyzed by using SPSS Win 12.0 Version. The conclusions from this study were summarized as follows. The mean time of Event to treatment interval was $4.6{\pm}4.3$ minutes and 49.2% arrived at patient within 4 minutes. Platinum minute was observed 61.1% of verbal response, 73.3% of painful response, 77.8% of unresponsive. The great majority of patients couldn't receive advanced life support on account of limited scope of practice and strict direct medical control in the Emergency Medical Services Act. Data from quality improvement activity will be useful to expand indirect medical control which is able to activate prehospital care. To utilize PCR for quality improvement. It has to have data elements, run data, patient data, check boxes, narrative including US DOT's minimum data set.

  • PDF

A Study on Medical Laws and External Evaluation Criteria with Reference to the Essential Forms consisting Medical Records and to the Items for Each Medical Record (의료기관 종별 의무기록 중요서식 항목별 작성 실태 및 의무기록 완결점검표 분석)

  • Seo, Sun Won;Kim, Kwang Hwan;Hwang, Yong-Hwa;Kang, Sunny;Kang, Jin Kyung;Cho, Woo Hyun;Hong, Joon Hyun;Pu, Yoo Kyung;Rhee, Hyun Sill
    • Quality Improvement in Health Care
    • /
    • v.9 no.2
    • /
    • pp.176-197
    • /
    • 2002
  • Backgound : This study is to suggest the standardized format of the clinical sheets and the standardized items of every clinical sheet. The standardization of the medical records will increase the faithfullnes of the contents in them and it will contribute to construct the good health information system. Method : From Jan. 1st. 2001 to March 31st 2001, we gathered as many paper clinical sheets as possible by every class of institutions to review the faithfulness of the clinical contents in them. Clinical sheets of 9 tertiary care hospitals, 6 general hospitals and 56 clinics were gathered. Two experienced medical record administrators reviewed them. The review focus was to check whether the items recommend by the hospital standardization review criteria and hospital service evaluation organization were appeared in the clinical sheets and whether the contents of every item were written. Results : Tertiary care hospitals; In case of administrative data, the contents were filled well if the items were fixed. The clinical data like C.C, history,physical examiniation were filled well, but if the items were not fixed, some items were omitted. The result is that more items are to be filled if they are fixed. General hospitals Administrative data were filled more than 50%. Final diagnosis was filled about 66.7%.But other clinical data were not filled well and not many clinical related items were appeared in the sheets.In the legal point of view, the reason for visiting hosptals or the right diagnosis, patient condition at discharge could not be confirmed well.In surgery cases, surgical procedures could not be confirmed well as many surgical related information(surgery time, fluids and blood, number of sponges, biopsy, etc) were omitted. Clinics More than 70% administrative data were filled and fixed as items. Among the clinical related data, laboratory result was the most credible data. But without the right diagnosis, drug orders were given and doctors' written signatures were not appeared over 96.4%. So the clinical sheets cannot be used as a legal document. Conculusion : There was a tendency that the contents were filled well if the items were fixed in the documents, We also suggest a clinical check list to review the completeness and faithfulness of the clinical sheets. If many hospitals use the suggested clincal check list and if they make the necessary items fixed in the clinical sheets, the quality of the medical record will increase dramatically.

  • PDF

The Model of Appraisal Method on Authentic Records (전자기록의 진본 평가 시스템 모형 연구)

  • Kim, Ik-Han
    • The Korean Journal of Archival Studies
    • /
    • no.14
    • /
    • pp.91-117
    • /
    • 2006
  • Electronic Records need to be appraised the authenticity as well as the value itself. There has been various kinds of discussion about how records to be appraised the value of themselves, but there's little argument about how electronic records to be appraised the authenticity of themselves. Therefore this article is modeling some specific authenticity appraisal methods and showing each stages those methods should or may be applied. At the Ingest stage, integrity verification right after records creation in the organization which produced the records, quality and integrity verification about the transferred in the organization which received the records and integrity check between SIP and AIP in the organization which received and preserved the records are essential. At the Preservation stage, integrity check between same AIPs stored in different medium separately and validation of records where or not damaged and recovery damaged records are needed. At the various Processing stages, suitability evaluation after changing the record's management control meta data and changing the record's classification, integrity check after records migration and periodical validation and integrity verification about DIPs are required. For those activities, the appraisal methods including integrity verification, content consistency check, suitability evaluation about record's meta data, feasibility check of unauthorized update and physical status validation should be applied to the electronic records management process.

A Comparison of Patient-specific Delivery Quality Assurance (DQA) Devices in Radiation Therapy (방사선치료에서 환자맞춤형 선량품질보증 장치의 비교)

  • Kyung Hwan Chang
    • Journal of radiological science and technology
    • /
    • v.46 no.3
    • /
    • pp.231-238
    • /
    • 2023
  • This study aimed to compare the results of delivery quality assurance (DQA) using MapCHECK and OCTAVIUS for radiation therapy. Thirty patients who passed the DQA results were retrospectively included in this study. The point dose difference (DD) and gamma passing rate (GPR) were analyzed to evaluate the agreement between the measured and planned data for all cases, Plan complexity was evaluated to analyze dosimetric accuracy by quantifying the degree of modulation according to each plan. We analyzed the monitor units (MUs) and total MUs for each plan to evaluate the correlation between the MUs and plan complexity. We used a paired t-test to compare the DD and GPRs that were obtained using the two devices. The DDs and GPRs were within the tolerance range for all cases. The average GPRs difference between the two devices was statistically significant for the brain, and head and neck for gamma criteria of 3%/3 mm and 2%/2 mm. There was no significant correlation between the modulation index and total MUs for any of the cases. These DQA devices can be used interchangeably for routine patient-specific QA in radiation therapy.

EVALUATION OF DATA QUALITY OF PERMANENT GPS STATIONS IN SOUTH KOREA

  • Park, Kwan-Dong;Kim, Ki-Nam;Lim, Hyung-Chul;Park, Pil-Ho
    • Journal of Astronomy and Space Sciences
    • /
    • v.19 no.4
    • /
    • pp.367-376
    • /
    • 2002
  • As of September 2002, there are more than 60 operational permanent Global Positioning System (GPS) stations in South Korea. Their data are being used for a variety of purposes: geodynamics, geodesy, real-time navigation, atmospheric science, and geography. Especially, many of the sites are reference stations for DGPS (Differential GPS). However, there has been no comprehensive and qualitative analysis published to evaluate the data quality. In this study, we present preliminary results of our assessment of the permanent GPS sites in South Korea. We have analyzed the multi-path characteristics of each station using a quality-checking software package called TEQC. Another multipath analysis tool based on post-fit phase residuals was used to check the repeating patterns and the amount of the multipath at each site. The long-term stability of each station was analyzed using the root-mean-square (RMS) error of the estimated site positions for one year, which enabled us to evaluate the mount stability. In addition, the number of cycle slips at each site was derived by TEQC. Based on these series of tests, we compared the stability and data quality of permanent GPS stations in South Korea.

Error Resilient and Concealment Schemes for Still Image Transmission over DSRC System Channel (DSRC시스템 채널 환경에서 정지 영상 전송을 위한 에러 복구 및 은닉 기법)

  • 최은석;백중환
    • Proceedings of the IEEK Conference
    • /
    • 2001.06d
    • /
    • pp.13-16
    • /
    • 2001
  • In the Dedicated Short Range Communication (DSRC) system channel, a large number of bit errors occur because of Additive White Gaussian Noise (AWGN) and fading. When an image data is transmitted under the condition, reconstructed image quality is significantly degraded. In this paper, as an alternative to the error correcting code and/or automatic repeat request scheme, we propose an error recovery scheme for image data transmission. We first analyze how transmission errors in the DSRC system channel degrade image quality. Then, in order to improve image quality, we propose error resilient and concealment schemes for still image transmission using DCT-based fixed length coding, hamming code, cyclic redundancy check, and interleaver. Finally, we show its performance by an experiment.

  • PDF

Quality Control and Assurance of Eddy Covariance Data at the Two KoFlux Sites (KoFlux 관측지에서 에디 공분산 자료의 품질관리 및 보증)

  • Kwon, Hyo-Jung;Park, Sung-Bin;Kang, Min-Seok;Yoo, Jae-Il;Yuan, Renmin;Kim, Joon
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.9 no.4
    • /
    • pp.260-267
    • /
    • 2007
  • This research note introduces the procedure of the quality control and quality assurance applied to the eddy covariance data collected at the two KoFlux sites (i.e., Gwangneung forest and Haenam farmland). The quality control was conducted through several steps based on micrometeorological theories and statistical tests. The data quality was determined at each step of the quality control procedure and was denoted by five different quality flags. The programs, which were used to perform the quality control, and the quality assessed data are available at KoFlux website (http://www.koflux.org/).