• Title/Summary/Keyword: Data Quality Validation

Search Result 369, Processing Time 0.035 seconds

Current State and Challenges of Pharmacoeconomic Evaluation in Korea (우리나라 의약품 경제성평가의 현황과 과제)

  • Choi, Sang-Eun
    • Journal of Preventive Medicine and Public Health
    • /
    • v.41 no.2
    • /
    • pp.74-79
    • /
    • 2008
  • Since the positive listing system for prescription drug reimbursement has been introduced in Korea, the number of pharmacoeconomic evaluation studies has increased. However it is not clear if the quality of pharmacoeconomic evaluation study has improved. Due to the lack of randomized clinical studies in Korean health care setting, Korean economic evaluation studies have typically integrated the local cost data and foreign clinical data. Therefore methodological issues can be raised in regard to data coherence and consistency. But the quality of data was not questiened and the potential bias has not been investigated yet. Even though changes in policy have encouraged the undertaking of pharmacoeconomic evaluations, there is few public-side funding for validation study of cost-effectiveness models and data. Several companies perform economic evaluation studies to be submitted on behalf of their own products, but do not want the study results to be disclosed to the academic community or public. To improve the present conduct of pharmacoeconomic evaluations in Korea, various funding sources need to be developed, and, like other multidisciplinary areas, the experts in different fields of study should collaborate to ensure the validity and credibility of pharmacoeconomic evaluations.

Quality Reporting of Radiomics Analysis in Mild Cognitive Impairment and Alzheimer's Disease: A Roadmap for Moving Forward

  • So Yeon Won;Yae Won Park;Mina Park;Sung Soo Ahn;Jinna Kim;Seung-Koo Lee
    • Korean Journal of Radiology
    • /
    • v.21 no.12
    • /
    • pp.1345-1354
    • /
    • 2020
  • Objective: To evaluate radiomics analysis in studies on mild cognitive impairment (MCI) and Alzheimer's disease (AD) using a radiomics quality score (RQS) system to establish a roadmap for further improvement in clinical use. Materials and Methods: PubMed MEDLINE and EMBASE were searched using the terms 'cognitive impairment' or 'Alzheimer' or 'dementia' and 'radiomic' or 'texture' or 'radiogenomic' for articles published until March 2020. From 258 articles, 26 relevant original research articles were selected. Two neuroradiologists assessed the quality of the methodology according to the RQS. Adherence rates for the following six key domains were evaluated: image protocol and reproducibility, feature reduction and validation, biologic/clinical utility, performance index, high level of evidence, and open science. Results: The hippocampus was the most frequently analyzed (46.2%) anatomical structure. Of the 26 studies, 16 (61.5%) used an open source database (14 from Alzheimer's Disease Neuroimaging Initiative and 2 from Open Access Series of Imaging Studies). The mean RQS was 3.6 out of 36 (9.9%), and the basic adherence rate was 27.6%. Only one study (3.8%) performed external validation. The adherence rate was relatively high for reporting the imaging protocol (96.2%), multiple segmentation (76.9%), discrimination statistics (69.2%), and open science and data (65.4%) but low for conducting test-retest analysis (7.7%) and biologic correlation (3.8%). None of the studies stated potential clinical utility, conducted a phantom study, performed cut-off analysis or calibration statistics, was a prospective study, or conducted cost-effectiveness analysis, resulting in a low level of evidence. Conclusion: The quality of radiomics reporting in MCI and AD studies is suboptimal. Validation is necessary using external dataset, and improvements need to be made to feature reproducibility, feature selection, clinical utility, model performance index, and pursuits of a higher level of evidence.

GLP Bioanalysis from the US FDA Perspective

  • Wilkinson, Tames M.
    • Proceedings of the Korean Society of Toxicology Conference
    • /
    • 2006.11a
    • /
    • pp.75-79
    • /
    • 2006
  • The United States Food and Drug Administration is responsible for ensuring US residents receive safe and effective medicines. Since blood levels of drugs are correlated with pharmacological effect, FDA closely regulates how those blood levels are measured. The FDA has established requirements for bioanalytical analyses such as minimum method validation, SOP, and reporting criteria. The FDA also has standards for computer validation which must be followed to ensure the data are reliable. Data presented to the Agency are scrutinized to ensure they are accurate and a true reflection of the raw data generated in a study. To verify the quality of data, FDA has developed an inspection program. The specific requirements of the FDA related to bioanalysis will be discussed.

  • PDF

Calibration of APEX-Paddy Model using Experimental Field Data

  • Mohammad, Kamruzzaman;Hwang, Syewoon;Cho, Jaepil;Choi, Soon-Kun;Park, Chanwoo
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2019.05a
    • /
    • pp.155-155
    • /
    • 2019
  • The Agricultural Policy/Environmental eXtender (APEX) models have been developed for assessing agricultural management efforts and their effects on soil and water at the field scale as well as more complex multi-subarea landscapes, whole farms, and watersheds. National Academy of Agricultural Sciences, Wanju, Korea, has modified a key component of APEX application, named APEX-Paddy for simulating water quality with considering appropriate paddy management practices, such as puddling and flood irrigation management. Calibration and validation are an anticipated step before any model application. Simple techniques are essential to assess whether or not a parameter should be adjusted for calibration. However, very few study has been done to evaluate the ability of APEX-Paddy to simulate the impact of multiple management scenarios on nutrients loss. In this study, the observation data from experimental fields at Iksan in South Kora was used in calibration and evaluation process during 2013-2015. The APEX auto- calibration tool (APEX-CUTE) was used for model calibration and sensitivity analysis. Four quantitative statistics, the coefficient of determination ($R^2$),Nash-Sutcliffe(NSE),percentbias(PBIAS)androotmeansquareerror(RMSE)were used in model evaluation. In this study, the hydrological process of the modified model, APEX-Paddy, is being calibrated and tested in predicting runoff discharge rate and nutrient yield. Field-scale calibration and validation processes are described with an emphasis on essential calibration parameters and direction regarding logical sequences of calibration steps. This study helps to understand the calibration and validation way is further provided for applications of APEX-Paddy at the field scales.

  • PDF

DEVELOPMENT AND VALIDATION OF LAND SURFACE TEMPERATURE RETRIEVAL ALGORITHM FROM MTSAT-1R DATA

  • Hong, Ki-Ok;Kang, Jeon-Ho;Suh, Myoung-Seok
    • Proceedings of the KSRS Conference
    • /
    • 2008.10a
    • /
    • pp.293-296
    • /
    • 2008
  • Land surface Temperature (LST) is a very useful surface parameter for the wide range of applications, such as agriculture, numerical and climate modelling community. Whereas operational observation of LST is far from the needs of application community in the spatial Itemporal resolution and accuracy. So, we developed split-window type LST retrieval algorithm to estimate the LST from MTSAT-IR data. The coefficients of split-window algorithm were obtained by means of a statistical regression analysis from the radiative transfer simulations using MODTRAN 4 for wide range of atmospheric profiles, satellite zenith angle and lapse rate conditions including the surface inversions. The sensitivity analysis showed that the LST algorithm reproduces the LST with a reasonable quality. However, the LST algorithm overestimates and underestimates for the strong surface inversion and superadiabatic conditions especially for the warm temperature, respectively. And the performance of LST algorithms is superior when satellite zenith angle is small. The accuracy of the retrieved LST has been evaluated with the Moderate Resolution Imaging Spectroradiometer (MODIS) LST data. The validation results showed that the correlation coefficients and RMSE are about 0.83${\sim}$0.98 and 1.38${\sim}$4.06, respectively. And the quality of LST is significantly better during night and winter time than during day and summer. The validation results showed that the LST retrieval algorithm could be used for the operational retrieval of LST from MTSAT-IR and COMS(Communication, Ocean and Meteorological Satellite) data with some modifications.

  • PDF

Validation of Quality of Life Index-Cancer among Korean Patients with Cancer (Quality of Life Index-Caner의 구성타당도 검증 -국내 암환자를 대상으로-)

  • 소향숙;이원희;이은현;정복례;허혜경;강은실
    • Journal of Korean Academy of Nursing
    • /
    • v.34 no.5
    • /
    • pp.693-701
    • /
    • 2004
  • Purpose: The purpose of this study was to validate Quality of Life Index-Cancer (Q.L.I.-C) developed by Ferrans (1990) among Korean cancer patients. Method: This study design was exploratory factor analysis methodology. Q.L.I.-C was translated into Korean and reverse-translated into English. The subjects were 357 Korean patients with various cancers. Data were collected by questionnaires from May to August, 2000 and was analyzed by descriptive statistics, Principal Component Analysis for construct validity and Cronbach's alpha coefficient for reliability. Result: The range of factor loadings was .446~.841. The explained variance from the 5 extracted factors was 63.7% of the total variance. The first factor 'family' was 35.5%, and 'health & physical functioning', 'psychological', 'spiritual', and 'economic' factors were 11.5%, 6.9%, 5.6%, and 4.2% respectively. Because of cultural difference between Americans and Koreans, certain items such as sexuality, job status, and education were deleted from the extraction of factors in this study. The Cronbach's alpha coefficient was .9253 among the 28 items. Conclusion: Q.L.I.-C could be applied in measuring quality of life of Korean cancer patients. It also recommend to do further studiesfor validation of Q.L.I.-C American and Korean versions relating to cultural differences.

The Comparative Analysis of Water Quality Environment Data of Wando Onshore Seawater Farm and Tidal Observatory (완도 육상 해수 양식장과 조위관측소의 수질 환경 데이터 비교 분석)

  • Ye, Seoung-Bin;Kwon, In-Yeong;Kim, Tae-Ho;Park, Jeong-Seon;Han, Soon-Hee;Ceong, Hee-Taek
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.16 no.5
    • /
    • pp.957-968
    • /
    • 2021
  • To improve the data on reliability of the onshore fish farm water quality monitoring system and operate the system efficiently, the water quality data of the onshore seawater fish farms which are progressing test operation, and the marine environmental information network(Wando tidal station) were compared and analyzed. Furthermore, data validation, data range filters, and data displacement checks were applied to analyze the data in a way that eliminates the data errors in water quality monitoring systems and increases the reliability of measurement data.

Proposal of Public Data Quality Management Level Evaluation Domain Rule Mapping Model

  • Jeong, Ha-Na;Kim, Jae-Woong;Chung, Young-Suk
    • Journal of the Korea Society of Computer and Information
    • /
    • v.27 no.12
    • /
    • pp.189-195
    • /
    • 2022
  • The Korean government has made it a major national task to contribute to the revitalization of the creative economy, such as creating new industries and jobs, by encouraging the private opening and utilization of public data. The Korean government is promoting public data quality improvement through activities such as conducting public data quality management level evaluation for high-quality public data retention. However, there is a difference in diagnosis results depending on the understanding and data expertise of users of the public data quality diagnosis tool. Therefore, it is difficult to ensure the accuracy of the diagnosis results. This paper proposes a public data quality management level evaluation domain rule mapping model applicable to validation diagnosis among the data quality diagnosis standards. This increases the stability and accuracy of public data quality diagnosis.

Prediction of concrete compressive strength using non-destructive test results

  • Erdal, Hamit;Erdal, Mursel;Simsek, Osman;Erdal, Halil Ibrahim
    • Computers and Concrete
    • /
    • v.21 no.4
    • /
    • pp.407-417
    • /
    • 2018
  • Concrete which is a composite material is one of the most important construction materials. Compressive strength is a commonly used parameter for the assessment of concrete quality. Accurate prediction of concrete compressive strength is an important issue. In this study, we utilized an experimental procedure for the assessment of concrete quality. Firstly, the concrete mix was prepared according to C 20 type concrete, and slump of fresh concrete was about 20 cm. After the placement of fresh concrete to formworks, compaction was achieved using a vibrating screed. After 28 day period, a total of 100 core samples having 75 mm diameter were extracted. On the core samples pulse velocity determination tests and compressive strength tests were performed. Besides, Windsor probe penetration tests and Schmidt hammer tests were also performed. After setting up the data set, twelve artificial intelligence (AI) models compared for predicting the concrete compressive strength. These models can be divided into three categories (i) Functions (i.e., Linear Regression, Simple Linear Regression, Multilayer Perceptron, Support Vector Regression), (ii) Lazy-Learning Algorithms (i.e., IBk Linear NN Search, KStar, Locally Weighted Learning) (iii) Tree-Based Learning Algorithms (i.e., Decision Stump, Model Trees Regression, Random Forest, Random Tree, Reduced Error Pruning Tree). Four evaluation processes, four validation implements (i.e., 10-fold cross validation, 5-fold cross validation, 10% split sample validation & 20% split sample validation) are used to examine the performance of predictive models. This study shows that machine learning regression techniques are promising tools for predicting compressive strength of concrete.

A Study on an Application of the Protection for the Visual Segment of the Approach Procedure focused on Taean Airport (접근절차의 시계구간 보호 적용 연구 - 태안비행장을 중심으로 -)

  • Kim, Dohyun;Hong, Seung Beom
    • Journal of the Korean Society for Aviation and Aeronautics
    • /
    • v.22 no.2
    • /
    • pp.9-15
    • /
    • 2014
  • 'Visual segment surface' means a surface that extends from the missed approach point of non precision approaches (or the decision altitude location for approaches with vertical guidance and precision approaches) to the threshold to facilitate the identification of and protection from obstacles in this visual segment of the approach. Validation is the necessary final quality assurance step in the procedure design process, prior to publication. The purpose of validation is the verification of all obstacle and navigation data, and assessment of flyability of the procedure. This paper shows how to apply the protection for the visual segment of the approach procedure, and the results of the validation for visual segment surface conducted at an airport.