• 제목/요약/키워드: data validation

검색결과 3,309건 처리시간 0.042초

Validation of GNSS TEC from NMSC GNSS Processing System

  • Lee, Jeong-Deok;Oh, Seung-Jun;Kil, Hyo-Sub;Shin, Dae-Yun
    • 천문학회보
    • /
    • 제36권2호
    • /
    • pp.101.1-101.1
    • /
    • 2011
  • National Meteorological Satellite Center(NMSC) of Korea Meteorological Administration(KMA) is collecting GNSS data in near-real time for about 80 GNSS stations operated by multiple agencies. (eg. National Geographic Information Institute (NGII), Korea Astronomy and Space Science Institute (KASI), DGNSS Central Office) Using these GNSS data, NMSC developed automatic Total Electron Contents(TEC) derivation system over the Korean peninsular every 1-hour based on single station data processing. We present the TEC result and validation of TEC using International GNSS Service(IGS) global TEC data for the case of quiet time and storm time. The future plans for the system improvement will be discussed.

  • PDF

Development of data analysis tool for combat system integration

  • Shin, Seung-Chun;Shin, Jong-Gye;Oh, Dae-Kyun
    • International Journal of Naval Architecture and Ocean Engineering
    • /
    • 제5권1호
    • /
    • pp.147-160
    • /
    • 2013
  • System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT) for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

Two-step LS-SVR for censored regression

  • Bae, Jong-Sig;Hwang, Chang-Ha;Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제23권2호
    • /
    • pp.393-401
    • /
    • 2012
  • This paper deals with the estimations of the least squares support vector regression when the responses are subject to randomly right censoring. The estimation is performed via two steps - the ordinary least squares support vector regression and the least squares support vector regression with censored data. We use the empirical fact that the estimated regression functions subject to randomly right censoring are close to the true regression functions than the observed failure times subject to randomly right censoring. The hyper-parameters of model which affect the performance of the proposed procedure are selected by a generalized cross validation function. Experimental results are then presented which indicate the performance of the proposed procedure.

A Study on the Prediction of Community Smart Pension Intention Based on Decision Tree Algorithm

  • Liu, Lijuan;Min, Byung-Won
    • International Journal of Contents
    • /
    • 제17권4호
    • /
    • pp.79-90
    • /
    • 2021
  • With the deepening of population aging, pension has become an urgent problem in most countries. Community smart pension can effectively resolve the problem of traditional pension, as well as meet the personalized and multi-level needs of the elderly. To predict the pension intention of the elderly in the community more accurately, this paper uses the decision tree classification method to classify the pension data. After missing value processing, normalization, discretization and data specification, the discretized sample data set is obtained. Then, by comparing the information gain and information gain rate of sample data features, the feature ranking is determined, and the C4.5 decision tree model is established. The model performs well in accuracy, precision, recall, AUC and other indicators under the condition of 10-fold cross-validation, and the precision was 89.5%, which can provide the certain basis for government decision-making.

GLP Bioanalysis from the US FDA Perspective

  • Wilkinson, Tames M.
    • 한국독성학회:학술대회논문집
    • /
    • 한국독성학회 2006년도 추계학술대회
    • /
    • pp.75-79
    • /
    • 2006
  • The United States Food and Drug Administration is responsible for ensuring US residents receive safe and effective medicines. Since blood levels of drugs are correlated with pharmacological effect, FDA closely regulates how those blood levels are measured. The FDA has established requirements for bioanalytical analyses such as minimum method validation, SOP, and reporting criteria. The FDA also has standards for computer validation which must be followed to ensure the data are reliable. Data presented to the Agency are scrutinized to ensure they are accurate and a true reflection of the raw data generated in a study. To verify the quality of data, FDA has developed an inspection program. The specific requirements of the FDA related to bioanalysis will be discussed.

  • PDF

WindSim을 이용한 풍황탑 차폐오차 구간의 보정치 검증 (Validation of Calibrated Wind Data Sector including Shadow Effects of a Meteorological Mast Using WindSim)

  • 박근성;유기완;김현구
    • 풍력에너지저널
    • /
    • 제4권2호
    • /
    • pp.34-39
    • /
    • 2013
  • The wind resource assessment for measured wind data over 1 year by using the meteorological mast should be a prerequisite for business feasibility of the wind farm development. Even though the direction of boom mounting the wind vane and anemometer is carefully engineered to escape the interference of wakes generated from the met-mast structures, the shadow effect is not completely avoided due to seasonal winds in the Korean Peninsula. The shadow effect should be properly calibrated because it is able to distort the wind resources. In this study a calibration method is introduced for the measured wind data at Julpo in Jeonbuk Province. Each sectoral terrain conditions along the selected wind direction nearby the met-mast is investigated, and the distorted wind data due to shadow effects can be calibrated effectively. The correction factor is adopted for quantitative calibration by carrying out the WindSim analysis.

An Indoor Localization Algorithm based on Improved Particle Filter and Directional Probabilistic Data Association for Wireless Sensor Network

  • Long Cheng;Jiayin Guan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제17권11호
    • /
    • pp.3145-3162
    • /
    • 2023
  • As an important technology of the internetwork, wireless sensor network technique plays an important role in indoor localization. Non-line-of-sight (NLOS) problem has a large effect on indoor location accuracy. A location algorithm based on improved particle filter and directional probabilistic data association (IPF-DPDA) for WSN is proposed to solve NLOS issue in this paper. Firstly, the improved particle filter is proposed to reduce error of measuring distance. Then the hypothesis test is used to detect whether measurements are in LOS situations or NLOS situations for N different groups. When there are measurements in the validation gate, the corresponding association probabilities are applied to weight retained position estimate to gain final location estimation. We have improved the traditional data association and added directional information on the original basis. If the validation gate has no measured value, we make use of the Kalman prediction value to renew. Finally, simulation and experimental results show that compared with existing methods, the IPF-DPDA performance better.

Recovery the Missing Streamflow Data on River Basin Based on the Deep Neural Network Model

  • Le, Xuan-Hien;Lee, Giha
    • 한국수자원학회:학술대회논문집
    • /
    • 한국수자원학회 2019년도 학술발표회
    • /
    • pp.156-156
    • /
    • 2019
  • In this study, a gated recurrent unit (GRU) network is constructed based on a deep neural network (DNN) with the aim of restoring the missing daily flow data in river basins. Lai Chau hydrological station is located upstream of the Da river basin (Vietnam) is selected as the target station for this study. Input data of the model are data on observed daily flow for 24 years from 1961 to 1984 (before Hoa Binh dam was built) at 5 hydrological stations, in which 4 gauge stations in the basin downstream and restoring - target station (Lai Chau). The total available data is divided into sections for different purposes. The data set of 23 years (1961-1983) was employed for training and validation purposes, with corresponding rates of 80% for training and 20% for validation respectively. Another data set of one year (1984) was used for the testing purpose to objectively verify the performance and accuracy of the model. Though only a modest amount of input data is required and furthermore the Lai Chau hydrological station is located upstream of the Da River, the calculated results based on the suggested model are in satisfactory agreement with observed data, the Nash - Sutcliffe efficiency (NSE) is higher than 95%. The finding of this study illustrated the outstanding performance of the GRU network model in recovering the missing flow data at Lai Chau station. As a result, DNN models, as well as GRU network models, have great potential for application within the field of hydrology and hydraulics.

  • PDF

포함관계 추론에서 접근 권한에 대한 효율적 RDF 질의 유효성 검증 (An Efficient RDF Query Validation for Access Authorization in Subsumption Inference)

  • 김재훈;박석
    • 한국정보과학회논문지:데이타베이스
    • /
    • 제36권6호
    • /
    • pp.422-433
    • /
    • 2009
  • 시맨틱 웹을 위한 하나의 보안연구로, 본 논문에서는, 온톨로지 계층 구조와 RDF 트리플 패턴에 기반한 RDF 접근 권한 명세 모델을 소개한다. 또한 권한 명세 모델을 승인된 접근 권한들에 대한 RDF 질의 유효성 검증 과정에 적용한다. RDF 트리플 패턴을 가지는 대표적 RDF 질의 언어인 SPARQL 또는 RQL 질의는 RDF 트리플 패턴 형식으로 명세된 접근 권한에 따라 실행 거부되거나 인가될 수 있다. 이러한 질의 유효성 검증 과정을 효율적으로 수행하기 위하여 RDF 포함 관계 추론에서의 주요한 권한 충돌 조건들을 분석한다. 다음으로 분석된 충돌조건과 Dewey 그래프 레이블링 기술을 활용하는 효율적 질의 유효성 검증 알고리즘을 제시한다. 실험을 통하여 제시된 검증 알고리즘이 합리적인 유효성 검증 시간과, 데이터와 접근권한들이 증가할 때 확장성을 가짐을 보인다.

Digital Forensics: Review of Issues in Scientific Validation of Digital Evidence

  • Arshad, Humaira;Jantan, Aman Bin;Abiodun, Oludare Isaac
    • Journal of Information Processing Systems
    • /
    • 제14권2호
    • /
    • pp.346-376
    • /
    • 2018
  • Digital forensics is a vital part of almost every criminal investigation given the amount of information available and the opportunities offered by electronic data to investigate and evidence a crime. However, in criminal justice proceedings, these electronic pieces of evidence are often considered with the utmost suspicion and uncertainty, although, on occasions are justifiable. Presently, the use of scientifically unproven forensic techniques are highly criticized in legal proceedings. Nevertheless, the exceedingly distinct and dynamic characteristics of electronic data, in addition to the current legislation and privacy laws remain as challenging aspects for systematically attesting evidence in a court of law. This article presents a comprehensive study to examine the issues that are considered essential to discuss and resolve, for the proper acceptance of evidence based on scientific grounds. Moreover, the article explains the state of forensics in emerging sub-fields of digital technology such as, cloud computing, social media, and the Internet of Things (IoT), and reviewing the challenges which may complicate the process of systematic validation of electronic evidence. The study further explores various solutions previously proposed, by researchers and academics, regarding their appropriateness based on their experimental evaluation. Additionally, this article suggests open research areas, highlighting many of the issues and problems associated with the empirical evaluation of these solutions for immediate attention by researchers and practitioners. Notably, academics must react to these challenges with appropriate emphasis on methodical verification. Therefore, for this purpose, the issues in the experiential validation of practices currently available are reviewed in this study. The review also discusses the struggle involved in demonstrating the reliability and validity of these approaches with contemporary evaluation methods. Furthermore, the development of best practices, reliable tools and the formulation of formal testing methods for digital forensic techniques are highlighted which could be extremely useful and of immense value to improve the trustworthiness of electronic evidence in legal proceedings.