• Title/Summary/Keyword: 검증(validation)

Search Result 2,472, Processing Time 0.032 seconds

An Efficient RDF Query Validation for Access Authorization in Subsumption Inference (포함관계 추론에서 접근 권한에 대한 효율적 RDF 질의 유효성 검증)

  • Kim, Jae-Hoon;Park, Seog
    • Journal of KIISE:Databases
    • /
    • v.36 no.6
    • /
    • pp.422-433
    • /
    • 2009
  • As an effort to secure Semantic Web, in this paper, we introduce an RDF access authorization model based on an ontology hierarchy and an RDF triple pattern. In addition, we apply the authorization model to RDF query validation for approved access authorizations. A subscribed SPARQL or RQL query, which has RDF triple patterns, can be denied or granted according to the corresponding access authorizations which have an RDF triple pattern. In order to efficiently perform the query validation process, we first analyze some primary authorization conflict conditions under RDF subsumption inference, and then we introduce an efficient query validation algorithm using the conflict conditions and Dewey graph labeling technique. Through experiments, we also show that the proposed validation algorithm provides a reasonable validation time and when data and authorizations increase it has scalability.

Validation Testing Tool for Light-Weight Stream Ciphers (경량 스트림 암호 구현 적합성 검증 도구)

  • Kang Ju-Sung;Shin Hyun Koo;Yi Okyeon;Hong Dowon
    • The KIPS Transactions:PartC
    • /
    • v.12C no.4 s.100
    • /
    • pp.495-502
    • /
    • 2005
  • Cryptographic algorithm testing is performed to ensure that a specific algorithm implementation is implemented correctly and functions correctly. CMVP(Cryptographic Module Validation Program) of NIST in US is the well-known testing system that validates cryptographic modules to Federal Information Processing Standards (FIPS). There is no FIPS-approved stream cipher, and CMVP doesn't involve its validation testing procedure. In this paper we provide validation systems for three currently used light-weight stream ciphers: Bluetooth encryption algorithm E0, 3GPP encryption algorithm A5/3, and RC4 used for WEP and SSL/TLS Protocols. Moreover we describe our validation tools implemented by JAVA programing.

Development of Miniaturized Automatic Chromatography System for validation Study of Chromatographic Resin lifetime (크로마토그래피 담체의 수멍을 검증하기 위한 자동화 미니 크로마토그래피 시스템 개발)

  • 박재하;서창우
    • KSBB Journal
    • /
    • v.17 no.4
    • /
    • pp.326-332
    • /
    • 2002
  • The quality of biopharmaceutical proteins is strongly affected by a manufacturing process employed to produce Et, and thus validation of the manufacturing bioprocess is a very important issue. Chromatography is probably the most widely used bioprocess unit operation for protein purification. In this study, a miniaturized automatic chromatography system was designed and constructed for scale-down studies for process chromatography validation. This system, named MiniValChrom, has the following features: automatic and repeated operation, flexible sequences and intervals among the steps, on-line and real-time monitoring and control, method files savings, etc. Using the MiniValChrom, we peformed a case study of an abbreviated experiment to estimate chromatographic resin lifetime. BSA (bovine serum albumin) and Cibacron Blue 3G-A were used as the model protein and the resin, respectively. The resin deterioration was evaluated by determining and monitoring the HETP and NTP values from the chromatograms every 5 cycles. It was observed that the HETP and the NTP values were changed by 9% after 15 cycles. The resin lifetime validation could be completed by repeating this experiment until the HETP value reached a predetermined value. The MiniValChrom's concept and the protocol suggested in this study can serve as a rapid and economical tool for the validation studies of bioprocess chromatography system.

Real-Time PCR for Validation of Minute Virus of Mice Safety during the Manufacture of Mammalian Cell Culture-Derived Biopharmaceuticals (세포배양 유래 생물의약품 생산 공정에서 Minute Virus of Mice 안전성 검증을 위한 Real-Time PCR)

  • Lee, Dong-Hyuck;Cho, Hang-Mee;Kim, Hyun-Mi;Lee, Jung-Suk;Kim, In-Seop
    • Microbiology and Biotechnology Letters
    • /
    • v.36 no.1
    • /
    • pp.12-20
    • /
    • 2008
  • Validation of viral safety is essential in ensuring the safety of mammalian cell culture-derived biopharmaceuticals, because numerous adventitious viruses have been contaminated during the manufacture of the products. Mammalian cells are highly susceptible to minute virus of mice(MVM), and there are several reports of MVM contamination during the manufacture of biopharmaceuticals. In order to establish the validation system for the MVM safety, a real-time PCR method was developed for quantitative detection of MVM in cell lines, raw materials, manufacturing processes, and final products as well as MVM clearance validation. Specific primers for amplification of MVM DNA was selected, and MVM DNA was quantified by use of SYBR Green I. The sensitivity of the assay was calculated to be $6{\times}10^{-2}TCID_{50}/mL$. The real-time PCR method was proven to be reproducible and very specific to MVM. The established real-time PCR assay was successfully applied to the validation of Chinese hamster ovary (CHO) cell artificially infected with MVM. MVM DNA could be Quantified in CHO cell as well as culture supernatant. When the real-time PCR assay was applied to the validation of virus removal during a virus filtration process, the result was similar to that of virus infectivity assay. Therefore, it was concluded that this rapid, specific, sensitive, and robust assay could replace infectivity assay for detection and clearance validation of MVM.

A Study on the Statistical Model Validation using Response-adaptive Experimental Design (반응적응 시험설계법을 이용하는 통계적 해석모델 검증 기법 연구)

  • Jung, Byung Chang;Huh, Young-Chul;Moon, Seok-Jun;Kim, Young Joong
    • Proceedings of the Korean Society for Noise and Vibration Engineering Conference
    • /
    • 2014.10a
    • /
    • pp.347-349
    • /
    • 2014
  • Model verification and validation (V&V) is a current research topic to build computational models with high predictive capability by addressing the general concepts, processes and statistical techniques. The hypothesis test for validity check is one of the model validation techniques and gives a guideline to evaluate the validity of a computational model when limited experimental data only exist due to restricted test resources (e.g., time and budget). The hypothesis test for validity check mainly employ Type I error, the risk of rejecting the valid computational model, for the validity evaluation since quantification of Type II error is not feasible for model validation. However, Type II error, the risk of accepting invalid computational model, should be importantly considered for an engineered products having high risk on predicted results. This paper proposes a technique named as the response-adaptive experimental design to reduce Type II error by adaptively designing experimental conditions for the validation experiment. A tire tread block problem and a numerical example are employed to show the effectiveness of the response-adaptive experimental design for the validity evaluation.

  • PDF

A Study of Optimal Ratio of Data Partition for Neuro-Fuzzy-Based Software Reliability Prediction (뉴로-퍼지 소프트웨어 신뢰성 예측에 대한 최적의 데이터 분할비율에 관한 연구)

  • Lee, Sang-Un
    • The KIPS Transactions:PartD
    • /
    • v.8D no.2
    • /
    • pp.175-180
    • /
    • 2001
  • This paper presents the optimal fraction of validation set to obtain a prediction accuracy of software failure count or failure time in the future by a neuro-fuzzy system. Given a fixed amount of training data, the most popular effective approach to avoiding underfitting and overfitting is early stopping, and hence getting optimal generalization. But there is unresolved practical issues : How many data do you assign to the training and validation set\ulcorner Rules of thumb abound, the solution is acquired by trial-and-error and we spend long time in this method. For the sake of optimal fraction of validation set, the variant specific fraction for the validation set be provided. It shows that minimal fraction of the validation data set is sufficient to achieve good next-step prediction. This result can be considered as a practical guideline in a prediction of software reliability by neuro-fuzzy system.

  • PDF

Analysis of Saccharomyces Cell Cycle Expression Data using Bayesian Validation of Fuzzy Clustering (퍼지 클러스터링의 베이지안 검증 방법을 이용한 발아효모 세포주기 발현 데이타의 분석)

  • Yoo Si-Ho;Won Hong-Hee;Cho Sung-Bae
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.12
    • /
    • pp.1591-1601
    • /
    • 2004
  • Clustering, a technique for the analysis of the genes, organizes the patterns into groups by the similarity of the dataset and has been used for identifying the functions of the genes in the cluster or analyzing the functions of unknown gones. Since the genes usually belong to multiple functional families, fuzzy clustering methods are more appropriate than the conventional hard clustering methods which assign a sample to a group. In this paper, a Bayesian validation method is proposed to evaluate the fuzzy partitions effectively. Bayesian validation method is a probability-based approach, selecting a fuzzy partition with the largest posterior probability given the dataset. At first, the proposed Bayesian validation method is compared to the 4 representative conventional fuzzy cluster validity measures in 4 well-known datasets where foray c-means algorithm is used. Then, we have analyzed the results of Saccharomyces cell cycle expression data evaluated by the proposed method.

An Accurate Cryptocurrency Price Forecasting using Reverse Walk-Forward Validation (역순 워크 포워드 검증을 이용한 암호화폐 가격 예측)

  • Ahn, Hyun;Jang, Baekcheol
    • Journal of Internet Computing and Services
    • /
    • v.23 no.4
    • /
    • pp.45-55
    • /
    • 2022
  • The size of the cryptocurrency market is growing. For example, market capitalization of bitcoin exceeded 500 trillion won. Accordingly, many studies have been conducted to predict the price of cryptocurrency, and most of them have similar methodology of predicting stock prices. However, unlike stock price predictions, machine learning become best model in cryptocurrency price predictions, conceptually cryptocurrency has no passive income from ownership, and statistically, cryptocurrency has at least three times higher liquidity than stocks. Thats why we argue that a methodology different from stock price prediction should be applied to cryptocurrency price prediction studies. We propose Reverse Walk-forward Validation (RWFV), which modifies Walk-forward Validation (WFV). Unlike WFV, RWFV measures accuracy for Validation by pinning the Validation dataset directly in front of the Test dataset in time series, and gradually increasing the size of the Training dataset in front of it in time series. Train data were cut according to the size of the Train dataset with the highest accuracy among all measured Validation accuracy, and then combined with Validation data to measure the accuracy of the Test data. Logistic regression analysis and Support Vector Machine (SVM) were used as the analysis model, and various algorithms and parameters such as L1, L2, rbf, and poly were applied for the reliability of our proposed RWFV. As a result, it was confirmed that all analysis models showed improved accuracy compared to existing studies, and on average, the accuracy increased by 1.23%p. This is a significant improvement in accuracy, given that most of the accuracy of cryptocurrency price prediction remains between 50% and 60% through previous studies.

A Cross-Validation of SeismicVulnerability Assessment Model: Application to Earthquake of 9.12 Gyeongju and 2017 Pohang (지진 취약성 평가 모델 교차검증: 경주(2016)와 포항(2017) 지진을 대상으로)

  • Han, Jihye;Kim, Jinsoo
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.3
    • /
    • pp.649-655
    • /
    • 2021
  • This study purposes to cross-validate its performance by applying the optimal seismic vulnerability assessment model based on previous studies conducted in Gyeongju to other regions. The test area was Pohang City, the occurrence site for the 2017 Pohang Earthquake, and the dataset was built the same influencing factors and earthquake-damaged buildings as in the previous studies. The validation dataset was built via random sampling, and the prediction accuracy was derived by applying it to a model based on a random forest (RF) of Gyeongju. The accuracy of the model success and prediction in Gyeongju was 100% and 94.9%, respectively, and as a result of confirming the prediction accuracy by applying the Pohang validation dataset, it appeared as 70.4%.

Performance Improvement of Cert-Validation of Certification based on FM Subcarrier Broadcasting (FM방식을 이용한 인증서 유효성 검증의 성능 향상)

  • 장홍종;이성은;이정현
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.12 no.3
    • /
    • pp.3-13
    • /
    • 2002
  • There are cases that revoke the certification because of disclosure of private key, deprivation of qualification and the expiration of a term of validity on PKI. So, a user has to confirm the public key whether valid or invalid in the certification. There are many methods such as CRL, Delta-CRL, OCSP for the cert-validation of certification. But these methods have many problems, which cause overload traffic on network and the CRL server because of realtime processing for cert-validation of certification. In this paper we proposed cert-validation of certification improvement method based on FM Subcarrier Broadcasting, which solved problems that are data integrity by different time between transmission and receiving for CRL, and overload traffic on network and the CRL server the realtime management.