• Title/Summary/Keyword: 테스팅자료

Search Result 56, Processing Time 0.019 seconds

The Comparative Software Cost Model of Considering Logarithmic Fault Detection Rate Based on Failure Observation Time (로그형 관측고장시간에 근거한 결함 발생률을 고려한 소프트웨어 비용 모형에 관한 비교 연구)

  • Kim, Kyung-Soo;Kim, Hee-Cheul
    • Journal of Digital Convergence
    • /
    • v.11 no.11
    • /
    • pp.335-342
    • /
    • 2013
  • In this study, reliability software cost model considering logarithmic fault detection rate based on observations from the process of software product testing was studied. Adding new fault probability using the Goel-Okumoto model that is widely used in the field of reliability problems presented. When correcting or modifying the software, finite failure non-homogeneous Poisson process model. For analysis of software cost model considering the time-dependent fault detection rate, the parameters estimation using maximum likelihood estimation of inter-failure time data was made. In this research, Software developers to identify the best time to release some extent be able to help is considered.

The Comparative Study of Software Optimal Release Time of Finite NHPP Model Considering Property of Nonlinear Intensity Function (비선형 강도함수 특성을 이용한 유한고장 NHPP모형에 근거한 소프트웨어 최적방출시기 비교 연구)

  • Kim, Kyung-Soo;Kim, Hee-Cheul
    • Journal of Digital Convergence
    • /
    • v.11 no.9
    • /
    • pp.159-166
    • /
    • 2013
  • In this paper, make a study decision problem called an optimal release policies after testing a software system in development phase and transfer it to the user. When correcting or modifying the software, finite failure non-homogeneous Poisson process model, presented and propose release policies of the life distribution, half-logistic property model which used to an area of reliability because of various shape and scale parameter. In this paper, discuss optimal software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement. In a numerical example, the parameters estimation using maximum likelihood estimation of failure time data, make out estimating software optimal release time. Software release time is used as prior information, potential security damages should be reduced.

A Software Reliability Cost Model Based on the Shape Parameter of Lomax Distribution (Lomax 분포의 형상모수에 근거한 소프트웨어 신뢰성 비용모형에 관한 연구)

  • Yang, Tae-Jin
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.9 no.2
    • /
    • pp.171-177
    • /
    • 2016
  • Software reliability in the software development process is an important issue. Software process improvement helps in finishing with reliable software product. Infinite failure NHPP software reliability models presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. In this study, reliability software cost model considering shape parameter based on life distribution from the process of software product testing was studied. The cost comparison problem of the Lomax distribution reliability growth model that is widely used in the field of reliability presented. The software failure model was used the infinite failure non-homogeneous Poisson process model. The parameters estimation using maximum likelihood estimation was conducted. For analysis of software cost model considering shape parameter. In the process of change and large software fix this situation can scarcely avoid the occurrence of defects is reality. The conditions that meet the reliability requirements and to minimize the total cost of the optimal release time. Studies comparing emissions when analyzing the problem to help kurtosis So why Kappa efficient distribution, exponential distribution, etc. updated in terms of the case is considered as also worthwhile. In this research, software developers to identify software development cost some extent be able to help is considered.

Theory and Implementation of Dynamic Taint Analysis for Tracing Tainted Data of Programs (프로그램의 오염 정보 추적을 위한 동적 오염 분석의 이론 및 구현)

  • Lim, Hyun-Il
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.2 no.7
    • /
    • pp.303-310
    • /
    • 2013
  • As the role of software increases in computing environments, issues in software security become more important problems. Dynamic taint analysis is a technique to trace and manage tainted data originated from unreliable sources during the execution of a program. This analysis can be applied to software security verification as well as software behavior understanding, testing unexpected errors, or debugging. In the previous researches, they focussed only to show the analysis results of dynamic taint analysis, and they did not logically describe propagation process of tainted data and analysis procedures. So, there were difficulties in understanding the analysis procedures or applying to other analysis. In this paper, by theoretically describing the analysis procedure, we logically show how the propagation process of tainted data can be traced, and present a theoretical model for dynamic taint analysis. In addition, we verify the correctness of the proposed model by implementing an analyser, and show that propagation of tainted data can be traced by the model. The proposed model can be applied to understand the analysis procedures of data flows in dynamic taint analysis, and can be used as an base knowledge for designing and implementing analysis method, which applies such analysis method.

The Study for Performance Analysis of Software Reliability Model using Fault Detection Rate based on Logarithmic and Exponential Type (로그 및 지수형 결함 발생률에 따른 소프트웨어 신뢰성 모형에 관한 신뢰도 성능분석 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.9 no.3
    • /
    • pp.306-311
    • /
    • 2016
  • Software reliability in the software development process is an important issue. Infinite failure NHPP software reliability models presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. In this paper, reliability software cost model considering logarithmic and exponential fault detection rate based on observations from the process of software product testing was studied. Adding new fault probability using the Goel-Okumoto model that is widely used in the field of reliability problems presented. When correcting or modifying the software, finite failure non-homogeneous Poisson process model. For analysis of software reliability model considering the time-dependent fault detection rate, the parameters estimation using maximum likelihood estimation of inter-failure time data was made. The logarithmic and exponential fault detection model is also efficient in terms of reliability because it (the coefficient of determination is 80% or more) in the field of the conventional model can be used as an alternative could be confirmed. From this paper, the software developers have to consider life distribution by prior knowledge of the software to identify failure modes which can be able to help.

A Study on Risk Parity Asset Allocation Model with XGBoos (XGBoost를 활용한 리스크패리티 자산배분 모형에 관한 연구)

  • Kim, Younghoon;Choi, HeungSik;Kim, SunWoong
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.135-149
    • /
    • 2020
  • Artificial intelligences are changing world. Financial market is also not an exception. Robo-Advisor is actively being developed, making up the weakness of traditional asset allocation methods and replacing the parts that are difficult for the traditional methods. It makes automated investment decisions with artificial intelligence algorithms and is used with various asset allocation models such as mean-variance model, Black-Litterman model and risk parity model. Risk parity model is a typical risk-based asset allocation model which is focused on the volatility of assets. It avoids investment risk structurally. So it has stability in the management of large size fund and it has been widely used in financial field. XGBoost model is a parallel tree-boosting method. It is an optimized gradient boosting model designed to be highly efficient and flexible. It not only makes billions of examples in limited memory environments but is also very fast to learn compared to traditional boosting methods. It is frequently used in various fields of data analysis and has a lot of advantages. So in this study, we propose a new asset allocation model that combines risk parity model and XGBoost machine learning model. This model uses XGBoost to predict the risk of assets and applies the predictive risk to the process of covariance estimation. There are estimated errors between the estimation period and the actual investment period because the optimized asset allocation model estimates the proportion of investments based on historical data. these estimated errors adversely affect the optimized portfolio performance. This study aims to improve the stability and portfolio performance of the model by predicting the volatility of the next investment period and reducing estimated errors of optimized asset allocation model. As a result, it narrows the gap between theory and practice and proposes a more advanced asset allocation model. In this study, we used the Korean stock market price data for a total of 17 years from 2003 to 2019 for the empirical test of the suggested model. The data sets are specifically composed of energy, finance, IT, industrial, material, telecommunication, utility, consumer, health care and staple sectors. We accumulated the value of prediction using moving-window method by 1,000 in-sample and 20 out-of-sample, so we produced a total of 154 rebalancing back-testing results. We analyzed portfolio performance in terms of cumulative rate of return and got a lot of sample data because of long period results. Comparing with traditional risk parity model, this experiment recorded improvements in both cumulative yield and reduction of estimated errors. The total cumulative return is 45.748%, about 5% higher than that of risk parity model and also the estimated errors are reduced in 9 out of 10 industry sectors. The reduction of estimated errors increases stability of the model and makes it easy to apply in practical investment. The results of the experiment showed improvement of portfolio performance by reducing the estimated errors of the optimized asset allocation model. Many financial models and asset allocation models are limited in practical investment because of the most fundamental question of whether the past characteristics of assets will continue into the future in the changing financial market. However, this study not only takes advantage of traditional asset allocation models, but also supplements the limitations of traditional methods and increases stability by predicting the risks of assets with the latest algorithm. There are various studies on parametric estimation methods to reduce the estimated errors in the portfolio optimization. We also suggested a new method to reduce estimated errors in optimized asset allocation model using machine learning. So this study is meaningful in that it proposes an advanced artificial intelligence asset allocation model for the fast-developing financial markets.