• Title/Summary/Keyword: Poission process

Search Result 18, Processing Time 0.028 seconds

The Study for NHPP Software Reliability Growth Model of Percentile Change-point (백분위수 변화점을 고려한 NHPP 소프트웨어 신뢰성장모형에 관한 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Convergence Security Journal
    • /
    • v.8 no.4
    • /
    • pp.115-120
    • /
    • 2008
  • Accurate predictions of software release times, and estimation of the reliability and availability of a software product require quantification of a critical element of the software testing process: Change-point problem. In this paper, exponential (Goel-Okumoto) model was reviewed, proposes the percentile change-point problem, which maked out efficiency application for software reliability. Algorithm to estimate the parameters used to maximum likelihood estimator and bisection method, model selection based on SSE statistics, for the sake of efficient model, was employed. Using NTDS data, The numerical example of percentilechange-point problemi s presented.

  • PDF

Assessing Infinite Failure Software Reliability Model Using SPC (Statistical Process Control) (통계적 공정관리(SPC)를 이용한 무한고장 소프트웨어 신뢰성 모형에 대한 접근방법 연구)

  • Kim, Hee Cheul;Shin, Hyun Cheul
    • Convergence Security Journal
    • /
    • v.12 no.6
    • /
    • pp.85-92
    • /
    • 2012
  • There are many software reliability models that are based on the times of occurrences of errors in the debugging of software. It is shown that it is possible to do asymptotic likelihood inference for software reliability models based on infinite failure model and non-homogeneous Poisson Processes (NHPP). For someone making a decision about when to market software, the conditional failure rate is an important variables. The finite failure model are used in a wide variety of practical situations. Their use in characterization problems, detection of outliers, linear estimation, study of system reliability, life-testing, survival analysis, data compression and many other fields can be seen from the many study. Statistical Process Control (SPC) can monitor the forecasting of software failure and there by contribute significantly to the improvement of software reliability. Control charts are widely used for software process control in the software industry. In this paper, we proposed a control mechanism based on NHPP using mean value function of log Poission, log-linear and Parto distribution.

The Study of NHPP Software Reliability Model from the Perspective of Learning Effects (학습 효과 기법을 이용한 NHPP 소프트웨어 신뢰도 모형에 관한 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Convergence Security Journal
    • /
    • v.11 no.1
    • /
    • pp.25-32
    • /
    • 2011
  • In this study, software products developed in the course of testing, software managers in the process of testing software test and test tools for effective learning effects perspective has been studied using the NHPP software. The Weibull distribution applied to distribution was based on finite failure NHPP. Software error detection techniques known in advance, but influencing factors for considering the errors found automatically and learning factors, by prior experience, to find precisely the error factor setting up the testing manager are presented comparing the problem. As a result, the learning factor is greater than automatic error that is generally efficient model could be confirmed. This paper, a numerical example of applying using time between failures and parameter estimation using maximum likelihood estimation method, after the efficiency of the data through trend analysis model selection were efficient using the mean square error and $R_{sq}$.

Manufacture of Artificial stone using Wasts Stone and Powder Sludge (폐석 및 석분 슬러지를 활용한 인조석판재의 제조)

  • 손정수;김병규;김치권
    • Resources Recycling
    • /
    • v.4 no.1
    • /
    • pp.4-11
    • /
    • 1995
  • The amounts of waste stone and stone powder sludge that occurred in the quarry and processing plant of s stone plates, have been increased with the development of stone industry. The manufactunng process of 따tificial s stone was studied to reduce the outlet of these wastes and utilIze them as raw materials for architecture, interior decoration and art work. In order to compare the properties of artiflcial stone with those of natural building-stone, the physi$\alpha$II properties of artificial stone such as specific gravity, absorption ratio, elastic wave velocity, compressive s strength, tensile strength, shore hardness, elasticity and Poission's ratio were measured. From the mesaured d data of physical properties, it was found that physical propertIes of artificial stone were controlled by homogeneous m mixing ratio of constituents, molding pressure, and amount of binder. Also, from the thermo-gravimetric analysis, it was found that artIfIcial stone manufactured had a good thermal stability up to $300^{\circ}C$. It was concluded that t the optimum conditions for manufacturing process of artificial stone were $200kg/\textrm{cm}^2$ of molding pressure, 12-15 w weight % of binder amounts.

  • PDF

The Study for ENHPP Software Reliability Growth Model Based on Kappa(2) Coverage Function (Kappa(2) 커버리지 함수를 이용한 ENHPP 소프트웨어 신뢰성장모형에 관한 연구)

  • Kim, Hee-Cheul
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.11 no.12
    • /
    • pp.2311-2318
    • /
    • 2007
  • Finite failure NHPP models presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. Accurate predictions of software release times, and estimation of the reliability and availability of a software product require Release times of a critical element of the software testing process : test coverage. This model called Enhanced non-homogeneous Poission process(ENHPP). In this paper, exponential coverage and S-shaped model was reviewed, proposes the Kappa coverage model, which make out efficiency application for software reliability. Algorithm to estimate the parameters used to maximum likelihood estimator and bisection method, model selection based on SSE statistics and Kolmogorov distance, for the sake of efficient model, was employed. Numerical examples using real data set for the sake of proposing Kappa coverage model was employed. This analysis of failure data compared with the Kappaa coverage model and the existing model(using arithmetic and Laplace trend tests, bias tests) is presented.

An Image Composition Technique using Water-Wave Image Analysis (물결영상 분석을 통한 이미지 합성기법에 관한 연구)

  • Li, Xianji;Kim, Jung-A;Ming, Shi-Hwa;Kim, Dong-Ho
    • Journal of the Korea Society of Computer and Information
    • /
    • v.13 no.1
    • /
    • pp.193-202
    • /
    • 2008
  • In this study, we want to composite the source image and the target image when the environment includes water surface in the target image such as lake, sea, etc. The water surface is different from other common environment. On the water surface, the object must be reflected or refract and sometimes is deformed by the wave of water. In order to composite the object in the source image onto the water image, we analyze the water surface of the target image and let the object be synthesized realistically based on the wave of water. Our composite process consists of three steps. First. we use Shape-from-Shading technique to extract the normal vector of the water surface in the target image. Next, the source image is deformed according to the normal vector map. Finally, we composite the deformed object onto the target image.

  • PDF

A Construction and Operation Analysis of Group Management Network about Control Devices based on CIM Level 3 (CIM 계층 3에서 제어 기기들의 그룹 관리 네트워크 구축과 운영 해석)

  • 김정호
    • The Journal of Society for e-Business Studies
    • /
    • v.4 no.1
    • /
    • pp.87-101
    • /
    • 1999
  • To operate the automatic devices of manufacturing process more effectively and to solve the needs of the resource sharing, network technology is applied to the control devices located in common manufacturing zone and operated by connecting them. In this paper, functional standard of the network layers are set as physical and data link layer of IEEE 802.2, 802.4, and VMD application layer and ISO-CIM reference model. Then, they are divided as minimized architecture, designed as group objects which perform group management and service objects which organizes and operates the group. For the stability in this network, this paper measures the variation of data packet length and node number and analyzes the variated value of the waiting time for the network operation. For the method of the analysis, non-exhausted service method are selected, and the arrival rates of the each data packet to the nodes that are assumed to form a Poission distribution. Then, queue model is set as M/G/1, and the analysis equation for waiting time is found. For the evalution of the performance, the length of the data packet varies from 10 bytes to 100 bytes in the operation of the group management network, the variation of the wating time is less than 10 msec. Since the waiting time in this case is less than 10 msec, response time is fast enough. Furthermore, to evaluate the real time processing of the group management network, it shows if the number of nodes is less than 40, and the average arrival time is less than 40 packet/sec, it can perform stable operation even taking the overhead such as software delay time, indicated packet service, and transmissin safety margin.

  • PDF

Study on the Methodology of the Microbial Risk Assessment in Food (식품중 미생물 위해성평가 방법론 연구)

  • 이효민;최시내;윤은경;한지연;김창민;김길생
    • Journal of Food Hygiene and Safety
    • /
    • v.14 no.4
    • /
    • pp.319-326
    • /
    • 1999
  • Recently, it is continuously rising to concern about the health risk being induced by microorganisms in food such as Escherichia coli O157:H7 and Listeria monocytogenes. Various organizations and regulatory agencies including U.S.FPA, U.S.DA and FAO/WHO are preparing the methodology building to apply microbial quantitative risk assessment to risk-based food safety program. Microbial risks are primarily the result of single exposure and its health impacts are immediate and serious. Therefore, the methodology of risk assessment differs from that of chemical risk assessment. Microbial quantitative risk assessment consists of tow steps; hazard identification, exposure assessment, dose-response assessment and risk characterization. Hazard identification is accomplished by observing and defining the types of adverse health effects in humans associated with exposure to foodborne agents. Epidemiological evidence which links the various disease with the particular exposure route is an important component of this identification. Exposure assessment includes the quantification of microbial exposure regarding the dynamics of microbial growth in food processing, transport, packaging and specific time-temperature conditions at various points from animal production to consumption. Dose-response assessment is the process characterizing dose-response correlation between microbial exposure and disease incidence. Unlike chemical carcinogens, the dose-response assessment for microbial pathogens has not focused on animal models for extrapolation to humans. Risk characterization links the exposure assessment and dose-response assessment and involve uncertainty analysis. The methodology of microbial dose-response assessment is classified as nonthreshold and thresh-old approach. The nonthreshold model have assumption that one organism is capable of producing an infection if it arrives at an appropriate site and organism have independence. Recently, the Exponential, Beta-poission, Gompertz, and Gamma-weibull models are using as nonthreshold model. The Log-normal and Log-logistic models are using as threshold model. The threshold has the assumption that a toxicant is produce by interaction of organisms. In this study, it was reviewed detailed process including risk value using model parameter and microbial exposure dose. Also this study suggested model application methodology in field of exposure assessment using assumed food microbial data(NaCl, water activity, temperature, pH, etc.) and the commercially used Food MicroModel. We recognized that human volunteer data to the healthy man are preferred rather than epidemiological data fur obtaining exact dose-response data. But, the foreign agencies are studying the characterization of correlation between human and animal. For the comparison of differences to the population sensitivity: it must be executed domestic study such as the establishment of dose-response data to the Korean volunteer by each microbial and microbial exposure assessment in food.

  • PDF