• Title/Summary/Keyword: 포아송 과정

Search Result 141, Processing Time 0.027 seconds

Software Reliability Growth Models considering an Imperfect Debugging environments (불완전 디버깅 환경을 고려한 소프트웨어 신뢰도 성장모델)

  • 이재기;이규욱;김창봉;남상식
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.29 no.6A
    • /
    • pp.589-599
    • /
    • 2004
  • Most models assume the complete debugging environments by requiring a complete software correction in quantitative evaluation of software reliability. But, in many case, new faults are involved in debugging works, for complete software correction is impossible. In this paper, software growth model is proposed about incomplete debugging environments by considering the possibility of new faults involvements, and software faults occurrence status are also mentioned about NHPP by considering software faults under software operation environments and native faults owing to the randomly involved faults in operation before test. While, effective quantitative measurements are derived in software reliability evaluation, applied results are suggested by using actual data, and fitnesswith existing models are also compared and analyzed.

Extreme Quantile Estimation of Losses in KRW/USD Exchange Rate (원/달러 환율 투자 손실률에 대한 극단분위수 추정)

  • Yun, Seok-Hoon
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.5
    • /
    • pp.803-812
    • /
    • 2009
  • The application of extreme value theory to financial data is a fairly recent innovation. The classical annual maximum method is to fit the generalized extreme value distribution to the annual maxima of a data series. An alterative modern method, the so-called threshold method, is to fit the generalized Pareto distribution to the excesses over a high threshold from the data series. A more substantial variant is to take the point-process viewpoint of high-level exceedances. That is, the exceedance times and excess values of a high threshold are viewed as a two-dimensional point process whose limiting form is a non-homogeneous Poisson process. In this paper, we apply the two-dimensional non-homogeneous Poisson process model to daily losses, daily negative log-returns, in the data series of KBW/USD exchange rate, collected from January 4th, 1982 until December 31 st, 2008. The main question is how to estimate extreme quantiles of losses such as the 10-year or 50-year return level.

An Improvement of the Approximation of the Ruin Probability in a Risk Process (보험 상품 파산 확률 근사 방법의 개선 연구)

  • Lee, Hye-Sun;Choi, Seung-Kyoung;Lee, Eui-Yong
    • The Korean Journal of Applied Statistics
    • /
    • v.22 no.5
    • /
    • pp.937-942
    • /
    • 2009
  • In this paper, a continuous-time risk process in an insurance business is considered, where the premium rate is constant and the claim process forms a compound Poisson process. We say that a ruin occurs if the surplus of the risk process becomes negative. It is practically impossible to calculate analytically the ruin probability because the theoretical formula of the ruin probability contains the recursive convolutions and infinite sum. Hence, many authors have suggested approximation formulas of the ruin probability. We introduce a new approximation formula of the ruin probability which extends the well-known De Vylder's and exponential approximation formulas. We compare our approximation formula with the existing ones and show numerically that our approximation formula gives closer values to the true ruin probability in most cases.

The Comparative Software Reliability Cost Model of Considering Shape Parameter (형상모수를 고려한 소프트웨어 신뢰성 비용 모형에 관한 비교 연구)

  • Kim, Kyung-Soo;Kim, Hee-Cheul
    • Journal of Digital Convergence
    • /
    • v.12 no.3
    • /
    • pp.219-226
    • /
    • 2014
  • In this study, reliability software cost model considering shape parameter based on life distribution from the process of software product testing was studied. The shape parameter using the Erlang and Log-logistic model that is widely used in the field of reliability problems presented. The software failure model was used finite failure non-homogeneous Poisson process model, the parameters estimation using maximum likelihood estimation was conducted. In comparison result of software cost model based on the Erlang distribution and the log-logistic distribution software cost model, because Erlang model is to predict the optimal release time can be software, but the log-logistic model to predict to optimal release time can not be, Erlang distribution than the log-logistic distribution appears to be effective. In this research, software developers to identify software development cost some extent be able to help is considered.

The Comparative Software Cost Model of Considering Logarithmic Fault Detection Rate Based on Failure Observation Time (로그형 관측고장시간에 근거한 결함 발생률을 고려한 소프트웨어 비용 모형에 관한 비교 연구)

  • Kim, Kyung-Soo;Kim, Hee-Cheul
    • Journal of Digital Convergence
    • /
    • v.11 no.11
    • /
    • pp.335-342
    • /
    • 2013
  • In this study, reliability software cost model considering logarithmic fault detection rate based on observations from the process of software product testing was studied. Adding new fault probability using the Goel-Okumoto model that is widely used in the field of reliability problems presented. When correcting or modifying the software, finite failure non-homogeneous Poisson process model. For analysis of software cost model considering the time-dependent fault detection rate, the parameters estimation using maximum likelihood estimation of inter-failure time data was made. In this research, Software developers to identify the best time to release some extent be able to help is considered.

The Comparative Study of Software Optimal Release Time of Finite NHPP Model Considering Property of Nonlinear Intensity Function (비선형 강도함수 특성을 이용한 유한고장 NHPP모형에 근거한 소프트웨어 최적방출시기 비교 연구)

  • Kim, Kyung-Soo;Kim, Hee-Cheul
    • Journal of Digital Convergence
    • /
    • v.11 no.9
    • /
    • pp.159-166
    • /
    • 2013
  • In this paper, make a study decision problem called an optimal release policies after testing a software system in development phase and transfer it to the user. When correcting or modifying the software, finite failure non-homogeneous Poisson process model, presented and propose release policies of the life distribution, half-logistic property model which used to an area of reliability because of various shape and scale parameter. In this paper, discuss optimal software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement. In a numerical example, the parameters estimation using maximum likelihood estimation of failure time data, make out estimating software optimal release time. Software release time is used as prior information, potential security damages should be reduced.

Decomposition based on Object of Convex Shapes Using Poisson Equation (포아송 방정식을 이용한 컨벡스 모양의 형태 기반 분할)

  • Kim, Seon-Jong;Kim, Joo-Man
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.14 no.5
    • /
    • pp.137-144
    • /
    • 2014
  • This paper proposes a novel procedure that uses a combination of overlapped basic convex shapes to decompose 2D silhouette image. A basic convex shape is used here as a structuring element to give a meaningful interpretation to 2D images. Poisson equation is utilized to obtain the basic shapes for either the whole image or a partial region or segment of an image. The reconstruction procedure is used to combine the basic convex shapes to generate the original shape. The decomposition process involves a merging stage, filtering stage and finalized by compromising stage. The merging procedure is based on solving Poisson's equation for two regions satisfying the same symmetrical conditions which leads to finding equivalencies between basic shapes that need to be merged. We implemented and tested our novel algorithm using 2D silhouette images. The test results showed that the proposed algorithm lead to an efficient shape decomposition procedure that transforms any shape into a simpler basic convex shapes.

Noise Modeling for CR Images of High-strength Materials (고강도매질 CR 영상의 잡음 모델링)

  • Hwang, Jung-Won;Hwang, Jae-Ho
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.5
    • /
    • pp.95-102
    • /
    • 2008
  • This paper presents an appropriate approach for modeling noise in Computed Radiography(CR) images of high strength materials. The approach is specifically designed for types of noise with the statistical and nonlinear properties. CR images Ere degraded even before they are encoded by computer process. Various types of noise often contribute to contaminate radiography image, although they are detected on digitalization. Quantum noise, which is Poisson distributed, is a shot noise, but the photon distribution on Image Plate(IP) of CR system is not always Poisson process. The statistical properties are relative and case-dependant due to its material characteristics. The usual assumption of a distribution of Poisson, binomial and Gaussian statistics are considered. Nonlinear effect is also represented in the process of statistical noise model. It leads to estimate the noise variance in regions from high to low intensity, specifying analytical model. The analysis approach is tested on a database of steel tube step-wedge CR images. The results are available for the comparative parameter studies which measure noise coherence, distribution, signal/noise ratios(SNR) and nonlinear interpolation.

LMS-Wiener Model for Resources Prediction of Handoff Calls in Multimedia Wireless IP Networks (멀티미디어 무선 IP 망에서 핸드오프 호의 자원예측을 위한 LMS-위너 모델)

  • Lee, Jin-Yi;Lee, Kwang-Hyung
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.2A
    • /
    • pp.26-33
    • /
    • 2005
  • Exact prediction of resource demands for future calls enhances the efficiency of the limited resource utilization in resource reservation methods for potential calls in wireless IP networks. In this paper, we propose a LMS-Wiener resource(bandwidth) prediction for future handoff calls, and then an the proposed method is compared with an existing Wiener-based method in terms of prediction error through our simulations. In our simulations, we assume that handoff call arrivals follow a non-Poisson process and each handoff call has an non-exponentially distributed channel holdingtime in the cell, considering that handoff call arrival pattern is not Poisson distribution but non-Poisson for long periods of time in wireless picocellular IP networks. Simulation results show that the prediction error in the proposed method converges to the lower value while in an existing method increase as time is passed. Therefore we may conclude that the proposed method improves the efficiency of resource utilization by more exactly predicting resource demands for future handoff calls than an existing method.

Analysis of Cell Variation of ATM Transmission for the Poisson and MMPP Input Model in the TDMA Method (TDMA 방식에서 포아송 입력과 MMPP 입력 모델에 따른 ATM 전송의 셀 지연 변이 해석)

  • Kim, Jeong-Ho;Choe, Gyeong-Su
    • The Transactions of the Korea Information Processing Society
    • /
    • v.3 no.3
    • /
    • pp.512-522
    • /
    • 1996
  • To provide broadband ISDN service for the users in scattered locations, the application of satellite communications network is seriously considered. To trans mit ATM cells efficiently in satellite communications, it is effective to use TDM A method. However, it is necessary to have a method to compensate the cell delayvari-ation caused by the difference between TDMA and ATM. This paper optimized the cell control time(Tc) when traffic inputs have poisson or markov modulated poisson process by applying cell delay variation characteristics of time stamp method, which has the most advantages among compensation methods or cell delay variation. This paper also intorduces a method of reducing the cell clumping phenomena by adapting discrete time stamp method, including the analysis and evalutation of the range of required quality of CDV distribution by ATM transmission.The result of the experiment shows that CDV distribution-range can be controlled to 1.2$\times$Tc which reduces overall cell delay variation by discrrete time stamp method.

  • PDF