• Title/Summary/Keyword: Poisson process.

Search Result 484, Processing Time 0.023 seconds

Order Based Performance Evaluation of a CONWIP System with Compound Poisson Demands (복합포아송 수요를 가지는 CONWIP 시스템에서 고객집단의 성능평가)

  • Park Chan-U;Lee Hyo-Seong
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 2004.10a
    • /
    • pp.8-12
    • /
    • 2004
  • In this study we consider a CONWIP system in which the processing times at each station follow a Coxian distribution and the demands for the finished products arrive according to a compound Poisson process. The demands that are not satisfied are backordered according to the number of demands that exist at their arrival instants. For this system we develop an approximation method to calculate order based performance measures such as the mean time of fulfilling a customer order and the mean number o: customer orders. For the analysis of the proposed CONWIP system, we model the CONWIP system as a closed queueing network with a synchronization station and analyze the closed queueing network using a product form approximation method. Numerical tests show that the approximation method provides fairly good estimation of the performance measures of interest.

  • PDF

Determining the Optimal Buffer Sizes in Poisson Driven 3-node Tandem Queues using (Max, +)-algebra ((Max, +)-대수를 이용한 3-노드 유한 버퍼 일렬대기행렬 망에서 최적 버퍼 크기 결정)

  • Seo, Dong-Won;Hwang, Seung-June
    • Korean Management Science Review
    • /
    • v.24 no.1
    • /
    • pp.25-34
    • /
    • 2007
  • In this study, we consider stationary waiting times in finite-buffer 3-node single-server queues in series with a Poisson arrival process and with either constant or non-overlapping service times. We assume that each node has a finite buffer except for the first node. The explicit expressions of waiting times in all areas of the stochastic system were driven as functions of finite buffer capacities. These explicit forms show that a system sojourn time does not depend on the finite buffer sizes, and also allow one to compute and compare characteristics of stationary waiting times at all areas under two blocking rules communication and manufacturing blocking. The goal of this study is to apply these results to an optimization problem which determines the smallest buffer capacities satisfying predetermined probabilistic constraints on stationary waiting times at all nodes. Numerical examples are also provided.

Analysis on the lgnition Charac teristics of Pseudospark Discharge Using Hybrid Fluid-Particle(Monte Carlo) Method (혼성 유체-입자(몬테칼로)법을 이용한 유사스파크 방전의 기동 특성 해석)

  • 심재학;주홍진;강형부
    • Journal of the Korean Institute of Electrical and Electronic Material Engineers
    • /
    • v.11 no.7
    • /
    • pp.571-580
    • /
    • 1998
  • The numerical model that can describe the ignition of pseudospark discharge using hybrid fluid-particle(Monte Carlo )method has been developed. This model consists of the fluid expression for transport of electrons and ions and Poisson's equation in the electric field. The fluid equation determines the spatiotemporal dependence of charged particle densities and the ionization source term is computed using the Monte carlo method. This model has been used to study the evolution of a discharge in Argon at 0.5 torr, with an applied voltage if 1kV. The evolution process of the discharge has been divided into four phases along the potential distribution : (1) Townsend discharge, (2) plasma formation, (3) onset of hollow cathode effect, (4) plasma expansion. From the numerical results, the physical mechanisms that lead to the rapid rise in current associated with the onset of pseudospark could be identified.

  • PDF

A generalized regime-switching integer-valued GARCH(1, 1) model and its volatility forecasting

  • Lee, Jiyoung;Hwang, Eunju
    • Communications for Statistical Applications and Methods
    • /
    • v.25 no.1
    • /
    • pp.29-42
    • /
    • 2018
  • We combine the integer-valued GARCH(1, 1) model with a generalized regime-switching model to propose a dynamic count time series model. Our model adopts Markov-chains with time-varying dependent transition probabilities to model dynamic count time series called the generalized regime-switching integer-valued GARCH(1, 1) (GRS-INGARCH(1, 1)) models. We derive a recursive formula of the conditional probability of the regime in the Markov-chain given the past information, in terms of transition probabilities of the Markov-chain and the Poisson parameters of the INGARCH(1, 1) process. In addition, we also study the forecasting of the Poisson parameter as well as the cumulative impulse response function of the model, which is a measure for the persistence of volatility. A Monte-Carlo simulation is conducted to see the performances of volatility forecasting and behaviors of cumulative impulse response coefficients as well as conditional maximum likelihood estimation; consequently, a real data application is given.

A Study on the Characteristics of Software Reliability Model Using Exponential-Exponential Life Distribution (수명분포가 지수화-지수분포를 따르는 소프트웨어 신뢰모형 특성에 관한 연구)

  • Kim, Hee Cheul;Moon, Song Chul
    • Journal of Information Technology Applications and Management
    • /
    • v.27 no.3
    • /
    • pp.69-75
    • /
    • 2020
  • In this paper, we applied the shape parameters of the exponentialized exponential life distribution widely used in the field of software reliability, and compared the reliability properties of the software using the non-homogeneous Poisson process in finite failure. In addition, the average value function is also a non-decreasing form. In the case of the larger the shape parameter, the smaller the estimated error in predicting the predicted value in comparison with the true value, so it can be regarded as an efficient model in terms of relative accuracy. Also, in the larger the shape parameter, the larger the estimated value of the coefficient of determination, which can be regarded as an efficient model in terms of suitability. So. the larger the shape parameter model can be regarded as an efficient model in terms of goodness-of-fit. In the form of the reliability function, it gradually appears as a non-increasing pattern and the higher the shape parameter, the lower it is as the mission time elapses. Through this study, software operators can use the pattern of mean square error, mean value, and hazard function as a basic guideline for exploring software failures.

Analysis of Spatial Distribution of Droughts in Korea through Drought Severity-Duration-frequency Analysis (가뭄심도-지속기간-빈도해석을 통한 우리나라 가뭄의 공간분포 분석)

  • Kim Dae-Ha;Yoo Chul-Sang
    • Journal of Korea Water Resources Association
    • /
    • v.39 no.9 s.170
    • /
    • pp.745-754
    • /
    • 2006
  • This study adopted the Rectangular Pulses Poisson Process Model for the drought severity-duration-frequency analysis to characterize the spatial pattern of drought over the Korean peninsula using the rainfall data of the 59 rain gauge stations. First of all, the drought severity in the southern part of the Korean peninsula was found to be generally high for any return period. This result is consistent for both cases with and without considering the overlap probability of rectangular pulses, which is also valid for longer durations. Comparison with those of observed drought frequency and maximum severity also showed that the result in this study has enough reliability.

Outage Probability Analysis of Macro Diversity Combining Based on Stochastic Geometry (매크로 다이버시티 결합의 확률 기하 이론 기반 Outage 확률 분석)

  • Zihan, Ewaldo;Choi, Kae-Won
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.9 no.2
    • /
    • pp.187-194
    • /
    • 2014
  • In this paper, we analyze the outage probability of macro diversity combining in cellular networks in consideration of aggregate interference from other mobile stations (MSs). Different from existing works analyzing the outage probability of macro diversity combining, we focus on a diversity gain attained by selecting a base station (BS) subject to relatively low aggregate interference. In our model, MSs are randomly located according to a Poisson point process. The outage probability is analyzed by approximating the multivariate distribution of aggregate interferences on multiple BSs by a multivariate lognormal distribution.

Effects of hydrodynamics and coagulant doses on particle aggregation during a rapid mixing

  • Park, Sang-Min;Heo, Tae-Young;Park, Jun-Gyu;Jun, Hang-Bae
    • Environmental Engineering Research
    • /
    • v.21 no.4
    • /
    • pp.365-372
    • /
    • 2016
  • The effects of hydrodynamics and alum dose on particle growth were investigated by monitoring particle counts in a rapid mixing process. Experiments were performed to measure the particle growth and breakup under various conditions. The rapid mixing scheme consisted of the following operating parameters: Velocity gradient (G) ($200-300s^{-1}$), alum dose (10-50 mg/L) and mixing time (30-180 s). The Poisson regression model was applied to assess the effects of the doses and velocity gradient with mixing time. The mechanism for the growth and breakup of particles was elucidated. An increase in alum dose was found to accelerate the particle count reduction. The particle count at a G value of $200s^{-1}$ decreased more rapidly than those at $300s^{-1}$. The growth and breakup of larger particles were more clearly observed at higher alum doses. Variations of particles due to aggregation and breakup of micro-flocs in rapid mixing step were interactively affected by G, mixing time and alum dose. Micro-flocculation played an important role in a rapid mixing process.

Mathematical Basis for Establishing Reasonable Objective Periodsin Zero Accident Campaign (무재해 목표기간 재설정의 수리적 근거)

  • Lim, Hyeon-Kyo;Kim, Young-Jin;Chang, Seong-Rok
    • Journal of the Korean Society of Safety
    • /
    • v.25 no.4
    • /
    • pp.61-67
    • /
    • 2010
  • Though "Zero Accident Campaign" is a desirable campaign for industrial accident prevention and reducing victims, the number of industrial enterprises has been decreasing abruptly in recent years. One of the reasons for this phenomenon may be attributed to irrationality of 'target accident-free time periods' established by related organizations. This study was carried out to develop a new rational scheme for the campaign. Therefore, for a numerical basis, Poisson process was introduced, and problems induced by current target periods were analyzed mathematically one by one. As a result, it was verified that current target periods were uneven since the probability that manufacturing plants get them would be different form industry to industry. To develop countermeasures, a brand new method were suggested in this research. The first characteristic was that group classification should be based upon average accident rates resulted from past several years, and the second was that adjustment probability which can make the target acquisition probability even. About the suggested method, a questionnaire survey was conducted. To make a conclusion, most manufacturing plants agreed with the suggested method such high affirmative portion that the suggested method would be expected to help promote the campaign again.

An Opportunity-based Age Replacement Policy with Warranty Analysed by Using TTT-Transforms

  • Iskandar, Bermawi P.;Klefsjo, Bengt;Sandoh, Hiroaki
    • International Journal of Reliability and Applications
    • /
    • v.1 no.1
    • /
    • pp.27-38
    • /
    • 2000
  • In a recent paper Iskandar & Sandoh (1999) studied an opportunity-based age replacement policy for a system which has a warranty period (0,S]. When the system fails at age x $\leq$ S a minimal repair is performed. If an opportunity occurs to the system at age x, S $\leq$ x $\leq$ T, we take the opportunity with probability p to preventively replace the system, while we conduct a corrective .replacement when its fails in (S,T). Finally, if its age reaches T, we perform a preventive replacement, Under this policy the design variable is T. For the case when opportunities occur according to a homogeneous Poisson process, the long-run average cost of this policy was formulated and studied analytically by Iskandar & Sandoh (1999). The same problem is here analysed by using a graphical technique based on scaled TTT-transforms. This technique gives, among other things, excellent possibilities for different types of sensitivity analysis. We also extend the discussion to the situation when we have to estimate T based on times to failure.

  • PDF