• Title/Summary/Keyword: Markov property

Search Result 65, Processing Time 0.027 seconds

Sensitivity of Conditions for Lumping Finite Markov Chains

  • Suh, Moon-Taek
    • Journal of the military operations research society of Korea
    • /
    • v.11 no.1
    • /
    • pp.111-129
    • /
    • 1985
  • Markov chains with large transition probability matrices occur in many applications such as manpowr models. Under certain conditions the state space of a stationary discrete parameter finite Markov chain may be partitioned into subsets, each of which may be treated as a single state of a smaller chain that retains the Markov property. Such a chain is said to be 'lumpable' and the resulting lumped chain is a special case of more general functions of Markov chains. There are several reasons why one might wish to lump. First, there may be analytical benefits, including relative simplicity of the reduced model and development of a new model which inherits known or assumed strong properties of the original model (the Markov property). Second, there may be statistical benefits, such as increased robustness of the smaller chain as well as improved estimates of transition probabilities. Finally, the identification of lumps may provide new insights about the process under investigation.

  • PDF

Average run length calculation of the EWMA control chart using the first passage time of the Markov process (Markov 과정의 최초통과시간을 이용한 지수가중 이동평균 관리도의 평균런길이의 계산)

  • Park, Changsoon
    • The Korean Journal of Applied Statistics
    • /
    • v.30 no.1
    • /
    • pp.1-12
    • /
    • 2017
  • Many stochastic processes satisfy the Markov property exactly or at least approximately. An interested property in the Markov process is the first passage time. Since the sequential analysis by Wald, the approximation of the first passage time has been studied extensively. The Statistical computing technique due to the development of high-speed computers made it possible to calculate the values of the properties close to the true ones. This article introduces an exponentially weighted moving average (EWMA) control chart as an example of the Markov process, and studied how to calculate the average run length with problematic issues that should be cautioned for correct calculation. The results derived for approximation of the first passage time in this research can be applied to any of the Markov processes. Especially the approximation of the continuous time Markov process to the discrete time Markov chain is useful for the studies of the properties of the stochastic process and makes computational approaches easy.

Bounding Methods for Markov Processes Based on Stochastic Monotonicity and Convexity (확률적 단조성과 콘벡스성을 이용한 마코프 프로세스에서의 범위한정 기법)

  • Yoon, Bok-Sik
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.17 no.1
    • /
    • pp.117-126
    • /
    • 1991
  • When {X(t), t ${\geq}$ 0} is a Markov process representing time-varying system states, we develop efficient bounding methods for some time-dependent performance measures. We use the discretization technique for stochastically monotone Markov processes and a combination of discretization and uniformization for Markov processes with the stochastic convexity(concavity) property. Sufficient conditions for stochastic monotonocity and stochastic convexity of a Markov process are also mentioned. A simple example is given to demonstrate the validity of the bounding methods.

  • PDF

The Ramp-Rate Constraint Effects on the Generators' Equilibrium Strategy in Electricity Markets

  • Joung, Man-Ho;Kim, Jin-Ho
    • Journal of Electrical Engineering and Technology
    • /
    • v.3 no.4
    • /
    • pp.509-513
    • /
    • 2008
  • In this paper, we investigate how generators' ramp-rate constraints may influence their equilibrium strategy formulation. In the market model proposed in this study, the generators' ramp-rate constraints are explicitly represented. In order to fully characterize the inter-temporal nature of the ramp-rate constraints, a dynamic game model is presented. The subgame perfect Nash equilibrium is adopted as the solution of the game and the backward induction procedure for the solution of the game is designed in this paper. The inter-temporal nature of the ramp-rate constraints results in the Markov property of the game, and we have found that the Markov property of the game significantly simplifies the subgame perfect Nash equilibrium characterization. Finally, a simple electricity market numerical illustration is presented for the successful application of the approach proposed.

First-Passage Time Distribution of Discrete Time Stochastic Process with 0-state

  • Park, Young-Sool
    • Journal of the Korean Data and Information Science Society
    • /
    • v.8 no.2
    • /
    • pp.119-125
    • /
    • 1997
  • We handle the stochastic processes of independent and identically distributed random variables. But random variables are usually dependent among themselves in actual life. So in this paper, we find out a new process not satisfying Markov property. We investigate the probability mass functions and study on the probability of the first-passage time. Also we find out the average frequency of continuous successes in from 0 to n time.

  • PDF

Bayesian Analysis of Binary Non-homogeneous Markov Chain with Two Different Time Dependent Structures

  • Sung, Min-Je
    • Management Science and Financial Engineering
    • /
    • v.12 no.2
    • /
    • pp.19-35
    • /
    • 2006
  • We use the hierarchical Bayesian approach to describe the transition probabilities of a binary nonhomogeneous Markov chain. The Markov chain is used for describing the transition behavior of emotionally disturbed children in a treatment program. The effects of covariates on transition probabilities are assessed using a logit link function. To describe the time evolution of transition probabilities, we consider two modeling strategies. The first strategy is based on the concept of exchangeabiligy, whereas the second one is based on a first order Markov property. The deviance information criterion (DIC) measure is used to compare models with two different time dependent structures. The inferences are made using the Markov chain Monte Carlo technique. The developed methodology is applied to some real data.

Development of Statistical Downscaling Model Using Nonstationary Markov Chain (비정상성 Markov Chain Model을 이용한 통계학적 Downscaling 기법 개발)

  • Kwon, Hyun-Han;Kim, Byung-Sik
    • Journal of Korea Water Resources Association
    • /
    • v.42 no.3
    • /
    • pp.213-225
    • /
    • 2009
  • A stationary Markov chain model is a stochastic process with the Markov property. Having the Markov property means that, given the present state, future states are independent of the past states. The Markov chain model has been widely used for water resources design as a main tool. A main assumption of the stationary Markov model is that statistical properties remain the same for all times. Hence, the stationary Markov chain model basically can not consider the changes of mean or variance. In this regard, a primary objective of this study is to develop a model which is able to make use of exogenous variables. The regression based link functions are employed to dynamically update model parameters given the exogenous variables, and the model parameters are estimated by canonical correlation analysis. The proposed model is applied to daily rainfall series at Seoul station having 46 years data from 1961 to 2006. The model shows a capability to reproduce daily and seasonal characteristics simultaneously. Therefore, the proposed model can be used as a short or mid-term prediction tool if elaborate GCM forecasts are used as a predictor. Also, the nonstationary Markov chain model can be applied to climate change studies if GCM based climate change scenarios are provided as inputs.

FEYNMAN-KAC SEMIGROUPS, MARTINGALES AND WAVE OPERATORS

  • Van Casteren, Jan A.
    • Journal of the Korean Mathematical Society
    • /
    • v.38 no.2
    • /
    • pp.227-274
    • /
    • 2001
  • In this paper we intended to discuss the following topics: (1) Notation, generalities, Markov processes. The close relationship between (generators of) Markov processes and the martingale problem is exhibited. A link between the Korovkin property and generators of Feller semigroups is established. (2) Feynman-Kac semigroups: 0-order regular perturbations, pinned Markov measures. A basic representation via distributions of Markov processes is depicted. (3) Dirichlet semigroups: 0-order singular perturbations, harmonic functions, multiplicative functionals. Here a representation theorem of solutions to the heat equation is depicted in terms of the distributions of the underlying Markov process and a suitable stopping time. (4) Sets of finite capacity, wave operators, and related results. In this section a number of results are presented concerning the completeness of scattering systems (and its spectral consequences). (5) Some (abstract) problems related to Neumann semigroups: 1st order perturbations. In this section some rather abstract problems are presented, which lie on the borderline between first order perturbations together with their boundary limits (Neumann type boundary conditions and) and reflected Markov processes.

  • PDF

Stochastic convexity in markov additive processes (마코프 누적 프로세스에서의 확률적 콘벡스성)

  • 윤복식
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1991.10a
    • /
    • pp.147-159
    • /
    • 1991
  • Stochastic convexity(concvity) of a stochastic process is a very useful concept for various stochastic optimization problems. In this study we first establish stochastic convexity of a certain class of Markov additive processes through the probabilistic construction based on the sample path approach. A Markov additive process is obtained by integrating a functional of the underlying Markov process with respect to time, and its stochastic convexity can be utilized to provide efficient methods for optimal design or for optimal operation schedule of a wide range of stochastic systems. We also clarify the conditions for stochatic monotonicity of the Markov process, which is required for stochatic convexity of the Markov additive process. This result shows that stochastic convexity can be used for the analysis of probabilistic models based on birth and death processes, which have very wide application area. Finally we demonstrate the validity and usefulness of the theoretical results by developing efficient methods for the optimal replacement scheduling based on the stochastic convexity property.

  • PDF

Enhanced Markov-Difference Based Power Consumption Prediction for Smart Grids

  • Le, Yiwen;He, Jinghan
    • Journal of Electrical Engineering and Technology
    • /
    • v.12 no.3
    • /
    • pp.1053-1063
    • /
    • 2017
  • Power prediction is critical to improve power efficiency in Smart Grids. Markov chain provides a useful tool for power prediction. With careful investigation of practical power datasets, we find an interesting phenomenon that the stochastic property of practical power datasets does not follow the Markov features. This mismatch affects the prediction accuracy if directly using Markov prediction methods. In this paper, we innovatively propose a spatial transform based data processing to alleviate this inconsistency. Furthermore, we propose an enhanced power prediction method, named by Spatial Mapping Markov-Difference (SMMD), to guarantee the prediction accuracy. In particular, SMMD adopts a second prediction adjustment based on the differential data to reduce the stochastic error. Experimental results validate that the proposed SMMD achieves an improvement in terms of the prediction accuracy with respect to state-of-the-art solutions.