• Title/Summary/Keyword: Discrete Markov Chain

Search Result 70, Processing Time 0.022 seconds

A Study on the Dynamic Programming for Control (제어를 위한 동적 프로그래밍에 관한 연구)

  • Cho, Hyang-Duck;Kim, Woo-Shik
    • Proceedings of the Korean Society for Noise and Vibration Engineering Conference
    • /
    • 2007.11a
    • /
    • pp.556-559
    • /
    • 2007
  • The notion of linearity is fundamental in science and engineering. Much of system and control theory is based on the analysis of linear system, which does not care whether it is nonlinear and complex. The dynamic programming is one of concerned technology when users are interested in choosing best choice from system operation for nonlinear or dynamic system‘s performance and control problem. In this paper, we will introduce the dynamic programming which is based on discrete system. When the discrete system is constructed with discrete state, transfer between states, and the event to induct transfer, the discrete system can describe the system operation as dynamic situation or symbolically at the logical point of view. We will introduce technologies which are related with controllable of Controlled Markov Chain as shown example of simple game. The dynamic programming will be able to apply to optimal control part which has adaptable performance in the discrete system.

  • PDF

DISCRETE-TIME BULK-SERVICE QUEUE WITH MARKOVIAN SERVICE INTERRUPTION AND PROBABILISTIC BULK SIZE

  • Lee, Yu-Tae
    • Journal of applied mathematics & informatics
    • /
    • v.28 no.1_2
    • /
    • pp.275-282
    • /
    • 2010
  • This paper analyzes a discrete-time bulk-service queue with probabilistic bulk size, where the service process is interrupted by a Markov chain. We study the joint probability generating function of system occupancy and the state of the Markov chain. We derive several performance measures of interest, including average system occupancy and delay distribution.

A Study on the Fatigue Reliability of Structures by Markov Chain Model (Markov Chain Model을 이용한 구조물의 피로 신뢰성 해석에 관한 연구)

  • Y.S. Yang;J.H. Yoon
    • Journal of the Society of Naval Architects of Korea
    • /
    • v.28 no.2
    • /
    • pp.228-240
    • /
    • 1991
  • Many experimental data of fatigue crack propagation show that the fatigue crack propagation process is stochastic. Therefore, the study on the crack propagation must be based on the probabilistic approach. In the present paper, fatigue crack propagation process is assumed to be a discrete Markov process and the method is developed, which can evaluate the reliability of the structural component by using Markov chain model(Unit step B-model) suggested by Bogdanoff. In this method, leak failure, plastic collapse and brittle fracture of the critical component are taken as failure modes, and the effects of initial crack distribution, periodic and non-periodic inspection on the probability of failure are considered. In this method, an equivalent load value for random loading such as wave load is used to facilitate the analysis. Finally some calculations are carried out in order to show the usefulness and the applicability of this method. And then some remarks on this method are mentioned.

  • PDF

Analysis of Real-time Error for Remote Estimation Based on Binary Markov Chain Model (이진 마르코프 연쇄 모형 기반 실시간 원격 추정값의 오차 분석)

  • Lee, Yutae
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.2
    • /
    • pp.317-320
    • /
    • 2022
  • This paper studies real-time error in the context of monitoring a symmetric binary information source over a delay system. To obtain the average real-time error, the delay system is modeled and analyzed as a discrete time Markov chain with a finite state space. Numerical analysis is performed on various system parameters such as state transition probabilities of information source, transmission times, and transmission frequencies. Given state transition probabilities and transmission times, we investigate the relationship between the transmission frequency and the average real-time error. The results can be used to investigate the relationship between real-time errors and age of information.

Sensitivity of Conditions for Lumping Finite Markov Chains

  • Suh, Moon-Taek
    • Journal of the military operations research society of Korea
    • /
    • v.11 no.1
    • /
    • pp.111-129
    • /
    • 1985
  • Markov chains with large transition probability matrices occur in many applications such as manpowr models. Under certain conditions the state space of a stationary discrete parameter finite Markov chain may be partitioned into subsets, each of which may be treated as a single state of a smaller chain that retains the Markov property. Such a chain is said to be 'lumpable' and the resulting lumped chain is a special case of more general functions of Markov chains. There are several reasons why one might wish to lump. First, there may be analytical benefits, including relative simplicity of the reduced model and development of a new model which inherits known or assumed strong properties of the original model (the Markov property). Second, there may be statistical benefits, such as increased robustness of the smaller chain as well as improved estimates of transition probabilities. Finally, the identification of lumps may provide new insights about the process under investigation.

  • PDF

ON THE APPLICATION OF LIMITING DIFFUSION IN SPECIAL DIPLOID MODEL

  • Choi, Won
    • Journal of applied mathematics & informatics
    • /
    • v.29 no.3_4
    • /
    • pp.1043-1048
    • /
    • 2011
  • W. Choi([1]) identified and characterized the limiting diffusion of this diploid model by defining discrete generator for the rescaled Markov chain. We denote by F the homozygosity and by S the average selection intensity. In this note, we define the Fleming-Viot process with generator of limiting diffusion and provide exact result for the relations of F and S.

ON THE LIMITING DIFFUSION OF SPECIAL DIPLOID MODEL IN POPULATION GENETICS

  • CHOI, WON
    • Bulletin of the Korean Mathematical Society
    • /
    • v.42 no.2
    • /
    • pp.397-404
    • /
    • 2005
  • In this note, we characterize the limiting diffusion of a diploid model by defining the discrete generator for the resealed Markov chain. We conclude that this limiting diffusion model is with uncountable state space and mutation selection and special 'mutation or gene conversion rate'.

THE QUEUE LENGTH DISTRIBUTION OF PHASE TYPE

  • Lim, Jong-Seul;Ahn, Seong-Joon
    • Journal of applied mathematics & informatics
    • /
    • v.24 no.1_2
    • /
    • pp.505-511
    • /
    • 2007
  • In this paper, we examine the Markov chain $\{X_k,\;N_k;\;k=0,\;1,...$. We show that the marginal steady state distribution of Xk is discrete phase type. The implication of this result is that the queue length distribution of phase type for large number of examples where this Markov chain is applicable and shows a queueing application by matrix geometric methods.

Development of Daily Rainfall Simulation Model Based on Homogeneous Hidden Markov Chain (동질성 Hidden Markov Chain 모형을 이용한 일강수량 모의기법 개발)

  • Kwon, Hyun-Han;Kim, Tae Jeong;Hwang, Seok-Hwan;Kim, Tae-Woong
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.33 no.5
    • /
    • pp.1861-1870
    • /
    • 2013
  • A climate change-driven increased hydrological variability has been widely acknowledged over the past decades. In this regards, rainfall simulation techniques are being applied in many countries to consider the increased variability. This study proposed a Homogeneous Hidden Markov Chain(HMM) designed to recognize rather complex patterns of rainfall with discrete hidden states and underlying distribution characteristics via mixture probability density function. The proposed approach was applied to Seoul and Jeonju station to verify model's performance. Statistical moments(e.g. mean, variance, skewness and kurtosis) derived by daily and seasonal rainfall were compared with observation. It was found that the proposed HMM showed better performance in terms of reproducing underlying distribution characteristics. Especially, the HMM was much better than the existing Markov Chain model in reproducing extremes. In this regard, the proposed HMM could be used to evaluate a long-term runoff and design flood as inputs.