• Title/Summary/Keyword: Markov processes

Search Result 142, Processing Time 0.031 seconds

ON STATIONARY GAUSSIAN SECOND ORDER MARKOV PROCESSES

  • Park, W.J.;Hsu, Y.S.
    • Kyungpook Mathematical Journal
    • /
    • v.19 no.2
    • /
    • pp.249-255
    • /
    • 1979
  • In this paper we give a characterization of Stationary Gaussian 2nd order Markov processes in terms of its covariance function $R({\tau})=E[X(t)X(t+{\tau})]$ and also give some relationship among quasi-Markov, Markov and 2nd order Markov processes.

  • PDF

Bounding Methods for Markov Processes Based on Stochastic Monotonicity and Convexity (확률적 단조성과 콘벡스성을 이용한 마코프 프로세스에서의 범위한정 기법)

  • Yoon, Bok-Sik
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.17 no.1
    • /
    • pp.117-126
    • /
    • 1991
  • When {X(t), t ${\geq}$ 0} is a Markov process representing time-varying system states, we develop efficient bounding methods for some time-dependent performance measures. We use the discretization technique for stochastically monotone Markov processes and a combination of discretization and uniformization for Markov processes with the stochastic convexity(concavity) property. Sufficient conditions for stochastic monotonocity and stochastic convexity of a Markov process are also mentioned. A simple example is given to demonstrate the validity of the bounding methods.

  • PDF

FEYNMAN-KAC SEMIGROUPS, MARTINGALES AND WAVE OPERATORS

  • Van Casteren, Jan A.
    • Journal of the Korean Mathematical Society
    • /
    • v.38 no.2
    • /
    • pp.227-274
    • /
    • 2001
  • In this paper we intended to discuss the following topics: (1) Notation, generalities, Markov processes. The close relationship between (generators of) Markov processes and the martingale problem is exhibited. A link between the Korovkin property and generators of Feller semigroups is established. (2) Feynman-Kac semigroups: 0-order regular perturbations, pinned Markov measures. A basic representation via distributions of Markov processes is depicted. (3) Dirichlet semigroups: 0-order singular perturbations, harmonic functions, multiplicative functionals. Here a representation theorem of solutions to the heat equation is depicted in terms of the distributions of the underlying Markov process and a suitable stopping time. (4) Sets of finite capacity, wave operators, and related results. In this section a number of results are presented concerning the completeness of scattering systems (and its spectral consequences). (5) Some (abstract) problems related to Neumann semigroups: 1st order perturbations. In this section some rather abstract problems are presented, which lie on the borderline between first order perturbations together with their boundary limits (Neumann type boundary conditions and) and reflected Markov processes.

  • PDF

An Efficient Simulation of Discrete Time Queueing Systems with Markov-modulated Arrival Processes (MMAP 이산시간 큐잉 시스템의 속산 시뮬레이션)

  • Kook Kwang-Ho;Kang Sungyeol
    • Journal of the Korea Society for Simulation
    • /
    • v.13 no.3
    • /
    • pp.1-10
    • /
    • 2004
  • The cell loss probability required in the ATM network is in the range of 10$^{-9}$ ∼10$^{-12}$ . If Monte Carlo simulation is used to analyze the performance of the ATM node, an enormous amount of computer time is required. To obtain large speed-up factors, importance sampling may be used. Since the Markov-modulated processes have been used to model various high-speed network traffic sources, we consider discrete time single server queueing systems with Markov-modulated arrival processes which can be used to model an ATM node. We apply importance sampling based on the Large Deviation Theory for the performance evaluation of, MMBP/D/1/K, ∑MMBP/D/1/K, and two stage tandem queueing networks with Markov-modulated arrival processes and deterministic service times. The simulation results show that the buffer overflow probabilities obtained by the importance sampling are very close to those obtained by the Monte Carlo simulation and the computer time can be reduced drastically.

  • PDF

Stochastic convexity in markov additive processes (마코프 누적 프로세스에서의 확률적 콘벡스성)

  • 윤복식
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1991.10a
    • /
    • pp.147-159
    • /
    • 1991
  • Stochastic convexity(concvity) of a stochastic process is a very useful concept for various stochastic optimization problems. In this study we first establish stochastic convexity of a certain class of Markov additive processes through the probabilistic construction based on the sample path approach. A Markov additive process is obtained by integrating a functional of the underlying Markov process with respect to time, and its stochastic convexity can be utilized to provide efficient methods for optimal design or for optimal operation schedule of a wide range of stochastic systems. We also clarify the conditions for stochatic monotonicity of the Markov process, which is required for stochatic convexity of the Markov additive process. This result shows that stochastic convexity can be used for the analysis of probabilistic models based on birth and death processes, which have very wide application area. Finally we demonstrate the validity and usefulness of the theoretical results by developing efficient methods for the optimal replacement scheduling based on the stochastic convexity property.

  • PDF

Equivalent Transformations of Undiscounted Nonhomogeneous Markov Decision Processes

  • Park, Yun-Sun
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.17 no.2
    • /
    • pp.131-144
    • /
    • 1992
  • Even though nonhomogeneous Markov Decision Processes subsume homogeneous Markov Decision Processes and are more practical in the real world, there are many results for them. In this paper we address the nonhomogeneous Markov Decision Process with objective to maximize average reward. By extending works of Ross [17] in the homogeneous case adopting the result of Bean and Smith [3] for the dicounted deterministic problem, we first transform the original problem into the discounted nonhomogeneous Markov Decision Process. Then, secondly, we transform into the discounted deterministic problem. This approach not only shows the interrelationships between various problems but also attacks the solution method of the undiscounted nohomogeneous Markov Decision Process.

  • PDF

System Replacement Policy for A Partially Observable Markov Decision Process Model

  • Kim, Chang-Eun
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.16 no.2
    • /
    • pp.1-9
    • /
    • 1990
  • The control of deterioration processes for which only incomplete state information is available is examined in this study. When the deterioration is governed by a Markov process, such processes are known as Partially Observable Markov Decision Processes (POMDP) which eliminate the assumption that the state or level of deterioration of the system is known exactly. This research investigates a two state partially observable Markov chain in which only deterioration can occur and for which the only actions possible are to replace or to leave alone. The goal of this research is to develop a new jump algorithm which has the potential for solving system problems dealing with continuous state space Markov chains.

  • PDF

Stochastic convexity in Markov additive processes and its applications (마코프 누적 프로세스에서의 확률적 콘벡스성과 그 응용)

  • 윤복식
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.16 no.1
    • /
    • pp.76-88
    • /
    • 1991
  • Stochastic convexity (concavity) of a stochastic process is a very useful concept for various stochastic optimization problems. In this study we first establish stochastic convexity of a certain class of Markov additive processes through probabilistic construction based on the sample path approach. A Markov additive process is abtained by integrating a functional of the underlying Markov process with respect to time, and its stochastic convexity can be utilized to provide efficient methods for optimal design or optimal operation schedule wide range of stochastic systems. We also clarify the conditions for stochastic monotonicity of the Markov process. From the result it is shown that stachstic convexity can be used for the analysis of probabilitic models based on birth and death processes, which have very wide applications area. Finally we demonstrate the validity and usefulness of the theoretical results by developing efficient methods for the optimal replacement scheduling based on the stochastic convexity property.

  • PDF

Asymptotics of a class of markov processes generated by $X_{n+1}=f(X_n)+\epsilon_{n+1}$

  • Lee, Oe-Sook
    • Journal of the Korean Statistical Society
    • /
    • v.23 no.1
    • /
    • pp.1-12
    • /
    • 1994
  • We consider the markov process ${X_n}$ on R which is genereated by $X_{n+1} = f(X_n) + \epsilon_{n+1}$. Sufficient conditions for irreducibility and geometric ergodicity are obtained for such Markov processes. In additions, when ${X_n}$ is geometrically ergodic, the functional central limit theorem is proved for every bounded functions on R.

  • PDF

Valuation of European and American Option Prices Under the Levy Processes with a Markov Chain Approximation

  • Han, Gyu-Sik
    • Management Science and Financial Engineering
    • /
    • v.19 no.2
    • /
    • pp.37-42
    • /
    • 2013
  • This paper suggests a numerical method for valuation of European and American options under the two L$\acute{e}$vy Processes, Normal Inverse Gaussian Model and the Variance Gamma model. The method is based on approximation of underlying asset price using a finite-state, time-homogeneous Markov chain. We examine the effectiveness of the proposed method with simulation results, which are compared with those from the existing numerical method, the lattice-based method.