• Title/Summary/Keyword: Markov

Search Result 2,433, Processing Time 0.021 seconds

Prediction of Mobile Phone Menu Selection with Markov Chains (Markov Chain을 이용한 핸드폰 메뉴 선택 예측)

  • Lee, Suk Won;Myung, Rohae
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.33 no.4
    • /
    • pp.402-409
    • /
    • 2007
  • Markov Chains has proven to be effective in predicting human behaviors in the areas of web site assess, multimedia educational system, and driving environment. In order to extend an application area of predicting human behaviors using Markov Chains, this study was conducted to investigate whether Markov Chains could be used to predict human behavior in selecting mobile phone menu item. Compared to the aforementioned application areas, this study has different aspects in using Markov Chains : m-order 1-step Markov Model and the concept of Power Law of Learning. The results showed that human behaviors in predicting mobile phone menu selection were well fitted into with m-order 1-step Markov Model and Power Law of Learning in allocating history path vector weights. In other words, prediction of mobile phone menu selection with Markov Chains was capable of user's actual menu selection.

Average run length calculation of the EWMA control chart using the first passage time of the Markov process (Markov 과정의 최초통과시간을 이용한 지수가중 이동평균 관리도의 평균런길이의 계산)

  • Park, Changsoon
    • The Korean Journal of Applied Statistics
    • /
    • v.30 no.1
    • /
    • pp.1-12
    • /
    • 2017
  • Many stochastic processes satisfy the Markov property exactly or at least approximately. An interested property in the Markov process is the first passage time. Since the sequential analysis by Wald, the approximation of the first passage time has been studied extensively. The Statistical computing technique due to the development of high-speed computers made it possible to calculate the values of the properties close to the true ones. This article introduces an exponentially weighted moving average (EWMA) control chart as an example of the Markov process, and studied how to calculate the average run length with problematic issues that should be cautioned for correct calculation. The results derived for approximation of the first passage time in this research can be applied to any of the Markov processes. Especially the approximation of the continuous time Markov process to the discrete time Markov chain is useful for the studies of the properties of the stochastic process and makes computational approaches easy.

A Markov Chain Representation of Statistical Process Monitoring Procedure under an ARIMA(0,1,1) Model (ARIMA(0,1,1)모형에서 통계적 공정탐색절차의 MARKOV연쇄 표현)

  • 박창순
    • The Korean Journal of Applied Statistics
    • /
    • v.16 no.1
    • /
    • pp.71-85
    • /
    • 2003
  • In the economic design of the process control procedure, where quality is measured at certain time intervals, its properties are difficult to derive due to the discreteness of the measurement intervals. In this paper a Markov chain representation of the process monitoring procedure is developed and used to derive its properties when the process follows an ARIMA(0,1,1) model, which is designed to describe the effect of the noise and the special cause in the process cycle. The properties of the Markov chain depend on the transition matrix, which is determined by the control procedure and the process distribution. The derived representation of the Markov chain can be adapted to most different types of control procedures and different kinds of process distributions by obtaining the corresponding transition matrix.

ON STATIONARY GAUSSIAN SECOND ORDER MARKOV PROCESSES

  • Park, W.J.;Hsu, Y.S.
    • Kyungpook Mathematical Journal
    • /
    • v.19 no.2
    • /
    • pp.249-255
    • /
    • 1979
  • In this paper we give a characterization of Stationary Gaussian 2nd order Markov processes in terms of its covariance function $R({\tau})=E[X(t)X(t+{\tau})]$ and also give some relationship among quasi-Markov, Markov and 2nd order Markov processes.

  • PDF

Implementation of Markov Chain: Review and New Application (관리도에서 Markov연쇄의 적용: 복습 및 새로운 응용)

  • Park, Chang-Soon
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.4
    • /
    • pp.657-676
    • /
    • 2011
  • Properties of statistical process control procedures may not be derived analytically in many cases; however, the application of a Markov chain can solve such problems. This article shows how to derive the properties of the process control procedures using the generated Markov chains when the control statistic satisfies the Markov property. Markov chain approaches that appear in the literature (such as the statistical design and economic design of the control chart as well as the variable sampling rate design) are reviewed along with the introduction of research results for application to a new control procedure and reset chart. The joint application of a Markov chain approach and analytical solutions (when available) can guarantee the correct derivation of the properties. A Markov chain approach is recommended over simulation studies due to its precise derivation of properties and short calculation times.

Implementation of Markov chain: Review and new application (관리도에서 Markov연쇄의 적용: 복습 및 새로운 응용)

  • Park, Changsoon
    • The Korean Journal of Applied Statistics
    • /
    • v.34 no.4
    • /
    • pp.537-556
    • /
    • 2021
  • Properties of statistical process control procedures may not be derived analytically in many cases; however, the application of a Markov chain can solve such problems. This article shows how to derive the properties of the process control procedures using the generated Markov chains when the control statistic satisfies the Markov property. Markov chain approaches that appear in the literature (such as the statistical design and economic design of the control chart as well as the variable sampling rate design) are reviewed along with the introduction of research results for application to a new control procedure and reset chart. The joint application of a Markov chain approach and analytical solutions (when available) can guarantee the correct derivation of the properties. A Markov chain approach is recommended over simulation studies due to its precise derivation of properties and short calculation times.

Development of Daily Rainfall Simulation Model Based on Homogeneous Hidden Markov Chain (동질성 Hidden Markov Chain 모형을 이용한 일강수량 모의기법 개발)

  • Kwon, Hyun-Han;Kim, Tae Jeong;Hwang, Seok-Hwan;Kim, Tae-Woong
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.33 no.5
    • /
    • pp.1861-1870
    • /
    • 2013
  • A climate change-driven increased hydrological variability has been widely acknowledged over the past decades. In this regards, rainfall simulation techniques are being applied in many countries to consider the increased variability. This study proposed a Homogeneous Hidden Markov Chain(HMM) designed to recognize rather complex patterns of rainfall with discrete hidden states and underlying distribution characteristics via mixture probability density function. The proposed approach was applied to Seoul and Jeonju station to verify model's performance. Statistical moments(e.g. mean, variance, skewness and kurtosis) derived by daily and seasonal rainfall were compared with observation. It was found that the proposed HMM showed better performance in terms of reproducing underlying distribution characteristics. Especially, the HMM was much better than the existing Markov Chain model in reproducing extremes. In this regard, the proposed HMM could be used to evaluate a long-term runoff and design flood as inputs.

Parametric Sensitivity Analysis of Markov Process Based RAM Model (Markov Process 기반 RAM 모델에 대한 파라미터 민감도 분석)

  • Kim, Yeong Seok;Hur, Jang Wook
    • Journal of the Korean Society of Systems Engineering
    • /
    • v.14 no.1
    • /
    • pp.44-51
    • /
    • 2018
  • The purpose of RAM analysis in weapon systems is to reduce life cycle costs, along with improving combat readiness by meeting RAM target value. We analyzed the sensitivity of the RAM analysis parameters to the use of the operating system by using the Markov Process based model (MPS, Markov Process Simulation) developed for RAM analysis. A Markov process-based RAM analysis model was developed to analyze the sensitivity of parameters (MTBF, MTTR and ALDT) to the utility of the 81mm mortar. The time required for the application to reach the steady state is about 15,000H, which is about 2 years, and the sensitivity of the parameter is highest for ALDT. In order to improve combat readiness, there is a need for continuous improvement in ALDT.

Korean Phoneme Recognition Using duration-dependent 3-State Hidden Markov Model (음소길이를 고려한 3-State Hidden Markov Model 에 의한 한국어 음소인식)

  • Yoo, H.-C.;Lee, H.-J.;Park, B.-C.
    • The Journal of the Acoustical Society of Korea
    • /
    • v.8 no.1
    • /
    • pp.81-87
    • /
    • 1989
  • This paper discribes the method associated with modeling of Korean phonemes. Hidden Markov models(HMM's) may be viewed as an effective technique for modeling the inherent nonstationarity of speech signal. We propose a 3-state phoneme model to represent the sequentially changing characteristics of phonemes, i.e., transition-to-stationary-to-transition. Also we clarify that the duration of a phoneme is an important factor to have an effect in recognition accuracy and show that improvement in recognition rate can be obtained by using duration-dependent 3-state hidden Markov models.

  • PDF

Bayesian Analysis of Binary Non-homogeneous Markov Chain with Two Different Time Dependent Structures

  • Sung, Min-Je
    • Management Science and Financial Engineering
    • /
    • v.12 no.2
    • /
    • pp.19-35
    • /
    • 2006
  • We use the hierarchical Bayesian approach to describe the transition probabilities of a binary nonhomogeneous Markov chain. The Markov chain is used for describing the transition behavior of emotionally disturbed children in a treatment program. The effects of covariates on transition probabilities are assessed using a logit link function. To describe the time evolution of transition probabilities, we consider two modeling strategies. The first strategy is based on the concept of exchangeabiligy, whereas the second one is based on a first order Markov property. The deviance information criterion (DIC) measure is used to compare models with two different time dependent structures. The inferences are made using the Markov chain Monte Carlo technique. The developed methodology is applied to some real data.