• Title/Summary/Keyword: Markov Process

Search Result 618, Processing Time 0.023 seconds

Analysis of an Inspection Process Allowing Consecutive Two-time Testing of Products Using Markov Chains (연속되는 이중 검사를 허용하는 제품품질검사 프로세스에 대한 마르코프 체인을 이용한 분석)

  • Ko, Jeong-Han
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.13 no.6
    • /
    • pp.2452-2457
    • /
    • 2012
  • When a quality inspection process rejects a product unit, consecutive repeated inspections are sometimes conducted for the rejected unit to reduce a false reject possibility. This paper analyzes a special inspection process that allows up to two times of consecutive testing for each product to decrease type I inspection errors. This study uses a Markov chain to model the steps of the inspection process and a product unit's quality states during inspection. Historical inspection results from a company are used as the data for the Markov chain model. Using the Markov chain model and data, this study analyzes the effect of this special inspection rule on the proportion of the final quality levels and scrap rate. The results demonstrate that this inspection process of possible double testing could help reduce unnecessary rejects and consequently decrease material and production costs.

A Study on System Availability Analysis Utilizing Markov Process (마르코프 프로세스를 활용한 시스템 가용도 분석 방법 고찰)

  • Kim, Bohyeon;Kim, Seongkyung;Pagulayan, Dhominick;Hur, Jangwook
    • Journal of Applied Reliability
    • /
    • v.16 no.4
    • /
    • pp.295-304
    • /
    • 2016
  • Purpose: This paper presents an application of Markov Process to reliability and availability analysis. In order to do that of analysis, we set up a specific case of Tablet PC and it's usage scenario. The case has it some spares and maintenance and repair processes. Methods: Different configurations of the tablet PC and as well as their functions are defined. The system configuration and calculated failure rates of components are modeled from Windchill Quality Solution. Two models, without a spare and with spare, are created and compared using Markov Process. The Matlab numerical analysis is used to simulate and show the change of state with time. Availability of the system is computed by determining the time the system stays in different states. Results: The mission availability and steady-state condition availability in accordance with the mission are compared and the availability of the system with spares have improved availability than without spares. Simulated data shows that downtime of the system increased which results in greater availability through the consideration of spares. Conclusion: There's many techniques and methods to do reliability and availability analysis and mostly are time-independent assumptions. But Markov Process, even though its steady-state and ergodic properties, can do time analysis any given time periods.

A Probabilistic Model of Damage Propagation based on the Markov Process (마코프 프로세스에 기반한 확률적 피해 파급 모델)

  • Kim Young-Gab;Baek Young-Kyo;In Hoh-Peter;Baik Doo-Kwon
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.33 no.8
    • /
    • pp.524-535
    • /
    • 2006
  • With rapid development of Internet technology, business management in an organization or an enterprise depends on Internet-based technology for the most part. Furthermore, as dependency and cohesiveness of network in the communication facilities are increasing, cyber attacks have been increased against vulnerable resource in the information system. Hence, to protect private information and computer resource, research for damage propagation is required in this situation. However the proposed traditional models present just mechanism for risk management, or are able to be applied to the specified threats such as virus or worm. Therefore, we propose the probabilistic model of damage propagation based on the Markov process, which can be applied to diverse threats in the information systems. Using the proposed model in this paper, we can predict the occurrence probability and occurrence frequency for each threats in the entire system.

Sensitivity of Conditions for Lumping Finite Markov Chains

  • Suh, Moon-Taek
    • Journal of the military operations research society of Korea
    • /
    • v.11 no.1
    • /
    • pp.111-129
    • /
    • 1985
  • Markov chains with large transition probability matrices occur in many applications such as manpowr models. Under certain conditions the state space of a stationary discrete parameter finite Markov chain may be partitioned into subsets, each of which may be treated as a single state of a smaller chain that retains the Markov property. Such a chain is said to be 'lumpable' and the resulting lumped chain is a special case of more general functions of Markov chains. There are several reasons why one might wish to lump. First, there may be analytical benefits, including relative simplicity of the reduced model and development of a new model which inherits known or assumed strong properties of the original model (the Markov property). Second, there may be statistical benefits, such as increased robustness of the smaller chain as well as improved estimates of transition probabilities. Finally, the identification of lumps may provide new insights about the process under investigation.

  • PDF

FEYNMAN-KAC SEMIGROUPS, MARTINGALES AND WAVE OPERATORS

  • Van Casteren, Jan A.
    • Journal of the Korean Mathematical Society
    • /
    • v.38 no.2
    • /
    • pp.227-274
    • /
    • 2001
  • In this paper we intended to discuss the following topics: (1) Notation, generalities, Markov processes. The close relationship between (generators of) Markov processes and the martingale problem is exhibited. A link between the Korovkin property and generators of Feller semigroups is established. (2) Feynman-Kac semigroups: 0-order regular perturbations, pinned Markov measures. A basic representation via distributions of Markov processes is depicted. (3) Dirichlet semigroups: 0-order singular perturbations, harmonic functions, multiplicative functionals. Here a representation theorem of solutions to the heat equation is depicted in terms of the distributions of the underlying Markov process and a suitable stopping time. (4) Sets of finite capacity, wave operators, and related results. In this section a number of results are presented concerning the completeness of scattering systems (and its spectral consequences). (5) Some (abstract) problems related to Neumann semigroups: 1st order perturbations. In this section some rather abstract problems are presented, which lie on the borderline between first order perturbations together with their boundary limits (Neumann type boundary conditions and) and reflected Markov processes.

  • PDF

A Study on Markov Chains Applied to informetrics (마코프모형의 계량정보학적 응용연구)

  • Moon, Kyung-Hwa
    • Journal of Information Management
    • /
    • v.30 no.2
    • /
    • pp.31-52
    • /
    • 1999
  • This paper is done by studying two experimental cases which utilize the stochastic theory of Markov Chains, which is used for forecasting the future and by analyzing recent trend of studies. Since the study of Markov Chains is not applied to the Informetrics to a high degree in Korea. It is also proposed that there is a necessity for further study on Markov Chains and its activation.

  • PDF

A Simulation Model for the Intermittent Hydrologic Process (II) - Markov Chain and Continuous Probability Distribution - (간헐(間歇) 수문과정(水文過程)의 모의발생(模擬發生) 모형(模型)(II) - Markov 연쇄와 연속확률분포(連續確率分布) -)

  • Lee, Jae Joon;Lee, Jung Sik
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.14 no.3
    • /
    • pp.523-534
    • /
    • 1994
  • The purpose of this study is to develop computer simulation model that produce precipitation patterns from stochastic model. In the paper(I) of this study, the alternate renewal process(ARP) is used for the daily precipitation series. In this paper(Il), stochastic simulation models for the daily precipitation series are developed by combining Markov chain for the precipitation occurrence process and continuous probability distribution for the precipitation amounts on the wet days. The precipitation occurrence is determined by first order Markov chain with two states(dry and wet). The amounts of precipitation, given that precipitation has occurred, are described by a Gamma, Pearson Type-III, Extremal Type-III, and 3 parameter Weibull distribution. Since the daily precipitation series shows seasonal variation, models are identified for each month of the year separately. To illustrate the application of the simulation models, daily precipitation data were taken from records at the seven locations of the Nakdong and Seomjin river basin. Simulated data were similar to actual data in terms of distribution for wet and dry spells, seasonal variability, and precipitation amounts.

  • PDF

First-Passage Time Distribution of Discrete Time Stochastic Process with 0-state

  • Park, Young-Sool
    • Journal of the Korean Data and Information Science Society
    • /
    • v.8 no.2
    • /
    • pp.119-125
    • /
    • 1997
  • We handle the stochastic processes of independent and identically distributed random variables. But random variables are usually dependent among themselves in actual life. So in this paper, we find out a new process not satisfying Markov property. We investigate the probability mass functions and study on the probability of the first-passage time. Also we find out the average frequency of continuous successes in from 0 to n time.

  • PDF

A Study on the Intelligent Load Management System Based on Queue with Diffusion Markov Process Model (확산 Markov 프로세스 모델을 이용한 Queueing System 기반 지능 부하관리에 관한 연구)

  • Kim, Kyung-Dong;Kim, Seok-Hyun;Lee, Seung-Chul
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.58 no.5
    • /
    • pp.891-897
    • /
    • 2009
  • This paper presents a novel load management technique that can lower the peak demand caused by package airconditioner loads in large apartment complex. An intelligent hierarchical load management system composed of a Central Intelligent Management System(CIMS) and multiple Local Intelligent Management Systems(LIMS) is proposed to implement the proposed technique. Once the required amount of the power reduction is set, CIMS issues tokens, which can be used by each LIMS as a right to turn on the airconditioner. CIMS creates and maintains a queue for fair and proper allocation of the tokens among the LIMS requesting tokens. By adjusting the number tokens and queue management policies, desired power reduction can be achieved smoothly. The Markov Birth and Death process and the Balance Equations utilizing the Diffusion Model are employed for evaluation of queue performances during transient periods until the static balances among the states are achieved. The proposed technique is tested using a summer load data of a large apartment complex and give promising results demonstrating the usability in load management while minimizing the customer inconveniences.

An Energy-Efficient Transmission Strategy for Wireless Sensor Networks (무선 센서 네트워크에서 에너지 효율적인 전송 방안에 관한 연구)

  • Phan, Van Ca;Kim, Jeong-Geun
    • Journal of Internet Computing and Services
    • /
    • v.10 no.3
    • /
    • pp.85-94
    • /
    • 2009
  • In this work we propose an energy-efficient transmission strategy for wireless sensor networks that operate in a strict energy-constrained environment. Our transmission algorithm consists of two components: a binary-decision based transmission and a channel-aware backoff adjustment. In the binary-decision based transmission, we obtain the optimum threshold for successful transmission via Markov decision process (MDP) formulation. A channel-aware backoff adjustment, the second component of our proposal, is introduced to favor sensor nodes seeing better channel in terms of transmission priority. Extensive simulations are performed to verify the performance of our proposal over fading wireless channels.

  • PDF