• Title/Summary/Keyword: Poisson sampling

Search Result 58, Processing Time 0.026 seconds

Marginal Likelihoods for Bayesian Poisson Regression Models

  • Kim, Hyun-Joong;Balgobin Nandram;Kim, Seong-Jun;Choi, Il-Su;Ahn, Yun-Kee;Kim, Chul-Eung
    • Communications for Statistical Applications and Methods
    • /
    • v.11 no.2
    • /
    • pp.381-397
    • /
    • 2004
  • The marginal likelihood has become an important tool for model selection in Bayesian analysis because it can be used to rank the models. We discuss the marginal likelihood for Poisson regression models that are potentially useful in small area estimation. Computation in these models is intensive and it requires an implementation of Markov chain Monte Carlo (MCMC) methods. Using importance sampling and multivariate density estimation, we demonstrate a computation of the marginal likelihood through an output analysis from an MCMC sampler.

Resolution Improvement of the Positron Computerized Tomography with a New Positron Camera Tomographic System (분해능 향상을 위한 새로운 양전자 단층 촬영기의 제안)

  • Hong, Ki-Sang;Ra, Jong-Beom
    • Journal of the Korean Institute of Telematics and Electronics
    • /
    • v.16 no.6
    • /
    • pp.22-30
    • /
    • 1979
  • A new circular ring position camera tomographic system termed "Oscillatory Dichotomic Ring" system is proposed and its performance is simulated. It is basically a circular ring system, composed of two half rings, which has the capability of scanning so that any sampling intervals can be obtained. Since finer sampling means poorer photon statistcs, simulations with varous signal dependent statistical noise effects, ray sampling and arrangement as well as related artifacts peculiar to the proposed Dichotomic Ring system are made.

  • PDF

A Modified Single Sampling Plan for the Inspection of Attribute Quality Characteristics

  • Subramani, J.;Balamurali, S.
    • Industrial Engineering and Management Systems
    • /
    • v.15 no.1
    • /
    • pp.41-48
    • /
    • 2016
  • In this manuscript, a modified single sampling plan is proposed for the inspection of products in which the nonconforming items can be classified in to two categories namely critical and non-critical; and explained with the help of industrial example. The operating procedure of this plan is also proposed and the performance measures such as the probability of acceptance, average sample number, average total inspection and average out going quality are also derived. The optimal parameters are determined which will have minimum sample size. The efficiency of the proposed plan is also discussed over the conventional single sampling plan. The extensive tables for selecting a modified single sampling plan based on AQL and LQL are provided for both Binomial and Poisson distributions and explained with the help of industrial data.

On Some Distributions Generated by Riff-Shuffle Sampling

  • Son M.S.;Hamdy H.I.
    • International Journal of Contents
    • /
    • v.2 no.2
    • /
    • pp.17-24
    • /
    • 2006
  • The work presented in this paper is divided into two parts. The first part presents finite urn problems which generate truncated negative binomial random variables. Some combinatorial identities that arose from the negative binomial sampling and truncated negative binomial sampling are established. These identities are constructed and serve important roles when we deal with these distributions and their characteristics. Other important results including cumulants and moments of the distributions are given in somewhat simple forms. Second, the distributions of the maximum of two chi-square variables and the distributions of the maximum correlated F-variables are then derived within the negative binomial sampling scheme. Although multinomial theory applied to order statistics and standard transformation techniques can be used to derive these distributions, the negative binomial sampling approach provides more information and deeper insight regarding the nature of the relationship between the sampling vehicle and the probability distributions of these functions of chi-square variables. We also provide an algorithm to compute the percentage points of these distributions. We supplement our findings with exact simple computational methods where no interpolations are involved.

  • PDF

The Role of Negative Binomial Sampling In Determining the Distribution of Minimum Chi-Square

  • Hamdy H.I.;Bentil Daniel E.;Son M.S.
    • International Journal of Contents
    • /
    • v.3 no.1
    • /
    • pp.1-8
    • /
    • 2007
  • The distributions of the minimum correlated F-variable arises in many applied statistical problems including simultaneous analysis of variance (SANOVA), equality of variance, selection and ranking populations, and reliability analysis. In this paper, negative binomial sampling technique is employed to derive the distributions of the minimum of chi-square variables and hence the distributions of the minimum correlated F-variables. The work presented in this paper is divided in two parts. The first part is devoted to develop some combinatorial identities arised from the negative binomial sampling. These identities are constructed and justified to serve important purpose, when we deal with these distributions or their characteristics. Other important results including cumulants and moments of these distributions are also given in somewhat simple forms. Second, the distributions of minimum, chisquare variable and hence the distribution of the minimum correlated F-variables are then derived within the negative binomial sampling framework. Although, multinomial theory applied to order statistics and standard transformation techniques can be used to derive these distributions, the negative binomial sampling approach provides more information regarding the nature of the relationship between the sampling vehicle and the probability distributions of these functions of chi-square variables. We also provide an algorithm to compute the percentage points of the distributions. The computation methods we adopted are exact and no interpolations are involved.

The Bayesian Approach of Software Optimal Release Time Based on Log Poisson Execution Time Model (포아송 실행시간 모형에 의존한 소프트웨어 최적방출시기에 대한 베이지안 접근 방법에 대한 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Journal of the Korea Society of Computer and Information
    • /
    • v.14 no.7
    • /
    • pp.1-8
    • /
    • 2009
  • In this paper, make a study decision problem called an optimal release policies after testing a software system in development phase and transfer it to the user. The optimal software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement is generally accepted. The Bayesian parametric inference of model using log Poisson execution time employ tool of Markov chain(Gibbs sampling and Metropolis algorithm). In a numerical example by T1 data was illustrated. make out estimating software optimal release time from the maximum likelihood estimation and Bayesian parametric estimation.

Bayesian Conway-Maxwell-Poisson (CMP) regression for longitudinal count data

  • Morshed Alam ;Yeongjin Gwon ;Jane Meza
    • Communications for Statistical Applications and Methods
    • /
    • v.30 no.3
    • /
    • pp.291-309
    • /
    • 2023
  • Longitudinal count data has been widely collected in biomedical research, public health, and clinical trials. These repeated measurements over time on the same subjects need to account for an appropriate dependency. The Poisson regression model is the first choice to model the expected count of interest, however, this may not be an appropriate when data exhibit over-dispersion or under-dispersion. Recently, Conway-Maxwell-Poisson (CMP) distribution is popularly used as the distribution offers a flexibility to capture a wide range of dispersion in the data. In this article, we propose a Bayesian CMP regression model to accommodate over and under-dispersion in modeling longitudinal count data. Specifically, we develop a regression model with random intercept and slope to capture subject heterogeneity and estimate covariate effects to be different across subjects. We implement a Bayesian computation via Hamiltonian MCMC (HMCMC) algorithm for posterior sampling. We then compute Bayesian model assessment measures for model comparison. Simulation studies are conducted to assess the accuracy and effectiveness of our methodology. The usefulness of the proposed methodology is demonstrated by a well-known example of epilepsy data.

The Comparison of Parameter Estimation for Nonhomogeneous Poisson Process Software Reliability Model (NHPP 소프트웨어 신뢰도 모형에 대한 모수 추정 비교)

  • Kim, Hee-Cheul;Lee, Sang-Sik;Song, Young-Jae
    • The KIPS Transactions:PartD
    • /
    • v.11D no.6
    • /
    • pp.1269-1276
    • /
    • 2004
  • The Parameter Estimation for software existing reliability models, Goel-Okumoto, Yamada-Ohba-Osaki model was reviewed and Rayleigh model based on Rayleigh distribution was studied. In this paper, we discusses comparison of parameter estimation using maximum likelihood estimator and Bayesian estimation based on Gibbs sampling to analysis of the estimator' pattern. Model selection based on sum of the squared errors and Braun statistic, for the sake of efficient model, was employed. A numerical example was illustrated using real data. The current areas and models of Superposition, mixture for future development are also employed.

A Study on Poisson-lognormal Model (포아송-로그정규분포 모형에 관한 연구)

  • 김용철
    • The Korean Journal of Applied Statistics
    • /
    • v.13 no.1
    • /
    • pp.189-196
    • /
    • 2000
  • Conjugate prior density families were motivated by considerations of tractability in implementing the Bayesian paradigm. But we consider problem that the conjugate prior p($\Theta$) cannot be used in restriction of the parameter $\Theta$. This article considers the nonconjugate prior problem of hierarchical Poisson model. We demonstrate the use of latent variables for sampling non-standard densities which arise in the context of the Bayesian analysis of non-conjugate by using a Gibbs sampler.

  • PDF

A Bayesian joint model for continuous and zero-inflated count data in developmental toxicity studies

  • Hwang, Beom Seuk
    • Communications for Statistical Applications and Methods
    • /
    • v.29 no.2
    • /
    • pp.239-250
    • /
    • 2022
  • In many applications, we frequently encounter correlated multiple outcomes measured on the same subject. Joint modeling of such multiple outcomes can improve efficiency of inference compared to independent modeling. For instance, in developmental toxicity studies, fetal weight and number of malformed pups are measured on the pregnant dams exposed to different levels of a toxic substance, in which the association between such outcomes should be taken into account in the model. The number of malformations may possibly have many zeros, which should be analyzed via zero-inflated count models. Motivated by applications in developmental toxicity studies, we propose a Bayesian joint modeling framework for continuous and count outcomes with excess zeros. In our model, zero-inflated Poisson (ZIP) regression model would be used to describe count data, and a subject-specific random effects would account for the correlation across the two outcomes. We implement a Bayesian approach using MCMC procedure with data augmentation method and adaptive rejection sampling. We apply our proposed model to dose-response analysis in a developmental toxicity study to estimate the benchmark dose in a risk assessment.