• Title/Summary/Keyword: hierarchical estimation

Search Result 209, Processing Time 0.028 seconds

SIMULTANEOUS ESTIMATION OF GAMMA SCALE PARAMETER UNDER ENTROPY LOSS:BAYESIAN APPROACH

  • Chung, Youn-Shik
    • Journal of applied mathematics & informatics
    • /
    • v.3 no.1
    • /
    • pp.55-64
    • /
    • 1996
  • Let $X_1, ....$X_P be p($\geq$2) independent random variables, where each X1 has a gamma distribution with $k_i and ${\heta}_i$. The problem is to simultaneously estimate p gammar parameters ${\heta}_i$ under entropy loss where the parameters are believed priori. Hierarchical bayes(HB) and empirical bayes(EB) estimators are investigated. Next computer simulation is studied to compute the risk percentage improvement of the HB, EB and the estimator of Dey et al.(1987) compared to MVUE of ${\heta}$.

Finite Population Total Estimation On Multistage Cluster Sampling

  • Geun-Shik Han;Yong-Chul Kim;Kiheon Choi
    • Communications for Statistical Applications and Methods
    • /
    • v.3 no.2
    • /
    • pp.161-168
    • /
    • 1996
  • Multistage hierarchical models and Bayesian inferences about finite population total estimations are considered. Here, Gibbs sampling approach that can be used to predict the marginal posterior means needed for Bayesian inferences is proposed.

  • PDF

Computational Latency Reduction via Simplified Soft-bit Estimation of Hierarchical Modulation (근사화된 계층 변조의 연판정 비트 검출을 통한 연산 지연시간 감소)

  • You, Dongho
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2022.06a
    • /
    • pp.175-178
    • /
    • 2022
  • 본 논문은 고차 계층 변조, 즉 계층 64QAM의 연판정 비트 검출을 위한 단순화된 연산 방법을 다룬다. 이는 기존 계층 변조의 연판정 비트, 즉 LLR(Log-Likelihood Ratio)값의 근사를 통해 불필요한 연산을 줄여 이에 필요한 지연시간을 줄일 수 있다. 또한 제안된 기법은 기존의 연판정 비트 검출 기법과 매우 유사한 비트 오류율(BER: Bit Error Rate) 성능을 유지하기 때문에 연판정 비트를 활용하는 방송 및 통신 시스템에 폭넓게 적용될 수 있을 것으로 기대한다.

  • PDF

Probabilistic assessment on the basis of interval data

  • Thacker, Ben H.;Huyse, Luc J.
    • Structural Engineering and Mechanics
    • /
    • v.25 no.3
    • /
    • pp.331-345
    • /
    • 2007
  • Uncertainties enter a complex analysis from a variety of sources: variability, lack of data, human errors, model simplification and lack of understanding of the underlying physics. However, for many important engineering applications insufficient data are available to justify the choice of a particular probability density function (PDF). Sometimes the only data available are in the form of interval estimates which represent, often conflicting, expert opinion. In this paper we demonstrate that Bayesian estimation techniques can successfully be used in applications where only vague interval measurements are available. The proposed approach is intended to fit within a probabilistic framework, which is established and widely accepted. To circumvent the problem of selecting a specific PDF when only little or vague data are available, a hierarchical model of a continuous family of PDF's is used. The classical Bayesian estimation methods are expanded to make use of imprecise interval data. Each of the expert opinions (interval data) are interpreted as random interval samples of a parent PDF. Consequently, a partial conflict between experts is automatically accounted for through the likelihood function.

Bayes tests of independence for contingency tables from small areas

  • Jo, Aejung;Kim, Dal Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.1
    • /
    • pp.207-215
    • /
    • 2017
  • In this paper we study pooling effects in Bayesian testing procedures of independence for contingency tables from small areas. In small area estimation setup, we typically use a hierarchical Bayesian model for borrowing strength across small areas. This techniques of borrowing strength in small area estimation is used to construct a Bayes test of independence for contingency tables from small areas. In specific, we consider the methods of direct or indirect pooling in multinomial models through Dirichlet priors. We use the Bayes factor (or equivalently the ratio of the marginal likelihoods) to construct the Bayes test, and the marginal density is obtained by integrating the joint density function over all parameters. The Bayes test is computed by performing a Monte Carlo integration based on the method proposed by Nandram and Kim (2002).

Search Range Reduction Algorithm with Motion Vectors of Upper Blocks for HEVC (상위 블록 움직임 벡터를 이용한 HEVC 움직임 예측 탐색 범위 감소 기법)

  • Lee, Kyujoong
    • Journal of Korea Multimedia Society
    • /
    • v.21 no.1
    • /
    • pp.18-25
    • /
    • 2018
  • In High Efficiency Video Coding (HEVC), integer motion estimation (IME) requires a large amount of computational complexity because HEVC adopts the high flexible and hierarchical coding structures. In order to reduce the computational complexity of IME, this paper proposes the search range reduction algorithm, which takes advantage of motion vectors similarity between different layers. It needs only a few modification for HEVC reference software. Based on the experimental results, the proposed algorithm reduces the processing time of IME by 28.1% on average, whereas its the $Bj{\emptyset}ntegaard$ delta bitrate (BD-BR) increase is 0.15% which is negligible.

Identification of Regression Outliers Based on Clustering of LMS-residual Plots

  • Kim, Bu-Yong;Oh, Mi-Hyun
    • Communications for Statistical Applications and Methods
    • /
    • v.11 no.3
    • /
    • pp.485-494
    • /
    • 2004
  • An algorithm is proposed to identify multiple outliers in linear regression. It is based on the clustering of residuals from the least median of squares estimation. A cut-height criterion for the hierarchical cluster tree is suggested, which yields the optimal clustering of the regression outliers. Comparisons of the effectiveness of the procedures are performed on the basis of the classic data and artificial data sets, and it is shown that the proposed algorithm is superior to the one that is based on the least squares estimation. In particular, the algorithm deals very well with the masking and swamping effects while the other does not.

Constant Quality Motion Compensated Temporal Filtering Video Compression using Multi-block size Motion Estimation and SPECK (다중 블록 크기의 움직임 예측과 SPECK을 이용한 고정 화질 움직임 보상 시간영역 필터링 동영상 압축)

  • Park Sang-Ju
    • Journal of Broadcast Engineering
    • /
    • v.11 no.2 s.31
    • /
    • pp.153-163
    • /
    • 2006
  • We propose a new video compression method based on MCTF(motion compensated temporal filtering) with constant quality. SPECK is an efficient image compression coding method of encoding DWT coefficients. Especially SPECK method is very efficient for coding the motion compensated residual image which usually has larger amounts of high frequency components than the natural images. And proposed multi block size hierarchical motion estimation technique is more efficient than classical block matching algorithm with fixed block size both in estimation precision and operation costs. Proposed video method based on MCTF video compression can also support multi-frame rate decoding with reasonable complexity. Simulation results showed that proposed method outperforms H.263 video compression standard.

BAYES EMPIRICAL BAYES ESTIMATION OF A PROPORT10N UNDER NONIGNORABLE NONRESPONSE

  • Choi, Jai-Won;Nandram, Balgobin
    • Journal of the Korean Statistical Society
    • /
    • v.32 no.2
    • /
    • pp.121-150
    • /
    • 2003
  • The National Health Interview Survey (NHIS) is one of the surveys used to assess the health status of the US population. One indicator of the nation's health is the total number of doctor visits made by the household members in the past year, There is a substantial nonresponse among the sampled households, and the main issue we address here is that the nonrespones mechanism should not be ignored because respondents and nonrespondents differ. It is standard practice to summarize the number of doctor visits by the binary variable of no doctor visit versus at least one doctor visit by a household for each of the fifty states and the District of Columbia. We consider a nonignorable nonresponse model that expresses uncertainty about ignorability through the ratio of odds of a household doctor visit among respondents to the odds of doctor visit among all households. This is a hierarchical model in which a nonignorable nonresponse model is centered on an ignorable nonresponse model. Another feature of this model is that it permits us to "borrow strength" across states as in small area estimation; this helps because some of the parameters are weakly identified. However, for simplicity we assume that the hyperparameters are fixed but unknown, and these hyperparameters are estimated by the EM algorithm; thereby making our method Bayes empirical Bayes. Our main result is that for some of the states the nonresponse mechanism can be considered non-ignorable, and that 95% credible intervals of the probability of a household doctor visit and the probability that a household responds shed important light on the NHIS.

Bayesian Nonstationary Probability Rainfall Estimation using the Grid Method (Grid Method 기법을 이용한 베이지안 비정상성 확률강수량 산정)

  • Kwak, Dohyun;Kim, Gwangseob
    • Journal of Korea Water Resources Association
    • /
    • v.48 no.1
    • /
    • pp.37-44
    • /
    • 2015
  • A Bayesian nonstationary probability rainfall estimation model using the Grid method is developed. A hierarchical Bayesian framework is consisted with prior and hyper-prior distributions associated with parameters of the Gumbel distribution which is selected for rainfall extreme data. In this study, the Grid method is adopted instead of the Matropolis Hastings algorithm for random number generation since it has advantage that it can provide a thorough sampling of parameter space. This method is good for situations where the best-fit parameter values are not easily inferred a priori, and where there is a high probability of false minima. The developed model was applied to estimated target year probability rainfall using hourly rainfall data of Seoul station from 1973 to 2012. Results demonstrated that the target year estimate using nonstationary assumption is about 5~8% larger than the estimate using stationary assumption.