• Title/Summary/Keyword: Marginal approximation

Search Result 15, Processing Time 0.024 seconds

ML estimation using Poisson HGLM approach in semi-parametric frailty models

  • Ha, Il Do
    • Journal of the Korean Data and Information Science Society
    • /
    • v.27 no.5
    • /
    • pp.1389-1397
    • /
    • 2016
  • Semi-parametric frailty model with nonparametric baseline hazards has been widely used for the analyses of clustered survival-time data. The frailty models can be fitted via an auxiliary Poisson hierarchical generalized linear model (HGLM). For the inferences of the frailty model marginal likelihood, which gives MLE, is often used. The marginal likelihood is usually obtained by integrating out random effects, but it often requires an intractable integration. In this paper, we propose to obtain the MLE via Laplace approximation using a Poisson HGLM approach for semi-parametric frailty model. The proposed HGLM approach uses hierarchical-likelihood (h-likelihood), which avoids integration itself. The proposed method is illustrated using a numerical study.

Maximum Likelihood Estimation Using Laplace Approximation in Poisson GLMMs

  • Ha, Il-Do
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.6
    • /
    • pp.971-978
    • /
    • 2009
  • Poisson generalized linear mixed models(GLMMs) have been widely used for the analysis of clustered or correlated count data. For the inference marginal likelihood, which is obtained by integrating out random effects is often used. It gives maximum likelihood(ML) estimator, but the integration is usually intractable. In this paper, we propose how to obtain the ML estimator via Laplace approximation based on hierarchical-likelihood (h-likelihood) approach under the Poisson GLMMs. In particular, the h-likelihood avoids the integration itself and gives a statistically efficient procedure for various random-effect models including GLMMs. The proposed method is illustrated using two practical examples and simulation studies.

Approximation Algorithm for Multi Agents-Multi Tasks Assignment with Completion Probability (작업 완료 확률을 고려한 다수 에이전트-다수 작업 할당의 근사 알고리즘)

  • Kim, Gwang
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.27 no.2
    • /
    • pp.61-69
    • /
    • 2022
  • A multi-agent system is a system that aims at achieving the best-coordinated decision based on each agent's local decision. In this paper, we consider a multi agent-multi task assignment problem. Each agent is assigned to only one task and there is a completion probability for performing. The objective is to determine an assignment that maximizes the sum of the completion probabilities for all tasks. The problem, expressed as a non-linear objective function and combinatorial optimization, is NP-hard. It is necessary to design an effective and efficient solution methodology. This paper presents an approximation algorithm using submodularity, which means a marginal gain diminishing, and demonstrates the scalability and robustness of the algorithm in theoretical and experimental ways.

A Study on the analysis of marginal cost using DC load flow in transmission constraint and network partition (송전제약하에서 DC LOAD FLOW를 이용한 한계비용의 분석과 계통의 분할에 관한 연구)

  • Jang, Si-Jin;Jeong, Hae-Seong;Park, Jong-Keun
    • Proceedings of the KIEE Conference
    • /
    • 2000.07a
    • /
    • pp.373-375
    • /
    • 2000
  • By DC load flow approximation, we analyzed marginal cost that is the important factor of price signal for network congestion management and expressed as a function of load. In network congestion, a large scale electric network is partitioned into subnetwork to provide a effetive price signal through zonal pricing. We propose a new network partition technique using marginal cost sensitivity with a variety of load consumption.

  • PDF

Sampling Based Approach to Hierarchical Bayesian Estimation of Reliability Function

  • Younshik Chung
    • Communications for Statistical Applications and Methods
    • /
    • v.2 no.2
    • /
    • pp.43-51
    • /
    • 1995
  • For the stress-strengh function, hierarchical Bayes estimations considered under squared error loss and entropy loss. In particular, the desired marginal postrior densities ate obtained via Gibbs sampler, an iterative Monte Carlo method, and Normal approximation (by Delta method). A simulation is presented.

  • PDF

Efficiency and Robustness of Fully Adaptive Simulated Maximum Likelihood Method

  • Oh, Man-Suk;Kim, Dai-Gyoung
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.3
    • /
    • pp.479-485
    • /
    • 2009
  • When a part of data is unobserved the marginal likelihood of parameters given the observed data often involves analytically intractable high dimensional integral and hence it is hard to find the maximum likelihood estimate of the parameters. Simulated maximum likelihood(SML) method which estimates the marginal likelihood via Monte Carlo importance sampling and optimize the estimated marginal likelihood has been used in many applications. A key issue in SML is to find a good proposal density from which Monte Carlo samples are generated. The optimal proposal density is the conditional density of the unobserved data given the parameters and the observed data, and attempts have been given to find a good approximation to the optimal proposal density. Algorithms which adaptively improve the proposal density have been widely used due to its simplicity and efficiency. In this paper, we describe a fully adaptive algorithm which has been used by some practitioners but has not been well recognized in statistical literature, and evaluate its estimation performance and robustness via a simulation study. The simulation study shows a great improvement in the order of magnitudes in the mean squared error, compared to non-adaptive or partially adaptive SML methods. Also, it is shown that the fully adaptive SML is robust in a sense that it is insensitive to the starting points in the optimization routine.

Bayes factors for accelerated life testing models

  • Smit, Neill;Raubenheimer, Lizanne
    • Communications for Statistical Applications and Methods
    • /
    • v.29 no.5
    • /
    • pp.513-532
    • /
    • 2022
  • In this paper, the use of Bayes factors and the deviance information criterion for model selection are compared in a Bayesian accelerated life testing setup. In Bayesian accelerated life testing, the most used tool for model comparison is the deviance information criterion. An alternative and more formal approach is to use Bayes factors to compare models. However, Bayesian accelerated life testing models with more than one stressor often have mathematically intractable posterior distributions and Markov chain Monte Carlo methods are employed to obtain posterior samples to base inference on. The computation of the marginal likelihood is challenging when working with such complex models. In this paper, methods for approximating the marginal likelihood and the application thereof in the accelerated life testing paradigm are explored for dual-stress models. A simulation study is also included, where Bayes factors using the different approximation methods and the deviance information are compared.

The Competitive Time Guarantee Decisions Via Continuous Approximation of Logistics Systems (연속적 근사법에 의한 물류시스템의 경쟁적 시간보장 의사결정 최적화에 관한 연구)

  • Kim, Hyoungtae
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.37 no.3
    • /
    • pp.64-74
    • /
    • 2014
  • We show how a supplier can peg cost measures to the reliability of his time guarantees via the penalty costs considered in the framework. The framework also enables us to study the connections between the logistics network and the market. In this context, we show that even when the market base increases significantly, the supplier can still use the logistics network designed to satisfy lower demand density, with only a marginal reduction in profit. Finally we show how the framework is useful to evaluate and compare various logistics system improvement strategies. The supplier can then easily choose the improvement strategy that increases his profit with the minimal increase in his logistics costs.

Effect of Adhesion layer on the Optical Scattering Properties of Plasmonic Au Nanodisc (접착층을 고려한 플라즈모닉 금 나노 디스크의 광산란 특성)

  • Kim, Jooyoung;Cho, Kyuman;Lee, Kyeong-Seok
    • Korean Journal of Metals and Materials
    • /
    • v.46 no.7
    • /
    • pp.464-470
    • /
    • 2008
  • Metallic nanostructures have great potential for bio-chemical sensor applications due to the excitation of localized surface plasmon and its sensitive response to environmental change. Unlike the commonly explored absorption-based sensing, the optical scattering provides single particle detection scheme. For the localized surface plasmon resonance spectroscopy, the metallic nanostructures with controlled shape and size have been usually fabricated on adhesion-layer pre-coated transparent glass substrates. In this study, we calculated the optical scattering properties of plasmonic Au nanodisc using a discrete dipole approximation method and analyzed the effect of adhesion layer on them. Our result also indicates that there is a trade-off between the surface plasmon damping and the capability of supporting nanostructures in determining the optimal thickness of adhesion layer. Marginal thickness of Ti adhesion layer for supporting Au nanostructures fabricated on a silica glass substrate was experimentally analyzed by an adhesion strength test using a nano-indentation technique.

Bayesian Inversion of Gravity and Resistivity Data: Detection of Lava Tunnel

  • Kwon, Byung-Doo;Oh, Seok-Hoon
    • Journal of the Korean earth science society
    • /
    • v.23 no.1
    • /
    • pp.15-29
    • /
    • 2002
  • Bayesian inversion for gravity and resistivity data was performed to investigate the cavity structure appearing as a lava tunnel in Cheju Island, Korea. Dipole-dipole DC resistivity data were proposed for a prior information of gravity data and we applied the geostatistical techniques such as kriging and simulation algorithms to provide a prior model information and covariance matrix in data domain. The inverted resistivity section gave the indicator variogram modeling for each threshold and it provided spatial uncertainty to give a prior PDF by sequential indicator simulations. We also presented a more objective way to make data covariance matrix that reflects the state of the achieved field data by geostatistical technique, cross-validation. Then Gaussian approximation was adopted for the inference of characteristics of the marginal distributions of model parameters and Broyden update for simple calculation of sensitivity matrix and SVD was applied. Generally cavity investigation by geophysical exploration is difficult and success is hard to be achieved. However, this exotic multiple interpretations showed remarkable improvement and stability for interpretation when compared to data-fit alone results, and suggested the possibility of diverse application for Bayesian inversion in geophysical inverse problem.