• Title/Summary/Keyword: prior estimate

Search Result 331, Processing Time 0.027 seconds

Use of Training Data to Estimate the Smoothing Parameter for Bayesian Image Reconstruction

  • Lee, Soo-Jin
    • The Journal of Engineering Research
    • /
    • v.4 no.1
    • /
    • pp.47-54
    • /
    • 2002
  • We consider the problem of determining smoothing parameters of Gibbs priors for Bayesian methods used in the medical imaging application of emission tomographic reconstruction. We address a simple smoothing prior (membrane) whose global hyperparameter (the smoothing parameter) controls the bias/variance tradeoff of the solution. We base our maximum-likelihood(ML) estimates of hyperparameters on observed training data, and argue the motivation for this approach. Good results are obtained with a simple ML estimate of the smoothing parameter for the membrane prior.

  • PDF

Efficient 3D Scene Labeling using Object Detectors & Location Prior Maps (물체 탐지기와 위치 사전 확률 지도를 이용한 효율적인 3차원 장면 레이블링)

  • Kim, Joo-Hee;Kim, In-Cheol
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.11
    • /
    • pp.996-1002
    • /
    • 2015
  • In this paper, we present an effective system for the 3D scene labeling of objects from RGB-D videos. Our system uses a Markov Random Field (MRF) over a voxel representation of the 3D scene. In order to estimate the correct label of each voxel, the probabilistic graphical model integrates both scores from sliding window-based object detectors and also from object location prior maps. Both the object detectors and the location prior maps are pre-trained from manually labeled RGB-D images. Additionally, the model integrates the scores from considering the geometric constraints between adjacent voxels in the label estimation. We show excellent experimental results for the RGB-D Scenes Dataset built by the University of Washington, in which each indoor scene contains tabletop objects.

Bayes Estimators in Group Testing

  • Kwon, Se-Hyug
    • Communications for Statistical Applications and Methods
    • /
    • v.11 no.3
    • /
    • pp.619-629
    • /
    • 2004
  • Binomial group testing or composite sampling is often used to estimate the proportion, p, of positive(infects, defectives) in a population when that proportion is known to be small; the potential benefits of group testing over one-at-a-time testing are well documented. The literature has focused on maximum likelihood estimation. We provide two Bayes estimators and compare them with the MLE. The first of our Bayes estimators uses an uninformative Uniform (0, 1) prior on p; the properties of this estimator are poor. Our second Bayes estimator uses a much more informative prior that recognizes and takes into account key aspects of the group testing context. This estimator compares very favorably with the MSE, having substantially lower mean squared errors in all of the wide range of cases we considered. The priors uses a Beta distribution, Beta ($\alpha$, $\beta$), and some advice is provided for choosing the parameter a and $\beta$ for that distribution.

Image Blur Estimation Using Dark Channel Prior (Dark Channel Prior를 이용한 영상 블러 측정)

  • Park, Han-Hoon;Moon, Kwang-Seok
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.15 no.3
    • /
    • pp.80-84
    • /
    • 2014
  • Dark channel prior means that, for undistorted outdoor images, at least one color channel of a pixel or its neighbors have values close to 0, and thus the prior can be used to estimate the amount of distortion for given distorted images. In other words, if an image is distorted by blur, its dark channel values are averaged with neighbor pixel values and thus increase. This paper proposes a method that estimates blur strengths by analyzing the variation of dark channel values caused by blur. Through experiments with images distorted by Gaussian and horizontal motion blur with given strengths, the usefulness of the proposed method is verified.

Realistic Estimation Method of Compressive Strength in Concrete Structure (콘크리트 구조물의 합리적인 압축강도 추정기법 연구)

  • Oh, Byung-Hwan;Yang, In-Hwan
    • Magazine of the Korea Concrete Institute
    • /
    • v.11 no.2
    • /
    • pp.241-249
    • /
    • 1999
  • To estimate the compressive strength of concrete more realistically, relative large number of data are necessary. However, it is very common in practice that only limited data are available. The purpose of the present paper is therefore to propose a realistic method to estimate the compressive strength of concrete with limited data in actual site. The Bayesian method of statistical analysis has been applied to the problem of the estimation of compressive strength of concrete. The mean compressive strength is considered as the random parameter and a prior distribution is selected to enable updating of the Bayesian distribution of compressive strength of concrete reflecting both existing data and sampling observations. The updating of the Bayesian distribution with increasing data is illustrated in numerical application. It is shown that by combining prior estimation with information from site observation, more precise estimation is possible with relatively small sampling. It is also seen that the contribution of the prior in determining the posterior distribution depends on its sharpness or flatness in relation to the sharpness or flatness of the likelihood function. The present paper allows more realistic determination of concrete strength in site with limited data.

Region Growing Based Variable Window Size Decision Algorithm for Image Denoising (영상 잡음 제거를 위한 영역 확장 기반 가변 윈도우 크기 결정 알고리즘)

  • 엄일규;김유신
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.41 no.5
    • /
    • pp.111-116
    • /
    • 2004
  • It is essential to know the information about the prior model for wavelet coefficients, the probability distribution of noise, and the variance of wavelet coefficients for noise reduction using Bayesian estimation in wavelet domain. In general denoising methods, the signal variance is estimated from the proper prior model for wavelet coefficients. In this paper, we propose a variable window size decision algorithm to estimate signal variance according to image region. Simulation results shows the proposed method have better PSNRs than those of the state of art denoising methods.

Impact of Heterogeneous Dispersion Parameter on the Expected Crash Frequency (이질적 과분산계수가 기대 교통사고건수 추정에 미치는 영향)

  • Shin, Kangwon
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.15 no.9
    • /
    • pp.5585-5593
    • /
    • 2014
  • This study tested the hypothesis that the significance of the heterogeneous dispersion parameter in safety performance function (SPF) used to estimate the expected crashes is affected by the endogenous heterogeneous prior distributions, and analyzed the impacts of the mis-specified dispersion parameter on the evaluation results for traffic safety countermeasures. In particular, this study simulated the Poisson means based on the heterogeneous dispersion parameters and estimated the SPFs using both the negative binomial (NB) model and the heterogeneous negative binomial (HNB) model for analyzing the impacts of the model mis-specification on the mean and dispersion functions in SPF. In addition, this study analyzed the characteristics of errors in the crash reduction factors (CRFs) obtained when the two models are used to estimate the posterior means and variances, which are essentially estimated through the estimated hyper-parameters in the heterogeneous prior distributions. The simulation study results showed that a mis-estimation on the heterogeneous dispersion parameters through the NB model does not affect the coefficient of the mean functions, but the variances of the prior distribution are seriously mis-estimated when the NB model is used to develop SPFs without considering the heterogeneity in dispersion. Consequently, when the NB model is used erroneously to estimate the prior distributions with heterogeneous dispersion parameters, the mis-estimated posterior mean can produce large errors in CRFs up to 120%.

Field Experiment of a Multi-azimuth Inverse VSP for Investigating Velocity Anisotropy (속도 이방성 조사를 위한 다방위 역수직 탄성파 현장 실험)

  • Lee, Doo-Sung;Kim, Hyoun-Gyu
    • Geophysics and Geophysical Exploration
    • /
    • v.2 no.3
    • /
    • pp.137-141
    • /
    • 1999
  • In order to estimate the anisotropy of the medium, we deployed a series of 120-sources in a borehole, and simultaneously recorded 3-component seismic data at 5 locations on the surface. We have tried to estimate the directional velocities by comparing the first arrivals at different receivers. For that purpose, the receiver statics must be corrected prior to pick the first arrivals. However, in an IVSP with a limited number of receiver points, it may not possible to estimate a reliable receiver statics, therefore, instead of using individual first arrival times, we tried to estimate the move-out velocity at each records. From this analysis, we have found that there exists a measurable amount of difference in directional velocities, and confirmed that the velocity anisotropy agrees with the results of the previous studies conducted in this area.

  • PDF

Event date model: a robust Bayesian tool for chronology building

  • Philippe, Lanos;Anne, Philippe
    • Communications for Statistical Applications and Methods
    • /
    • v.25 no.2
    • /
    • pp.131-157
    • /
    • 2018
  • We propose a robust event date model to estimate the date of a target event by a combination of individual dates obtained from archaeological artifacts assumed to be contemporaneous. These dates are affected by errors of different types: laboratory and calibration curve errors, irreducible errors related to contaminations, and taphonomic disturbances, hence the possible presence of outliers. Modeling based on a hierarchical Bayesian statistical approach provides a simple way to automatically penalize outlying data without having to remove them from the dataset. Prior information on individual irreducible errors is introduced using a uniform shrinkage density with minimal assumptions about Bayesian parameters. We show that the event date model is more robust than models implemented in BCal or OxCal, although it generally yields less precise credibility intervals. The model is extended in the case of stratigraphic sequences that involve several events with temporal order constraints (relative dating), or with duration, hiatus constraints. Calculations are based on Markov chain Monte Carlo (MCMC) numerical techniques and can be performed using ChronoModel software which is freeware, open source and cross-platform. Features of the software are presented in Vibet et al. (ChronoModel v1.5 user's manual, 2016). We finally compare our prior on event dates implemented in the ChronoModel with the prior in BCal and OxCal which involves supplementary parameters defined as boundaries to phases or sequences.

Majorization-Minimization-Based Sparse Signal Recovery Method Using Prior Support and Amplitude Information for the Estimation of Time-varying Sparse Channels

  • Wang, Chen;Fang, Yong
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.10
    • /
    • pp.4835-4855
    • /
    • 2018
  • In this paper, we study the sparse signal recovery that uses information of both support and amplitude of the sparse signal. A convergent iterative algorithm for sparse signal recovery is developed using Majorization-Minimization-based Non-convex Optimization (MM-NcO). Furthermore, it is shown that, typically, the sparse signals that are recovered using the proposed iterative algorithm are not globally optimal and the performance of the iterative algorithm depends on the initial point. Therefore, a modified MM-NcO-based iterative algorithm is developed that uses prior information of both support and amplitude of the sparse signal to enhance recovery performance. Finally, the modified MM-NcO-based iterative algorithm is used to estimate the time-varying sparse wireless channels with temporal correlation. The numerical results show that the new algorithm performs better than related algorithms.