• 제목/요약/키워드: bayesian approach

Search Result 628, Processing Time 0.023 seconds

A Study on the Determination of a Minimum Cost Sampling Inspection Plan for Destructive Testing (파괴검사(破壞檢査)에 있어서의 최소비용(最少費用) 샘플링 검사방식(檢査方式)의 결정(決定)에 관한 연구(硏究) - 계수파괴(計數破壞) 1회검사(回檢査)를 중심(中心)으로 -)

  • Hwang, Ui-Cheol;Jeong, Yeong-Bae
    • Journal of Korean Society for Quality Management
    • /
    • v.8 no.2
    • /
    • pp.15-22
    • /
    • 1980
  • This paper deals with the problem of determining a minimum cost sampling inspection plan for a single destructive testing by attribute. The cost for inspection lot is constructed by following three cost factors: (1) cost of inspection, (2) cost of accepted defective, (3) cost of rejected lot Using Hald's Bayesian approach in a single non-destructive testing, procedure's for finding the minimum cost single destructive sampling inspection plan by attribute are given. Assuming the uniform distribution as a prior-distribution and using numerical analysis by computer, a minimum cost single destructive sampling inspection plan by attribute for several lot sizes, unit cost, destructive testing cost, and salvage cost is given.

  • PDF

Methods and Techniques for Variance Component Estimation in Animal Breeding - Review -

  • Lee, C.
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.13 no.3
    • /
    • pp.413-422
    • /
    • 2000
  • In the class of models which include random effects, the variance component estimates are important to obtain accurate predictors and estimators. Variance component estimation is straightforward for balanced data but not for unbalanced data. Since orthogonality among factors is absent in unbalanced data, various methods for variance component estimation are available. REML estimation is the most widely used method in animal breeding because of its attractive statistical properties. Recently, Bayesian approach became feasible through Markov Chain Monte Carlo methods with increasingly powerful computers. Furthermore, advances in variance component estimation with complicated models such as generalized linear mixed models enabled animal breeders to analyze non-normal data.

A K-Nearest Neighbor Algorithm for Categorical Sequence Data (범주형 시퀀스 데이터의 K-Nearest Neighbor알고리즘)

  • Oh Seung-Joon
    • Journal of the Korea Society of Computer and Information
    • /
    • v.10 no.2 s.34
    • /
    • pp.215-221
    • /
    • 2005
  • TRecently, there has been enormous growth in the amount of commercial and scientific data, such as protein sequences, retail transactions, and web-logs. Such datasets consist of sequence data that have an inherent sequential nature. In this Paper, we study how to classify these sequence datasets. There are several kinds techniques for data classification such as decision tree induction, Bayesian classification and K-NN etc. In our approach, we use a K-NN algorithm for classifying sequences. In addition, we propose a new similarity measure to compute the similarity between two sequences and an efficient method for measuring similarity.

  • PDF

Structural change and asymmetry analysis of petroleum product prices in Korea

  • Oh, Sun-Ah;Heo, Eun-Nyeong
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2003.11a
    • /
    • pp.669-675
    • /
    • 2003
  • This paper examines structural breaks and asymmetries of prices of four domestic petroleum products, i.e., gasoline, kerosene, diesel, and bunker-C, following the changes in the pricing policies pertaining to petroleum products in Korea from the government-controlled pricing system to the market pricing system. We use the monthly wholesale market price data for the sample period between July 1988 and December 2001. Using the four methods: the Chow test, the CUSUM/CUSUMQ tests, the Bayesian approach and the Dufour test, the structural behaviors of the petroleum product prices are examined. We found that structural change occurred in all petroleum products, with the exception of Kerosene, at the point of pricing policy change from government-controlled to the spot-price related pricing system. We, also conducted asymmetric analysis using the Borenstein, Cameron, and Gilbert (1997)'s model and found evidences of price asymmetry for all four product types, but in different pattern for each product.

  • PDF

Nonparametric Bayesian Approach for Multichannel based Semantic Segmentation of TV Dramas (멀티채널 기반 드라마 동영상 의미 분절화를 위한 비모수 베이지안 방법)

  • Seok, Ho-Sik;Lee, Ba-Do;Zhang, Byoung-Tak
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2012.06b
    • /
    • pp.474-476
    • /
    • 2012
  • 본 논문에서는 드라마 동영상의 의미 분절화(Semantic segmentation)를 위한 멀티 채널 기반 비모수적 베이지만 방법론을 소개한다. 기존 방법론은 매우 한정적인 특징만을 이용하여 분절화를 시도하거나 이미지 채널이나 오디오 채널과 같은 단일 채널에서만 유효한 방법론을 이용하여 데이터 분석을 시도하였기에, TV 드라마와 같이 예측할 수 없는 변화를 보여주는 스트림 데이터에 적용하기에는 어려움이 많았다. 이와 같은 단점을 극복하기 위해 우리는 주어진 동영상을 단일 모달리티의 채널로 분할한 후 각 채널 별로 분절화를 시도하고 각 채널의 분절 결과를 동적으로 결합하여 주어진 동영상에서의 의미 분절화를 근사하는 방법을 개발하였다. 제안 방법은 실제 TV 동영상의 의미 분절화에 적용되었으며 인간 평가자에 의한 의미 변화 구간과의 비교를 통해 그 성능을 확인하였다.

A Bayesian Approach to Stereo Matching via Merging Watershed Regions (워터쉐드 영역병합을 이용한 스테레오 정합의 베이지언 접근방법)

  • Kil, Woo-Sung;Kim, Shin-Hyung;Jang, Jong-Whan
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2005.05a
    • /
    • pp.809-812
    • /
    • 2005
  • 본 논문은 세그멘테이션 기반의 스테레오 정합에서 복잡한 장면 정합 시 발생되는 오 정합을 최소화 하는 방법을 제안한다. 이를 위하여, 스테레오 영상의 좌측 영상에 대해 워터쉐드 영상 분할을 이용하여 정합을 위한 feature 를 생성한 다음, 베이지언 프레임웍을 적용하여, 각각의 영역을 비슷한 변이 정보를 가진 것들로 병합한다. 생성되는 정합 패치들은 정합의 모호성이 작게 되어 오 정합이 현저히 줄어 들 뿐만 아니라, 영역간의 콘트라스트가 적은 영상에서도 신뢰할 만한 변이 영상을 생성하게 된다.

  • PDF

On Estimation of HPD Interval for the Generalized Variance Using a Weighted Monte Carlo Method

  • Kim, Hea-Jung
    • Communications for Statistical Applications and Methods
    • /
    • v.9 no.2
    • /
    • pp.305-313
    • /
    • 2002
  • Regarding to inference about a scalar measure of internal scatter of Ρ-variate normal population, this paper considers an interval estimation of the generalized variance, │$\Sigma$│. Due to complicate sampling distribution, fully parametric frequentist approach for the interval estimation is not available and thus Bayesian method is pursued to calculate the highest probability density (HPD) interval for the generalized variance. It is seen that the marginal posterior distribution of the generalized variance is intractable, and hence a weighted Monte Carlo method, a variant of Chen and Shao (1999) method, is developed to calculate the HPD interval of the generalized variance. Necessary theories involved in the method and computation are provided. Finally, a simulation study is given to illustrate and examine the proposed method.

EM Algorithm-based Segmentation of Magnetic Resonance Image Corrupted by Bias Field (바이어스필드에 의해 왜곡된 MRI 영상자료분할을 위한 EM 알고리즘 기반 접근법)

  • 김승구
    • The Korean Journal of Applied Statistics
    • /
    • v.16 no.2
    • /
    • pp.305-319
    • /
    • 2003
  • This paper provides a non-Bayesian method based on the expanded EM algorithm for segmenting the magnetic resonance images degraded by bias field. For the images with the intensity as a pixel value, many segmentation methods often fail to segment it because of the bias field(with low frequency) as well as noise(with high frequency). Our contextual approach is appropriately designed by using normal mixture model incorporated with Markov random field for noise-corrective segmentation and by using the penalized likelihood to estimate bias field for efficient bias filed-correction.

A minimum cost sampling inspection plan for destructive testing (破壤檢査詩의 最小費용 샘플링 檢査方式)

  • 趙星九;裵道善
    • Journal of the Korean Statistical Society
    • /
    • v.7 no.1
    • /
    • pp.27-43
    • /
    • 1978
  • This paper deals with the problem of obtaining a minimum cost acceptance sampling plan for destructive testing. The cost model is constructed under the assumption that the sampling procedure takes the following form; 1) lots rejected on the first sample are acreened with a non-destructive testing, 2) the screening is assumed to be imperfect, and therefore, after the screening, a second sample is taken to determine whether to accept the lot of to scrap it. The usual sampling procedures for destructive testing can be regarded as special cases of the above one. Utilizing Hald's Bayesian approach, procedures for finding the global optimal sampling plans are given. However, when the lot size is large, the global plan is very different to obtain even with the aid of an electronic computer. Therefore a method of finding suboptimal plan is suggested. An example with uniform prior is also given.

  • PDF

Pullout capacity of small ground anchors: a relevance vector machine approach

  • Samui, Pijush;Sitharam, T.G.
    • Geomechanics and Engineering
    • /
    • v.1 no.3
    • /
    • pp.259-262
    • /
    • 2009
  • This paper examines the potential of relevance vector machine (RVM) in prediction of pullout capacity of small ground anchors. RVM is based on a Bayesian formulation of a linear model with an appropriate prior that results in a sparse representation. The results are compared with a widely used artificial neural network (ANN) model. Overall, the RVM showed good performance and is proven to be better than ANN model. It also estimates the prediction variance. The plausibility of RVM technique is shown by its superior performance in forecasting pullout capacity of small ground anchors providing exogenous knowledge.