• Title/Summary/Keyword: Adaptive Importance Sampling Method

Search Result 10, Processing Time 0.02 seconds

Adaptive kernel method for evaluating structural system reliability

  • Wang, G.S.;Ang, A.H.S.;Lee, J.C.
    • Structural Engineering and Mechanics
    • /
    • v.5 no.2
    • /
    • pp.115-126
    • /
    • 1997
  • Importance sampling methods have been developed with the aim of reducing the computational costs inherent in Monte Carlo methods. This study proposes a new algorithm called the adaptive kernel method which combines and modifies some of the concepts from adaptive sampling and the simple kernel method to evaluate the structural reliability of time variant problems. The essence of the resulting algorithm is to select an appropriate starting point from which the importance sampling density can be generated efficiently. Numerical results show that the method is unbiased and substantially increases the efficiency over other methods.

Reliability Analysis of Stochastic Finite Element Model by the Adaptive Importance Sampling Technique (적응적 중요표본추출법에 의한 확률유한요소모형의 신뢰성분석)

  • 김상효;나경웅
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 1999.10a
    • /
    • pp.351-358
    • /
    • 1999
  • The structural responses of underground structures are examined in probability by using the elasto-plastic stochastic finite element method in which the spatial distributions of material properties are assumed to be stochastic fields. In addition, the adaptive importance sampling method using the response surface technique is used to improve simulation efficiency. The method is found to provide appropriate information although the nonlinear Limit State involves a large number of basic random variables and the failure probability is small. The probability of plastic local failures around an excavated area is effectively evaluated and the reliability for the limit displacement of the ground is investigated. It is demonstrated that the adaptive importance sampling method can be very efficiently used to evaluate the reliability of a large scale stochastic finite element model, such as the underground structures located in the multi-layered ground.

  • PDF

A Reliability Analysis Application and Comparative Study on Probabilistic Structure Design for an Automatic Salt Collector (자동채염기의 확률론적 구조설계 구현을 위한 신뢰성 해석 응용과 비교연구)

  • Song, Chang Yong
    • Journal of the Korean Society of Manufacturing Process Engineers
    • /
    • v.19 no.12
    • /
    • pp.70-79
    • /
    • 2020
  • This paper describes a comparative study of characteristics of probabilistic design using various reliability analysis methods in the structure design of an automatic salt collector. The thickness sizing variables of the main structural member were considered to be random variables, including the uncertainty of corrosion, which would be an inevitable hazard in the work environment of the automatic salt collector. Probabilistic performance functions were selected from the strength performances of the automatic salt collector structure. First-order reliability method, second-order reliability method, mean value reliability method, and adaptive importance sampling method were applied during the reliability analyses. The probabilistic design performances such as reliability probability and numerical costs based on the reliability analysis methods were compared to the Monte Carlo simulation results. The adaptive importance sampling method showed the most rational results for the probabilistic structure design of the automatic salt collector.

A novel reliability analysis method based on Gaussian process classification for structures with discontinuous response

  • Zhang, Yibo;Sun, Zhili;Yan, Yutao;Yu, Zhenliang;Wang, Jian
    • Structural Engineering and Mechanics
    • /
    • v.75 no.6
    • /
    • pp.771-784
    • /
    • 2020
  • Reliability analysis techniques combining with various surrogate models have attracted increasing attention because of their accuracy and great efficiency. However, they primarily focus on the structures with continuous response, while very rare researches on the reliability analysis for structures with discontinuous response are carried out. Furthermore, existing adaptive reliability analysis methods based on importance sampling (IS) still have some intractable defects when dealing with small failure probability, and there is no related research on reliability analysis for structures involving discontinuous response and small failure probability. Therefore, this paper proposes a novel reliability analysis method called AGPC-IS for such structures, which combines adaptive Gaussian process classification (GPC) and adaptive-kernel-density-estimation-based IS. In AGPC-IS, an efficient adaptive strategy for design of experiments (DoE), taking into consideration the classification uncertainty, the sampling uniformity and the regional classification accuracy improvement, is developed with the purpose of improving the accuracy of Gaussian process classifier. The adaptive kernel density estimation is introduced for constructing the quasi-optimal density function of IS. In addition, a novel and more precise stopping criterion is also developed from the perspective of the stability of failure probability estimation. The efficiency, superiority and practicability of AGPC-IS are verified by three examples.

Efficiency and Robustness of Fully Adaptive Simulated Maximum Likelihood Method

  • Oh, Man-Suk;Kim, Dai-Gyoung
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.3
    • /
    • pp.479-485
    • /
    • 2009
  • When a part of data is unobserved the marginal likelihood of parameters given the observed data often involves analytically intractable high dimensional integral and hence it is hard to find the maximum likelihood estimate of the parameters. Simulated maximum likelihood(SML) method which estimates the marginal likelihood via Monte Carlo importance sampling and optimize the estimated marginal likelihood has been used in many applications. A key issue in SML is to find a good proposal density from which Monte Carlo samples are generated. The optimal proposal density is the conditional density of the unobserved data given the parameters and the observed data, and attempts have been given to find a good approximation to the optimal proposal density. Algorithms which adaptively improve the proposal density have been widely used due to its simplicity and efficiency. In this paper, we describe a fully adaptive algorithm which has been used by some practitioners but has not been well recognized in statistical literature, and evaluate its estimation performance and robustness via a simulation study. The simulation study shows a great improvement in the order of magnitudes in the mean squared error, compared to non-adaptive or partially adaptive SML methods. Also, it is shown that the fully adaptive SML is robust in a sense that it is insensitive to the starting points in the optimization routine.

Probabilistic Structure Design of Automatic Salt Collector Using Reliability Based Robust Optimization (신뢰성 기반 강건 최적화를 이용한 자동채염기의 확률론적 구조설계)

  • Song, Chang Yong
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.23 no.5
    • /
    • pp.799-807
    • /
    • 2020
  • This paper deals with identification of probabilistic design using reliability based robust optimization in structure design of automatic salt collector. The thickness sizing variables of main structure member in the automatic salt collector were considered the random design variables including the uncertainty of corrosion that would be an inevitable hazardousness in the saltern work environment. The probabilistic constraint functions were selected from the strength performances of the automatic salt collector. The reliability based robust optimum design problem was formulated such that the random design variables were determined by minimizing the weight of the automatic salt collector subject to the probabilistic strength performance constraints evaluating from reliability analysis. Mean value reliability method and adaptive importance sampling method were applied to the reliability evaluation in the reliability based robust optimization. The three sigma level quality was considered robustness in side constraints. The probabilistic optimum design results according to the reliability analysis methods were compared to deterministic optimum design results. The reliability based robust optimization using the mean value reliability method showed the most rational results for the probabilistic optimum structure design of the automatic salt collector.

Adaptive Importance Sampling Method with Response Surface Technique (응답면기법을 이용한 적응적 중요표본추출법)

  • 나경웅;김상효;이상호
    • Computational Structural Engineering
    • /
    • v.11 no.4
    • /
    • pp.309-320
    • /
    • 1998
  • 중요표본추출기법중에서도 층화표본추출법을 이용한 적응적 중요표본추출기법이 일반적으로 가장 합리적인 것으로 알려져 있다. 그러나 확률장 유한요소모형문제와 같이 기본 확률변수의 규모가 큰 경우에는 층화표본추출법에서 요구되는 기본적인 표본점의 규모가 급증하여 효율성이 떨어지게 된다. 본 연구에서는 이러한 한계성을 극복하기 위하여 층화표본추출에서 기본확률변수를 사용하는 대신에 기본확률변수들의 함수이며 새로운 확률변수인 응답값을 이용하는 방법을 개발하였다. 여기에서 응답값은 일반적인 함수형태로 표시되지 않으며, 한 번의 응답계산에 많은 계산량이 소요되므로 이러한 문제점을 해결하기 위하여 응답면식을 이용한 층화표본추출법을 개발하였다. 개발된 기법에서는 기본확률변수의 모의발생규모는 기본의 기본확률변수를 이용한 층화표본추출법에서 보다 증가하지만 매우 많은 계산량을 요구하는 실제응답해석규모는 응답면식을 이용함으로써 획기적으로 감소되었다. 특히 본 기법은 기본확률변수의 규모가 크고 대상한계상태의 파괴확률이 낮을수록 기존의 방법과 비교해 효율성이 증대되는 것으로 분석되었다.

  • PDF

Development of Face Tracking System Using Skin Color and Facial Shape (얼굴의 색상과 모양정보를 이용한 조명 변화에 강인한 얼굴 추적 시스템 구현)

  • Lee, Hyung-Soo
    • The KIPS Transactions:PartB
    • /
    • v.10B no.6
    • /
    • pp.711-718
    • /
    • 2003
  • In this paper, we propose a robust face tracking algorithm. It is based on Condensation algorithm [7] and uses skin color and facial shape as the observation measure. It is hard to integrate color weight and shape weight. So we propose the method that has two separate trackers which uses skin color and facial shape as the observation measure respectively. One tracker tracks skin colored region and the other tracks facial shape. We used importance sampling technique to limit sampling region of two trackers. For skin-colored region tracker, we propose an adaptive color model to avoid the effect of illumination change. The proposed face tracker performs robustly in clutter background and in the illumination changes.

Developing statistical models and constructing clinical systems for analyzing semi-competing risks data produced from medicine, public heath, and epidemiology (의료, 보건, 역학 분야에서 생산되는 준경쟁적 위험자료를 분석하기 위한 통계적 모형의 개발과 임상분석시스템 구축을 위한 연구)

  • Kim, Jinheum
    • The Korean Journal of Applied Statistics
    • /
    • v.33 no.4
    • /
    • pp.379-393
    • /
    • 2020
  • A terminal event such as death may censor an intermediate event such as relapse, but not vice versa in semi-competing risks data, which is often seen in medicine, public health, and epidemiology. We propose a Weibull regression model with a normal frailty to analyze semi-competing risks data when all three transition times of the illness-death model are possibly interval-censored. We construct the conditional likelihood separately depending on the types of subjects: still alive with or without the intermediate event, dead with or without the intermediate event, and dead with the intermediate event missing. Optimal parameter estimates are obtained from the iterative quasi-Newton algorithm after the marginalization of the full likelihood using the adaptive importance sampling. We illustrate the proposed method with extensive simulation studies and PAQUID (Personnes Agées Quid) data.

Additive hazards models for interval-censored semi-competing risks data with missing intermediate events (결측되었거나 구간중도절단된 중간사건을 가진 준경쟁적위험 자료에 대한 가산위험모형)

  • Kim, Jayoun;Kim, Jinheum
    • The Korean Journal of Applied Statistics
    • /
    • v.30 no.4
    • /
    • pp.539-553
    • /
    • 2017
  • We propose a multi-state model to analyze semi-competing risks data with interval-censored or missing intermediate events. This model is an extension of the three states of the illness-death model: healthy, disease, and dead. The 'diseased' state can be considered as the intermediate event. Two more states are added into the illness-death model to incorporate the missing events, which are caused by a loss of follow-up before the end of a study. One of them is a state of the lost-to-follow-up (LTF), and the other is an unobservable state that represents an intermediate event experienced after the occurrence of LTF. Given covariates, we employ the Lin and Ying additive hazards model with log-normal frailty and construct a conditional likelihood to estimate transition intensities between states in the multi-state model. A marginalization of the full likelihood is completed using adaptive importance sampling, and the optimal solution of the regression parameters is achieved through an iterative quasi-Newton algorithm. Simulation studies are performed to investigate the finite-sample performance of the proposed estimation method in terms of empirical coverage probability of true regression parameters. Our proposed method is also illustrated with a dataset adapted from Helmer et al. (2001).