• Title/Summary/Keyword: estimation of probability density function

Search Result 146, Processing Time 0.028 seconds

Estimation of Probability Density Function of Tidal Elevation Data using the Double Truncation Method (이중 절단 기법을 이용한 조위자료의 확률밀도함수 추정)

  • Jeong, Shin-Taek;Cho, Hong-Yeon;Kim, Jeong-Dae;Hui, Ko-Dong
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.20 no.3
    • /
    • pp.247-254
    • /
    • 2008
  • The double-peak normal distribution function (DPDF) suggested by Cho et al.(2004) has the problems that the extremely high and low tidal elevations are frequently generated in the Monte-Carlo simulation processes because the upper and lower limits of the DPDF are unbounded in spite of the excellent goodness-offit results. In this study, the modified DPDF is suggested by introducing the upper and lower value parameters and re-scale parameters in order to remove these problems. These new parameters of the DPDF are optimally estimated by the non-linear optimization problem solver using the Levenberg-Marquardt scheme. This modified DPDF can remove completely the unrealistically generated tidal levations and give a slightly better fit than the existing DRDF. Based on the DPDF's characteristic power, the over- and under estimation problems of the design factors are also automatically intercepted, too.

Image Denoising Using Bivariate Gaussian Model In Wavelet Domain (웨이블릿 영역에서 이변수 가우스 모델을 이용한 영상 잡음 제거)

  • Eom, Il-Kyu
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.6
    • /
    • pp.57-63
    • /
    • 2008
  • In this paper, we present an efficient noise reduction method using bivariate Gaussian density function in the wavelet domain. In our method, the probability model for the interstate dependency in the wavelet domain is modeled by bivariate Gaussian function, and then, the noise reduction is performed by Bayesian estimation. The statistical parameter for Bayesian estimation can be approximately obtained by the $H{\ddot{o}}lder$ inequality. The simulation results show that our method outperforms the previous methods using bivariate probability models.

Contingency Estimation Method based on Stochastic Earned Value Management System (추계적 EVMS 기반 예비비 산정 방법론)

  • Gwak, Han-Seong;Choi, Byung-Youn;Yi, Chang-Yong;Lee, Dong-Eun
    • Proceedings of the Korean Institute of Building Construction Conference
    • /
    • 2018.05a
    • /
    • pp.72-73
    • /
    • 2018
  • The accuracy of contingency estimation plays an important role for dealing with the uncertainty of the financial success of construction project. Its' estimation may be used for various purposes such as schedule control, emergency resolve, and quality expense, etc. This paper presents a contingency estimation method which is schedule control specific. The method 1) implements stochastic EVMS, 2) detects a specific timing for schedule compression, 3) identifies an optimal strategy for shortening planned schedule, 4) finds a probability density function (PDF) of project cost overrun, and 5) estimates the optimal contingency cost based on the level of confidence. The method facilitates expeditious decisions involved in project budgeting. The validity of the method is confirmed by performing test case.

  • PDF

Probabilistic assessment on the basis of interval data

  • Thacker, Ben H.;Huyse, Luc J.
    • Structural Engineering and Mechanics
    • /
    • v.25 no.3
    • /
    • pp.331-345
    • /
    • 2007
  • Uncertainties enter a complex analysis from a variety of sources: variability, lack of data, human errors, model simplification and lack of understanding of the underlying physics. However, for many important engineering applications insufficient data are available to justify the choice of a particular probability density function (PDF). Sometimes the only data available are in the form of interval estimates which represent, often conflicting, expert opinion. In this paper we demonstrate that Bayesian estimation techniques can successfully be used in applications where only vague interval measurements are available. The proposed approach is intended to fit within a probabilistic framework, which is established and widely accepted. To circumvent the problem of selecting a specific PDF when only little or vague data are available, a hierarchical model of a continuous family of PDF's is used. The classical Bayesian estimation methods are expanded to make use of imprecise interval data. Each of the expert opinions (interval data) are interpreted as random interval samples of a parent PDF. Consequently, a partial conflict between experts is automatically accounted for through the likelihood function.

THE MINIMUM VARIANCE UNBIASED ESTIMATION OF SYSTEM RELIABILITY

  • Park, C.J.;Kim, Jae-Joo
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.4 no.1
    • /
    • pp.29-32
    • /
    • 1978
  • We obtain the minimum variance unbiased estimate of system reliability when a system consists of n components whose life times are assumed to be independent and identically distributed either negative exponential or geometric random variables. For the case of a negative exponential life time, we obtain the minimum variance unbiased estimate of the probability density function of the i-th order statistic.

  • PDF

Time-Delay Estimation in the Multi-Path Channel based on Maximum Likelihood Criterion

  • Xie, Shengdong;Hu, Aiqun;Huang, Yi
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.6 no.4
    • /
    • pp.1063-1075
    • /
    • 2012
  • To locate an object accurately in the wireless sensor networks, the distance measure based on time-delay plays an important role. In this paper, we propose a maximum likelihood (ML) time-delay estimation algorithm in multi-path wireless propagation channel. We get the joint probability density function after sampling the frequency domain response of the multi-path channel, which could be obtained by the vector network analyzer. Based on the ML criterion, the time-delay values of different paths are estimated. Considering the ML function is non-linear with respect to the multi-path time-delays, we first obtain the coarse values of different paths using the subspace fitting algorithm, then take them as an initial point, and finally get the ML time-delay estimation values with the pattern searching optimization method. The simulation results show that although the ML estimation variance could not reach the Cramer-Rao lower bounds (CRLB), its performance is superior to that of subspace fitting algorithm, and could be seen as a fine algorithm.

Estimation of Frequency of Storm Surge Heights on the West and South Coasts of Korea Using Synthesized Typhoons (확률론적 합성태풍을 이용한 서남해안 빈도 해일고 산정)

  • Kim, HyeonJeong;Suh, SeungWon
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.31 no.5
    • /
    • pp.241-252
    • /
    • 2019
  • To choose appropriate countermeasures against potential coastal disaster damages caused by a storm surge, it is necessary to estimate the frequency of storm surge heights estimation. As the coastal populations size in the past was small, the tropical cyclone risk model (TCRM) was used to generate 176,689 synthetic typhoons. In simulation, historical paths and central pressures were incorporated as a probability density function. Moreover, to consider the typhoon characteristics that resurfaced or decayed after landfall on the southeast coast of China, incorporated the shift angle of the historical typhoon as a function of the probability density function and applied it as a damping parameter. Thus, the passing rate of typhoons moving from the southeast coast of China to the south coast has improved. The characteristics of the typhoon were analyzed from the historical typhoon information using correlations between the central pressure, maximum wind speed ($V_{max}$) and the maximum wind speed radius ($R_{max}$); it was then applied to synthetic typhoons. The storm surges were calculated using the ADCIRC model, considering both tidal and synthetic typhoons using automated Perl script. The storm surges caused by the probabilistic synthetic typhoons appear similar to the recorded storm surges, therefore this proposed scheme can be applied to the storm surge simulations. Based on these results, extreme values were calculated using the Generalized Extreme Value (GEV) method, and as a result, the 100-year return period storm surge was found to be satisfactory compared with the calculated empirical simulation value. The method proposed in this study can be applied to estimate the frequency of storm surges in coastal areas.

Bandwidth selections based on cross-validation for estimation of a discontinuity point in density (교차타당성을 이용한 확률밀도함수의 불연속점 추정의 띠폭 선택)

  • Huh, Jib
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.4
    • /
    • pp.765-775
    • /
    • 2012
  • The cross-validation is a popular method to select bandwidth in all types of kernel estimation. The maximum likelihood cross-validation, the least squares cross-validation and biased cross-validation have been proposed for bandwidth selection in kernel density estimation. In the case that the probability density function has a discontinuity point, Huh (2012) proposed a method of bandwidth selection using the maximum likelihood cross-validation. In this paper, two forms of cross-validation with the one-sided kernel function are proposed for bandwidth selection to estimate the location and jump size of the discontinuity point of density. These methods are motivated by the least squares cross-validation and the biased cross-validation. By simulated examples, the finite sample performances of two proposed methods with the one of Huh (2012) are compared.

Fingerprint Image Quality Assessment for On-line Fingerprint Recognition (온라인 지문 인식 시스템을 위한 지문 품질 측정)

  • Lee, Sang-Hoon
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.47 no.2
    • /
    • pp.77-85
    • /
    • 2010
  • Fingerprint image quality checking is one of the most important issues in on-line fingerprint recognition because the recognition performance is largely affected by the quality of fingerprint images. In the past, many related fingerprint quality checking methods have typically considered the local quality of fingerprint. However, It is necessary to estimate the global quality of fingerprint to judge whether the fingerprint can be used or not in on-line recognition systems. Therefore, in this paper, we propose both local and global-based methods to calculate the fingerprint quality. Local fingerprint quality checking algorithm considers both the condition of the input fingerprints and orientation estimation errors. The 2D gradients of the fingerprint images were first separated into two sets of 1D gradients. Then,the shapes of the PDFs(Probability Density Functions) of these gradients were measured in order to determine fingerprint quality. And global fingerprint quality checking method uses neural network to estimate the global fingerprint quality based on local quality values. We also analyze the matching performance using FVC2002 database. Experimental results showed that proposed quality check method has better matching performance than NFIQ(NIST Fingerprint Image Quality) method.

Probabilistic distribution of displacement response of frictionally damped structures excited by seismic loads

  • Lee, S.H.;Youn, K.J.;Min, K.W.;Park, J.H.
    • Smart Structures and Systems
    • /
    • v.6 no.4
    • /
    • pp.363-372
    • /
    • 2010
  • Accurate peak response estimation of a seismically excited structure with frictional damping system (FDS) is very difficult since the structure with FDS shows nonlinear behavior dependent on the structural period, loading characteristics, and relative magnitude between the frictional force and the excitation load. Previous studies have estimated the peak response of the structure with FDS by replacing a nonlinear system with an equivalent linear one or by employing the response spectrum obtained based on nonlinear time history and statistical analysis. In case that earthquake excitation is defined probabilistically, corresponding response of the structure with FDS becomes to have probabilistic distribution. In this study, nonlinear time history analyses were performed for the structure with FDS subjected to artificial earthquake excitation generated using Kanai-Tajimi filter. An equation for the probability density function (PDF) of the displacement response is proposed by adapting the PDF of the normal distribution. Coefficients of the proposed PDF are obtained by regression of the statistical distribution of the time history responses. Finally, the correlation between the resulting PDFs and statistical response distribution is investigated.