• Title/Summary/Keyword: Probability density estimation

Search Result 221, Processing Time 0.034 seconds

Fatigue Strength Analysis of Propulsion Shafting System with Two Stroke Low Speed Diesel Engine by Torsional Vibration in Frequency Domain (주파수 영역에서 비틀림진동에 의한 저속 2행정 디젤엔진을 갖는 추진축계의 피로강도 해석)

  • Kim, S.H.;Lee, D.C.
    • Proceedings of the Korean Society for Noise and Vibration Engineering Conference
    • /
    • 2007.05a
    • /
    • pp.416-422
    • /
    • 2007
  • Prime movers in most large merchant ships adapt two stroke low speed diesel engine which has higher efficiency, mobility and durability. However, severe torsional vibration in these diesel engines may be induced by higher fluctuation of combustion pressures. Consequently, it may lead sometimes to propulsion shafting failure due to the accumulated fatigue stresses. Shaft fatigue strength analysis had been done traditionally in time domain but this method is complicated and difficult in analysing bi-modal vibration system such as the case of cylinder misfiring condition. In this paper authors introduce an assessment method of fatigue strength estimation for propulsion shafting system with two stroke low speed diesel engine in the frequency domain.

  • PDF

Mathematical representation to assess the wind resource by three parameter Weibull distribution

  • Sukkiramathi, K.;Rajkumar, R.;Seshaiah, C.V.
    • Wind and Structures
    • /
    • v.31 no.5
    • /
    • pp.419-430
    • /
    • 2020
  • Weibull distribution is a conspicuous distribution known for its accuracy and its usage for wind energy analysis. The two and three parameter Weibull distributions are adopted in this study to fit wind speed data. The daily mean wind speed data of Ennore, Tamil Nadu, India has been used to validate the procedure. The parameters are estimated using maximum likelihood method, least square method and moment method. Four statistical tests namely Root mean square error, R2 test, Kolmogorov-Smirnov test and Anderson-Darling test are employed to inspect the fitness of Weibull probability density functions. The value of shape factor, scale factor, wind speed and wind power are determined at a height of 100m using extrapolation of numerical equations. Also, the value of capacity factor is calculated mathematically. This study provides a way to evaluate feasible locations for wind energy assessment, which can be used at any windy site throughout the world.

FREQUENCY HISTOGRAM MODEL FOR LINE TRANSECT DATA WITH AND WITHOUT THE SHOULDER CONDITION

  • EIDOUS OMAR
    • Journal of the Korean Statistical Society
    • /
    • v.34 no.1
    • /
    • pp.49-60
    • /
    • 2005
  • In this paper we introduce a nonparametric method for estimating the probability density function of detection distances in line transect sampling. The estimator is obtained using a frequency histogram density estimation method. The asymptotic properties of the proposed estimator are derived and compared with those of the kernel estimator under the assumption that the data collected satisfy the shoulder condition. We found that the asymptotic mean square error (AMSE) of the two estimators have about the same convergence rate. The formula for the optimal histogram bin width is derived which minimizes AMSE. Moreover, the performances of the corresponding k-nearest-neighbor estimators are studied through simulation techniques. In the absence of our knowledge whether the shoulder condition is valid or not a new semi-parametric model is suggested to fit the line transect data. The performances of the proposed two estimators are studied and compared with some existing nonparametric and semiparametric estimators using simulation techniques. The results demonstrate the superiority of the new estimators in most cases considered.

Posterior density estimation of Kappa via Gibbs sampler in the beta-binomial model (베타-이항 분포에서 Gibbs sampler를 이용한 평가 일치도의 사후 분포 추정)

  • 엄종석;최일수;안윤기
    • The Korean Journal of Applied Statistics
    • /
    • v.7 no.2
    • /
    • pp.9-19
    • /
    • 1994
  • Beta-binomial model, which is reparametrized in terms of the mean probability $\mu$ of a positive deagnosis and the $\kappa$ of agreement, is widely used in psychology. When $\mu$ is close to 0, inference about $\kappa$ become difficult because likelihood function becomes constant. We consider Bayesian approach in this case. To apply Bayesian analysis, Gibbs sampler is used to overcome difficulties in integration. Marginal posterior density functions are estimated and Bayesian estimates are derived by using Gibbs sampler and compare the results with the one obtained by using numerical integration.

  • PDF

A QUALITATIVE METHOD TO ESTIMATE HSI DISPLAY COMPLEXITY

  • Hugo, Jacques;Gertman, David
    • Nuclear Engineering and Technology
    • /
    • v.45 no.2
    • /
    • pp.141-150
    • /
    • 2013
  • There is mounting evidence that complex computer system displays in control rooms contribute to cognitive complexity and, thus, to the probability of human error. Research shows that reaction time increases and response accuracy decreases as the number of elements in the display screen increase. However, in terms of supporting the control room operator, approaches focusing on addressing display complexity solely in terms of information density and its location and patterning, will fall short of delivering a properly designed interface. This paper argues that information complexity and semantic complexity are mandatory components when considering display complexity and that the addition of these concepts assists in understanding and resolving differences between designers and the preferences and performance of operators. This paper concludes that a number of simplified methods, when combined, can be used to estimate the impact that a particular display may have on the operator's ability to perform a function accurately and effectively. We present a mixed qualitative and quantitative approach and a method for complexity estimation.

Bayesian estimation for finite population proportions in multinomial data

  • Kwak, Sang-Gyu;Kim, Dal-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.3
    • /
    • pp.587-593
    • /
    • 2012
  • We study Bayesian estimates for finite population proportions in multinomial problems. To do this, we consider a three-stage hierarchical Bayesian model. For prior, we use Dirichlet density to model each cell probability in each cluster. Our method does not require complicated computation such as Metropolis-Hastings algorithm to draw samples from each density of parameters. We draw samples using Gibbs sampler with grid method. We apply this algorithm to a couple of simulation data under three scenarios and we estimate the finite population proportions using two kinds of approaches We compare results with the point estimates of finite population proportions and their standard deviations. Finally, we check the consistency of computation using differen samples drawn from distinct iterates.

Detection and Assessment of Forest Cover Change in Gangwon Province, Inter-Korean, Based on Gaussian Probability Density Function (가우시안 확률밀도 함수기반 강원도 남·북한 지역의 산림면적 변화탐지 및 평가)

  • Lee, Sujong;Park, Eunbeen;Song, Cholho;Lim, Chul-Hee;Cha, Sungeun;Lee, Sle-gee;Lee, Woo-Kyun
    • Korean Journal of Remote Sensing
    • /
    • v.35 no.5_1
    • /
    • pp.649-663
    • /
    • 2019
  • The 2018 United Nations Development Programme (UNDP) report announced that deforestation in North Korea is the most extreme situation and in terms of climate change, this deforestation is a global scale issue. To respond deforestation, various study and projects are conducted based on remote sensing, but access to public data in North Korea is limited, and objectivity is difficult to be guaranteed. In this study, the forest detection based on density estimation in statistic using Landsat imagery was conducted in Gangwon province which is the only administrative district divided into South and North. The forest spatial data of South Korea was used as data for the labeling of forest and Non-forest in the Normalized Difference Vegetation Index (NDVI), and a threshold (0.6658) for forest detection was set by Gaussian Probability Density Function (PDF) estimation by category. The results show that the forest area decreased until the 2000s in both Korea, but the area increased in 2010s. It is also confirmed that the reduction of forest area on the local scale is the same as the policy direction of urbanization and industrialization at that time. The Kappa value for validation was strong agreement (0.8) and moderate agreement (0.6), respectively. The detection based on the Gaussian PDF estimation is considered a method for complementing the statistical limitations of the existing detection method using satellite imagery. This study can be used as basic data for deforestation in North Korea and Based on the detection results, it is necessary to protect and restore forest resources.

Optimization of Gaussian Mixture in CDHMM Training for Improved Speech Recognition

  • Lee, Seo-Gu;Kim, Sung-Gil;Kang, Sun-Mee;Ko, Han-Seok
    • Speech Sciences
    • /
    • v.5 no.1
    • /
    • pp.7-21
    • /
    • 1999
  • This paper proposes an improved training procedure in speech recognition based on the continuous density of the Hidden Markov Model (CDHMM). Of the three parameters (initial state distribution probability, state transition probability, output probability density function (p.d.f.) of state) governing the CDHMM model, we focus on the third parameter and propose an efficient algorithm that determines the p.d.f. of each state. It is known that the resulting CDHMM model converges to a local maximum point of parameter estimation via the iterative Expectation Maximization procedure. Specifically, we propose two independent algorithms that can be embedded in the segmental K -means training procedure by replacing relevant key steps; the adaptation of the number of mixture Gaussian p.d.f. and the initialization using the CDHMM parameters previously estimated. The proposed adaptation algorithm searches for the optimal number of mixture Gaussian humps to ensure that the p.d.f. is consistently re-estimated, enabling the model to converge toward the global maximum point. By applying an appropriate threshold value, which measures the amount of collective changes of weighted variances, the optimized number of mixture Gaussian branch is determined. The initialization algorithm essentially exploits the CDHMM parameters previously estimated and uses them as the basis for the current initial segmentation subroutine. It captures the trend of previous training history whereas the uniform segmentation decimates it. The recognition performance of the proposed adaptation procedures along with the suggested initialization is verified to be always better than that of existing training procedure using fixed number of mixture Gaussian p.d.f.

  • PDF

Bayesian Parameter Estimation for Prognosis of Crack Growth under Variable Amplitude Loading (변동진폭하중 하에서 균열성장예지를 위한 베이지안 모델변수 추정법)

  • Leem, Sang-Hyuck;An, Da-Wn;Choi, Joo-Ho
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.35 no.10
    • /
    • pp.1299-1306
    • /
    • 2011
  • In this study, crack-growth model parameters subjected to variable amplitude loading are estimated in the form of a probability distribution using the method of Bayesian parameter estimation. Huang's model is employed to describe the retardation and acceleration of the crack growth during the loadings. The Markov Chain Monte Carlo (MCMC) method is used to obtain samples of the parameters following the probability distribution. As the conventional MCMC method often fails to converge to the equilibrium distribution because of the increased complexity of the model under variable amplitude loading, an improved MCMC method is introduced to overcome this shortcoming, in which a marginal (PDF) is employed as a proposal density function. The model parameters are estimated on the basis of the data from several test specimens subjected to constant amplitude loading. The prediction is then made under variable amplitude loading for the same specimen, and validated by the ground-truth data using the estimated parameters.

Research on improvement of target tracking performance of LM-IPDAF through improvement of clutter density estimation method (클러터밀도 추정 방법 개선을 통한 LM-IPDAF의 표적 추적 성능 향상 연구)

  • Yoo, In-Je;Park, Sung-Jae
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.5
    • /
    • pp.99-110
    • /
    • 2017
  • Improving tracking performance by estimating the status of multiple targets using radar is important. In a clutter environment, a joint event occurs between the track and measurement in multiple target tracking using a tracking filter. As the number increases, the joint event increases exponentially. The problem to be considered when multiple target tracking filter design in such environments is that first, the tracking filter minimizes the rate of false track alarmsby eliminating the false track and quickly confirming the target track. The purpose is to increase the FTD performance. The second consideration is to improve the track maintenance performance by allocating each measurement to a track efficiently when an event occurs. Through two considerations, a single target tracking data association technique is extended to a multiple target tracking filter, and representative algorithms are JIPDAF and LM-IPDAF. In this study, a probabilistic evaluation of many hypotheses in the assignment of measurements was not performed, so that the computation amount does not increase nonlinearly according to the number of measurements and tracks, and the track existence probability based on the track density The LM-IPDAF algorithm was introduced. This paper also proposes a method to reduce the computational complexity by improving the clutter density estimation method for calculating the track existence probability of LM-IPDAF. The performance was verified by a comparison with the existing algorithm through simulation. As a result, it was possible to reduce the simulation processing time by approximately 20% while achieving equivalent performance on the position RMSE and Confirmed True Track.