• Title/Summary/Keyword: Probability density estimate

Search Result 136, Processing Time 0.081 seconds

Power Estimation by Using Testability (테스트 용이도를 이용한 전력소모 예측)

  • Lee, Jae-Hun;Min, Hyeong-Bok
    • The Transactions of the Korea Information Processing Society
    • /
    • v.6 no.3
    • /
    • pp.766-772
    • /
    • 1999
  • With the increase of portable system and high-density IC, power consumption of VLSI circuits is very important factor in design process. Power estimation is required in order to estimate the power consumption. A simple and correct solution of power estimation is to use circuit simulation. But it is very time consuming and inefficient way. Probabilistic method has been proposed to overcome this problem. Transition density using probability was an efficient method to estimate power consumption using BDD and Boolean difference. But it is difficult to build the BDD and compute complex Boolean difference. In this paper, we proposed Propowest. Propowest is building a digraph of circuit, and easy and fast in computing transition density by using modified COP algorithm. Propowest provides an efficient way for power estimation.

  • PDF

THE MINIMUM VARIANCE UNBIASED ESTIMATION OF SYSTEM RELIABILITY

  • Park, C.J.;Kim, Jae-Joo
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.4 no.1
    • /
    • pp.29-32
    • /
    • 1978
  • We obtain the minimum variance unbiased estimate of system reliability when a system consists of n components whose life times are assumed to be independent and identically distributed either negative exponential or geometric random variables. For the case of a negative exponential life time, we obtain the minimum variance unbiased estimate of the probability density function of the i-th order statistic.

  • PDF

Fingerprint Image Quality Assessment for On-line Fingerprint Recognition (온라인 지문 인식 시스템을 위한 지문 품질 측정)

  • Lee, Sang-Hoon
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.47 no.2
    • /
    • pp.77-85
    • /
    • 2010
  • Fingerprint image quality checking is one of the most important issues in on-line fingerprint recognition because the recognition performance is largely affected by the quality of fingerprint images. In the past, many related fingerprint quality checking methods have typically considered the local quality of fingerprint. However, It is necessary to estimate the global quality of fingerprint to judge whether the fingerprint can be used or not in on-line recognition systems. Therefore, in this paper, we propose both local and global-based methods to calculate the fingerprint quality. Local fingerprint quality checking algorithm considers both the condition of the input fingerprints and orientation estimation errors. The 2D gradients of the fingerprint images were first separated into two sets of 1D gradients. Then,the shapes of the PDFs(Probability Density Functions) of these gradients were measured in order to determine fingerprint quality. And global fingerprint quality checking method uses neural network to estimate the global fingerprint quality based on local quality values. We also analyze the matching performance using FVC2002 database. Experimental results showed that proposed quality check method has better matching performance than NFIQ(NIST Fingerprint Image Quality) method.

Pattern Optimization of Intentional Blade Mistuning for the Reduction of the Forced Response Using Genetic Algorithm

  • Park, Byeong-Keun
    • Journal of Mechanical Science and Technology
    • /
    • v.17 no.7
    • /
    • pp.966-977
    • /
    • 2003
  • This paper investigates how intentional mistuning of bladed disks reduces their sensitivity to unintentional random mistuning. The class of intentionally mistuned disks considered here is limited, for cost reasons, to arrangements of two types of blades (A and B, say). A two-step procedure is then described to optimize the arrangement of these blades around the disk to reduce the effects of unintentional random mistuning. First, a pure optimization effort is undertaken to obtain the pattern (s) of the A and B blades that yields small/the smallest value of the largest amplitude of response to a given excitation in the absence of unintentional random mistuning using Genetic Algorithm. Then, in the second step, a qualitative/quantitative estimate of the sensitivity for the optimized intentionally mistuned bladed disks with respect to unintentional random mistuning is performed by analyzing their amplification factor, probability density function and passband/stopband structures. Examples of application with simple bladed disk models demonstrate the significant benefits of using this class of intentionally mistuned disks.

Modeling Quantization Error using Laplacian Probability Density function (Laplacian 분포 함수를 이용한 양자화 잡음 모델링)

  • 최지은;이병욱
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.26 no.11A
    • /
    • pp.1957-1962
    • /
    • 2001
  • Image and video compression requires quantization error model of DCT coefficients for post processing, restoration or transcoding. Once DCT coefficients are quantized, it is impossible to recover the original distribution. We assume that the original probability density function (pdf) is the Laplacian function. We calculate the variance of the quantized variable, and estimate the variance of the DCT coefficients. We can confirm that the proposed method enhances the accuracy of the quantization error estimation.

  • PDF

Wakeby Distribution and the Maximum Likelihood Estimation Algorithm in Which Probability Density Function Is Not Explicitly Expressed

  • Park Jeong-Soo
    • Communications for Statistical Applications and Methods
    • /
    • v.12 no.2
    • /
    • pp.443-451
    • /
    • 2005
  • The studied in this paper is a new algorithm for searching the maximum likelihood estimate(MLE) in which probability density function is not explicitly expressed. Newton-Raphson's root-finding routine and a nonlinear numerical optimization algorithm with constraint (so-called feasible sequential quadratic programming) are used. This algorithm is applied to the Wakeby distribution which is importantly used in hydrology and water resource research for analysis of extreme rainfall. The performance comparison between maximum likelihood estimates and method of L-moment estimates (L-ME) is studied by Monte-carlo simulation. The recommended methods are L-ME for up to 300 observations and MLE for over the sample size, respectively. Methods for speeding up the algorithm and for computing variances of estimates are discussed.

A QUALITATIVE METHOD TO ESTIMATE HSI DISPLAY COMPLEXITY

  • Hugo, Jacques;Gertman, David
    • Nuclear Engineering and Technology
    • /
    • v.45 no.2
    • /
    • pp.141-150
    • /
    • 2013
  • There is mounting evidence that complex computer system displays in control rooms contribute to cognitive complexity and, thus, to the probability of human error. Research shows that reaction time increases and response accuracy decreases as the number of elements in the display screen increase. However, in terms of supporting the control room operator, approaches focusing on addressing display complexity solely in terms of information density and its location and patterning, will fall short of delivering a properly designed interface. This paper argues that information complexity and semantic complexity are mandatory components when considering display complexity and that the addition of these concepts assists in understanding and resolving differences between designers and the preferences and performance of operators. This paper concludes that a number of simplified methods, when combined, can be used to estimate the impact that a particular display may have on the operator's ability to perform a function accurately and effectively. We present a mixed qualitative and quantitative approach and a method for complexity estimation.

Reliability Evaluation of Parameter Estimation Methods of Probability Density Function for Estimating Probability Rainfalls (확률강우량 추정을 위한 확률분포함수의 매개변수 추정법에 대한 신뢰성 평가)

  • Han, Jeong-Woo;Kwon, Hyun-Han;Kim, Tae-Woong
    • Journal of the Korean Society of Hazard Mitigation
    • /
    • v.9 no.6
    • /
    • pp.143-151
    • /
    • 2009
  • Extreme hydrologic events cause serious disaster, such as flood and drought. Many researchers have an effort to estimate design rainfalls or discharges. This study evaluated parameter estimation methods to estimate probability rainfalls with low uncertainty which will be used in design rainfalls. This study collected rainfall data from Incheon, Gangnueng, Gwangju, Busan, and Chupungryong gage station, and generated synthetic rainfall data using ARMA model. This study employed the maximum likelihood method and the Bayesian inference method for estimating parameters of the Gumbel and GEV distribution. Using a bootstrap resampling method, this study estimated the confidence intervals of estimated probability rainfalls. Based on the comparison of the confidence intervals, this study recommended a proper parameter estimation method for estimating probability rainfalls which have a low uncertainty.

Efficient Deployment of RSUs in Smart Highway Environment

  • Ge, Mingzhu;Chung, Yeongjee
    • International journal of advanced smart convergence
    • /
    • v.8 no.4
    • /
    • pp.179-187
    • /
    • 2019
  • Vehicular density is usually low in a highway environment. Consequently, connectivity of the vehicular ad hoc networks (VANETs) might be poor. We are investigating the problem of deploying the approximation optimal roadside units (RSUs) on the highway covered by VANETs, which employs VANETs to provide excellent connectivity. The goal is to estimate the minimal number of deployed RSUs to guarantee the connectivity probability of the VANET within a given threshold considering that RSUs are to be allocated equidistantly. We apply an approximation algorithm to distribute RSUs locations in the VANETs. Thereafter, performance of the proposed scheme is evaluated by calculating the connectivity probability of the VANET. The simulation results show that there is the threshold value M of implemented RSUs corresponding to each vehicular network with N vehicles. The connectivity probability increases slowly with the number of RSUs getting larger.

Structural reliability estimation based on quasi ideal importance sampling simulation

  • Yonezawa, Masaaki;Okuda, Shoya;Kobayashi, Hiroaki
    • Structural Engineering and Mechanics
    • /
    • v.32 no.1
    • /
    • pp.55-69
    • /
    • 2009
  • A quasi ideal importance sampling simulation method combined in the conditional expectation is proposed for the structural reliability estimation. The quasi ideal importance sampling joint probability density function (p.d.f.) is so composed on the basis of the ideal importance sampling concept as to be proportional to the conditional failure probability multiplied by the p.d.f. of the sampling variables. The respective marginal p.d.f.s of the ideal importance sampling joint p.d.f. are determined numerically by the simulations and partly by the piecewise integrations. The quasi ideal importance sampling simulations combined in the conditional expectation are executed to estimate the failure probabilities of structures with multiple failure surfaces and it is shown that the proposed method gives accurate estimations efficiently.