• 제목/요약/키워드: entropy measure

검색결과 203건 처리시간 0.021초

Application of Discrimination Information (Cross Entropy) as Information-theoretic Measure to Safety Assessment in Manufacturing Processes

  • Choi, Gi-Heung;Ryu, Boo-Hyung
    • International Journal of Safety
    • /
    • 제4권2호
    • /
    • pp.1-5
    • /
    • 2005
  • Design of manufacturing process, in general, facilitates the creation of new process that may potentially harm the workers. Design of safety-guaranteed manufacturing process is, therefore, very important since it determines the ultimate outcomes of manufacturing activities involving safety of workers. This study discusses application of discrimination information (cross entropy) to safety assessment of manufacturing processes. The idea is based on the general principles of design and their applications. An example of Cartesian robotic movement is given.

Minimum Variance Unbiased Estimation for the Maximum Entropy of the Transformed Inverse Gaussian Random Variable by Y=X-1/2

  • Choi, Byung-Jin
    • Communications for Statistical Applications and Methods
    • /
    • 제13권3호
    • /
    • pp.657-667
    • /
    • 2006
  • The concept of entropy, introduced in communication theory by Shannon (1948) as a measure of uncertainty, is of prime interest in information-theoretic statistics. This paper considers the minimum variance unbiased estimation for the maximum entropy of the transformed inverse Gaussian random variable by $Y=X^{-1/2}$. The properties of the derived UMVU estimator is investigated.

A CODING THEOREM ON GENERALIZED R-NORM ENTROPY

  • Hooda, D.S.
    • Journal of applied mathematics & informatics
    • /
    • 제8권3호
    • /
    • pp.881-888
    • /
    • 2001
  • Recently, Hooda and Ram [7] have proposed and characterized a new generalized measure of R-norm entropy. In the present communication we have studied its application in coding theory. Various mean codeword lengths and their bounds have been defined and a coding theorem on lower and upper bounds of a generalized mean codeword length in term of the generalized R-norm entropy has been proved.

Some Characterization Results Based on Dynamic Survival and Failure Entropies

  • Abbasnejad, Maliheh
    • Communications for Statistical Applications and Methods
    • /
    • 제18권6호
    • /
    • pp.787-798
    • /
    • 2011
  • In this paper, we develop some characterization results in terms of survival entropy of the first order statistic. In addition, we generalize the cumulative entropy recently proposed by Di Crescenzo and Logobardi (2009) to a new measure of information (called the failure entropy) and study some properties of it and its dynamic version. Furthermore, power distribution is characterized based on dynamic failure entropy.

ONLINE TEST BASED ON MUTUAL INFORMATION FOR TRUE RANDOM NUMBER GENERATORS

  • Kim, Young-Sik;Yeom, Yongjin;Choi, Hee Bong
    • 대한수학회지
    • /
    • 제50권4호
    • /
    • pp.879-897
    • /
    • 2013
  • Shannon entropy is one of the widely used randomness measures especially for cryptographic applications. However, the conventional entropy tests are less sensitive to the inter-bit dependency in random samples. In this paper, we propose new online randomness test schemes for true random number generators (TRNGs) based on the mutual information between consecutive ${\kappa}$-bit output blocks for testing of inter-bit dependency in random samples. By estimating the block entropies of distinct lengths at the same time, it is possible to measure the mutual information, which is closely related to the amount of the statistical dependency between two consecutive data blocks. In addition, we propose a new estimation method for entropies, which accumulates intermediate values of the number of frequencies. The proposed method can estimate entropy with less samples than Maurer-Coron type entropy test can. By numerical simulations, it is shown that the new proposed scheme can be used as a reliable online entropy estimator for TRNGs used by cryptographic modules.

기울기 벡터장과 조건부 엔트로피 결합에 의한 의료영상 정합 (Medical Image Registration by Combining Gradient Vector Flow and Conditional Entropy Measure)

  • 이명은;김수형;김선월;임준식
    • 정보처리학회논문지B
    • /
    • 제17B권4호
    • /
    • pp.303-308
    • /
    • 2010
  • 본 논문에서는 기울기 벡터장과 조건부 엔트로피를 결합한 의료영상 정합 방법을 제안한다. 정합 방법은 조건부 확률의 엔트로피에 기반한 측도를 수행한다. 먼저 공간적 정보를 얻기 위해 윤곽선 정보의 방향을 제공하는 기울기 정보인 기울기 벡터장을 계산한다. 다음으로 주어진 두 영상에서 픽셀의 밝기정보와 에지정보를 결합하여 조인트 히스토그램을 계산하여 조건부 엔트로피를 구하고, 이것을 두 영상의 정합측도로 사용한다. 제안된 방법의 성능평가를 위해 자기공명 영상과 변환된 컴퓨터단층촬영 영상에 기존 방법인 상호정보기반의 측도, 조건부 엔트로피만을 사용한 측도와 비교 실험을 수행한다. 실험결과로부터 제안한 방법이 기존의 최적화 방법들 보다 더 빠르고 정확한 정합임을 알 수 있다.

On Information Theoretic Index for Measuring the Stochastic Dependence Among Sets of Variates

  • Kim, Hea-Jung
    • Journal of the Korean Statistical Society
    • /
    • 제26권1호
    • /
    • pp.131-146
    • /
    • 1997
  • In this paper the problem of measuring the stochastic dependence among sets fo random variates is considered, and attention is specifically directed to forming a single well-defined measure of the dependence among sets of normal variates. A new information theoretic measure of the dependence called dependence index (DI) is introduced and its several properties are studied. The development of DI is based on the generalization and normalization of the mutual information introduced by Kullback(1968). For data analysis, minimum cross entropy estimator of DI is suggested, and its asymptotic distribution is obtained for testing the existence of the dependence. Monte Carlo simulations demonstrate the performance of the estimator, and show that is is useful not only for evaluation of the dependence, but also for independent model testing.

  • PDF

Maximum entropy test for infinite order autoregressive models

  • Lee, Sangyeol;Lee, Jiyeon;Noh, Jungsik
    • Journal of the Korean Data and Information Science Society
    • /
    • 제24권3호
    • /
    • pp.637-642
    • /
    • 2013
  • In this paper, we consider the maximum entropy test in in nite order autoregressiv models. Its asymptotic distribution is derived under the null hypothesis. A bootstrap version of the test is discussed and its performance is evaluated through Monte Carlo simulations.

${L_{1:1}}^\beta$(t) IN TERMS OF A GENERALIZED MEASURE OF ENTROPY

  • Hooda, D.S.;Ram, Anant
    • Journal of applied mathematics & informatics
    • /
    • 제5권1호
    • /
    • pp.201-212
    • /
    • 1998
  • In the present paper we define the codes which assign D-alphabet one-one codeword to each outcome of a random variable and the functions which represent possible transormations from one-one codes of size D to suitable codes. By using these functions we obtain lower bounds on the exponentiated mean codeword length for one-one codes in terms of the generalized entropy of order $\alpha$ and type $\beta$ and study the particular cases also.