• Title/Summary/Keyword: Kullback-Leibler Information

Search Result 60, Processing Time 0.024 seconds

A Test for Weibull Distribution and Extreme Value Distribution Based on Kullback-Leibler Information (쿨백-레이블러 정보함수에 기초한 와이블분포와 극단값 분포에 대한 적합도 검정)

  • 김종태;이우동
    • The Korean Journal of Applied Statistics
    • /
    • v.11 no.2
    • /
    • pp.351-362
    • /
    • 1998
  • In this paper, a test of fit for Weibull distribution on the estimated Kullback-Leibler information is proposed. The test uses the Vasicek entropy estimates, so to compute it a window size m must first be fried, and then is obtained critical values computed by Monte Carlo simulations. The power of the proposed test under various alternatives is compares with that of ocher famous tests. The use of the test is shown in an illustrative example.

  • PDF

Generalized Kullback-Leibler information and its extensions to censored and discrete cases

  • Park, Sangun
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.6
    • /
    • pp.1223-1229
    • /
    • 2012
  • In this paper, we propose a generalized Kullback-Leibler (KL) information for measuring the distance between two distribution functions where the extension to the censored case is immediate. The generalized KL information has the nonnegativity and characterization properties, and its censored version has the additional property of monotonic increase. We also extend the discussion to the discrete case and propose a generalized censored measure which is comparable to Pearson's chi-square statistic.

A View on Extension of Utility-Based on Links with Information Measures

  • Hoseinzadeh, A.R.;Borzadaran, G.R.Mohtashami;Yari, G.H.
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.5
    • /
    • pp.813-820
    • /
    • 2009
  • In this paper, we review the utility-based generalization of the Shannon entropy and Kullback-Leibler information measure as the U-entropy and the U-relative entropy that was introduced by Friedman et al. (2007). Then, we derive some relations between the U-relative entropy and other information measures based on a parametric family of utility functions.

On scaled cumulative residual Kullback-Leibler information

  • Hwang, Insung;Park, Sangun
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.6
    • /
    • pp.1497-1501
    • /
    • 2013
  • Cumulative residual Kullback-Leibler (CRKL) information is well defined on the empirical distribution function (EDF) and allows us to construct a EDF-based goodness of t test statistic. However, we need to consider a scaled CRKL because CRKL is not scale invariant. In this paper, we consider several criterions for estimating the scale parameter in the scale CRKL and compare the performances of the estimated CRKL in terms of both power and unbiasedness.

DIRECTIONAL LOG-DENSITY ESTIMATION

  • Huh, Jib;Kim, Peter T.;Koo, Ja-Yong;Park, Jin-Ho
    • Journal of the Korean Statistical Society
    • /
    • v.33 no.3
    • /
    • pp.255-269
    • /
    • 2004
  • This paper develops log-density estimation for directional data. The methodology is to use expansions with respect to spherical harmonics followed by estimating the unknown parameters by maximum likelihood. Minimax rates of convergence in terms of the Kullback-Leibler information divergence are obtained.

Code-Reuse Attack Detection Using Kullback-Leibler Divergence in IoT

  • Ho, Jun-Won
    • International journal of advanced smart convergence
    • /
    • v.5 no.4
    • /
    • pp.54-56
    • /
    • 2016
  • Code-reuse attacks are very dangerous in various systems. This is because they do not inject malicious codes into target systems, but reuse the instruction sequences in executable files or libraries of target systems. Moreover, code-reuse attacks could be more harmful to IoT systems in the sense that it may not be easy to devise efficient and effective mechanism for code-reuse attack detection in resource-restricted IoT devices. In this paper, we propose a detection scheme with using Kullback-Leibler (KL) divergence to combat against code-reuse attacks in IoT. Specifically, we detect code-reuse attacks by calculating KL divergence between the probability distributions of the packets that generate from IoT devices and contain code region addresses in memory system and the probability distributions of the packets that come to IoT devices and contain code region addresses in memory system, checking if the computed KL divergence is abnormal.

Kullback-Leibler Information-Based Tests of Fit for Inverse Gaussian Distribution (역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.6
    • /
    • pp.1271-1284
    • /
    • 2011
  • The entropy-based test of fit for the inverse Gaussian distribution presented by Mudholkar and Tian(2002) can only be applied to the composite hypothesis that a sample is drawn from an inverse Gaussian distribution with both the location and scale parameters unknown. In application, however, a researcher may want a test of fit either for an inverse Gaussian distribution with one parameter known or for an inverse Gaussian distribution with both the two partameters known. In this paper, we introduce tests of fit for the inverse Gaussian distribution based on the Kullback-Leibler information as an extension of the entropy-based test. A window size should be chosen to implement the proposed tests. By means of Monte Carlo simulations, window sizes are determined for a wide range of sample sizes and the corresponding critical values of the test statistics are estimated. The results of power analysis for various alternatives report that the Kullback-Leibler information-based goodness-of-fit tests have good power.

SOME INEQUALITIES FOR THE $CSISZ{\acute{A}}R\;{\Phi}-DIVERGENCE$

  • Dragomir, S.S.
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.7 no.1
    • /
    • pp.63-77
    • /
    • 2003
  • Some inequalities for the $Csisz{\acute{a}}r\;{\Phi}-divergence$ and applications for the Kullback-Leibler, $R{\acute{e}}nyi$, Hellinger and Bhattacharyya distances in Information Theory are given.

  • PDF

Kullback-Leibler Information of the Equilibrium Distribution Function and its Application to Goodness of Fit Test

  • Park, Sangun;Choi, Dongseok;Jung, Sangah
    • Communications for Statistical Applications and Methods
    • /
    • v.21 no.2
    • /
    • pp.125-134
    • /
    • 2014
  • Kullback-Leibler (KL) information is a measure of discrepancy between two probability density functions. However, several nonparametric density function estimators have been considered in estimating KL information because KL information is not well-defined on the empirical distribution function. In this paper, we consider the KL information of the equilibrium distribution function, which is well defined on the empirical distribution function (EDF), and propose an EDF-based goodness of fit test statistic. We evaluate the performance of the proposed test statistic for an exponential distribution with Monte Carlo simulation. We also extend the discussion to the censored case.

A Goodness of Fit Tests Based on the Partial Kullback-Leibler Information with the Type II Censored Data

  • Park, Sang-Un;Lim, Jong-Gun
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2003.10a
    • /
    • pp.233-238
    • /
    • 2003
  • Goodness of fit test statistics based on the information discrepancy have been shown to perform very well (Vasicek 1976, Dudewicz and van der Meulen 1981, Chandra et al 1982, Gohkale 1983, Arizona and Ohta 1989, Ebrahimi et al 1992, etc). Although the test is well defined for the non-censored case, censored case has not been discussed in the literature. Therefore we consider a goodness of fit test based on the partial Kullback-Leibler(KL) information with the type II censored data. We derive the partial KL information of the null distribution function and a nonparametric distribution function, and establish a goodness of fit test statistic. We consider the exponential and normal distributions and made Monte Calro simulations to compare the test statistics with some existing tests.

  • PDF