• 제목/요약/키워드: Kullback-Leibler Information

검색결과 60건 처리시간 0.02초

쿨백-레이블러 정보함수에 기초한 와이블분포와 극단값 분포에 대한 적합도 검정 (A Test for Weibull Distribution and Extreme Value Distribution Based on Kullback-Leibler Information)

  • 김종태;이우동
    • 응용통계연구
    • /
    • 제11권2호
    • /
    • pp.351-362
    • /
    • 1998
  • 공학의 응용분야인 신뢰수명론에서 와이블분포는 매우 중요한 역할을 해왔다. 그러나 와이블분포는 분포자체가 가지고 있는 형상모수의 영향으로 인하여 적합도 청정에 있어서 어려움의 대상이 되어 왔다. 이 논문은 쿨백-레이블러 정보 (Kullback-Leibler Information)을 이용한, 와이블 분포의 모수들에 영향을 받지 않은 검정 통계량을 제시함으로 위의 문제점을 해결하고, 제시된 검정 통계량에 대한 점근적 성질들과 결정력을 분석하였다. 제시된 검정 통계량은 기존의 결정 통계량들보다 검정력 비교에 있어서 더 우수한 검정력들을 보였고, 또한 실제 자료에 의한 적합도 검정의 예제를 보였다.

  • PDF

Generalized Kullback-Leibler information and its extensions to censored and discrete cases

  • Park, Sangun
    • Journal of the Korean Data and Information Science Society
    • /
    • 제23권6호
    • /
    • pp.1223-1229
    • /
    • 2012
  • In this paper, we propose a generalized Kullback-Leibler (KL) information for measuring the distance between two distribution functions where the extension to the censored case is immediate. The generalized KL information has the nonnegativity and characterization properties, and its censored version has the additional property of monotonic increase. We also extend the discussion to the discrete case and propose a generalized censored measure which is comparable to Pearson's chi-square statistic.

A View on Extension of Utility-Based on Links with Information Measures

  • Hoseinzadeh, A.R.;Borzadaran, G.R.Mohtashami;Yari, G.H.
    • Communications for Statistical Applications and Methods
    • /
    • 제16권5호
    • /
    • pp.813-820
    • /
    • 2009
  • In this paper, we review the utility-based generalization of the Shannon entropy and Kullback-Leibler information measure as the U-entropy and the U-relative entropy that was introduced by Friedman et al. (2007). Then, we derive some relations between the U-relative entropy and other information measures based on a parametric family of utility functions.

On scaled cumulative residual Kullback-Leibler information

  • Hwang, Insung;Park, Sangun
    • Journal of the Korean Data and Information Science Society
    • /
    • 제24권6호
    • /
    • pp.1497-1501
    • /
    • 2013
  • Cumulative residual Kullback-Leibler (CRKL) information is well defined on the empirical distribution function (EDF) and allows us to construct a EDF-based goodness of t test statistic. However, we need to consider a scaled CRKL because CRKL is not scale invariant. In this paper, we consider several criterions for estimating the scale parameter in the scale CRKL and compare the performances of the estimated CRKL in terms of both power and unbiasedness.

DIRECTIONAL LOG-DENSITY ESTIMATION

  • Huh, Jib;Kim, Peter T.;Koo, Ja-Yong;Park, Jin-Ho
    • Journal of the Korean Statistical Society
    • /
    • 제33권3호
    • /
    • pp.255-269
    • /
    • 2004
  • This paper develops log-density estimation for directional data. The methodology is to use expansions with respect to spherical harmonics followed by estimating the unknown parameters by maximum likelihood. Minimax rates of convergence in terms of the Kullback-Leibler information divergence are obtained.

Code-Reuse Attack Detection Using Kullback-Leibler Divergence in IoT

  • Ho, Jun-Won
    • International journal of advanced smart convergence
    • /
    • 제5권4호
    • /
    • pp.54-56
    • /
    • 2016
  • Code-reuse attacks are very dangerous in various systems. This is because they do not inject malicious codes into target systems, but reuse the instruction sequences in executable files or libraries of target systems. Moreover, code-reuse attacks could be more harmful to IoT systems in the sense that it may not be easy to devise efficient and effective mechanism for code-reuse attack detection in resource-restricted IoT devices. In this paper, we propose a detection scheme with using Kullback-Leibler (KL) divergence to combat against code-reuse attacks in IoT. Specifically, we detect code-reuse attacks by calculating KL divergence between the probability distributions of the packets that generate from IoT devices and contain code region addresses in memory system and the probability distributions of the packets that come to IoT devices and contain code region addresses in memory system, checking if the computed KL divergence is abnormal.

역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정 (Kullback-Leibler Information-Based Tests of Fit for Inverse Gaussian Distribution)

  • 최병진
    • 응용통계연구
    • /
    • 제24권6호
    • /
    • pp.1271-1284
    • /
    • 2011
  • 본 논문에서는 위치와 척도모수가 모두 알려지지 않은 역가우스분포에 대한 적합도 검정으로 기존에 개발된 엔트로피 기반 검정을 확장한 쿨백-라이블러 정보 기반 적합도 검정을 소개한다. 역가우스분포에 대한 단순 또는 복합 영가설을 검정하기 위한 4가지 형태의 검정통계량을 제시하고 검정통계량의 계산에 사용할 표본크기에 따른 윈도크기와 기각값을 모의실험을 통해 결정하여 표의 형태로 제공한다. 검정력 분석을 위해 수행한 모의실험의 결과에서 위치와 척도모수가 모두 알려진 역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정은 모든 대립분포와 표본크기에서 EDF 검정들보다 좋은 검정력을 가지는 것으로 나타난다. 위치모수 또는 척도모수만 알려진 역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정은 모든 대립분포에 대해서 표본크기가 커짐에 따라 검정력이 증가하는 경향을 보인다. 위치와 척도모수가 모두 알려지지 않은 역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정은 대체적으로 엔트로피 기반 검정과 비슷한 수준의 검정력을 보이는 것으로 나타나고 이 결과를 통해서 두 검정은 동일함을 확인할 수 있다.

SOME INEQUALITIES FOR THE $CSISZ{\acute{A}}R\;{\Phi}-DIVERGENCE$

  • Dragomir, S.S.
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • 제7권1호
    • /
    • pp.63-77
    • /
    • 2003
  • Some inequalities for the $Csisz{\acute{a}}r\;{\Phi}-divergence$ and applications for the Kullback-Leibler, $R{\acute{e}}nyi$, Hellinger and Bhattacharyya distances in Information Theory are given.

  • PDF

Kullback-Leibler Information of the Equilibrium Distribution Function and its Application to Goodness of Fit Test

  • Park, Sangun;Choi, Dongseok;Jung, Sangah
    • Communications for Statistical Applications and Methods
    • /
    • 제21권2호
    • /
    • pp.125-134
    • /
    • 2014
  • Kullback-Leibler (KL) information is a measure of discrepancy between two probability density functions. However, several nonparametric density function estimators have been considered in estimating KL information because KL information is not well-defined on the empirical distribution function. In this paper, we consider the KL information of the equilibrium distribution function, which is well defined on the empirical distribution function (EDF), and propose an EDF-based goodness of fit test statistic. We evaluate the performance of the proposed test statistic for an exponential distribution with Monte Carlo simulation. We also extend the discussion to the censored case.

A Goodness of Fit Tests Based on the Partial Kullback-Leibler Information with the Type II Censored Data

  • Park, Sang-Un;Lim, Jong-Gun
    • 한국통계학회:학술대회논문집
    • /
    • 한국통계학회 2003년도 추계 학술발표회 논문집
    • /
    • pp.233-238
    • /
    • 2003
  • Goodness of fit test statistics based on the information discrepancy have been shown to perform very well (Vasicek 1976, Dudewicz and van der Meulen 1981, Chandra et al 1982, Gohkale 1983, Arizona and Ohta 1989, Ebrahimi et al 1992, etc). Although the test is well defined for the non-censored case, censored case has not been discussed in the literature. Therefore we consider a goodness of fit test based on the partial Kullback-Leibler(KL) information with the type II censored data. We derive the partial KL information of the null distribution function and a nonparametric distribution function, and establish a goodness of fit test statistic. We consider the exponential and normal distributions and made Monte Calro simulations to compare the test statistics with some existing tests.

  • PDF