• 제목/요약/키워드: Divergence Measure

검색결과 66건 처리시간 0.027초

A NEW EXPONENTIAL DIRECTED DIVERGENCE INFORMATION MEASURE

  • JAIN, K.C.;CHHABRA, PRAPHULL
    • Journal of applied mathematics & informatics
    • /
    • 제34권3_4호
    • /
    • pp.295-308
    • /
    • 2016
  • Depending upon the nature of the problem, different divergence measures are suitable. So it is always desirable to develop a new divergence measure. In the present work, new information divergence measure, which is exponential in nature, is introduced and characterized. Bounds of this new measure are obtained in terms of various symmetric and non- symmetric measures together with numerical verification by using two discrete distributions: Binomial and Poisson. Fuzzy information measure and Useful information measure corresponding to new exponential divergence measure are also introduced.

Bayesian Model Selection in the Unbalanced Random Effect Model

  • Kim, Dal-Ho;Kang, Sang-Gil;Lee, Woo-Dong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제15권4호
    • /
    • pp.743-752
    • /
    • 2004
  • In this paper, we develop the Bayesian model selection procedure using the reference prior for comparing two nested model such as the independent and intraclass models using the distance or divergence between the two as the basis of comparison. A suitable criterion for this is the power divergence measure as introduced by Cressie and Read(1984). Such a measure includes the Kullback -Liebler divergence measures and the Hellinger divergence measure as special cases. For this problem, the power divergence measure turns out to be a function solely of $\rho$, the intraclass correlation coefficient. Also, this function is convex, and the minimum is attained at $\rho=0$. We use reference prior for $\rho$. Due to the duality between hypothesis tests and set estimation, the hypothesis testing problem can also be solved by solving a corresponding set estimation problem. The present paper develops Bayesian method based on the Kullback-Liebler and Hellinger divergence measures, rejecting $H_0:\rho=0$ when the specified divergence measure exceeds some number d. This number d is so chosen that the resulting credible interval for the divergence measure has specified coverage probability $1-{\alpha}$. The length of such an interval is compared with the equal two-tailed credible interval and the HPD credible interval for $\rho$ with the same coverage probability which can also be inverted into acceptance regions of $H_0:\rho=0$. Example is considered where the HPD interval based on the one-at- a-time reference prior turns out to be the shortest credible interval having the same coverage probability.

  • PDF

분포유사도를 이용한 문헌클러스터링의 성능향상에 대한 연구 (Improving the Performance of Document Clustering with Distributional Similarities)

  • 이재윤
    • 정보관리학회지
    • /
    • 제24권4호
    • /
    • pp.267-283
    • /
    • 2007
  • 이 연구에서는 분포 유사도를 문헌 클러스터링에 적용하여 전통적인 코사인 유사도 공식을 대체할 수 있는 가능성을 모색해보았다. 대표적인 분포 유사도인 KL 다이버전스 공식을 변형한 Jansen-Shannon 다이버전스, 대칭적 스큐 다이버전스, 최소스큐 다이버전스의 세 가지 공식을 문헌 벡터에 적용하는 방안을 고안하였다. 분포 유사도를 적용한 문헌 클러스터링 성능을 검증하기 위해서 세 실험 집단을 대상으로 두 가지 실험을 준비하여 실행하였다. 첫 번째 문헌클러스터링실험에서는 최소스큐다이버전스가 코사인 유사도 뿐만 아니라 다른 다이버전스공식의 성능도 확연히 앞서는 뛰어난 성능을 보였다. 두번째 실험에서는 피어슨 상관계수를 이용하여1차 유사도 행렬로부터2차 분포 유사도를 산출하여 문헌 클러스터링을 수행하였다. 실험결과는 2차 분포 유사도가 전반적으로더 좋은 문헌 클러스터링성능을 보이는 것으로 나타났다. 문헌클러스터링에서 처리 시간과 분류 성능을 함께 고려한다면 이 연구에서 제안한 최소 스큐 다이버전스 공식을 사용하고, 분류 성능만 고려할 경우에는 2차 분포 유사도 방식을 사용하는 것이 바람직하다고 판단된다.

PATTERSON-SULLIVAN MEASURE AND GROUPS OF DIVERGENCE TYPE

  • Hong, Sungbok
    • 대한수학회보
    • /
    • 제30권2호
    • /
    • pp.223-228
    • /
    • 1993
  • In this paper, we use the Patterson-Sullivan measure and results of [H] to show that for a nonelementary discrete group of divergence type, the conical limit set .LAMBDA.$_{c}$ has positive Patterson-Sullivan measure. The definition of the Patterson-Sullivan measure for groups of divergence type is reviewed in section 2. The Patterson-Sullivan measure can also be defined for groups of convergence type and the details for that case can be found in [N]. Necessary definitions and results from [H] are given in section 3, and in section 4, we prove our main result.t.

  • PDF

분산 기반의 Gradient Based Fuzzy c-means 에 의한 MPEG VBR 비디오 데이터의 모델링과 분류 (Modeling and Classification of MPEG VBR Video Data using Gradient-based Fuzzy c_means with Divergence Measure)

  • 박동철;김봉주
    • 한국통신학회논문지
    • /
    • 제29권7C호
    • /
    • pp.931-936
    • /
    • 2004
  • GPDF(Gaussian Probability Density Function)을 효율적으로 군집화할 수 있는 GBFCM(DM)(Gradient Based Fuzzy c_means with Divergence Measure) 알고리즘이 본 논문에서 제안되었다. 제안된 GBFCM(DM)은 데이터 사이의 거리 척도로 발산거리(Divergence measure)를 적용한 새로운 형태의 FCM으로, 기존의 GBFCM에 기반을 두는 알고리즘이다. 본 논문에서는 MPEG VBR 비디오 데이터를 GPDF형태의 다차원 데이터로 변형시켜 모델링 하고, 모델링 한 MPEG VBR 비디오 데이터를 영화 또는 스포츠 형태로 분류하는데 응용되었다. 본 논문의 실험에서 기존의 FCM, GBFCM과 새롭게 제안된 GBFCM(DM)을 사용하여 모델링 및 분류결과를 상호 비교하였다. 비교결과 GBFCM(DM)이 오분류율의 기준에서 기존의 다른 알고리즘들에 비해 약 5∼l5%의 향상된 성능을 보였다.

Deriving a New Divergence Measure from Extended Cross-Entropy Error Function

  • Oh, Sang-Hoon;Wakuya, Hiroshi;Park, Sun-Gyu;Noh, Hwang-Woo;Yoo, Jae-Soo;Min, Byung-Won;Oh, Yong-Sun
    • International Journal of Contents
    • /
    • 제11권2호
    • /
    • pp.57-62
    • /
    • 2015
  • Relative entropy is a divergence measure between two probability density functions of a random variable. Assuming that the random variable has only two alphabets, the relative entropy becomes a cross-entropy error function that can accelerate training convergence of multi-layer perceptron neural networks. Also, the n-th order extension of cross-entropy (nCE) error function exhibits an improved performance in viewpoints of learning convergence and generalization capability. In this paper, we derive a new divergence measure between two probability density functions from the nCE error function. And the new divergence measure is compared with the relative entropy through the use of three-dimensional plots.

ON THE GOODNESS OF FIT TEST FOR DISCRETELY OBSERVED SAMPLE FROM DIFFUSION PROCESSES: DIVERGENCE MEASURE APPROACH

  • Lee, Sang-Yeol
    • 대한수학회지
    • /
    • 제47권6호
    • /
    • pp.1137-1146
    • /
    • 2010
  • In this paper, we study the divergence based goodness of fit test for partially observed sample from diffusion processes. In order to derive the limiting distribution of the test, we study the asymptotic behavior of the residual empirical process based on the observed sample. It is shown that the residual empirical process converges weakly to a Brownian bridge and the associated phi-divergence test has a chi-square limiting null distribution.

NEW INFORMATION INEQUALITIES ON ABSOLUTE VALUE OF THE FUNCTIONS AND ITS APPLICATION

  • CHHABRA, PRAPHULL
    • Journal of applied mathematics & informatics
    • /
    • 제35권3_4호
    • /
    • pp.371-385
    • /
    • 2017
  • Jain and Saraswat (2012) introduced new generalized f-information divergence measure, by which we obtained many well known and new information divergences. In this work, we introduce new information inequalities in absolute form on this new generalized divergence by considering convex normalized functions. Further, we apply these inequalities for getting new relations among well known divergences, together with numerical verification. Application to the Mutual information is also presented. Asymptotic approximation in terms of Chi- square divergence is done as well.

Void Formation Induced by the Divergence of the Diffusive Ionic Fluxes in Metal Oxides Under Chemical Potential Gradients

  • Maruyama, Toshio;Ueda, Mitsutoshi
    • 한국세라믹학회지
    • /
    • 제47권1호
    • /
    • pp.8-18
    • /
    • 2010
  • When metal oxides are exposed to chemical potential gradients, ions are driven to diffusive mass transport. During this transport process, the divergence of ionic fluxes offers the formation/annihilation of oxides. Therefore, the divergence of ionic flux may play an important role in the void formation in oxides. Kinetic equations were derived for describing chemical potential distribution, ionic fluxes and their divergence in oxides. The divergence was found to be the measure of void formation. Defect chemistry in scales is directly related to the sign of divergence and gives an indication of the void formation behavior. The quantitative estimation on the void formation was successfully applied to a growing magnetite scale in high temperature oxidation of iron at 823 K.

대규모 분할표 분석 (Analysis of Large Tables)

  • 최현집
    • 응용통계연구
    • /
    • 제18권2호
    • /
    • pp.395-410
    • /
    • 2005
  • 많은 수의 범주형 변수에 의한 대규모 분할표 분석을 위하여 차원축소(collapsibility) 성질을 이용한 분석 방법을 제안하였다. kullback-Leibler의 발산 측도(divergence measure)를 이용한 서로 완전한 연관을 갖는 변수그룹을 결정하는 방법을 제안하였으며, 제안된 방법에 의한 변수그룹은 주변 로그선형모형(marginal log-linear models)에 의하여 변수들간의 연관성을 식별할 수 있다. 제안된 방법의 적용 예로 데이터 마이닝에서 흔히 접할 수 있는 대규모 분할표 자료인 소비자들의 구매행위 분석을 위한 장바구니 자료의 분석 결과를 제시하였다.