• 제목/요약/키워드: divergence measures

검색결과 31건 처리시간 0.025초

SOME NEW MEASURES OF FUZZY DIRECTED DIVERGENCE AND THEIR GENERALIZATION

  • PARKASH OM;SHARMA P. K.
    • 한국수학교육학회지시리즈B:순수및응용수학
    • /
    • 제12권4호
    • /
    • pp.307-315
    • /
    • 2005
  • There exist many measures of fuzzy directed divergence corresponding to the existing probabilistic measures. Some new measures of fuzzy divergence have been proposed which correspond to some well-known existing probabilistic measures. The essential properties of the proposed measures have been developed which contains many existing measures of fuzzy directed divergence.

  • PDF

A NEW EXPONENTIAL DIRECTED DIVERGENCE INFORMATION MEASURE

  • JAIN, K.C.;CHHABRA, PRAPHULL
    • Journal of applied mathematics & informatics
    • /
    • 제34권3_4호
    • /
    • pp.295-308
    • /
    • 2016
  • Depending upon the nature of the problem, different divergence measures are suitable. So it is always desirable to develop a new divergence measure. In the present work, new information divergence measure, which is exponential in nature, is introduced and characterized. Bounds of this new measure are obtained in terms of various symmetric and non- symmetric measures together with numerical verification by using two discrete distributions: Binomial and Poisson. Fuzzy information measure and Useful information measure corresponding to new exponential divergence measure are also introduced.

Local Sensitivity Analysis using Divergence Measures under Weighted Distribution

  • Chung, Younshik;Dey, Dipak K.
    • Journal of the Korean Statistical Society
    • /
    • 제30권3호
    • /
    • pp.467-480
    • /
    • 2001
  • This paper considers the use of local $\phi$-divergence measures between posterior distributions under classes of perturbations in order to investigate the inherent robustness of certain classes. The smaller value of the limiting local $\phi$-divergence implies more robustness for the prior or the likelihood. We consider the cases when the likelihood comes form the class of weighted distribution. Two kinds of perturbations are considered for the local sensitivity analysis. In addition, some numerical examples are considered which provide measures of robustness.

  • PDF

INEQUALITIES FOR QUANTUM f-DIVERGENCE OF CONVEX FUNCTIONS AND MATRICES

  • Dragomir, Silvestru Sever
    • Korean Journal of Mathematics
    • /
    • 제26권3호
    • /
    • pp.349-371
    • /
    • 2018
  • Some inequalities for quantum f-divergence of matrices are obtained. It is shown that for normalised convex functions it is nonnegative. Some upper bounds for quantum f-divergence in terms of variational and ${\chi}^2-distance$ are provided. Applications for some classes of divergence measures such as Umegaki and Tsallis relative entropies are also given.

GOODNESS OF FIT TESTS BASED ON DIVERGENCE MEASURES

  • Pasha, Eynollah;Kokabi, Mohsen;Mohtashami, Gholam Reza
    • Journal of applied mathematics & informatics
    • /
    • 제26권1_2호
    • /
    • pp.177-189
    • /
    • 2008
  • In this paper, we have considered an investigation on goodness of fit tests based on divergence measures. In the case of categorical data, under certain regularity conditions, we obtained asymptotic distribution of these tests. Also, we have proposed a modified test that improves the rate of convergence. In continuous case, we used our modified entropy estimator [10], for Kullback-Leibler information estimation. A comparative study based on simulation results is discussed also.

  • PDF

Test for Parameter Change based on the Estimator Minimizing Density-based Divergence Measures

  • Na, Ok-Young;Lee, Sang-Yeol;Park, Si-Yun
    • 한국통계학회:학술대회논문집
    • /
    • 한국통계학회 2003년도 춘계 학술발표회 논문집
    • /
    • pp.287-293
    • /
    • 2003
  • In this paper we consider the problem of parameter change based on the cusum test proposed by Lee et al. (2003). The cusum test statistic is constructed utilizing the estimator minimizing density-based divergence measures. It is shown that under regularity conditions, the test statistic has the limiting distribution of the sup of standard Brownian bridge. Simulation results demonstrate that the cusum test is robust when there arc outliers.

  • PDF

분포유사도를 이용한 문헌클러스터링의 성능향상에 대한 연구 (Improving the Performance of Document Clustering with Distributional Similarities)

  • 이재윤
    • 정보관리학회지
    • /
    • 제24권4호
    • /
    • pp.267-283
    • /
    • 2007
  • 이 연구에서는 분포 유사도를 문헌 클러스터링에 적용하여 전통적인 코사인 유사도 공식을 대체할 수 있는 가능성을 모색해보았다. 대표적인 분포 유사도인 KL 다이버전스 공식을 변형한 Jansen-Shannon 다이버전스, 대칭적 스큐 다이버전스, 최소스큐 다이버전스의 세 가지 공식을 문헌 벡터에 적용하는 방안을 고안하였다. 분포 유사도를 적용한 문헌 클러스터링 성능을 검증하기 위해서 세 실험 집단을 대상으로 두 가지 실험을 준비하여 실행하였다. 첫 번째 문헌클러스터링실험에서는 최소스큐다이버전스가 코사인 유사도 뿐만 아니라 다른 다이버전스공식의 성능도 확연히 앞서는 뛰어난 성능을 보였다. 두번째 실험에서는 피어슨 상관계수를 이용하여1차 유사도 행렬로부터2차 분포 유사도를 산출하여 문헌 클러스터링을 수행하였다. 실험결과는 2차 분포 유사도가 전반적으로더 좋은 문헌 클러스터링성능을 보이는 것으로 나타났다. 문헌클러스터링에서 처리 시간과 분류 성능을 함께 고려한다면 이 연구에서 제안한 최소 스큐 다이버전스 공식을 사용하고, 분류 성능만 고려할 경우에는 2차 분포 유사도 방식을 사용하는 것이 바람직하다고 판단된다.

Empirical Comparisons of Disparity Measures for Partial Association Models in Three Dimensional Contingency Tables

  • Jeong, D.B.;Hong, C.S.;Yoon, S.H.
    • Communications for Statistical Applications and Methods
    • /
    • 제10권1호
    • /
    • pp.135-144
    • /
    • 2003
  • This work is concerned with comparison of the recently developed disparity measures for the partial association model in three dimensional categorical data. Data are generated by using simulation on each term in the log-linear model equation based on the partial association model, which is a proposed method in this paper. This alternative Monte Carlo methods are explored to study the behavior of disparity measures such as the power divergence statistic I(λ), the Pearson chi-square statistic X$^2$, the likelihood ratio statistic G$^2$, the blended weight chi-square statistic BWCS(λ), the blended weight Hellinger distance statistic BWHD(λ), and the negative exponential disparity statistic NED(λ) for moderate sample sizes. We find that the power divergence statistic I(2/3) and the blended weight Hellinger distance family BWHD(1/9) are the best tests with respect to size and power.

Bayesian Model Selection in the Unbalanced Random Effect Model

  • Kim, Dal-Ho;Kang, Sang-Gil;Lee, Woo-Dong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제15권4호
    • /
    • pp.743-752
    • /
    • 2004
  • In this paper, we develop the Bayesian model selection procedure using the reference prior for comparing two nested model such as the independent and intraclass models using the distance or divergence between the two as the basis of comparison. A suitable criterion for this is the power divergence measure as introduced by Cressie and Read(1984). Such a measure includes the Kullback -Liebler divergence measures and the Hellinger divergence measure as special cases. For this problem, the power divergence measure turns out to be a function solely of $\rho$, the intraclass correlation coefficient. Also, this function is convex, and the minimum is attained at $\rho=0$. We use reference prior for $\rho$. Due to the duality between hypothesis tests and set estimation, the hypothesis testing problem can also be solved by solving a corresponding set estimation problem. The present paper develops Bayesian method based on the Kullback-Liebler and Hellinger divergence measures, rejecting $H_0:\rho=0$ when the specified divergence measure exceeds some number d. This number d is so chosen that the resulting credible interval for the divergence measure has specified coverage probability $1-{\alpha}$. The length of such an interval is compared with the equal two-tailed credible interval and the HPD credible interval for $\rho$ with the same coverage probability which can also be inverted into acceptance regions of $H_0:\rho=0$. Example is considered where the HPD interval based on the one-at- a-time reference prior turns out to be the shortest credible interval having the same coverage probability.

  • PDF

알파 다이버전스를 이용한 무게중심 모델 기반 음악 유사도 (Centroid-model based music similarity with alpha divergence)

  • 서진수;김정현;박지현
    • 한국음향학회지
    • /
    • 제35권2호
    • /
    • pp.83-91
    • /
    • 2016
  • 음악 유사도 계산은 음악 검색 및 분류 등의 정보 처리 시스템 구현에 있어서 가장 중요한 부분이다. 본 논문은 최근 제안된 무게중심 모델을 이용한 음악 검색 방법에 대해서 살펴보고, 무게중심 모델의 확률 분포 유사도를 이용하여 음악 검색을 수행하고 성능을 평가하였다. 확률 분포간의 거리는 주어진 두 개의 확률 분포가 특정 기준에서 얼마나 가까운 지를 계산하는 것으로 다이버전스라고 불리기도 한다. 본 논문에서는 무게중심 모델에서 확률 분포 간의 거리 비교 시에 알파 다이버전스를 활용하였다. 알파 다이버전스는 알파 값에 따라 다양한 형태를 가지며, 널리 사용되고 있는 KLD(Kullback-Leibler)와 BD(Bhattacharyya Distance)를 포함한다. 음악 장르와 가수 데이터셋에서 검색 실험을 수행했고, 확률 분포 거리 기반 유사도와 벡터 거리 기반 유사도의 음악 검색 성능을 비교하였다. 알파 다이버전스를 통해서 무게중심 모델 기반 음악 검색 성능을 개선시킬 수 있음을 보였다.