• Title/Summary/Keyword: divergence measures

Search Result 31, Processing Time 0.024 seconds

SOME NEW MEASURES OF FUZZY DIRECTED DIVERGENCE AND THEIR GENERALIZATION

  • PARKASH OM;SHARMA P. K.
    • The Pure and Applied Mathematics
    • /
    • v.12 no.4 s.30
    • /
    • pp.307-315
    • /
    • 2005
  • There exist many measures of fuzzy directed divergence corresponding to the existing probabilistic measures. Some new measures of fuzzy divergence have been proposed which correspond to some well-known existing probabilistic measures. The essential properties of the proposed measures have been developed which contains many existing measures of fuzzy directed divergence.

  • PDF

A NEW EXPONENTIAL DIRECTED DIVERGENCE INFORMATION MEASURE

  • JAIN, K.C.;CHHABRA, PRAPHULL
    • Journal of applied mathematics & informatics
    • /
    • v.34 no.3_4
    • /
    • pp.295-308
    • /
    • 2016
  • Depending upon the nature of the problem, different divergence measures are suitable. So it is always desirable to develop a new divergence measure. In the present work, new information divergence measure, which is exponential in nature, is introduced and characterized. Bounds of this new measure are obtained in terms of various symmetric and non- symmetric measures together with numerical verification by using two discrete distributions: Binomial and Poisson. Fuzzy information measure and Useful information measure corresponding to new exponential divergence measure are also introduced.

Local Sensitivity Analysis using Divergence Measures under Weighted Distribution

  • Chung, Younshik;Dey, Dipak K.
    • Journal of the Korean Statistical Society
    • /
    • v.30 no.3
    • /
    • pp.467-480
    • /
    • 2001
  • This paper considers the use of local $\phi$-divergence measures between posterior distributions under classes of perturbations in order to investigate the inherent robustness of certain classes. The smaller value of the limiting local $\phi$-divergence implies more robustness for the prior or the likelihood. We consider the cases when the likelihood comes form the class of weighted distribution. Two kinds of perturbations are considered for the local sensitivity analysis. In addition, some numerical examples are considered which provide measures of robustness.

  • PDF

INEQUALITIES FOR QUANTUM f-DIVERGENCE OF CONVEX FUNCTIONS AND MATRICES

  • Dragomir, Silvestru Sever
    • Korean Journal of Mathematics
    • /
    • v.26 no.3
    • /
    • pp.349-371
    • /
    • 2018
  • Some inequalities for quantum f-divergence of matrices are obtained. It is shown that for normalised convex functions it is nonnegative. Some upper bounds for quantum f-divergence in terms of variational and ${\chi}^2-distance$ are provided. Applications for some classes of divergence measures such as Umegaki and Tsallis relative entropies are also given.

GOODNESS OF FIT TESTS BASED ON DIVERGENCE MEASURES

  • Pasha, Eynollah;Kokabi, Mohsen;Mohtashami, Gholam Reza
    • Journal of applied mathematics & informatics
    • /
    • v.26 no.1_2
    • /
    • pp.177-189
    • /
    • 2008
  • In this paper, we have considered an investigation on goodness of fit tests based on divergence measures. In the case of categorical data, under certain regularity conditions, we obtained asymptotic distribution of these tests. Also, we have proposed a modified test that improves the rate of convergence. In continuous case, we used our modified entropy estimator [10], for Kullback-Leibler information estimation. A comparative study based on simulation results is discussed also.

  • PDF

Test for Parameter Change based on the Estimator Minimizing Density-based Divergence Measures

  • Na, Ok-Young;Lee, Sang-Yeol;Park, Si-Yun
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2003.05a
    • /
    • pp.287-293
    • /
    • 2003
  • In this paper we consider the problem of parameter change based on the cusum test proposed by Lee et al. (2003). The cusum test statistic is constructed utilizing the estimator minimizing density-based divergence measures. It is shown that under regularity conditions, the test statistic has the limiting distribution of the sup of standard Brownian bridge. Simulation results demonstrate that the cusum test is robust when there arc outliers.

  • PDF

Improving the Performance of Document Clustering with Distributional Similarities (분포유사도를 이용한 문헌클러스터링의 성능향상에 대한 연구)

  • Lee, Jae-Yun
    • Journal of the Korean Society for information Management
    • /
    • v.24 no.4
    • /
    • pp.267-283
    • /
    • 2007
  • In this study, measures of distributional similarity such as KL-divergence are applied to cluster documents instead of traditional cosine measure, which is the most prevalent vector similarity measure for document clustering. Three variations of KL-divergence are investigated; Jansen-Shannon divergence, symmetric skew divergence, and minimum skew divergence. In order to verify the contribution of distributional similarities to document clustering, two experiments are designed and carried out on three test collections. In the first experiment the clustering performances of the three divergence measures are compared to that of cosine measure. The result showed that minimum skew divergence outperformed the other divergence measures as well as cosine measure. In the second experiment second-order distributional similarities are calculated with Pearson correlation coefficient from the first-order similarity matrixes. From the result of the second experiment, secondorder distributional similarities were found to improve the overall performance of document clustering. These results suggest that minimum skew divergence must be selected as document vector similarity measure when considering both time and accuracy, and second-order similarity is a good choice for considering clustering accuracy only.

Empirical Comparisons of Disparity Measures for Partial Association Models in Three Dimensional Contingency Tables

  • Jeong, D.B.;Hong, C.S.;Yoon, S.H.
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.1
    • /
    • pp.135-144
    • /
    • 2003
  • This work is concerned with comparison of the recently developed disparity measures for the partial association model in three dimensional categorical data. Data are generated by using simulation on each term in the log-linear model equation based on the partial association model, which is a proposed method in this paper. This alternative Monte Carlo methods are explored to study the behavior of disparity measures such as the power divergence statistic I(λ), the Pearson chi-square statistic X$^2$, the likelihood ratio statistic G$^2$, the blended weight chi-square statistic BWCS(λ), the blended weight Hellinger distance statistic BWHD(λ), and the negative exponential disparity statistic NED(λ) for moderate sample sizes. We find that the power divergence statistic I(2/3) and the blended weight Hellinger distance family BWHD(1/9) are the best tests with respect to size and power.

Bayesian Model Selection in the Unbalanced Random Effect Model

  • Kim, Dal-Ho;Kang, Sang-Gil;Lee, Woo-Dong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.4
    • /
    • pp.743-752
    • /
    • 2004
  • In this paper, we develop the Bayesian model selection procedure using the reference prior for comparing two nested model such as the independent and intraclass models using the distance or divergence between the two as the basis of comparison. A suitable criterion for this is the power divergence measure as introduced by Cressie and Read(1984). Such a measure includes the Kullback -Liebler divergence measures and the Hellinger divergence measure as special cases. For this problem, the power divergence measure turns out to be a function solely of $\rho$, the intraclass correlation coefficient. Also, this function is convex, and the minimum is attained at $\rho=0$. We use reference prior for $\rho$. Due to the duality between hypothesis tests and set estimation, the hypothesis testing problem can also be solved by solving a corresponding set estimation problem. The present paper develops Bayesian method based on the Kullback-Liebler and Hellinger divergence measures, rejecting $H_0:\rho=0$ when the specified divergence measure exceeds some number d. This number d is so chosen that the resulting credible interval for the divergence measure has specified coverage probability $1-{\alpha}$. The length of such an interval is compared with the equal two-tailed credible interval and the HPD credible interval for $\rho$ with the same coverage probability which can also be inverted into acceptance regions of $H_0:\rho=0$. Example is considered where the HPD interval based on the one-at- a-time reference prior turns out to be the shortest credible interval having the same coverage probability.

  • PDF

Centroid-model based music similarity with alpha divergence (알파 다이버전스를 이용한 무게중심 모델 기반 음악 유사도)

  • Seo, Jin Soo;Kim, Jeonghyun;Park, Jihyun
    • The Journal of the Acoustical Society of Korea
    • /
    • v.35 no.2
    • /
    • pp.83-91
    • /
    • 2016
  • Music-similarity computation is crucial in developing music information retrieval systems for browsing and classification. This paper overviews the recently-proposed centroid-model based music retrieval method and applies the distributional similarity measures to the model for retrieval-performance evaluation. Probabilistic distance measures (also called divergence) compute the distance between two probability distributions in a certain sense. In this paper, we consider the alpha divergence in computing distance between two centroid models for music retrieval. The alpha divergence includes the widely-used Kullback-Leibler divergence and Bhattacharyya distance depending on the values of alpha. Experiments were conducted on both genre and singer datasets. We compare the music-retrieval performance of the distributional similarity with that of the vector distances. The experimental results show that the alpha divergence improves the performance of the centroid-model based music retrieval.