• 제목/요약/키워드: Relative Entropy

검색결과 77건 처리시간 0.019초

A RELATIVE RÉNYI OPERATOR ENTROPY

  • MIRAN JEONG;SEJONG KIM
    • Journal of applied mathematics & informatics
    • /
    • 제41권1호
    • /
    • pp.123-132
    • /
    • 2023
  • We define an operator version of the relative Rényi entropy as the generalization of relative von Neumann entropy, and provide its fundamental properties and the bounds for its trace value. Moreover, we see an effect of the relative Rényi entropy under tensor product, and show the sub-additivity for density matrices.

A View on Extension of Utility-Based on Links with Information Measures

  • Hoseinzadeh, A.R.;Borzadaran, G.R.Mohtashami;Yari, G.H.
    • Communications for Statistical Applications and Methods
    • /
    • 제16권5호
    • /
    • pp.813-820
    • /
    • 2009
  • In this paper, we review the utility-based generalization of the Shannon entropy and Kullback-Leibler information measure as the U-entropy and the U-relative entropy that was introduced by Friedman et al. (2007). Then, we derive some relations between the U-relative entropy and other information measures based on a parametric family of utility functions.

RELATIVE SEQUENCE ENTROPY PAIRS FOR A MEASURE AND RELATIVE TOPOLOGICAL KRONECKER FACTOR

  • AHN YOUNG-HO;LEE JUNGSEOB;PARK KYEWON KOH
    • 대한수학회지
    • /
    • 제42권4호
    • /
    • pp.857-869
    • /
    • 2005
  • Let $(X,\;B,\;{\mu},\;T)$ be a dynamical system and (Y, A, v, S) be a factor. We investigate the relative sequence entropy of a partition of X via the maximal compact extension of (Y, A, v, S). We define relative sequence entropy pairs and using them, we find the relative topological ${\mu}-Kronecker$ factor over (Y, v) which is the maximal topological factor having relative discrete spectrum over (Y, v). We also describe the topological Kronecker factor which is the maximal factor having discrete spectrum for any invariant measure.

엔트로피 개념을 이용한 절삭가공에서 표면거칠기의 특성화 (Characterization of Surface Roughness Using the Concept of Entropy in Machining)

  • 최기홍;최기상
    • 대한기계학회논문집
    • /
    • 제18권12호
    • /
    • pp.3118-3126
    • /
    • 1994
  • This paper describes the use of the concept of (relative) entropy for effective characterization of the amplitude and the frequency distributions of the surface profile formed in machining operation. For this purpose, a theoretical model for surface texture formation in turning operation is developed first. Then, the concept of (relative) entropy is reviewed and its effectiveness is examined based on the simulation and experimental results. The results also suggest that under random tool vibration the effect of the geometrical factors on the surface texture formation can be successfully decomposed and therefore, identified by applying the concept of (relative) entropy.

THE RELATIVE ENTROPY UNDER THE R-CGMY PROCESSES

  • Kwon, YongHoon;Lee, Younhee
    • 충청수학회지
    • /
    • 제28권1호
    • /
    • pp.109-117
    • /
    • 2015
  • We consider the relative entropy for two R-CGMY processes, which are CGMY processes with Y equal to 1, to choose an equivalent martingale measure (EMM) when the underlying asset of a derivative follows a R-CGMY process in the financial market. Since the R-CGMY process leads to an incomplete market, we have to use a proper technique to choose an EMM among a variety of EMMs. In this paper, we derive the closed form expression of the relative entropy for R-CGMY processes.

Deriving a New Divergence Measure from Extended Cross-Entropy Error Function

  • Oh, Sang-Hoon;Wakuya, Hiroshi;Park, Sun-Gyu;Noh, Hwang-Woo;Yoo, Jae-Soo;Min, Byung-Won;Oh, Yong-Sun
    • International Journal of Contents
    • /
    • 제11권2호
    • /
    • pp.57-62
    • /
    • 2015
  • Relative entropy is a divergence measure between two probability density functions of a random variable. Assuming that the random variable has only two alphabets, the relative entropy becomes a cross-entropy error function that can accelerate training convergence of multi-layer perceptron neural networks. Also, the n-th order extension of cross-entropy (nCE) error function exhibits an improved performance in viewpoints of learning convergence and generalization capability. In this paper, we derive a new divergence measure between two probability density functions from the nCE error function. And the new divergence measure is compared with the relative entropy through the use of three-dimensional plots.

PSS Evaluation Based on Vague Assessment Big Data: Hybrid Model of Multi-Weight Combination and Improved TOPSIS by Relative Entropy

  • Lianhui Li
    • Journal of Information Processing Systems
    • /
    • 제20권3호
    • /
    • pp.285-295
    • /
    • 2024
  • Driven by the vague assessment big data, a product service system (PSS) evaluation method is developed based on a hybrid model of multi-weight combination and improved TOPSIS by relative entropy. The index values of PSS alternatives are solved by the integration of the stakeholders' vague assessment comments presented in the form of trapezoidal fuzzy numbers. Multi-weight combination method is proposed for index weight solving of PSS evaluation decision-making. An improved TOPSIS by relative entropy (RE) is presented to overcome the shortcomings of traditional TOPSIS and related modified TOPSIS and then PSS alternatives are evaluated. A PSS evaluation case in a printer company is given to test and verify the proposed model. The RE closeness of seven PSS alternatives are 0.3940, 0.5147, 0.7913, 0.3719, 0.2403, 0.4959, and 0.6332 and the one with the highest RE closeness is selected as the best alternative. The results of comparison examples show that the presented model can compensate for the shortcomings of existing traditional methods.

Mutual Information Analysis with Similarity Measure

  • Wang, Hong-Mei;Lee, Sang-Hyuk
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제10권3호
    • /
    • pp.218-223
    • /
    • 2010
  • Discussion and analysis about relative mutual information has been carried out through fuzzy entropy and similarity measure. Fuzzy relative mutual information measure (FRIM) plays an important part as a measure of information shared between two fuzzy pattern vectors. This FRIM is analyzed and explained through similarity measure between two fuzzy sets. Furthermore, comparison between two measures is also carried out.

소양강댐 유역의 강우관측망 적정성 평가 (Evaluation of Raingauge Networks in the Soyanggang Dam River Basin)

  • 김재복;배영대;박봉진;김재한
    • 한국수자원학회:학술대회논문집
    • /
    • 한국수자원학회 2007년도 학술발표회 논문집
    • /
    • pp.178-182
    • /
    • 2007
  • In this study, we evaluated current raingauge network of Soyanggang dam region applying spatial-correlation analysis and Entropy theory to recommend an optimized raingauge network. In the process of analysis, correlation distance of raingauge stations is estimated and evaluated via spatial-correlation method and entropy method. From this correlation distances, respective influencing radii of each dataset and each methods is assessed. The result of correlation and entropy analysis has estimated correlation distance of 25.546km and influence radius of 7.206km, deducing a decrease of network density from $224.53km^2$ to $122.47km^2$ which satisfy the recommended minimum densities of $250km^2$ in mountainous regions(WMO, 1994) and an increase of basin coverage from 59.3% to 86.8%. As for the elevation analysis the relative evaluation ratio increased from 0.59(current) to 0.92(optimized) resulting an obvious improvement.

  • PDF