• Title/Summary/Keyword: entropy measure

Search Result 203, Processing Time 0.029 seconds

Fuzzy Entropy Construction for Non-Convex Fuzzy Membership Function (비 컨벡스 퍼지 소속함수에 대한 퍼지 엔트로피구성)

  • Lee, Sang-H;Kim, Jae-Hyung;Kim, Sang-Jin
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2008.04a
    • /
    • pp.21-22
    • /
    • 2008
  • Fuzzy entropy is designed for non-convex fuzzy membership function using well known Hamming distance measure. Design procedure of convex fuzzy membership function is represented through distance measure, furthermore characteristic analysis for non-convex function are also illustrated. Proof of proposed fuzzy entropy is discussed, and entropy computation is illustrated.

  • PDF

MEASURE OF MAXIMAL ENTROPY FOR STAR MULTIMODAL MAPS

  • Attarzadeh, Fatemeh;Tajbakhsh, Khosro
    • Journal of the Chungcheong Mathematical Society
    • /
    • v.34 no.1
    • /
    • pp.77-84
    • /
    • 2021
  • Let f : [0, 1] → [0, 1] be a multimodal map with positive topological entropy. The dynamics of the renormalization operator for multimodal maps have been investigated by Daniel Smania. It is proved that the measure of maximal entropy for a specific category of Cr interval maps is unique.

Deriving a New Divergence Measure from Extended Cross-Entropy Error Function

  • Oh, Sang-Hoon;Wakuya, Hiroshi;Park, Sun-Gyu;Noh, Hwang-Woo;Yoo, Jae-Soo;Min, Byung-Won;Oh, Yong-Sun
    • International Journal of Contents
    • /
    • v.11 no.2
    • /
    • pp.57-62
    • /
    • 2015
  • Relative entropy is a divergence measure between two probability density functions of a random variable. Assuming that the random variable has only two alphabets, the relative entropy becomes a cross-entropy error function that can accelerate training convergence of multi-layer perceptron neural networks. Also, the n-th order extension of cross-entropy (nCE) error function exhibits an improved performance in viewpoints of learning convergence and generalization capability. In this paper, we derive a new divergence measure between two probability density functions from the nCE error function. And the new divergence measure is compared with the relative entropy through the use of three-dimensional plots.

Information Management by Data Quantification with FuzzyEntropy and Similarity Measure

  • Siang, Chua Hong;Lee, Sanghyuk
    • Journal of the Korea Convergence Society
    • /
    • v.4 no.2
    • /
    • pp.35-41
    • /
    • 2013
  • Data management with fuzzy entropy and similarity measure were discussed and verified by applying reliable data selection problem. Calculation of certainty or uncertainty for data, fuzzy entropy and similarity measure are designed and proved. Proposed fuzzy entropy and similarity are considered as dissimilarity measure and similarity measure, and the relation between two measures are explained through graphical illustration.Obtained measures are useful to the application of decision theory and mutual information analysis problem. Extension of data quantification results based on the proposed measures are applicable to the decision making and fuzzy game theory.

Information Quantification Application to Management with Fuzzy Entropy and Similarity Measure

  • Wang, Hong-Mei;Lee, Sang-Hyuk
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.10 no.4
    • /
    • pp.275-280
    • /
    • 2010
  • Verification of efficiency in data management fuzzy entropy and similarity measure were discussed and verified by applying reliable data selection problem and numerical data similarity evaluation. In order to calculate the certainty or uncertainty fuzzy entropy and similarity measure are designed and proved. Designed fuzzy entropy and similarity are considered as dissimilarity measure and similarity measure, and the relation between two measures are explained through graphical illustration. Obtained measures are useful to the application of decision theory and mutual information analysis problem. Extension of data quantification results based on the proposed measures are applicable to the decision making and fuzzy game theory.

A View on Extension of Utility-Based on Links with Information Measures

  • Hoseinzadeh, A.R.;Borzadaran, G.R.Mohtashami;Yari, G.H.
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.5
    • /
    • pp.813-820
    • /
    • 2009
  • In this paper, we review the utility-based generalization of the Shannon entropy and Kullback-Leibler information measure as the U-entropy and the U-relative entropy that was introduced by Friedman et al. (2007). Then, we derive some relations between the U-relative entropy and other information measures based on a parametric family of utility functions.

On some properties of distance measures and fuzzy entropy

  • Lee, Sang-Hyuk;Kim, Sungshin
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2002.12a
    • /
    • pp.9-12
    • /
    • 2002
  • Representation and quantification of fuzziness are required for the uncertain system modelling and controller design. Conventional results show that entropy of fuzzy sets represent the fuzziness of fuzzy sets. In this literature, the relations of fuzzy enropy, distance measure and similarity measure are discussed, and distance measure is proposed. With the help of relations of fuzzy enropy, distance measure and similarity measure, fuzzy entropy is represented by the newly proposed distance measure. With simple fuzzy set, example is illustrated.

Calculation of Data Reliability with Entropy for Fuzzy Sets

  • Wang, Hongmei;Lee, Sang-Hyuk
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.9 no.4
    • /
    • pp.269-274
    • /
    • 2009
  • Measuring uncertainty for fuzzy sets has been carried out by calculating fuzzy entropy. Fuzzy entropy of fuzzy set is derived with the help of distance measure. The distance proportional value between the fuzzy set and the corresponding crisp set is designed as the fuzzy entropy. The usefulness is verified by proving the proposed entropy. Generally, fuzzy entropy contains the complementary characteristics that the fuzzy entropies of fuzzy set and complementary fuzzy set have the same entropies. Discrepancy that low fuzzy entropy did not guarantee the data certainty was overcome by modifying fuzzy entropy formulation. Obtained fuzzy entropy is analyzed and discussed through simple example.

A Study on the Analysis of Part Commonality and Redundancy in a Product Line by Entropy Measure (엔트로피 척도(尺度)를 이용(利用)한 제품(製品)라인의 부품 (部品) 공통성(共通性) 및 중복성(重複性) 분석(分析)에 관(關)한 연구(硏究))

  • Ro, Jae-Ho
    • Journal of Industrial Technology
    • /
    • v.3
    • /
    • pp.39-46
    • /
    • 1983
  • This paper presents a quantitative measure of the degree of part commonality and redundancy in a product line based on entropy measure of information theory. The several possible methods of analysis are discussed and the use of the entropy measure is discussed. These commonality and redundancy measure can be applied to analyze the usage pattern of part across a product line and to determine which parts have the broadest usage across the firm's product lines. An analysis of the results by entropy statistics is compared with the practical part usage in a simulation of several types of part usage's distributions.

  • PDF

Similarity Measure Construction of the Fuzzy Set for the Reliable Data Selection (신뢰성 있는 정보의 추출을 위한 퍼지집합의 유사측도 구성)

  • Lee Sang-Hyuk
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.9C
    • /
    • pp.854-859
    • /
    • 2005
  • We construct the fuzzy entropy for measuring of uncertainty with the help of relation between distance measure and similarity measure. Proposed fuzzy entropy is constructed through distance measure. In this study, the distance measure is used Hamming distance measure. Also for the measure of similarity between fuzzy sets or crisp sets, we construct similarity measure through distance measure, and the proposed 려zzy entropies and similarity measures are proved.