• Title/Summary/Keyword: entropy theory

Search Result 184, Processing Time 0.033 seconds

Maximum Entropy Principle for Queueing Theory

  • SungJin Ahn;DongHoon Lim;SooTaek Kim
    • Communications for Statistical Applications and Methods
    • /
    • v.4 no.2
    • /
    • pp.497-505
    • /
    • 1997
  • We attempt to get a probabilistic model of a queueing system in the maximum entropy condition. Applying the maximum entropy principle to the queueing system, we obtain the most uncertain probability model compatible with the available information expressed by moments.

  • PDF

Shifting Paradigms in Polymer Crystallization

  • Muthukumar, M.
    • Proceedings of the Polymer Society of Korea Conference
    • /
    • 2006.10a
    • /
    • pp.108-108
    • /
    • 2006
  • The role of conformational entropy of polymer chains in polymer crystallization is investigated by molecular modeling and theory. The entropy of folded loops dominates at experimentally relevant temperatures to dictate short equilibrium lamellar thicknesses, which are much smaller than the extended chain thickness. Also the entropic barriers control the kinetics of polymer crystallization. These results based on chain entropy are different from the classical views of how polymer chains crystallize.

  • PDF

A NOTE ON THE MAXIMUM ENTROPY WEIGHTING FUNCTION PROBLEM

  • Hong, Dug-Hun;Kim, Kyung-Tae
    • Journal of applied mathematics & informatics
    • /
    • v.23 no.1_2
    • /
    • pp.547-552
    • /
    • 2007
  • In this note, we extends some of the results of Liu [Fuzzy Sets and systems 157 (2006) 869-878]. This extension consists of a simple proof involving weighted functions and their preference index. We also give an elementary simple proof of the maximum entropy weighting function problem with a given preference index value without using any advanced theory like variational principles or without using Lagrangian multiplier methods.

AN EXTENSION OF JENSEN-MERCER INEQUALITY WITH APPLICATIONS TO ENTROPY

  • Yamin, Sayyari
    • Honam Mathematical Journal
    • /
    • v.44 no.4
    • /
    • pp.513-520
    • /
    • 2022
  • The Jensen and Mercer inequalities are very important inequalities in information theory. The article provides the generalization of Mercer's inequality for convex functions on the line segments. This result contains Mercer's inequality as a particular case. Also, we investigate bounds for Shannon's entropy and give some new applications in zeta function and analysis.

A REFINEMENT OF THE JENSEN-SIMIC-MERCER INEQUALITY WITH APPLICATIONS TO ENTROPY

  • Sayyari, Yamin
    • The Pure and Applied Mathematics
    • /
    • v.29 no.1
    • /
    • pp.51-57
    • /
    • 2022
  • The Jensen, Simic and Mercer inequalities are very important inequalities in theory of inequalities and some results are devoted to this inequalities. In this paper, firstly, we establish extension of Jensen-Simic-Mercer inequality. After that, we investigate bounds for Shannons entropy of a probability distribution. Finally, We give some new applications in analysis.

Development of an Item Selection Method for Test-Construction by using a Relationship Structure among Abilities

  • Kim, Sung-Ho;Jeong, Mi-Sook;Kim, Jung-Ran
    • Communications for Statistical Applications and Methods
    • /
    • v.8 no.1
    • /
    • pp.193-207
    • /
    • 2001
  • When designing a test set, we need to consider constraints on items that are deemed important by item developers or test specialists. The constraints are essentially on the components of the test domain or abilities relevant to a given test set. And so if the test domain could be represented in a more refined form, test construction would be made in a more efficient way. We assume that relationships among task abilities are representable by a causal model and that the item response theory (IRT) is not fully available for them. In such a case we can not apply traditional item selection methods that are based on the IRT. In this paper, we use entropy as an uncertainty measure for making inferences on task abilities and developed an optimal item selection algorithm which reduces most the entropy of task abilities when items are selected from an item pool.

  • PDF

Deriving a New Divergence Measure from Extended Cross-Entropy Error Function

  • Oh, Sang-Hoon;Wakuya, Hiroshi;Park, Sun-Gyu;Noh, Hwang-Woo;Yoo, Jae-Soo;Min, Byung-Won;Oh, Yong-Sun
    • International Journal of Contents
    • /
    • v.11 no.2
    • /
    • pp.57-62
    • /
    • 2015
  • Relative entropy is a divergence measure between two probability density functions of a random variable. Assuming that the random variable has only two alphabets, the relative entropy becomes a cross-entropy error function that can accelerate training convergence of multi-layer perceptron neural networks. Also, the n-th order extension of cross-entropy (nCE) error function exhibits an improved performance in viewpoints of learning convergence and generalization capability. In this paper, we derive a new divergence measure between two probability density functions from the nCE error function. And the new divergence measure is compared with the relative entropy through the use of three-dimensional plots.

Rough Entropy-based Knowledge Reduction using Rough Set Theory (러프집합 이론을 이용한 러프 엔트로피 기반 지식감축)

  • Park, In-Kyoo
    • Journal of Digital Convergence
    • /
    • v.12 no.6
    • /
    • pp.223-229
    • /
    • 2014
  • In an attempt to retrieve useful information for an efficient decision in the large knowledge system, it is generally necessary and important for a refined feature selection. Rough set has difficulty in generating optimal reducts and classifying boundary objects. In this paper, we propose quick reduction algorithm generating optimal features by rough entropy analysis for condition and decision attributes to improve these restrictions. We define a new conditional information entropy for efficient feature extraction and describe procedure of feature selection to classify the significance of features. Through the simulation of 5 datasets from UCI storage, we compare our feature selection approach based on rough set theory with the other selection theories. As the result, our modeling method is more efficient than the previous theories in classification accuracy for feature selection.

Shannon's Information Theory and Document Indexing (Shannon의 정보이론과 문헌정보)

  • Chung Young Mee
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.6
    • /
    • pp.87-103
    • /
    • 1979
  • Information storage and retrieval is a part of general communication process. In the Shannon's information theory, information contained in a message is a measure of -uncertainty about information source and the amount of information is measured by entropy. Indexing is a process of reducing entropy of information source since document collection is divided into many smaller groups according to the subjects documents deal with. Significant concepts contained in every document are mapped into the set of all sets of index terms. Thus index itself is formed by paired sets of index terms and documents. Without indexing the entropy of document collection consisting of N documents is $log_2\;N$, whereas the average entropy of smaller groups $(W_1,\;W_2,...W_m)$ is as small $(as\;(\sum\limits^m_{i=1}\;H(W_i))/m$. Retrieval efficiency is a measure of information system's performance, which is largely affected by goodness of index. If all and only documents evaluated relevant to user's query can be retrieved, the information system is said $100\%$ efficient. Document file W may be potentially classified into two sets of relevant documents and non-relevant documents to a specific query. After retrieval, the document file W' is reclassified into four sets of relevant-retrieved, relevant-not retrieved, non-relevant-retrieved and non-relevant-not retrieved. It is shown in the paper that the difference in two entropies of document file Wand document file W' is a proper measure of retrieval efficiency.

  • PDF