• Title/Summary/Keyword: Shannon's Information Theory

Search Result 22, Processing Time 0.028 seconds

A Study on the Relative Motivation of Shannon's Information Theory (샤논 정보이론의 상관성 동기에 관한 연구)

  • Lee, Moon-Ho;Kim, Jeong-Su
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.21 no.3
    • /
    • pp.51-57
    • /
    • 2021
  • In this paper, the relevance between Einstein's special theory of relativity (1905) and Bernoulli's fluid mechanics (1738), which motivates Shannon's theorem (1948), was derived from the AB=A/A=I dimension, and the Shannon's theorem channel code was simulated. When Bernoulli's fluid mechanics ΔP=pgh was applied to the Hallasan volcano Magma eruption, the dimensions and heights matched the measured values. The relationship between Einstein's special theory of relativity, Shannon's information theory, and the stack effect theory of fluid mechanics was analyzed, and the relationship between volcanic eruptions was mathematically proven. Einstein's and Bernoulli's conservation of energy and conservation of mass were the same in terms of bandwidth and power efficiency in Shannon's theorem.

A Meeting of Euler and Shannon (오일러(Euler)와 샤논(Shannon)의 만남)

  • Lee, Moon-Ho
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.17 no.1
    • /
    • pp.59-68
    • /
    • 2017
  • The flower and woman are beautiful but Euler's theorem and the symmetry are the best. Shannon applied his theorem to information and communication based on Euler's theorem. His theorem is the root of wireless communication and information theory and the principle of today smart phone. Their meeting point is $e^{-SNR}$ of MIMO(multiple input and multiple output) multiple antenna diversity. In this paper, Euler, who discovered the most beautiful formula($e^{{\pi}i}+1=0$) in the world, briefly guided Shannon's formula ($C=Blog_2(1+{\frac{S}{N}})$) to discover the origin of wireless communication and information communication, and these two masters prove a meeting at the Shannon limit, It reveals something what this secret. And we find that it is symmetry and element-wise inverse are the hidden secret in algebraic coding theory and triangular function.

MODEL BASED DIAGNOSTICS FOR A GEARBOX USING INFORMATION THEORY

  • Choi, J.;Bryant, M.D.
    • Proceedings of the Korean Society of Tribologists and Lubrication Engineers Conference
    • /
    • 2002.10b
    • /
    • pp.459-460
    • /
    • 2002
  • This article discusses a diagnostics method based on models, and information theory. From an extensive system dynamics bond graph model of a gearbox [1], simulated were various cases germane to this diagnostics approach, including the response of an ideal gearbox, which functions perfectly to designer's specifications, and degraded gearboxes with tooth root cracking. By comparing these cases and constructing a signal flow analogy between the gearbox and a communication channel, Shannon' s information theory [2], including theorems, was applied to the gearbox to assess system health, in terms of ability to function.

  • PDF

EDISON 앱 개발 및 교육을 위한 Polymer Collapse 중 Polymer의 Entropy 및 Free Energy 계산

  • Park, Yun-Jae;Jang, Rak-U
    • Proceeding of EDISON Challenge
    • /
    • 2017.03a
    • /
    • pp.75-81
    • /
    • 2017
  • Polymer collapse transition에 대한 연구가 많이 진행되어왔다. 허나 각각의 microstate에 대한 entropy나 free energy에 대한 계산을 하지는 못하였다. 최근 local nonequilibrium thermodynamics와 관련한 논문이 발표되었는데 이는 비평형 상태에서의 각각의 microstate에 대한 확률 분포를 결정하는 물리량을 발견 및 특성을 규명하여 이 중 특별한 상태가 지니는 "information" 이라는 양이 내부에너지와 엔트로피와의 상관관계가 있음을 보였다. 또한, 이러한 information theory를 이용한 Shannon entropy를 사용하여 entropy를 정의하고 free energy와 같은 물리량을 계산하였다. 따라서 이를 이용하여 information theory를 이용한 Shannon entropy와 이로 정의된 free energy를 이용하여 polymer collapse중 entropy 및 free energy를 계산하였다.

  • PDF

Application of information theory to wireless cooperative communications (정보이론의 협력무선통신에의 응용)

  • Kim, Yoon-Hyun;Yang, Jae-Soo;Ha, Kwang-Jun;Kim, Jin-Young
    • 한국정보통신설비학회:학술대회논문집
    • /
    • 2009.08a
    • /
    • pp.337-343
    • /
    • 2009
  • Information theory is one field of the applied mathematics quantitating the data in order to store data as many as in the medium or communicate through channel. it was released on the Shannon's paper, in 1948. With basis on this paper, It has been achieved dramatically development for communication, signal processing, and date procedure and transmission in the network In this paper, the basic concept of information theory is described through dealing with contents about meaning of information, entropy, and channel capacity. It is also handled how information theory is applied to the fields of sensor network, relay channel, MIMO system, and others.

  • PDF

Shannon's Information Theory and Document Indexing (Shannon의 정보이론과 문헌정보)

  • Chung Young Mee
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.6
    • /
    • pp.87-103
    • /
    • 1979
  • Information storage and retrieval is a part of general communication process. In the Shannon's information theory, information contained in a message is a measure of -uncertainty about information source and the amount of information is measured by entropy. Indexing is a process of reducing entropy of information source since document collection is divided into many smaller groups according to the subjects documents deal with. Significant concepts contained in every document are mapped into the set of all sets of index terms. Thus index itself is formed by paired sets of index terms and documents. Without indexing the entropy of document collection consisting of N documents is $log_2\;N$, whereas the average entropy of smaller groups $(W_1,\;W_2,...W_m)$ is as small $(as\;(\sum\limits^m_{i=1}\;H(W_i))/m$. Retrieval efficiency is a measure of information system's performance, which is largely affected by goodness of index. If all and only documents evaluated relevant to user's query can be retrieved, the information system is said $100\%$ efficient. Document file W may be potentially classified into two sets of relevant documents and non-relevant documents to a specific query. After retrieval, the document file W' is reclassified into four sets of relevant-retrieved, relevant-not retrieved, non-relevant-retrieved and non-relevant-not retrieved. It is shown in the paper that the difference in two entropies of document file Wand document file W' is a proper measure of retrieval efficiency.

  • PDF

Shannon의 함수

  • Yi, Beom-Jun
    • 한국해양학회지
    • /
    • v.14 no.1
    • /
    • pp.32-38
    • /
    • 1979
  • The original concept and theory of Shannon's function H=-$\Sigma$(i-1,n)Pi, log$\_$2/Pi and its applicable domains in ecology are discussed. The confusions exist in use and interpretation of this function are due to: 1. Mixing the idea of proper ecological diversity with that of Shannon's information theory. 2. Confusion of physical or thermodynamical systems with ecological systems. 3. Confusion of the system from which one had calculated function H with the system of which function H is interpreted. It's proposed to use function H for the comparison of community's structure and so, for the distinction of community's evolution (succession) steps.

  • PDF

Characterization of New Two Parametric Generalized Useful Information Measure

  • Bhat, Ashiq Hussain;Baig, M. A. K.
    • Journal of Information Science Theory and Practice
    • /
    • v.4 no.4
    • /
    • pp.64-74
    • /
    • 2016
  • In this paper we define a two parametric new generalized useful average code-word length $L_{\alpha}^{\beta}$(P;U) and its relationship with two parametric new generalized useful information measure $H_{\alpha}^{\beta}$(P;U) has been discussed. The lower and upper bound of $L_{\alpha}^{\beta}$(P;U), in terms of $H_{\alpha}^{\beta}$(P;U) are derived for a discrete noiseless channel. The measures defined in this communication are not only new but some well known measures are the particular cases of our proposed measures that already exist in the literature of useful information theory. The noiseless coding theorems for discrete channel proved in this paper are verified by considering Huffman and Shannon-Fano coding schemes on taking empirical data. Also we study the monotonic behavior of $H_{\alpha}^{\beta}$(P;U) with respect to parameters ${\alpha}$ and ${\beta}$. The important properties of $H_{{\alpha}}^{{\beta}}$(P;U) have also been studied.

AN EXTENSION OF JENSEN-MERCER INEQUALITY WITH APPLICATIONS TO ENTROPY

  • Yamin, Sayyari
    • Honam Mathematical Journal
    • /
    • v.44 no.4
    • /
    • pp.513-520
    • /
    • 2022
  • The Jensen and Mercer inequalities are very important inequalities in information theory. The article provides the generalization of Mercer's inequality for convex functions on the line segments. This result contains Mercer's inequality as a particular case. Also, we investigate bounds for Shannon's entropy and give some new applications in zeta function and analysis.

Encounter of Lattice-type coding with Wiener's MMSE and Shannon's Information-Theoretic Capacity Limits in Quantity and Quality of Signal Transmission (신호 전송의 양과 질에서 위너의 MMSE와 샤논의 정보 이론적 정보량 극한 과 격자 코드 와의 만남)

  • Park, Daechul;Lee, Moon Ho
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.8
    • /
    • pp.83-93
    • /
    • 2013
  • By comparing Wiener's MMSE on stochastic signal transmission with Shannon's mutual information first proved by C.E. Shannon in terms of information theory, connections between two approaches were investigated. What Wiener wanted to see in signal transmission in noisy channel is to try to capture fundamental limits for signal quality in signal estimation. On the other hands, Shannon was interested in finding fundamental limits of signal quantity that maximize the uncertainty in mutual information using the entropy concept in noisy channel. First concern of this paper is to show that in deriving limits of Shannon's point to point fundamental channel capacity, Shannon's mutual information obtained by exploiting MMSE combiner and Wiener filter's MMSE are interelated by integro-differential equantion. Then, At the meeting point of Wiener's MMSE and Shannon's mutual information the upper bound of spectral efficiency and the lower bound of energy efficiency were computed. Choosing a proper lattice-type code of a mod-${\Lambda}$AWGN channel model and MMSE estimation of ${\alpha}$ confirmed to lead to the fundamental Shannon capacity limits.