• Title/Summary/Keyword: Shannon entropy

Search Result 81, Processing Time 0.029 seconds

ONLINE TEST BASED ON MUTUAL INFORMATION FOR TRUE RANDOM NUMBER GENERATORS

  • Kim, Young-Sik;Yeom, Yongjin;Choi, Hee Bong
    • Journal of the Korean Mathematical Society
    • /
    • v.50 no.4
    • /
    • pp.879-897
    • /
    • 2013
  • Shannon entropy is one of the widely used randomness measures especially for cryptographic applications. However, the conventional entropy tests are less sensitive to the inter-bit dependency in random samples. In this paper, we propose new online randomness test schemes for true random number generators (TRNGs) based on the mutual information between consecutive ${\kappa}$-bit output blocks for testing of inter-bit dependency in random samples. By estimating the block entropies of distinct lengths at the same time, it is possible to measure the mutual information, which is closely related to the amount of the statistical dependency between two consecutive data blocks. In addition, we propose a new estimation method for entropies, which accumulates intermediate values of the number of frequencies. The proposed method can estimate entropy with less samples than Maurer-Coron type entropy test can. By numerical simulations, it is shown that the new proposed scheme can be used as a reliable online entropy estimator for TRNGs used by cryptographic modules.

Use of gaze entropy to evaluate situation awareness in emergency accident situations of nuclear power plant

  • Lee, Yejin;Jung, Kwang-Tae;Lee, Hyun-Chul
    • Nuclear Engineering and Technology
    • /
    • v.54 no.4
    • /
    • pp.1261-1270
    • /
    • 2022
  • This study was conducted to investigate the possibility of using gaze entropy to evaluate an operator's situation awareness in an emergency accident situation of a nuclear power plant. Gaze entropy can be an effective measure for evaluating an operator's situation awareness at a nuclear power plant because it can express gaze movement as a single comprehensive number. In order to determine the relationship between situation awareness and gaze entropy for an emergency accident situation of a nuclear power plant, an experiment was conducted to measure situation awareness and gaze entropy using simulators created for emergency accident situations LOCA, SGTR, SLB, and LOV. The experiment was to judge the accident situation of nuclear power plants presented in the simulator. The results showed that situation awareness and Shannon, dwell time, and Markov entropy had a significant negative correlation, while visual attention entropy (VAE) did not show any significant correlation with situation awareness. The results determined that Shannon entropy, dwell time entropy, and Markov entropy could be used as measures to evaluate situation awareness.

Estimation for the Variation of the Concentration of Greenhouse Gases with Modified Shannon Entropy (변형된 샤논 엔트로피식을 이용한 온실가스 농도변화량 예측)

  • Kim, Sang-Mok;Lee, Do-Haeng;Choi, Eol;Koh, Mi-Sol;Yang, Jae-Kyu
    • Journal of Environmental Science International
    • /
    • v.22 no.11
    • /
    • pp.1473-1479
    • /
    • 2013
  • Entropy is a measure of disorder or uncertainty. This terminology is qualitatively used in the understanding of its correlation to pollution in the environmental area. In this research, three different entropies were defined and characterized in order to quantify the qualitative entropy previously used in the environmental science. We are dealing with newly defined distinct entropies $E_1$, $E_2$, and $E_3$ originated from Shannon entropy in the information theory, reflecting concentration of three major green house gases $CO_2$, $N_2O$ and $CH_4$ represented as the probability variables. First, $E_1$ is to evaluate the total amount of entropy from concentration difference of each green house gas with respect to three periods, due to industrial revolution, post-industrial revolution, and information revolution, respectively. Next, $E_2$ is to evaluate the entropy reflecting the increasing of the logarithm base along with the accumulated time unit. Lastly, $E_3$ is to evaluate the entropy with a fixed logarithm base by 2 depending on the time. Analytical results are as follows. $E_1$ shows the degree of prediction reliability with respect to variation of green house gases. As $E_1$ increased, the concentration variation becomes stabilized, so that it follows from linear correlation. $E_2$ is a valid indicator for the mutual comparison of those green house gases. Although $E_3$ locally varies within specific periods, it eventually follows a logarithmic curve like a similar pattern observed in thermodynamic entropy.

Identifying Suspended Particulate Matters in an Urban Coastal System: Significance and Application of Particle Size Analysis

  • Ahn, Jong-Ho
    • Environmental Engineering Research
    • /
    • v.17 no.3
    • /
    • pp.167-174
    • /
    • 2012
  • In situ particle size spectra are obtained from two sequent cruises in order to evaluate the physical consequences of suspended particulate matters caused by episodic storm runoff from the Santa Ana River watershed, an urbanized coastal watershed. Suspended particles from various sources including surface runoff, near-bed resuspension, and phytoplankton are identified in empirical orthogonal function (EOF) analysis and an entropy-based parameterization (Shannon entropy). The first EOF mode is associated with high turbidity and fine particles as indicated by the elevated beam attenuation near the Santa Ana River and Newport Bay outlets, and the second EOF mode explains the suspended sediment dispersal and particle coarsening at the near-surface plume. Chlorophyll particles are also distinguished by negative magnitudes of the first EOF mode, which is supported by the relationship between fluorescence and beam attenuation. The integrated observation between the first EOF mode and the Shannon entropy index accentuates the characteristics of two different structures and/or sources of sediment particles; the near-surface plumes are originated from runoff water outflow, while the near-bottom particles are resuspended due to increased wave heights or mobilizing bottom turbidity currents. In a coastal pollution context, these methods may offer useful means of characterizing particle-associated pollutants for purposes of source tracking and environmental interpretation.

Shannon's Information Theory and Document Indexing (Shannon의 정보이론과 문헌정보)

  • Chung Young Mee
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.6
    • /
    • pp.87-103
    • /
    • 1979
  • Information storage and retrieval is a part of general communication process. In the Shannon's information theory, information contained in a message is a measure of -uncertainty about information source and the amount of information is measured by entropy. Indexing is a process of reducing entropy of information source since document collection is divided into many smaller groups according to the subjects documents deal with. Significant concepts contained in every document are mapped into the set of all sets of index terms. Thus index itself is formed by paired sets of index terms and documents. Without indexing the entropy of document collection consisting of N documents is $log_2\;N$, whereas the average entropy of smaller groups $(W_1,\;W_2,...W_m)$ is as small $(as\;(\sum\limits^m_{i=1}\;H(W_i))/m$. Retrieval efficiency is a measure of information system's performance, which is largely affected by goodness of index. If all and only documents evaluated relevant to user's query can be retrieved, the information system is said $100\%$ efficient. Document file W may be potentially classified into two sets of relevant documents and non-relevant documents to a specific query. After retrieval, the document file W' is reclassified into four sets of relevant-retrieved, relevant-not retrieved, non-relevant-retrieved and non-relevant-not retrieved. It is shown in the paper that the difference in two entropies of document file Wand document file W' is a proper measure of retrieval efficiency.

  • PDF

The RNA Base Over 95% of Onju Citrus and Coffee Genes Cut & Paste Based on The BCJM Matrix with Chargaff-Shannon Entropy (BCJM 행렬 및 Chargaff 법칙과 Shannon Entropy에 의한 RNA 유전자 비율이 95%이상인 온주감귤과 귤의 유전자 조합)

  • Lee, Sung Kook;Kim, Jeong Su;Lee, Moon Ho
    • The Journal of the Convergence on Culture Technology
    • /
    • v.8 no.4
    • /
    • pp.415-422
    • /
    • 2022
  • The heterogeneous Onju citrus genes (A=20.57, C=32.71, G=30.01, U=16.71%) and coffee genes (A=20.66, C=31.76, G=30.187, U=16.71%) have the same genetic ratio of 95% or more. It is known that gene compatibility is generally not possible with this group. However, it can be grafted if the conditions of Chargaff rule and Shannon Entropy are met with gene functional-similarity of more than 95%, and it becomes a new breed of Coffrange. We calculated the world's first BCJM matrix for DNA-RNA and published it in US patents and international journals. All animals and viruses are similar to human genes. Based on this, it was announced in June in the British matrix textbook by solving the genetic characteristics of COVID-19 and the human body. In plants, it is treated with BCJM-Transposon treatment, a technique that easily changes gene location. Simulation predicted that the matrix could be successful with Cut & Paste and Transpose.

Entropy-Constrained Sample-Adaptive Product Quantizer Design for the High Bit-Rate Quantization (고 전송률 양자화를 위한 엔트로피 제한 표본 적응 프로덕트 양자기 설계)

  • Kim, Dong-Sik
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.49 no.1
    • /
    • pp.11-18
    • /
    • 2012
  • In this paper, an entropy constrained vector quantizer for high bit-rates is proposed. The sample-adaptive product quantizer (SAPQ), which is based on the product codebooks, is employed, and a design algorithm for the entropy constrained sample adaptive product quantizer (ECSAPQ) is proposed. The performance of the proposed ECSAPQ is better than the case of the entropy constrained vector quantizer by 0.5dB. It is also shown that the ECSAPQ distortion curve, which is based on the scalar quantizer, is lower than the high-rate theoretical curve of the entropy constrained scalar quantizer, where the theoretical curve have 1.53dB difference from Shannon's lower bound.

An Analysis of Quality Efficiency of Loan Consultants in a Bank using Shannon's Entropy and PCA-DEA Model (Entropy와 PCA-DEA 모형을 이용한 은행 대출상담사의 서비스 품질 효율성 분석)

  • Choi, Jang Ki;Kim, Kyeongtaek;Suh, Jae Joon
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.40 no.3
    • /
    • pp.7-17
    • /
    • 2017
  • Loan consultants assist clients with loan application processing and loan decisions. Their duties may include contacting people to ask if they want a loan, meeting with loan applicants and explaining different loan options. We studied the efficiency of service quality of loan consultants contracted to a bank in Korea. They do not work as a team, but do work independently. Since he/she is not an employee of the bank, the consultant is paid solely in proportion to how much he/she sell loans. In this study, a consultant is considered as a decision making unit (DMU) in the DEA (Data Envelopment Analysis) model. We use a principal component analysis-data envelopment analysis (PCA-DEA) model integrated with Shannon's Entropy to evaluate quality efficiency of the consultants. We adopt a three-stage process to calculate the efficiency of service quality of the consultants. In the first stage, we use PCA to obtain 6 synthetic indicators, including 4 input indicators and 2 output indicators, from survey results in which questionnaire items are constructed on the basis of SERVQUAL model. In the second stage, 3 DEA models allowing negative values are used to calculate the relative efficiency of each DMU. In the third stage, the weight of each result is calculated on the basis of Shannon's Entropy theory, and then we generate a comprehensive efficiency score using it. An example illustrates the proposed process of evaluating the relative quality efficiency of the loan consultants and how to use the efficiency to improve the service quality of the consultants.

Minimum Variance Unbiased Estimation for the Maximum Entropy of the Transformed Inverse Gaussian Random Variable by Y=X-1/2

  • Choi, Byung-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.3
    • /
    • pp.657-667
    • /
    • 2006
  • The concept of entropy, introduced in communication theory by Shannon (1948) as a measure of uncertainty, is of prime interest in information-theoretic statistics. This paper considers the minimum variance unbiased estimation for the maximum entropy of the transformed inverse Gaussian random variable by $Y=X^{-1/2}$. The properties of the derived UMVU estimator is investigated.

An Adaptive Data Compression Algorithm for Video Data (사진데이타를 위한 한 Adaptive Data Compression 방법)

  • 김재균
    • Journal of the Korean Institute of Telematics and Electronics
    • /
    • v.12 no.2
    • /
    • pp.1-10
    • /
    • 1975
  • This paper presents an adaptive data compression algorithm for video data. The coling complexity due to the high correlation in the given data sequence is alleviated by coding the difference data, sequence rather than the data sequence itself. The adaptation to the nonstationary statistics of the data is confined within a code set, which consists of two constant length cades and six modified Shannon.Fano codes. lt is assumed that the probability distributions of tile difference data sequence and of the data entropy are Laplacian and Gaussion, respectively. The adaptive coding performance is compared for two code selection criteria: entropy and $P_r$[difference value=0]=$P_0$. It is shown that data compression ratio 2 : 1 is achievable with the adaptive coding. The gain by the adaptive coding over the fixed coding is shown to be about 10% in compression ratio and 15% in code efficiency. In addition, $P_0$ is found to he not only a convenient criterion for code selection, but also such efficient a parameter as to perform almost like entropy.

  • PDF