• Title/Summary/Keyword: entropy condition

Search Result 97, Processing Time 0.019 seconds

Uncertainty Improvement of Incomplete Decision System using Bayesian Conditional Information Entropy (베이지언 정보엔트로피에 의한 불완전 의사결정 시스템의 불확실성 향상)

  • Choi, Gyoo-Seok;Park, In-Kyu
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.14 no.6
    • /
    • pp.47-54
    • /
    • 2014
  • Based on the indiscernible relation of rough set, the inevitability of superposition and inconsistency of data makes the reduction of attributes very important in information system. Rough set has difficulty in the difference of attribute reduction between consistent and inconsistent information system. In this paper, we propose the new uncertainty measure and attribute reduction algorithm by Bayesian posterior probability for correlation analysis between condition and decision attributes. We compare the proposed method and the conditional information entropy to address the uncertainty of inconsistent information system. As the result, our method has more accuracy than conditional information entropy in dealing with uncertainty via mutual information of condition and decision attributes of information system.

A Study on the Spray Characteristics of Swirl Injector for Use a HCCI Engine using Entropy Analysis and PIV Technique (엔트로피 해석과 PIV를 이용한 HCCI 엔진용 스월 인젝터의 분무 특성 해석에 관한 연구)

  • 안용흠;이창희;이기형;이창식
    • Transactions of the Korean Society of Automotive Engineers
    • /
    • v.12 no.1
    • /
    • pp.39-47
    • /
    • 2004
  • The objective of this study is to analyse the spray characteristics according to the injection duration under ambient pressure condition and to investigate the relationship between vorticity and entropy for controlling diffusion process that is the most important thing during the intake stroke injection process. Therefore, the spray velocity was obtained by using the PIV method that has been an useful optical diagnostics technology, and vorticity calculated from spray velocity component with vorticity algorithm. In addition, the homogeneous diffusion rate of spray was quantified by using the entropy analysis based on the Boltzmann's statistical thermodynamics. From these method, we found that as injection duration increases, spray velocity increases and the location of vortex is moved to the downstream of spray. In the same condition, as the entropy decrease, mean vorticity increases. This means that the concentration of spray droplets caused by the increase of injection duration is more effective than the increase of momentum dissipation.

An Approach to Constructing an Efficient Entropy Source on Multicore Processor (멀티코어 환경에서 효율적인 엔트로피 원의 설계 기법)

  • Kim, SeongGyeom;Lee, SeungJoon;Kang, HyungChul;Hong, Deukjo;Sung, Jaechul;Hong, Seokhie
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.28 no.1
    • /
    • pp.61-71
    • /
    • 2018
  • In the Internet of Things, in which plenty of devices have connection to each other, cryptographically secure Random Number Generators (RNGs) are essential. Particularly, entropy source, which is the only one non-deterministic part in generating random numbers, has to equip with an unpredictable noise source(or more) for the required security strength. This might cause an requirement of additional hardware extracting noise source. Although additional hardware resources has better performance, it is needed to make the best use of existing resources in order to avoid extra costs, such as area, power consumption. In this paper, we suggest an entropy source which uses a multi-threaded program without any additional hardware. As a result, it reduces the difficulty when implementing on lightweight, low-power devices. Additionally, according to NIST's entropy estimation test suite, the suggested entropy source is tested to be secure enough for source of entropy input.

Efficient Learning of Bayesian Networks using Entropy (효율적인 베이지안망 학습을 위한 엔트로피 적용)

  • Heo, Go-Eun;Jung, Yong-Gyu
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.9 no.3
    • /
    • pp.31-36
    • /
    • 2009
  • Bayesian networks are known as the best tools to express and predict the domain knowledge with uncertain environments. However, bayesian learning could be too difficult to do effective and reliable searching. To solve the problems of overtime demand, the nodes should be arranged orderly, so that effective structural learning can be possible. This paper suggests the classification learning model to reduce the errors in the independent condition, in which a lot of variables exist and data can increase the reliability by calculating the each entropy of probabilities depending on each circumstances. Also efficient learning models are suggested to decide the order of nodes, that has lowest entropy by calculating the numerical values of entropy of each node in K2 algorithm. Consequently the model of the most suitably settled Bayesian networks could be constructed as quickly as possible.

  • PDF

Context-Based Minimum MSE Prediction and Entropy Coding for Lossless Image Coding

  • Musik-Kwon;Kim, Hyo-Joon;Kim, Jeong-Kwon;Kim, Jong-Hyo;Lee, Choong-Woong
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 1999.06a
    • /
    • pp.83-88
    • /
    • 1999
  • In this paper, a novel gray-scale lossless image coder combining context-based minimum mean squared error (MMSE) prediction and entropy coding is proposed. To obtain context of prediction, this paper first defines directional difference according to sharpness of edge and gradients of localities of image data. Classification of 4 directional differences forms“geometry context”model which characterizes two-dimensional general image behaviors such as directional edge region, smooth region or texture. Based on this context model, adaptive DPCM prediction coefficients are calculated in MMSE sense and the prediction is performed. The MMSE method on context-by-context basis is more in accord with minimum entropy condition, which is one of the major objectives of the predictive coding. In entropy coding stage, context modeling method also gives useful performance. To reduce the statistical redundancy of the residual image, many contexts are preset to take full advantage of conditional probability in entropy coding and merged into small number of context in efficient way for complexity reduction. The proposed lossless coding scheme slightly outperforms the CALIC, which is the state-of-the-art, in compression ratio.

The Slip-Wall Boundary Conditions Effects and the Entropy Characteristics of the Multi-Species GH Solver (다화학종 GH 방정식의 정확성 향상을 위한 벽면 경계조건 연구 및 GH 방정식의 엔트로피 특성 고찰)

  • Ahn, Jae-Wan;Kim, Chong-Am
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.37 no.10
    • /
    • pp.947-954
    • /
    • 2009
  • Starting from the Eu's GH(Generalized Hydrodynamic) theory, the multi-species GH numerical solver is developed in this research and its computatyional behaviors are examined for the hypersonic rarefied flow over an axisymmetric body. To improve the accuracy of the developed multi-species GH solver, various slip-wall boundary conditions are tested and the computed results are compared. Additionally, in order to validate the entropy characteristics of the GH equation, the entropy production and entropy generation rates of the GH equation are investigated in the 1-dimensional normal shock structure test at a high Knudsen number.

THE SECOND CENTRAL LIMIT THEOREM FOR MARTINGALE DIFFERENCE ARRAYS

  • Bae, Jongsig;Jun, Doobae;Levental, Shlomo
    • Bulletin of the Korean Mathematical Society
    • /
    • v.51 no.2
    • /
    • pp.317-328
    • /
    • 2014
  • In Bae et al. [2], we have considered the uniform CLT for the martingale difference arrays under the uniformly integrable entropy. In this paper, we prove the same problem under the bracketing entropy condition. The proofs are based on Freedman inequality combined with a chaining argument that utilizes majorizing measures. The results of present paper generalize those for a sequence of stationary martingale differences. The results also generalize independent problems.

THE UNIFORM CLT FOR MARTINGALE DIFFERENCE ARRAYS UNDER THE UNIFORMLY INTEGRABLE ENTROPY

  • Bae, Jong-Sig;Jun, Doo-Bae;Levental, Shlomo
    • Bulletin of the Korean Mathematical Society
    • /
    • v.47 no.1
    • /
    • pp.39-51
    • /
    • 2010
  • In this paper we consider the uniform central limit theorem for a martingale-difference array of a function-indexed stochastic process under the uniformly integrable entropy condition. We prove a maximal inequality for martingale-difference arrays of process indexed by a class of measurable functions by a method as Ziegler [19] did for triangular arrays of row wise independent process. The main tools are the Freedman inequality for the martingale-difference and a sub-Gaussian inequality based on the restricted chaining. The results of present paper generalizes those of Ziegler [19] and other results of independent problems. The results also generalizes those of Bae and Choi [3] to martingale-difference array of a function-indexed stochastic process. Finally, an application to classes of functions changing with n is given.

A Study on the Optimal Image for Precise measurement (정밀측정을 위한 최적영상에 관한 연구)

  • 유봉환
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.7 no.3
    • /
    • pp.126-131
    • /
    • 1998
  • In computer vision system of modern industry precise measuring has lots of dfficulties because of measurement error due to distortion phenomenon. Among the difficulties, the distortion of edge is regraded as a dominent problem. which is caused by the vlurred image. The blurred image apperar when camera can not discriminate its precise focus. So. it is very important to decide focus of lens and to develop algorithm in order to correct distortion phenomenon. Thus. discrimination criteria obtained by image information of precise focus must be fixed in advance. The gray level histogram of image acquired from blurred edge tends to show a uniform distribution. Bimodal intensity histogram is related with condition of focus, and it is possible to find good condition of focus by using bimodal histogram of entropy.

Rough Entropy-based Knowledge Reduction using Rough Set Theory (러프집합 이론을 이용한 러프 엔트로피 기반 지식감축)

  • Park, In-Kyoo
    • Journal of Digital Convergence
    • /
    • v.12 no.6
    • /
    • pp.223-229
    • /
    • 2014
  • In an attempt to retrieve useful information for an efficient decision in the large knowledge system, it is generally necessary and important for a refined feature selection. Rough set has difficulty in generating optimal reducts and classifying boundary objects. In this paper, we propose quick reduction algorithm generating optimal features by rough entropy analysis for condition and decision attributes to improve these restrictions. We define a new conditional information entropy for efficient feature extraction and describe procedure of feature selection to classify the significance of features. Through the simulation of 5 datasets from UCI storage, we compare our feature selection approach based on rough set theory with the other selection theories. As the result, our modeling method is more efficient than the previous theories in classification accuracy for feature selection.