• Title/Summary/Keyword: entropy test

Search Result 129, Processing Time 0.023 seconds

Time Series Analysis of Engine Test Data (엔진 시험 데이터에 대한 시계열 분석)

  • Kim, Il-Doo;Yoon, Hyun-Gull;Lim, Jin-Shik
    • Proceedings of the Korean Society of Propulsion Engineers Conference
    • /
    • 2011.11a
    • /
    • pp.241-245
    • /
    • 2011
  • In an engine test, data are collected in a form of a time series. Usually only the time average of a time series is interesting to engineers while its stochastic fluctuation is being ignored. In this paper, we collect pressure and fuel flux data from an air-breathing engine test and analyze their fluctuations using the multiscale sample entropy analysis, which is suggested as a measure of the complexity of a time series. It is shown that different physical quantities indeed have different complexities at each timescales, suggesting a possibility of an instantaneous tool which evaluates the engine test.

  • PDF

GOODNESS OF FIT TESTS BASED ON DIVERGENCE MEASURES

  • Pasha, Eynollah;Kokabi, Mohsen;Mohtashami, Gholam Reza
    • Journal of applied mathematics & informatics
    • /
    • v.26 no.1_2
    • /
    • pp.177-189
    • /
    • 2008
  • In this paper, we have considered an investigation on goodness of fit tests based on divergence measures. In the case of categorical data, under certain regularity conditions, we obtained asymptotic distribution of these tests. Also, we have proposed a modified test that improves the rate of convergence. In continuous case, we used our modified entropy estimator [10], for Kullback-Leibler information estimation. A comparative study based on simulation results is discussed also.

  • PDF

Image Thresholding based on the Entropy Using Variance of the Gray Levels (그레이 레벨의 분산을 이용한 엔트로피에 기반한 영상 임계화)

  • Kwon, Soon-H.
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.21 no.5
    • /
    • pp.543-548
    • /
    • 2011
  • Entropy measuring the richness in details of the image is generally obtained by using the histogram of gray levels in an image, and has been widely used as an index for thresholding of the image. In this paper, we propose an entropy-based thresholding method, where the entropy is obtained not by the histogram but by the variance of the gray levels, to binalize a given image. The effectiveness of the proposed method is demonstrated by thresholding experiments on nine test images and comparison with conventional two thresholding methods, that is, Otsu method and entropy-based method using the histogram.

Entropy-Constrained Temporal Decomposition (엔트로피 제한 조건을 갖는 시간축 분할)

  • Lee Ki-Seung
    • The Journal of the Acoustical Society of Korea
    • /
    • v.24 no.5
    • /
    • pp.262-270
    • /
    • 2005
  • In this paper, a new temporal decomposition method is proposed. where not oniy distortion but also entropy are involved in segmentation. The interpolation functions and the target feature vectors are determined by a dynamic Programing technique. where both distortion and entropy are simultaneously minimized. The interpolation functions are built by using a training speech corpus. An iterative method. where segmentation and estimation are iteratively performed. finds the locally optimum Points in the sense of minimizing both distortion and entropy. Simulation results -3how that in terms of both distortion and entropy. the Proposed temporal decomposition method Produced superior results to the conventional split vector-quantization method which is widely employed in the current speech coding methods. According to the results from the subjective listening test, the Proposed method reveals superior Performance in terms of qualify. comparing to the Previous vector quantization method.

An Approach to Constructing an Efficient Entropy Source on Multicore Processor (멀티코어 환경에서 효율적인 엔트로피 원의 설계 기법)

  • Kim, SeongGyeom;Lee, SeungJoon;Kang, HyungChul;Hong, Deukjo;Sung, Jaechul;Hong, Seokhie
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.28 no.1
    • /
    • pp.61-71
    • /
    • 2018
  • In the Internet of Things, in which plenty of devices have connection to each other, cryptographically secure Random Number Generators (RNGs) are essential. Particularly, entropy source, which is the only one non-deterministic part in generating random numbers, has to equip with an unpredictable noise source(or more) for the required security strength. This might cause an requirement of additional hardware extracting noise source. Although additional hardware resources has better performance, it is needed to make the best use of existing resources in order to avoid extra costs, such as area, power consumption. In this paper, we suggest an entropy source which uses a multi-threaded program without any additional hardware. As a result, it reduces the difficulty when implementing on lightweight, low-power devices. Additionally, according to NIST's entropy estimation test suite, the suggested entropy source is tested to be secure enough for source of entropy input.

Estimation of entropy of the inverse weibull distribution under generalized progressive hybrid censored data

  • Lee, Kyeongjun
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.3
    • /
    • pp.659-668
    • /
    • 2017
  • The inverse Weibull distribution (IWD) can be readily applied to a wide range of situations including applications in medicines, reliability and ecology. It is generally known that the lifetimes of test items may not be recorded exactly. In this paper, therefore, we consider the maximum likelihood estimation (MLE) and Bayes estimation of the entropy of a IWD under generalized progressive hybrid censoring (GPHC) scheme. It is observed that the MLE of the entropy cannot be obtained in closed form, so we have to solve two non-linear equations simultaneously. Further, the Bayes estimators for the entropy of IWD based on squared error loss function (SELF), precautionary loss function (PLF), and linex loss function (LLF) are derived. Since the Bayes estimators cannot be obtained in closed form, we derive the Bayes estimates by revoking the Tierney and Kadane approximate method. We carried out Monte Carlo simulations to compare the classical and Bayes estimators. In addition, two real data sets based on GPHC scheme have been also analysed for illustrative purposes.

Vocal Effort Detection Based on Spectral Information Entropy Feature and Model Fusion

  • Chao, Hao;Lu, Bao-Yun;Liu, Yong-Li;Zhi, Hui-Lai
    • Journal of Information Processing Systems
    • /
    • v.14 no.1
    • /
    • pp.218-227
    • /
    • 2018
  • Vocal effort detection is important for both robust speech recognition and speaker recognition. In this paper, the spectral information entropy feature which contains more salient information regarding the vocal effort level is firstly proposed. Then, the model fusion method based on complementary model is presented to recognize vocal effort level. Experiments are conducted on isolated words test set, and the results show the spectral information entropy has the best performance among the three kinds of features. Meanwhile, the recognition accuracy of all vocal effort levels reaches 81.6%. Thus, potential of the proposed method is demonstrated.

PSS Evaluation Based on Vague Assessment Big Data: Hybrid Model of Multi-Weight Combination and Improved TOPSIS by Relative Entropy

  • Lianhui Li
    • Journal of Information Processing Systems
    • /
    • v.20 no.3
    • /
    • pp.285-295
    • /
    • 2024
  • Driven by the vague assessment big data, a product service system (PSS) evaluation method is developed based on a hybrid model of multi-weight combination and improved TOPSIS by relative entropy. The index values of PSS alternatives are solved by the integration of the stakeholders' vague assessment comments presented in the form of trapezoidal fuzzy numbers. Multi-weight combination method is proposed for index weight solving of PSS evaluation decision-making. An improved TOPSIS by relative entropy (RE) is presented to overcome the shortcomings of traditional TOPSIS and related modified TOPSIS and then PSS alternatives are evaluated. A PSS evaluation case in a printer company is given to test and verify the proposed model. The RE closeness of seven PSS alternatives are 0.3940, 0.5147, 0.7913, 0.3719, 0.2403, 0.4959, and 0.6332 and the one with the highest RE closeness is selected as the best alternative. The results of comparison examples show that the presented model can compensate for the shortcomings of existing traditional methods.

Linear Relationships between Thermodynamic Parameters (Part III) Application to Solvolysis Reaction (熱力學函數間의 直線關係 (第3報) Solvolysis反應에의 應用)

  • Ikchoon Lee
    • Journal of the Korean Chemical Society
    • /
    • v.7 no.4
    • /
    • pp.264-270
    • /
    • 1963
  • The general equation for the substituent effect test, which was derived in the previous paper, has been extended to correlate thermodynamic parameters of solvolysis reaction by modifying the potential energy term to represent the effect of changes in solvent composition. The linear fits of the new equation, $\Delta{\Delta}H^\neq=a'Y+b\Delta{\Delta}S^\neq$, were tested with 35 examples from literature and average correlation coefficient of 0.977 was obtained. Examination of results showed that the equation is generally applicable to solvolysis reaction and helps elucidate some the difficulties experienced with the Grunwald-Winsteln equation. It has been stressed that the linear enthalpy-entropy effect exists only between the external enthalpy and entropy of activation, and therefore strictly it is the linear external enthalpy-entropy effect.

  • PDF

Design of the ICMEP Algorithm for the Highly Efficient Entropy Encoding (고효율 엔트로피 부호화를 위한 ICMEP 알고리즘 설계)

  • 이선근;임순자;김환용
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.41 no.4
    • /
    • pp.75-82
    • /
    • 2004
  • The channel transmission ratio is speeded up by the combination of the Huffman algorithm, the model scheme of the lossy transform having minimum average code lengths for the image information and good instantaneous decoding capability, with the Lempel-Ziv algorithm showing the fast processing performance during the compression process. In order to increase the processing speed during the compression process, ICMEP algorithm is proposed and the entropy encoder of HDTV is designed and inspected. The ICMEP entropy encoder have been designed by choosing the top-down method and consisted of the source codes and the test benches by the behavior expression with VHDL. As a simulation results, implemented ICMEP entropy encoder confirmed that whole system efficiency by memory saturation prevention and compressibility increase improves.