• Title/Summary/Keyword: Sample Entropy

Search Result 71, Processing Time 0.032 seconds

Improving Sample Entropy Based on Nonparametric Quantile Estimation

  • Park, Sang-Un;Park, Dong-Ryeon
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.4
    • /
    • pp.457-465
    • /
    • 2011
  • Sample entropy (Vasicek, 1976) has poor performance, and several nonparametric entropy estimators have been proposed as alternatives. In this paper, we consider a piecewise uniform density function based on quantiles, which enables us to evaluate entropy in each interval, and study the poor performance of the sample entropy in terms of the poor estimation of lower and upper quantiles. Then we propose some improved entropy estimators by simply modifying the quantile estimators, and compare their performances with some existing estimators.

Modified Mass-Preserving Sample Entropy

  • Kim, Chul-Eung;Park, Sang-Un
    • Communications for Statistical Applications and Methods
    • /
    • v.9 no.1
    • /
    • pp.13-19
    • /
    • 2002
  • In nonparametric entropy estimation, both mass and mean-preserving maximum entropy distribution (Theil, 1980) and the underlying distribution of the sample entropy (Vasicek, 1976), the most widely used entropy estimator, consist of nb mass-preserving densities based on disjoint Intervals of the simple averages of two adjacent order statistics. In this paper, we notice that those nonparametric density functions do not actually keep the mass-preserving constraint, and propose a modified sample entropy by considering the generalized 0-statistics (Kaigh and Driscoll, 1987) in averaging two adjacent order statistics. We consider the proposed estimator in a goodness of fit test for normality and compare its performance with that of the sample entropy.

Entropy-Constrained Sample-Adaptive Product Quantizer Design for the High Bit-Rate Quantization (고 전송률 양자화를 위한 엔트로피 제한 표본 적응 프로덕트 양자기 설계)

  • Kim, Dong-Sik
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.49 no.1
    • /
    • pp.11-18
    • /
    • 2012
  • In this paper, an entropy constrained vector quantizer for high bit-rates is proposed. The sample-adaptive product quantizer (SAPQ), which is based on the product codebooks, is employed, and a design algorithm for the entropy constrained sample adaptive product quantizer (ECSAPQ) is proposed. The performance of the proposed ECSAPQ is better than the case of the entropy constrained vector quantizer by 0.5dB. It is also shown that the ECSAPQ distortion curve, which is based on the scalar quantizer, is lower than the high-rate theoretical curve of the entropy constrained scalar quantizer, where the theoretical curve have 1.53dB difference from Shannon's lower bound.

Goodness-of-fit Tests for the Weibull Distribution Based on the Sample Entropy

  • Kang, Suk-Bok;Lee, Hwa-Jung
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.1
    • /
    • pp.259-268
    • /
    • 2006
  • For Type-II censored sample, we propose three modified entropy estimators based on the Vasieck's estimator, van Es' estimator, and Correa's estimator. We also propose the goodness-of-fit tests of the Weibull distribution based on the modified entropy estimators. We simulate the mean squared errors (MSE) of the proposed entropy estimators and the powers of the proposed tests. We also compare the proposed tests with the modified Kolmogorov-Smirnov and Cramer-von-Mises tests which were proposed by Kang et al. (2003).

  • PDF

Crack identification with parametric optimization of entropy & wavelet transformation

  • Wimarshana, Buddhi;Wu, Nan;Wu, Christine
    • Structural Monitoring and Maintenance
    • /
    • v.4 no.1
    • /
    • pp.33-52
    • /
    • 2017
  • A cantilever beam with a breathing crack is studied to improve the breathing crack identification sensitivity by the parametric optimization of sample entropy and wavelet transformation. Crack breathing is a special bi-linear phenomenon experienced by fatigue cracks which are under dynamic loadings. Entropy is a measure, which can quantify the complexity or irregularity in system dynamics, and hence employed to quantify the bi-linearity/irregularity of the vibration response, which is induced by the breathing phenomenon of a fatigue crack. To improve the sensitivity of entropy measurement for crack identification, wavelet transformation is merged with entropy. The crack identification is studied under different sinusoidal excitation frequencies of the cantilever beam. It is found that, for the excitation frequencies close to the first modal frequency of the beam structure, the method is capable of detecting only 22% of the crack depth percentage ratio with respect to the thickness of the beam. Using parametric optimization of sample entropy and wavelet transformation, this crack identification sensitivity is improved up to 8%. The experimental studies are carried out, and experimental results successfully validate the numerical parametric optimization process.

Three-dimensional structural health monitoring based on multiscale cross-sample entropy

  • Lin, Tzu Kang;Tseng, Tzu Chi;Lainez, Ana G.
    • Earthquakes and Structures
    • /
    • v.12 no.6
    • /
    • pp.673-687
    • /
    • 2017
  • A three-dimensional; structural health monitoring; vertical; planar; cross-sample entropy; multiscaleA three-dimensional structural health monitoring (SHM) system based on multiscale entropy (MSE) and multiscale cross-sample entropy (MSCE) is proposed in this paper. The damage condition of a structure is rapidly screened through MSE analysis by measuring the ambient vibration signal on the roof of the structure. Subsequently, the vertical damage location is evaluated by analyzing individual signals on different floors through vertical MSCE analysis. The results are quantified using the vertical damage index (DI). Planar MSCE analysis is applied to detect the damage orientation of damaged floors by analyzing the biaxial signals in four directions on each damaged floor. The results are physically quantified using the planar DI. With progressive vertical and planar analysis methods, the damaged floors and damage locations can be accurately and efficiently diagnosed. To demonstrate the performance of the proposed system, performance evaluation was conducted on a three-dimensional seven-story steel structure. According to the results, the damage condition and elevation were reliably detected. Moreover, the damage location was efficiently quantified by the DI. Average accuracy rates of 93% (vertical) and 91% (planar) were achieved through the proposed DI method. A reference measurement of the current stage can initially launch the SHM system; therefore, structural damage can be reliably detected after major earthquakes.

Predicting the FTSE China A50 Index Movements Using Sample Entropy

  • AKEEL, Hatem
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.9 no.3
    • /
    • pp.1-10
    • /
    • 2022
  • This research proposes a novel trading method based on sample entropy for the FTSE China A50 Index. The approach is used to determine the points at which the index should be bought and sold for various holding durations. The findings are then compared to three other trading strategies: buying and holding the index for the entire time period, using the Relative Strength Index (RSI), and using the Moving Average Convergence Divergence (MACD) as buying/selling signaling tools. The unique entropy trading method, which used 90-day holding periods and was called StEn(90), produced the highest cumulative return: 25.66 percent. Regular buy and hold, RSI, and MACD were all outperformed by this strategy. In fact, when applied to the same time periods, RSI and MACD had negative returns for the FTSE China A50 Index. Regular purchase and hold yielded a 6% positive return, whereas RSI yielded a 28.56 percent negative return and MACD yielded a 33.33 percent negative return.

Rationale of the Maximum Entropy Probability Density

  • Park, B. S.
    • Journal of the Korean Statistical Society
    • /
    • v.13 no.2
    • /
    • pp.87-106
    • /
    • 1984
  • It ${X_t}$ is a sequence of independent identically distributed normal random variables, then the conditional probability density of $X_1, X_2, \cdots, X_n$ given the first p+1 sample autocovariances converges to the maximum entropy probability density satisfying the corresponding covariance constraints as the length of the sample sequence tends to infinity. This establishes that the maximum entropy probability density and the associated Gaussian autoregressive process arise naturally as the answers of conditional limit problems.

  • PDF

A Comparison on the Differential Entropy

  • Kim, Dae-Hak
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.3
    • /
    • pp.705-712
    • /
    • 2005
  • Entropy is the basic concept of information theory. It is well defined for random varibles with known probability density function(pdf). For given data with unknown pdf, entropy should be estimated. Usually, estimation of entropy is based on the approximations. In this paper, we consider a kernel based approximation and compare it to the cumulant approximation method for several distributions. Monte carlo simulation for various sample size is conducted.

  • PDF

SAMPLE ENTROPY IN ESTIMATING THE BOX-COX TRANSFORMATION

  • Rahman, Mezbahur;Pearson, Larry M.
    • Journal of the Korean Data and Information Science Society
    • /
    • v.12 no.1
    • /
    • pp.103-125
    • /
    • 2001
  • The Box-Cox transformation is a well known family of power transformation that brings a set of data into agreement with the normality assumption of the residuals and hence the response variable of a postulated model in regression analysis. This paper proposes a new method for estimating the Box-Cox transformation using maximization of the Sample Entropy statistic which forces the data to get closer to normal as much as possible. A comparative study of the proposed procedure with the maximum likelihood procedure, the procedure via artificial regression estimation, and the recently introduced maximization of the Shapiro-Francia W' statistic procedure is given. In addition, we generate a table for the optimal spacings parameter in computing the Sample Entropy statistic.

  • PDF