• Title/Summary/Keyword: minimum entropy

Search Result 75, Processing Time 0.021 seconds

Design of the ICMEP Algorithm for the Highly Efficient Entropy Encoding (고효율 엔트로피 부호화를 위한 ICMEP 알고리즘 설계)

  • 이선근;임순자;김환용
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.41 no.4
    • /
    • pp.75-82
    • /
    • 2004
  • The channel transmission ratio is speeded up by the combination of the Huffman algorithm, the model scheme of the lossy transform having minimum average code lengths for the image information and good instantaneous decoding capability, with the Lempel-Ziv algorithm showing the fast processing performance during the compression process. In order to increase the processing speed during the compression process, ICMEP algorithm is proposed and the entropy encoder of HDTV is designed and inspected. The ICMEP entropy encoder have been designed by choosing the top-down method and consisted of the source codes and the test benches by the behavior expression with VHDL. As a simulation results, implemented ICMEP entropy encoder confirmed that whole system efficiency by memory saturation prevention and compressibility increase improves.

Accurate Detection of a Defective Area by Adopting a Divide and Conquer Strategy in Infrared Thermal Imaging Measurement

  • Jiangfei, Wang;Lihua, Yuan;Zhengguang, Zhu;Mingyuan, Yuan
    • Journal of the Korean Physical Society
    • /
    • v.73 no.11
    • /
    • pp.1644-1649
    • /
    • 2018
  • Aiming at infrared thermal images with different buried depth defects, we study a variety of image segmentation algorithms based on the threshold to develop global search ability and the ability to find the defect area accurately. Firstly, the iterative thresholding method, the maximum entropy method, the minimum error method, the Ostu method and the minimum skewness method are applied to image segmentation of the same infrared thermal image. The study shows that the maximum entropy method and the minimum error method have strong global search capability and can simultaneously extract defects at different depths. However none of these five methods can accurately calculate the defect area at different depths. In order to solve this problem, we put forward a strategy of "divide and conquer". The infrared thermal image is divided into several local thermal maps, with each map containing only one defect, and the defect area is calculated after local image processing of the different buried defects one by one. The results show that, under the "divide and conquer" strategy, the iterative threshold method and the Ostu method have the advantage of high precision and can accurately extract the area of different defects at different depths, with an error of less than 5%.

Minimum Entropy Deconvolution을 이용한 지하수 상대 재충진양의 시계열 추정법

  • 김태희;이강근
    • Proceedings of the Korean Society of Soil and Groundwater Environment Conference
    • /
    • 2003.09a
    • /
    • pp.574-578
    • /
    • 2003
  • There are so many methods to estimate the groundwater recharge. These methods can be categorized into four groups. First groupis related to the water balance analysis, second group is concerned with baseflow/springflow recession, and third group is interested in some types of tracers; environmental tracers and/or temperature profile. The limitation of these types of methods is that the estimated results of recharge are presented in the form of an average over some time period. Forth group has a little different approach. They use the time series data of hydraulic head and specific yield evaluated from field test, and the results of estimation are described in the sequential form. But their approach has a serious problem. The estimated results in forth typeof methods are generally underestimated because they cannot consider the discharge phase of water table fluctuation coupled with the recharge phase. Ketchum el. at. (2000) proposed calibrated method, considering recharge- and discharge-coupled water table fluctuation. But the dischargeis considered just as the areal average with discharge rate. On the other hand, there are many methods to estimate the source wavelet with observed data set in geophysics/signal processing and geophysical methods are rarely applied to the estimation of groundwater recharge. The purpose this study is the evaluation of the applicability of one of the geophysical method in the estimation of sequential recharge rate. The applied geophysical method is called minimum entropy deconvolution (MED). For this purpose, numerical modeling with linearized Boussinesq equation was applied. Using the synthesized hydraulic head through the numerical modeling, the relative sequenceof recharge is calculated inversely. Estimated results are very concordant with the applied recharge sequence. Cross-correlations between applied recharge sequence and the estimated results are above 0.985 in all study cases. Through the numerical test, the availability of MED in the estimation of the recharge sequence to groundwater was investigated

  • PDF

The Applicability of Minimum Entropy Deconvolution Considering Spatial Distribution of Sampling Points (지하수 함양량 추정시 공간상에서의 자료 Sampling 방법에 따른 Minimum Entropy Deconvolution의 적용성에 관한 검토)

  • Kim Tae-Hee;Kim Yong-Je;Lee Kang-Keun
    • Journal of Soil and Groundwater Environment
    • /
    • v.11 no.3
    • /
    • pp.52-58
    • /
    • 2006
  • Kim and Lee (2005) suggested Minimum Entropy Deconvolution (MED) to estimate the temporal sequence of the relative recharge. However this study by Kim and Lee (2005) was just related to the verification of the conceptual approach with MED. In this study, we try to characterize the applicability of MED in the case of spatially heterogeneous recharge (distance from recharge area). Simulated results were recorded with some specific sampling points. Estimated results from this study show higher than 0.8 in cross-correlation with the original recharge sequence. In addition, the physical and mathematical meanings of the applied filter length was also investigated. It was revealed that the length of filter is highly related to the spatial distance between recharge area and the monitoring site, and the apparent shape of hydraulic head change.

A Goodness of Fit Tests Based on the Partial Kullback-Leibler Information with the Type II Censored Data

  • Park, Sang-Un;Lim, Jong-Gun
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2003.10a
    • /
    • pp.233-238
    • /
    • 2003
  • Goodness of fit test statistics based on the information discrepancy have been shown to perform very well (Vasicek 1976, Dudewicz and van der Meulen 1981, Chandra et al 1982, Gohkale 1983, Arizona and Ohta 1989, Ebrahimi et al 1992, etc). Although the test is well defined for the non-censored case, censored case has not been discussed in the literature. Therefore we consider a goodness of fit test based on the partial Kullback-Leibler(KL) information with the type II censored data. We derive the partial KL information of the null distribution function and a nonparametric distribution function, and establish a goodness of fit test statistic. We consider the exponential and normal distributions and made Monte Calro simulations to compare the test statistics with some existing tests.

  • PDF

Shadow Removal Based on Chromaticity and Entropy for Efficient Moving Object Tracking (효과적인 이동물체 추적을 위한 색도 영상과 엔트로피 기반의 그림자 제거)

  • Park, Ki-Hong
    • Journal of Advanced Navigation Technology
    • /
    • v.18 no.4
    • /
    • pp.387-392
    • /
    • 2014
  • Recently, various research for intelligent video surveillance system have been proposed, but the existing monitoring systems are inefficient because all of situational awareness is judged by the human. In this paper, shadow removal based moving object tracking method is proposed using the chromaticity and entropy image. The background subtraction model, effective in the context awareness environment, has been applied for moving object detection. After detecting the region of moving object, the shadow candidate region has been estimated and removed by RGB based chromaticity and minimum cross entropy images. For the validity of the proposed method, the highway video is used to experiment. Some experiments are conducted so as to verify the proposed method, and as a result, shadow removal and moving object tracking are well performed.

Adaptive Multi-class Segmentation Model of Aggregate Image Based on Improved Sparrow Search Algorithm

  • Mengfei Wang;Weixing Wang;Sheng Feng;Limin Li
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.2
    • /
    • pp.391-411
    • /
    • 2023
  • Aggregates play the skeleton and supporting role in the construction field, high-precision measurement and high-efficiency analysis of aggregates are frequently employed to evaluate the project quality. Aiming at the unbalanced operation time and segmentation accuracy for multi-class segmentation algorithms of aggregate images, a Chaotic Sparrow Search Algorithm (CSSA) is put forward to optimize it. In this algorithm, the chaotic map is combined with the sinusoidal dynamic weight and the elite mutation strategies; and it is firstly proposed to promote the SSA's optimization accuracy and stability without reducing the SSA's speed. The CSSA is utilized to optimize the popular multi-class segmentation algorithm-Multiple Entropy Thresholding (MET). By taking three METs as objective functions, i.e., Kapur Entropy, Minimum-cross Entropy and Renyi Entropy, the CSSA is implemented to quickly and automatically calculate the extreme value of the function and get the corresponding correct thresholds. The image adaptive multi-class segmentation model is called CSSA-MET. In order to comprehensively evaluate it, a new parameter I based on the segmentation accuracy and processing speed is constructed. The results reveal that the CSSA outperforms the other seven methods of optimization performance, as well as the quality evaluation of aggregate images segmented by the CSSA-MET, and the speed and accuracy are balanced. In particular, the highest I value can be obtained when the CSSA is applied to optimize the Renyi Entropy, which indicates that this combination is more suitable for segmenting the aggregate images.

Enhancing seismic reflection signal (탄성파 반사 신호 향상)

  • Hien, D.H.;Jang, Seong-Hyung;Kim, Young-Wan;Suh, Sang-Yong
    • 한국신재생에너지학회:학술대회논문집
    • /
    • 2008.05a
    • /
    • pp.606-609
    • /
    • 2008
  • Deconvolution is one of the most used techniques for processing seismic reflection data. It is applied to improve temporal resolution by wavelet shaping and removal of short period reverberations. Several deconvolution algorithms such as predicted, spike, minimum entropy deconvolution and so on has been proposed to obtain such above purposes. Among of them, $\iota_1$ norm proposed by Taylor et al., (1979) and used to compared to minimum entropy deconvolution by Sacchi et al., (1994) has given some advantages on time computing and high efficiency. Theoritically, the deconvolution can be considered as inversion technique to invert the single seismic trace to the reflectivity, but it has not been successfully adopted due to noisy signals of the real data set and unknown source wavelet. After stacking, the seismic traces are moved to zero offset, thus each seismic traces now can be a single trace that is created by convolving the seismic source wavelet and reflectivity. In this paper, the fundamental of $\iota_1$ norm deconvolution method will be introduced. The method will be tested by synthetic data and applied to improve the stacked section of gas hydrate.

  • PDF

Clustering Algorithm for Data Mining using Posterior Probability-based Information Entropy (데이터마이닝을 위한 사후확률 정보엔트로피 기반 군집화알고리즘)

  • Park, In-Kyoo
    • Journal of Digital Convergence
    • /
    • v.12 no.12
    • /
    • pp.293-301
    • /
    • 2014
  • In this paper, we propose a new measure based on the confidence of Bayesian posterior probability so as to reduce unimportant information in the clustering process. Because the performance of clustering is up to selecting the important degree of attributes within the databases, the concept of information entropy is added to posterior probability for attributes discernibility. Hence, The same value of attributes in the confidence of the proposed measure is considerably much less due to the natural logarithm. Therefore posterior probability-based clustering algorithm selects the minimum of attribute reducts and improves the efficiency of clustering. Analysis of the validation of the proposed algorithms compared with others shows their discernibility as well as ability of clustering to handle uncertainty with ACME categorical data.

Creation of Approximate Rules based on Posterior Probability (사후확률에 기반한 근사 규칙의 생성)

  • Park, In-Kyu;Choi, Gyoo-Seok
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.15 no.5
    • /
    • pp.69-74
    • /
    • 2015
  • In this paper the patterns of information system is reduced so that control rules can guarantee fast response of queries in database. Generally an information system includes many kinds of necessary and unnecessary attribute. In particular, inconsistent information system is less likely to acquire the accuracy of response. Hence we are interested in the simple and understandable rules that can represent useful patterns by means of rough entropy and Bayesian posterior probability. We propose an algorithm which can reduce control rules to a minimum without inadequate patterns such that the implication between condition attributes and decision attributes is measured through the framework of rough entropy. Subsequently the validation of the proposed algorithm is showed through test information system of new employees appointment.