• Title/Summary/Keyword: Kullback-Leibler divergence

Search Result 42, Processing Time 0.026 seconds

Analysis of Large Tables (대규모 분할표 분석)

  • Choi, Hyun-Jip
    • The Korean Journal of Applied Statistics
    • /
    • v.18 no.2
    • /
    • pp.395-410
    • /
    • 2005
  • For the analysis of large tables formed by many categorical variables, we suggest a method to group the variables into several disjoint groups in which the variables are completely associated within the groups. We use a simple function of Kullback-Leibler divergence as a similarity measure to find the groups. Since the groups are complete hierarchical sets, we can identify the association structure of the large tables by the marginal log-linear models. Examples are introduced to illustrate the suggested method.

A study on the active sonar reverberation suppression method based on non-negative matrix factorization with beta-divergence function (베타-발산 함수를 활용한 비음수 행렬 분해 기반의 능동 소나 잔향 제거 기법에 대한 연구)

  • Seokjin Lee;Geunhwan Kim
    • The Journal of the Acoustical Society of Korea
    • /
    • v.43 no.4
    • /
    • pp.369-382
    • /
    • 2024
  • To suppress the reverberation in the active sonar system, the non-negative matrix factorization-based reverberation suppression methods have been researched recently. An estimation loss function, which makes the multiplication of basis matrices same as the input signals, has to be considered to design the non-negative matrix factorization methods, but the conventional method simply chooses the Kullback-Leibler divergence asthe lossfunction without any considerations. In this paper, we examined that the Kullback-Leibler divergence is the best lossfunction or there isthe other loss function enhancing the performance. First, we derived a modified reverberation suppression algorithm using the generalized beta-divergence function, which includes the Kullback-Leibler divergence. Then, we performed Monte-Carlo simulations using synthesized reverberation for the modified reverberation suppression method. The results showed that the Kullback-Leibler divergence function (β = 1) has good performances in the high signal-to-reverberation environments, but the intermediate function (β = 1.25) between Kullback-Leibler divergence and Euclidean distance has better performance in the low signal-to-reverberation environments.

Code-Reuse Attack Detection Using Kullback-Leibler Divergence in IoT

  • Ho, Jun-Won
    • International journal of advanced smart convergence
    • /
    • v.5 no.4
    • /
    • pp.54-56
    • /
    • 2016
  • Code-reuse attacks are very dangerous in various systems. This is because they do not inject malicious codes into target systems, but reuse the instruction sequences in executable files or libraries of target systems. Moreover, code-reuse attacks could be more harmful to IoT systems in the sense that it may not be easy to devise efficient and effective mechanism for code-reuse attack detection in resource-restricted IoT devices. In this paper, we propose a detection scheme with using Kullback-Leibler (KL) divergence to combat against code-reuse attacks in IoT. Specifically, we detect code-reuse attacks by calculating KL divergence between the probability distributions of the packets that generate from IoT devices and contain code region addresses in memory system and the probability distributions of the packets that come to IoT devices and contain code region addresses in memory system, checking if the computed KL divergence is abnormal.

SOME INEQUALITIES FOR THE $CSISZ{\acute{A}}R\;{\Phi}-DIVERGENCE$

  • Dragomir, S.S.
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.7 no.1
    • /
    • pp.63-77
    • /
    • 2003
  • Some inequalities for the $Csisz{\acute{a}}r\;{\Phi}-divergence$ and applications for the Kullback-Leibler, $R{\acute{e}}nyi$, Hellinger and Bhattacharyya distances in Information Theory are given.

  • PDF

A Kullback-Leibler divergence based comparison of approximate Bayesian estimations of ARMA models

  • Amin, Ayman A
    • Communications for Statistical Applications and Methods
    • /
    • v.29 no.4
    • /
    • pp.471-486
    • /
    • 2022
  • Autoregressive moving average (ARMA) models involve nonlinearity in the model coefficients because of unobserved lagged errors, which complicates the likelihood function and makes the posterior density analytically intractable. In order to overcome this problem of posterior analysis, some approximation methods have been proposed in literature. In this paper we first review the main analytic approximations proposed to approximate the posterior density of ARMA models to be analytically tractable, which include Newbold, Zellner-Reynolds, and Broemeling-Shaarawy approximations. We then use the Kullback-Leibler divergence to study the relation between these three analytic approximations and to measure the distance between their derived approximate posteriors for ARMA models. In addition, we evaluate the impact of the approximate posteriors distance in Bayesian estimates of mean and precision of the model coefficients by generating a large number of Monte Carlo simulations from the approximate posteriors. Simulation study results show that the approximate posteriors of Newbold and Zellner-Reynolds are very close to each other, and their estimates have higher precision compared to those of Broemeling-Shaarawy approximation. Same results are obtained from the application to real-world time series datasets.

Malicious User Suppression Based on Kullback-Leibler Divergence for Cognitive Radio

  • Van, Hiep-Vu;Koo, In-Soo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.5 no.6
    • /
    • pp.1133-1146
    • /
    • 2011
  • Cognitive radio (CR) is considered one of the most promising next-generation communication systems; it has the ability to sense and make use of vacant channels that are unused by licensed users. Reliable detection of the licensed users' signals is an essential element for a CR network. Cooperative spectrum sensing (CSS) is able to offer better sensing performance as compared to individual sensing. The presence of malicious users who falsify sensing data can severely degrade the sensing performance of the CSS scheme. In this paper, we investigate a secure CSS scheme, based on the Kullback-Leibler Divergence (KL-divergence) theory, in order to identify malicious users and mitigate their harmful effect on the sensing performance of CSS in a CR network. The simulation results prove the effectiveness of the proposed scheme.

DIRECTIONAL LOG-DENSITY ESTIMATION

  • Huh, Jib;Kim, Peter T.;Koo, Ja-Yong;Park, Jin-Ho
    • Journal of the Korean Statistical Society
    • /
    • v.33 no.3
    • /
    • pp.255-269
    • /
    • 2004
  • This paper develops log-density estimation for directional data. The methodology is to use expansions with respect to spherical harmonics followed by estimating the unknown parameters by maximum likelihood. Minimax rates of convergence in terms of the Kullback-Leibler information divergence are obtained.

A study on the performance improvement of learning based on consistency regularization and unlabeled data augmentation (일치성규칙과 목표값이 없는 데이터 증대를 이용하는 학습의 성능 향상 방법에 관한 연구)

  • Kim, Hyunwoong;Seok, Kyungha
    • The Korean Journal of Applied Statistics
    • /
    • v.34 no.2
    • /
    • pp.167-175
    • /
    • 2021
  • Semi-supervised learning uses both labeled data and unlabeled data. Recently consistency regularization is very popular in semi-supervised learning. Unsupervised data augmentation (UDA) that uses unlabeled data augmentation is also based on the consistency regularization. The Kullback-Leibler divergence is used for the loss of unlabeled data and cross-entropy for the loss of labeled data through UDA learning. UDA uses techniques such as training signal annealing (TSA) and confidence-based masking to promote performance. In this study, we propose to use Jensen-Shannon divergence instead of Kullback-Leibler divergence, reverse-TSA and not to use confidence-based masking for performance improvement. Through experiment, we show that the proposed technique yields better performance than those of UDA.

On a Balanced Classification Rule

  • Kim, Hea-Jung
    • Journal of the Korean Statistical Society
    • /
    • v.24 no.2
    • /
    • pp.453-470
    • /
    • 1995
  • We describe a constrained optimal classification rule for the case when the prior probability of an observation belonging to one of the two populations is unknown. This is done by suggesting a balanced design for the classification experiment and constructing the optimal rule under the balanced design condition. The rule si characterized by a constrained minimization of total risk of misclassification; the constraint of the rule is constructed by the process of equation between Kullback-Leibler's directed divergence measures obtained from the two population conditional densities. The efficacy of the suggested rule is examined through two-group normal classification. This indicates that, in case little is known about the relative population sizes, dramatic gains in accuracy of classification result can be achieved.

  • PDF

Visualizing a Multi-Dimensional Data Set in a Lower Dimensional Space (저차원 영역에서 고차원 데이터 집합의 표현 방법)

  • Dong-Hun Seo;Kolesnikova Anastasiya;Won Don Lee
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2008.11a
    • /
    • pp.40-43
    • /
    • 2008
  • 본 논문에서는 고차원 영역의 데이터 집합을 저차원 영역으로 표현하는 방법에 대해서 제안한다. 특별히 고차원 영역을 2 차원 영역으로 실험하였다. 제안한 방법은 사람이 데이터 객체 사이의 거리나 관계를 직관적으로 인지할 수 있도록 하는 방법이다. 데이터 객체 사이의 거리나 관계를 계산하기 위하여 Kullback-Leibler divergence 를 사용하였다. 이 방법은 확률 분포를 갖는 벡터들 사이의 거리를 계산하여 사용한다. Kullback-Leibler divergence 를 사용하여 계산된 거리 값들은 저차원 영역에서 객체들의 좌표를 계산하기 위하여 사용된다. 좌표계산을 위해서 Simulated Annealing 란 최적화 기법을 사용하였다. 실험 결과를 통해 다차원 데이터를 2 차원 영역으로 표현한 것이 충분히 직관적임을 보였다.