• Title/Summary/Keyword: Kernel Methods

Search Result 485, Processing Time 0.021 seconds

Incomplete Cholesky Decomposition based Kernel Cross Modal Factor Analysis for Audiovisual Continuous Dimensional Emotion Recognition

  • Li, Xia;Lu, Guanming;Yan, Jingjie;Li, Haibo;Zhang, Zhengyan;Sun, Ning;Xie, Shipeng
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.2
    • /
    • pp.810-831
    • /
    • 2019
  • Recently, continuous dimensional emotion recognition from audiovisual clues has attracted increasing attention in both theory and in practice. The large amount of data involved in the recognition processing decreases the efficiency of most bimodal information fusion algorithms. A novel algorithm, namely the incomplete Cholesky decomposition based kernel cross factor analysis (ICDKCFA), is presented and employed for continuous dimensional audiovisual emotion recognition, in this paper. After the ICDKCFA feature transformation, two basic fusion strategies, namely feature-level fusion and decision-level fusion, are explored to combine the transformed visual and audio features for emotion recognition. Finally, extensive experiments are conducted to evaluate the ICDKCFA approach on the AVEC 2016 Multimodal Affect Recognition Sub-Challenge dataset. The experimental results show that the ICDKCFA method has a higher speed than the original kernel cross factor analysis with the comparable performance. Moreover, the ICDKCFA method achieves a better performance than other common information fusion methods, such as the Canonical correlation analysis, kernel canonical correlation analysis and cross-modal factor analysis based fusion methods.

Efficient Kernel Based 3-D Source Localization via Tensor Completion

  • Lu, Shan;Zhang, Jun;Ma, Xianmin;Kan, Changju
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.1
    • /
    • pp.206-221
    • /
    • 2019
  • Source localization in three-dimensional (3-D) wireless sensor networks (WSNs) is becoming a major research focus. Due to the complicated air-ground environments in 3-D positioning, many of the traditional localization methods, such as received signal strength (RSS) may have relatively poor accuracy performance. Benefit from prior learning mechanisms, fingerprinting-based localization methods are less sensitive to complex conditions and can provide relatively accurate localization performance. However, fingerprinting-based methods require training data at each grid point for constructing the fingerprint database, the overhead of which is very high, particularly for 3-D localization. Also, some of measured data may be unavailable due to the interference of a complicated environment. In this paper, we propose an efficient kernel based 3-D localization algorithm via tensor completion. We first exploit the spatial correlation of the RSS data and demonstrate the low rank property of the RSS data matrix. Based on this, a new training scheme is proposed that uses tensor completion to recover the missing data of the fingerprint database. Finally, we propose a kernel based learning technique in the matching phase to improve the sensitivity and accuracy in the final source position estimation. Simulation results show that our new method can effectively eliminate the impairment caused by incomplete sensing data to improve the localization performance.

Study on the ensemble methods with kernel ridge regression

  • Kim, Sun-Hwa;Cho, Dae-Hyeon;Seok, Kyung-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.2
    • /
    • pp.375-383
    • /
    • 2012
  • The purpose of the ensemble methods is to increase the accuracy of prediction through combining many classifiers. According to recent studies, it is proved that random forests and forward stagewise regression have good accuracies in classification problems. However they have great prediction error in separation boundary points because they used decision tree as a base learner. In this study, we use the kernel ridge regression instead of the decision trees in random forests and boosting. The usefulness of our proposed ensemble methods was shown by the simulation results of the prostate cancer and the Boston housing data.

A selective review of nonlinear sufficient dimension reduction

  • Sehun Jang;Jun Song
    • Communications for Statistical Applications and Methods
    • /
    • v.31 no.2
    • /
    • pp.247-262
    • /
    • 2024
  • In this paper, we explore nonlinear sufficient dimension reduction (SDR) methods, with a primary focus on establishing a foundational framework that integrates various nonlinear SDR methods. We illustrate the generalized sliced inverse regression (GSIR) and the generalized sliced average variance estimation (GSAVE) which are fitted by the framework. Further, we delve into nonlinear extensions of inverse moments through the kernel trick, specifically examining the kernel sliced inverse regression (KSIR) and kernel canonical correlation analysis (KCCA), and explore their relationships within the established framework. We also briefly explain the nonlinear SDR for functional data. In addition, we present practical aspects such as algorithmic implementations. This paper concludes with remarks on the dimensionality problem of the target function class.

A Non-linear Variant of Improved Robust Fuzzy PCA (잡음 민감성이 향상된 주성분 분석 기법의 비선형 변형)

  • Heo, Gyeong-Yong;Seo, Jin-Seok;Lee, Im-Geun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.16 no.4
    • /
    • pp.15-22
    • /
    • 2011
  • Principal component analysis (PCA) is a well-known method for dimensionality reduction and feature extraction while maintaining most of the variation in data. Although PCA has been applied in many areas successfully, it is sensitive to outliers and only valid for Gaussian distributions. Several variants of PCA have been proposed to resolve noise sensitivity and, among the variants, improved robust fuzzy PCA (RF-PCA2) demonstrated promising results. RF-PCA, however, is still a linear algorithm that cannot accommodate non-Gaussian distributions. In this paper, a non-linear algorithm that combines RF-PCA2 and kernel PCA (K-PCA), called improved robust kernel fuzzy PCA (RKF-PCA2), is introduced. The kernel methods make it to accommodate non-Gaussian distributions. RKF-PCA2 inherits noise robustness from RF-PCA2 and non-linearity from K-PCA. RKF-PCA2 outperforms previous methods in handling non-Gaussian distributions in a noise robust way. Experimental results also support this.

A New Adaptive Kernel Estimation Method for Correntropy Equalizers (코렌트로피 이퀄라이져를 위한 새로운 커널 사이즈 적응 추정 방법)

  • Kim, Namyong
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.22 no.3
    • /
    • pp.627-632
    • /
    • 2021
  • ITL (information-theoretic learning) has been applied successfully to adaptive signal processing and machine learning applications, but there are difficulties in deciding the kernel size, which has a great impact on the system performance. The correntropy algorithm, one of the ITL methods, has superior properties of impulsive-noise robustness and channel-distortion compensation. On the other hand, it is also sensitive to the kernel sizes that can lead to system instability. In this paper, considering the sensitivity of the kernel size cubed in the denominator of the cost function slope, a new adaptive kernel estimation method using the rate of change in error power in respect to the kernel size variation is proposed for the correntropy algorithm. In a distortion-compensation experiment for impulsive-noise and multipath-distorted channel, the performance of the proposed kernel-adjusted correntropy algorithm was examined. The proposed method shows a two times faster convergence speed than the conventional algorithm with a fixed kernel size. In addition, the proposed algorithm converged appropriately for kernel sizes ranging from 2.0 to 6.0. Hence, the proposed method has a wide acceptable margin of initial kernel sizes.

Daily Dose of Apricot Kernel in Treatise on Cold Damage Diseases (상한론(傷寒論) 탕제에서 행인(杏仁) 1 일 복용량)

  • Kim, In-Rak
    • The Korea Journal of Herbology
    • /
    • v.32 no.6
    • /
    • pp.17-22
    • /
    • 2017
  • Objectives : Daily Dose of Apricot Kernel in Treatise on Cold Damage Diseases is usually written in the number, sometimes in the volume. The seed coat and acute end of Apricot Kernel must be removed, so author want to know its daily dose and proportion of seed coat and acute end. Methods : Assuming dosage by editions of Treatise on Cold Damage Diseases. And comparing it with measured weight of Apricot Kernel distributed in market. Results : The number of prescriptions including Apricot Kernel is ten, and eight of that are made to decoction, two of that are made to pill prescription. And two of decoction are made by reducing and uniting prescriptions. The daily dose of six decoction are 70, 47 or 35 in numbers. The 70 Apricot Kernel except seed coat and acute end are 1/2 Sheong ($33m{\ell}$) in volume, 3 Ryang (19.5 g) in weight. Weight of Apricot Kernel the most common in market is 0.28 ~ 0.38 g. 70 Apricot Kernel are 23.10 g, seed coats of that are 1.15 g, acute ends of that are 2.43 g, 70 Apricot Kernel except seed coat and acute end is 19.5 g. So, seed coat is 5%, acute end is 10% by proportion, which is the same with it assumed based on writings. Conclusions : 70 Apricot Kernel except seed coat and acute end are 1/2 Sheong, 3 Ryang, and it is 33 mL and 19.5 g respectively. It also correspond with current market goods.

Development and Performance of a Jatropha Seed Shelling Machine Based on Seed Moisture Content

  • Aremu, A.K.;Adeniyi, A.O.;Fadele, O.K.
    • Journal of Biosystems Engineering
    • /
    • v.40 no.2
    • /
    • pp.137-144
    • /
    • 2015
  • Purpose: The high energy requirement of extraction of oil from jatropha seed and reduction of loss in oil content between whole seed and kernel of jatropha necessitate seed shelling. The purpose of this study is to develop and evaluate the performance of a jatropha seed shelling machine based on seed moisture content. Methods: A shelling machine was designed and constructed for jatropha seed. The components are frame, hopper, shelling chamber, concave, and blower with discharge units. The performance evaluation of the machine was carried out by determining parameters such as percentage of whole kernel recovered, percentage of broken kernel recovered, percentage of partially shelled seed, percentage of unshelled seed, machine capacity, machine efficiency, and shelling efficiency. All of the parameters were evaluated at five different moisture levels: 8.00%, 9.37%, 10.77%, 12.21%, and 13.68% w.b.). Results: The shelling efficiency of the machine increased with increase in seed moisture content; the percentage of whole kernel recovered and percentage of partially shelled seed decreased with increase in moisture content; and percentage of broken kernel, machine efficiency, and percentage of unshelled seed followed a sinusoidal trend with moisture content variation. Conclusion: The best operating condition for the shelling machine was at a moisture content of 8.00% w.b., at which the maximum percentage of whole kernel recovered was 23.23% at a shelling efficiency of 73.95%.

On Practical Efficiency of Locally Parametric Nonparametric Density Estimation Based on Local Likelihood Function

  • Kang, Kee-Hoon;Han, Jung-Hoon
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.2
    • /
    • pp.607-617
    • /
    • 2003
  • This paper offers a practical comparison of efficiency between local likelihood approach and conventional kernel approach in density estimation. The local likelihood estimation procedure maximizes a kernel smoothed log-likelihood function with respect to a polynomial approximation of the log likelihood function. We use two types of data driven bandwidths for each method and compare the mean integrated squares for several densities. Numerical results reveal that local log-linear approach with simple plug-in bandwidth shows better performance comparing to the standard kernel approach in heavy tailed distribution. For normal mixture density cases, standard kernel estimator with the bandwidth in Sheather and Jones(1991) dominates the others in moderately large sample size.

A note on SVM estimators in RKHS for the deconvolution problem

  • Lee, Sungho
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.1
    • /
    • pp.71-83
    • /
    • 2016
  • In this paper we discuss a deconvolution density estimator obtained using the support vector machines (SVM) and Tikhonov's regularization method solving ill-posed problems in reproducing kernel Hilbert space (RKHS). A remarkable property of SVM is that the SVM leads to sparse solutions, but the support vector deconvolution density estimator does not preserve sparsity as well as we expected. Thus, in section 3, we propose another support vector deconvolution estimator (method II) which leads to a very sparse solution. The performance of the deconvolution density estimators based on the support vector method is compared with the classical kernel deconvolution density estimator for important cases of Gaussian and Laplacian measurement error by means of a simulation study. In the case of Gaussian error, the proposed support vector deconvolution estimator shows the same performance as the classical kernel deconvolution density estimator.