• Title/Summary/Keyword: Kernel function

Search Result 628, Processing Time 0.022 seconds

Self-adaptive Online Sequential Learning Radial Basis Function Classifier Using Multi-variable Normal Distribution Function

  • Dong, Keming;Kim, Hyoung-Joong;Suresh, Sundaram
    • 한국정보통신설비학회:학술대회논문집
    • /
    • 2009.08a
    • /
    • pp.382-386
    • /
    • 2009
  • Online or sequential learning is one of the most basic and powerful method to train neuron network, and it has been widely used in disease detection, weather prediction and other realistic classification problem. At present, there are many algorithms in this area, such as MRAN, GAP-RBFN, OS-ELM, SVM and SMC-RBF. Among them, SMC-RBF has the best performance; it has less number of hidden neurons, and best efficiency. However, all the existing algorithms use signal normal distribution as kernel function, which means the output of the kernel function is same at the different direction. In this paper, we use multi-variable normal distribution as kernel function, and derive EKF learning formulas for multi-variable normal distribution kernel function. From the result of the experience, we can deduct that the proposed method has better efficiency performance, and not sensitive to the data sequence.

  • PDF

Estimating Variance Function with Kernel Machine

  • Kim, Jong-Tae;Hwang, Chang-Ha;Park, Hye-Jung;Shim, Joo-Yong
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.2
    • /
    • pp.383-388
    • /
    • 2009
  • In this paper we propose a variance function estimation method based on kernel trick for replicated data or data consisted of sample variances. Newton-Raphson method is used to obtain associated parameter vector. Furthermore, the generalized approximate cross validation function is introduced to select the hyper-parameters which affect the performance of the proposed variance function estimation method. Experimental results are then presented which illustrate the performance of the proposed procedure.

Kernel Poisson regression for mixed input variables

  • Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.6
    • /
    • pp.1231-1239
    • /
    • 2012
  • An estimating procedure is introduced for kernel Poisson regression when the input variables consist of numerical and categorical variables, which is based on the penalized negative log-likelihood and the component-wise product of two different types of kernel functions. The proposed procedure provides the estimates of the mean function of the response variables, where the canonical parameter is linearly and/or nonlinearly related to the input variables. Experimental results are then presented which indicate the performance of the proposed kernel Poisson regression.

A NEW BIHARMONIC KERNEL FOR THE UPPER HALF PLANE

  • Abkar, Ali
    • Journal of the Korean Mathematical Society
    • /
    • v.43 no.6
    • /
    • pp.1169-1181
    • /
    • 2006
  • We introduce a new biharmonic kernel for the upper half plane, and then study the properties of its relevant potentials, such as the convergence in the mean and the boundary behavior. Among other things, we shall see that Fatou's theorem is valid for these potentials, so that the biharmonic Poisson kernel resembles the usual Poisson kernel for the upper half plane.

A Study on Kernel Type Discontinuity Point Estimations

  • Huh, Jib
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.4
    • /
    • pp.929-937
    • /
    • 2003
  • Kernel type estimations of discontinuity point at an unknown location in regression function or its derivatives have been developed. It is known that the discontinuity point estimator based on $Gasser-M\ddot{u}ller$ regression estimator with a one-sided kernel function which has a zero value at the point 0 makes a poor asymptotic behavior. Further, the asymptotic variance of $Gasser-M\ddot{u}ller$ regression estimator in the random design case is 1.5 times larger that the one in the corresponding fixed design case, while those two are identical for the local polynomial regression estimator. Although $Gasser-M\ddot{u}ller$ regression estimator with a one-sided kernel function which has a non-zero value at the point 0 for the modification is used, computer simulation show that this phenomenon is also appeared in the discontinuity point estimation.

  • PDF

MONOTONICITY PROPERTIES OF THE BESSEL-STRUVE KERNEL

  • Baricz, Arpad;Mondal, Saiful R.;Swaminathan, Anbhu
    • Bulletin of the Korean Mathematical Society
    • /
    • v.53 no.6
    • /
    • pp.1845-1856
    • /
    • 2016
  • In this paper our aim is to study the classical Bessel-Struve kernel. Monotonicity and log-convexity properties for the Bessel-Struve kernel, and the ratio of the Bessel-Struve kernel and the Kummer confluent hypergeometric function are investigated. Moreover, lower and upper bounds are given for the Bessel-Struve kernel in terms of the exponential function and some $Tur{\acute{a}}n$ type inequalities are deduced.

AN ELIGIBLE PRIMAL-DUAL INTERIOR-POINT METHOD FOR LINEAR OPTIMIZATION

  • Cho, Gyeong-Mi;Lee, Yong-Hoon
    • East Asian mathematical journal
    • /
    • v.29 no.3
    • /
    • pp.279-292
    • /
    • 2013
  • It is well known that each kernel function defines a primal-dual interior-point method(IPM). Most of polynomial-time interior-point algorithms for linear optimization(LO) are based on the logarithmic kernel function([2, 11]). In this paper we define a new eligible kernel function and propose a new search direction and proximity function based on this function for LO problems. We show that the new algorithm has ${\mathcal{O}}((log\;p){\sqrt{n}}\;log\;n\;log\;{\frac{n}{\epsilon}})$ and ${\mathcal{O}}((q\;log\;p)^{\frac{3}{2}}{\sqrt{n}}\;log\;{\frac{n}{\epsilon}})$ iteration bound for large- and small-update methods, respectively. These are currently the best known complexity results.

CONFORMAL MAPPING AND CLASSICAL KERNEL FUNCTIONS

  • CHUNG, YOUNG-BOK
    • Honam Mathematical Journal
    • /
    • v.27 no.2
    • /
    • pp.195-203
    • /
    • 2005
  • We show that the exact Bergman kernel function associated to a $C^{\infty}$ bounded domain in the plane relates the derivatives of the Ahlfors map in an explicit way. And we find several formulas relating the exact Bergman kernel to classical kernel functions in potential theory.

  • PDF

Support vector quantile regression for autoregressive data

  • Hwang, Hyungtae
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.6
    • /
    • pp.1539-1547
    • /
    • 2014
  • In this paper we apply the autoregressive process to the nonlinear quantile regression in order to infer nonlinear quantile regression models for the autocorrelated data. We propose a kernel method for the autoregressive data which estimates the nonlinear quantile regression function by kernel machines. Artificial and real examples are provided to indicate the usefulness of the proposed method for the estimation of quantile regression function in the presence of autocorrelation between data.

ALGEBRAICITY OF PROPER HOLOMORPHIC MAPPINGS

  • CHUNG, YOUNG-BOK
    • Honam Mathematical Journal
    • /
    • v.21 no.1
    • /
    • pp.105-113
    • /
    • 1999
  • Suppose that ${\Omega}$ is a bounded domain with $C^{\infty}$ smooth boundary in the plane whose associated Bergman kernel, exact Bergman kernel, or $Szeg{\ddot{o}}$ kernel function is an algebraic function. We shall prove that any proper holomorphic mapping of ${\Omega}$ onto the unit disc is algebraic.

  • PDF