• Title/Summary/Keyword: Kernel function

Search Result 621, Processing Time 0.028 seconds

A Development of Noparamtric Kernel Function Suitable for Extreme Value (극치값 추정에 적합한 비매개변수적 핵함수 개발)

  • Cha Young-Il;Kim Soon-Bum;Moon Young-Il
    • Journal of Korea Water Resources Association
    • /
    • v.39 no.6 s.167
    • /
    • pp.495-502
    • /
    • 2006
  • The importance of the bandwidth selection has been more emphasized than the kernel function selection for nonparametric frequency analysis since the interpolation is more reliable than the extrapolation method. However, when the extrapolation method is being applied(i.e. recurrence interval more than the length of data or extreme probabilities such as $200{\sim}500$ years), the selection of the kernel function is as important as the selection of the bandwidth. So far, the existing kernel functions have difficulties for extreme value estimations because the values extrapolated by kernel functions are either too small or too big. This paper suggests a Modified Cauchy kernel function that is suitable for both interpolation and extrapolation as an improvement.

Determining Kernel Function of Apparent Earth Resistivity Using Linearization (선형화를 이용한 대지저항률의 커널함수 결정)

  • Kang, Min-Jae;Boo, Chang-Jin;Lee, Jung-Hoon;Kim, Ho-Chan
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.22 no.4
    • /
    • pp.454-459
    • /
    • 2012
  • A kernel function of apparent earth resistivity can be estimated using the apparent earth resistivity measured with Wenner's 4 point method. It becomes to solve a nonlinear system to estimate the kernel function of apparent earth resistivity. However it is not simple to get solution of nonlinear system with many unknown variables. This paper suggests the method of estimating kernel function by linearizing this nonlinear system. Finally, various examples of earth structure have been simulated to evaluate the proposed method in this paper.

Weighted Kernel and it's Learning Method for Cancer Diagnosis System (암진단시스템을 위한 Weighted Kernel 및 학습방법)

  • Choi, Gyoo-Seok;Park, Jong-Jin;Jeon, Byoung-Chan;Park, In-Kyu;Ahn, Ihn-Seok;Nguyen, Ha-Nam
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.9 no.2
    • /
    • pp.1-6
    • /
    • 2009
  • One of the most important problems in bioinformatics is how to extract the useful information from a huge amount of data, and make a decision in diagnosis, prognosis, and medical treatment applications. This paper proposes a weighted kernel function for support vector machine and its learning method with a fast convergence and a good classification performance. We defined the weighted kernel function as the weighted sum of a set of different types of basis kernel functions such as neural, radial, and polynomial kernels, which are trained by a learning method based on genetic algorithm. The weights of basis kernel functions in proposed kernel are determined in learning phase and used as the parameters in the decision model in classification phase. The experiments on several clinical datasets such as colon cancer indicate that our weighted kernel function results in higher and more stable classification performance than other kernel functions.

  • PDF

A LARGE-UPDATE INTERIOR POINT ALGORITHM FOR $P_*(\kappa)$ LCP BASED ON A NEW KERNEL FUNCTION

  • Cho, You-Young;Cho, Gyeong-Mi
    • East Asian mathematical journal
    • /
    • v.26 no.1
    • /
    • pp.9-23
    • /
    • 2010
  • In this paper we generalize large-update primal-dual interior point methods for linear optimization problems in [2] to the $P_*(\kappa)$ linear complementarity problems based on a new kernel function which includes the kernel function in [2] as a special case. The kernel function is neither self-regular nor eligible. Furthermore, we improve the complexity result in [2] from $O(\sqrt[]{n}(\log\;n)^2\;\log\;\frac{n{\mu}o}{\epsilon})$ to $O\sqrt[]{n}(\log\;n)\log(\log\;n)\log\;\frac{m{\mu}o}{\epsilon}$.

Spare Representation Learning of Kernel Space Using the Kernel Relaxation Procedure (커널 이완 절차에 의한 커널 공간의 저밀도 표현 학습)

  • 류재홍;정종철
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.11 no.9
    • /
    • pp.817-821
    • /
    • 2001
  • In this paper, a new learning methodology for kernel methods that results in a sparse representation of kernel space from the training patterns for classification problems is suggested. Among the traditional algorithms of linear discriminant function, this paper shows that the relaxation procedure can obtain the maximum margin separating hyperplane of linearly separable pattern classification problem as SVM(Support Vector Machine) classifier does. The original relaxation method gives only the necessary condition of SV patterns. We suggest the sufficient condition to identify the SV patterns in the learning epoches. For sequential learning of kernel methods, extended SVM and kernel discriminant function are defined. Systematic derivation of learning algorithm is introduced. Experiment results show the new methods have the higher or equivalent performance compared to the conventional approach.

  • PDF

REPRODUCING KERNEL KREIN SPACES

  • Yang, Mee-Hyea
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.2
    • /
    • pp.659-668
    • /
    • 2001
  • Let S(z) be a power series with operator coefficients such that multiplication by S(z) is an everywhere defined transformation in the square summable power series C(z). In this paper we show that there exists a reproducing kernel Krein space which is state space of extended canonical linear system with transfer function S(z). Also we characterize the reproducing kernel function of the state space of a linear system.

ASYMPTOTIC APPROXIMATION OF KERNEL-TYPE ESTIMATORS WITH ITS APPLICATION

  • Kim, Sung-Kyun;Kim, Sung-Lai;Jang, Yu-Seon
    • Journal of applied mathematics & informatics
    • /
    • v.15 no.1_2
    • /
    • pp.147-158
    • /
    • 2004
  • Sufficient conditions are given under which a generalized class of kernel-type estimators allows asymptotic approximation on the modulus of continuity. This generalized class includes sample distribution function, kernel-type estimator of density function, and an estimator that may apply to the censored case. In addition, an application is given to asymptotic normality of recursive density estimators of density function at an unknown point.

Kernel Machine for Poisson Regression

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.18 no.3
    • /
    • pp.767-772
    • /
    • 2007
  • A kernel machine is proposed as an estimating procedure for the linear and nonlinear Poisson regression, which is based on the penalized negative log-likelihood. The proposed kernel machine provides the estimate of the mean function of the response variable, where the canonical parameter is related to the input vector in a nonlinear form. The generalized cross validation(GCV) function of MSE-type is introduced to determine hyperparameters which affect the performance of the machine. Experimental results are then presented which indicate the performance of the proposed machine.

  • PDF

Sparse kernel classication using IRWLS procedure

  • Kim, Dae-Hak
    • Journal of the Korean Data and Information Science Society
    • /
    • v.20 no.4
    • /
    • pp.749-755
    • /
    • 2009
  • Support vector classification (SVC) provides more complete description of the lin-ear and nonlinear relationships between input vectors and classifiers. In this paper. we propose the sparse kernel classifier to solve the optimization problem of classification with a modified hinge loss function and absolute loss function, which provides the efficient computation and the sparsity. We also introduce the generalized cross validation function to select the hyper-parameters which affects the classification performance of the proposed method. Experimental results are then presented which illustrate the performance of the proposed procedure for classification.

  • PDF