• Title/Summary/Keyword: kernel regression

Search Result 240, Processing Time 0.028 seconds

Kernel Poisson regression for mixed input variables

  • Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.6
    • /
    • pp.1231-1239
    • /
    • 2012
  • An estimating procedure is introduced for kernel Poisson regression when the input variables consist of numerical and categorical variables, which is based on the penalized negative log-likelihood and the component-wise product of two different types of kernel functions. The proposed procedure provides the estimates of the mean function of the response variables, where the canonical parameter is linearly and/or nonlinearly related to the input variables. Experimental results are then presented which indicate the performance of the proposed kernel Poisson regression.

Ensemble approach for improving prediction in kernel regression and classification

  • Han, Sunwoo;Hwang, Seongyun;Lee, Seokho
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.4
    • /
    • pp.355-362
    • /
    • 2016
  • Ensemble methods often help increase prediction ability in various predictive models by combining multiple weak learners and reducing the variability of the final predictive model. In this work, we demonstrate that ensemble methods also enhance the accuracy of prediction under kernel ridge regression and kernel logistic regression classification. Here we apply bagging and random forests to two kernel-based predictive models; and present the procedure of how bagging and random forests can be embedded in kernel-based predictive models. Our proposals are tested under numerous synthetic and real datasets; subsequently, they are compared with plain kernel-based predictive models and their subsampling approach. Numerical studies demonstrate that ensemble approach outperforms plain kernel-based predictive models.

A Fast Kernel Regression Framework for Video Super-Resolution

  • Yu, Wen-Sen;Wang, Ming-Hui;Chang, Hua-Wen;Chen, Shu-Qing
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.1
    • /
    • pp.232-248
    • /
    • 2014
  • A series of kernel regression (KR) algorithms, such as the classic kernel regression (CKR), the 2- and 3-D steering kernel regression (SKR), have been proposed for image and video super-resolution. In existing KR frameworks, a single algorithm is usually adopted and applied for a whole image/video, regardless of region characteristics. However, their performances and computational efficiencies can differ in regions of different characteristics. To take full advantage of the KR algorithms and avoid their disadvantage, this paper proposes a kernel regression framework for video super-resolution. In this framework, each video frame is first analyzed and divided into three types of regions: flat, non-flat-stationary, and non-flat-moving regions. Then different KR algorithm is selected according to the region type. The CKR and 2-D SKR algorithms are applied to flat and non-flat-stationary regions, respectively. For non-flat-moving regions, this paper proposes a similarity-assisted steering kernel regression (SASKR) algorithm, which can give better performance and higher computational efficiency than the 3-D SKR algorithm. Experimental results demonstrate that the computational efficiency of the proposed framework is greatly improved without apparent degradation in performance.

Variable selection in the kernel Cox regression

  • Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.22 no.4
    • /
    • pp.795-801
    • /
    • 2011
  • In machine learning and statistics it is often the case that some variables are not important, while some variables are more important than others. We propose a novel algorithm for selecting such relevant variables in the kernel Cox regression. We employ the weighted version of ANOVA decomposition kernels to choose optimal subset of relevant variables in the kernel Cox regression. Experimental results are then presented which indicate the performance of the proposed method.

Kernel method for autoregressive data

  • Shim, Joo-Yong;Lee, Jang-Taek
    • Journal of the Korean Data and Information Science Society
    • /
    • v.20 no.5
    • /
    • pp.949-954
    • /
    • 2009
  • The autoregressive process is applied in this paper to kernel regression in order to infer nonlinear models for predicting responses. We propose a kernel method for the autoregressive data which estimates the mean function by kernel machines. We also present the model selection method which employs the cross validation techniques for choosing the hyper-parameters which affect the performance of kernel regression. Artificial and real examples are provided to indicate the usefulness of the proposed method for the estimation of mean function in the presence of autocorrelation between data.

  • PDF

On Predicting with Kernel Ridge Regression

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.1
    • /
    • pp.103-111
    • /
    • 2003
  • Kernel machines are used widely in real-world regression tasks. Kernel ridge regressions(KRR) and support vector machines(SVM) are typical kernel machines. Here, we focus on two types of KRR. One is inductive KRR. The other is transductive KRR. In this paper, we study how differently they work in the interpolation and extrapolation areas. Furthermore, we study prediction interval estimation method for KRR. This turns out to be a reliable and practical measure of prediction interval and is essential in real-world tasks.

  • PDF

Censored Kernel Ridge Regression

  • Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.4
    • /
    • pp.1045-1052
    • /
    • 2005
  • This paper deals with the estimations of kernel ridge regression when the responses are subject to randomly right censoring. The weighted data are formed by redistributing the weights of the censored data to the uncensored data. Then kernel ridge regression can be taken up with the weighted data. The hyperparameters of model which affect the performance of the proposed procedure are selected by a generalized approximate cross validation(GACV) function. Experimental results are then presented which indicate the performance of the proposed procedure.

  • PDF

A note on nonparametric density deconvolution by weighted kernel estimators

  • Lee, Sungho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.4
    • /
    • pp.951-959
    • /
    • 2014
  • Recently Hazelton and Turlach (2009) proposed a weighted kernel density estimator for the deconvolution problem. In the case of Gaussian kernels and measurement error, they argued that the weighted kernel density estimator is a competitive estimator over the classical deconvolution kernel estimator. In this paper we consider weighted kernel density estimators when sample observations are contaminated by double exponentially distributed errors. The performance of the weighted kernel density estimators is compared over the classical deconvolution kernel estimator and the kernel density estimator based on the support vector regression method by means of a simulation study. The weighted density estimator with the Gaussian kernel shows numerical instability in practical implementation of optimization function. However the weighted density estimates with the double exponential kernel has very similar patterns to the classical kernel density estimates in the simulations, but the shape is less satisfactory than the classical kernel density estimator with the Gaussian kernel.

Nonparametic Kernel Regression model for Rating curve (수위-유량곡선을 위한 비매개 변수적 Kernel 회귀모형)

  • Moon, Young-Il;Cho, Sung-Jin;Chun, Si-Young
    • Journal of Korea Water Resources Association
    • /
    • v.36 no.6
    • /
    • pp.1025-1033
    • /
    • 2003
  • In common with workers in hydrologic fields, scientists and engineers relate one variable to two or more other variables for purposes of predication, optimization, and control. Statistics methods have improved to establish such relationships. Regression, as it is called, is indeed the most commonly used statistics technique in hydrologic fields; relationship between the monitored variable stage and the corresponding discharges(rating curve). Regression methods expressed in the form of mathematical equations which has parameters, so called parametric methods. some times, the establishment of parameters is complicated and uncertain. Many non-parametric regression methods which have not parameters, have been proposed and studied. The most popular of these are kernel regression method. Kernel regression offer a way of estimation the regression function without the specification of a parametric model. This paper conducted comparisons of some bandwidth selection methods which are using the least squares and cross-validation.

M-quantile regression using kernel machine technique

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.21 no.5
    • /
    • pp.973-981
    • /
    • 2010
  • Quantile regression investigates the quantiles of the conditional distribution of a response variable given a set of covariates. M-quantile regression extends this idea by a "quantile-like" generalization of regression based on influence functions. In this paper we propose a new method of estimating M-quantile regression functions, which uses kernel machine technique. Simulation studies are presented that show the finite sample properties of the proposed M-quantile regression.