Browse > Article

Doubly penalized kernel method for heteroscedastic autoregressive datay  

Cho, Dae-Hyeon (Department of Data Science, Institute of Statistical Information, Inje University)
Shim, Joo-Yong (Department of Applied Statistics, Catholic University of Daegu)
Seok, Kyung-Ha (Department of Data Science, Institute of Statistical Information, Inje University)
Publication Information
Journal of the Korean Data and Information Science Society / v.21, no.1, 2010 , pp. 155-162 More about this Journal
Abstract
In this paper we propose a doubly penalized kernel method which estimates both the mean function and the variance function simultaneously by kernel machines for heteroscedastic autoregressive data. We also present the model selection method which employs the cross validation techniques for choosing the hyper-parameters which aect the performance of proposed method. Simulated examples are provided to indicate the usefulness of proposed method for the estimation of mean and variance functions.
Keywords
Autoregressive process; cross validation function heteroscedasticity; hyper-parameters; kernel function;
Citations & Related Records
Times Cited By KSCI : 4  (Citation Analysis)
연도 인용수 순위
1 Shim, J., Park, H. J. and Seok, K. H. (2008). Kernel Poisson regression for longitudinal data. Journal of Korean Data & Information Science Society, 19, 1353-1360.   과학기술학회마을
2 Yuan, M. and Wahba, G. (2004). Doubly penalized likelihood estimator in heteroscedastic regression. Statistics & Probability Letters, 69, 11-20.   DOI   ScienceOn
3 Fan, J. Q. and Yao, Q. W. (1998). Efficient estimation of conditional variance functions in stochastic regression. Biometrika, 85, 645-660.   DOI   ScienceOn
4 Mercer, J. (1909). Functions of positive and negative type and their connection with theory of integral equations. Philosophical Transactions of Royal Society, A, 415-446.   DOI
5 Hwang, C. (2008). Mixed effects kernel binomial regression. Journal of Korean Data & Information Science Society, 19, 1327-1334.   과학기술학회마을
6 Kimeldorf, G. S. and Wahba, G. (1971). Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and its Applications, 33, 82-95.   DOI
7 Suykens, J. A. K. and Vanderwalle, J. (1999). Least square support vector machine classifier. Neural Processing Letters, 9, 293-300.   DOI   ScienceOn
8 Xiang, D. and Wahba, G. (1996). A generalized approximate cross validation for smoothing splines with non-gaussian data. Statistian Sinica, 6, 675-692.
9 Anderson, T. G. and Lund, J. (1997). Estimating continuous-time stochastic volatility models of short-term interest rate. Journal of Econometrics, 77, 343-377.   DOI   ScienceOn
10 Golub, G. H., Heath, M. and Wahba, G. (1979). Generalized cross validation as a method for choosing a good ridge parameter. Technometrics, 21, 215-223.   DOI   ScienceOn
11 Juditsky, A, Hjalmarsson, H., Benveniste, A., Deylon, B., Ljung, L., Sj o. berg, J. and Zhang, Q. (1995). Nonlinear black-box modelling in system identification: Mathematical foundations. Automatica, 31, 1725-1750.   DOI   ScienceOn
12 Liu, A., Tong, T. and Wang, Y. (2007). Smoothing spline estimation of variance functions. Journal of Computational and Graphical Statistics, 16, 312-329.   DOI   ScienceOn
13 Aizerman, M. A., Braverman, E. M. and Rozonoer, L. I. (1964). Theoretical foundation of potential function method in pattern recognition learning. Automation and Remote Control, 25, 821-837.
14 Ruppert, D., Wand, M. P., Holst, U. and Hossjer, O. (1997). Local polynomial variance-function estimation. Technometrics, 39, 262-73.   DOI   ScienceOn
15 Shim, J. and Lee, J. T. (2009). Kernel method for autoregressive data. Journal of Korean Data and Information Science Society, 20, 949-954.   과학기술학회마을
16 Shim, J., Park, H. J. and Seok, K. H. (2009). Variance function estimation with LS-SVM for replicated data. Journal of Korean Data and Information Science Society, 20, 925-931.   과학기술학회마을