Browse > Article
http://dx.doi.org/10.7465/jkdi.2012.23.6.1045

Smoothing Kaplan-Meier estimate using monotone support vector regression  

Hwang, Changha (Department of Statistics, Dankook University)
Shim, Jooyong (Department of Data Science, Inje University)
Publication Information
Journal of the Korean Data and Information Science Society / v.23, no.6, 2012 , pp. 1045-1054 More about this Journal
Abstract
Support vector machine is known to be the very useful statistical method in classification and nonlinear function estimation. In this paper we propose a monotone support vector regression (SVR) for the estimation of monotonically decreasing function. The proposed monotone SVR is applied to smooth the Kaplan-Meier estimate of survival function. Experimental results are then presented which indicate the performance of the proposed monotone SVR using survival functions obtained by exponential distribution.
Keywords
Kernel parameter; kernel technique; monotonocity; randomly right censored data; smoothing; support vector machine; survival function;
Citations & Related Records
Times Cited By KSCI : 4  (Citation Analysis)
연도 인용수 순위
1 Buckley, J. and James, I. (1979). Linear regression with censored data. Biometrika, 66, 429-436.   DOI   ScienceOn
2 Chakravarti, I. M., Laha, R. G. and Roy, J. (1967). Handbook of methods of applied statistics, John Wiley and Sons, New York.
3 Cox, D. R. (1972). Regression models and life tables. Journal of the Royal Statistical Society, 34, 187-202.
4 Cristianini, N. and Shawe-Taylor, J. (2000). An introduction to support vector machines and other kernelbased learning methods, Cambridge University Press, Cambridge.
5 Green, P. J. and Silverman, B. W. (1994). Nonparametric regression and generalized linear models, Chapman & Hall, London.
6 Gunn, S. R. (1998). Support vector machines for classification and regression, Technical Report, Department of Electronics and Computer Science, Southampton University.
7 Hwang, C. and Shim, J. (2010). Semiparametric support vector machine for accelerated failure time model. Journal of the Korean Data & Information Science Society, 21, 467-477.   과학기술학회마을
8 Hwang, C. and Shim, J. (2011). Cox proportional hazard model with L1 penalty. Journal of the Korean Data & Information Society, 22, 613-618.   과학기술학회마을
9 Jo, D. H., Shim, J. and Seok, K. H. (2010). Doubly penalized kernel method for heteroscedastic autoregressive data. Journal of the Korean Data & Information Science Society, 21, 155-162.
10 Kalbfleisch, J. D. and Prentice, R. L. (1980). The statistical analysis of failure time data, John Wiley & Sons Inc., New York.
11 Kaplan, E. L. and Meier, P. (1958). Nonparametric estimation from incomplete observations. Journal of American Statistical Association, 53, 457-481.   DOI   ScienceOn
12 Kuhn, H. W. and Tucker, A. W. (1951). Nonlinear programming. In Proceedings of 2nd Berkeley Symposium, University of California Press, Berkeley, 481-491.
13 Kim, M., Park, H., Hwang, C. and Shim, J. (2008). Claims reserving via kernel machine. Journal of the Korean Data & Information Science Society, 19, 1419-1427.   과학기술학회마을
14 Mercer, J. (1909). Functions of positive and negative type and their connection with theory of integral equations. Philosophical Transactions of Royal Society A, 415-446.
15 Miller, R. G. (1981). Survival analysis, Wiley, New York.
16 Moulton, L. H. and Dibley, M. J. (1997). Multivariate time-to-event models for studies of recurrent childhood diseases. International Journal of Epidemiology, 26, 1334-1339.   DOI   ScienceOn
17 Saunders, C., Gammerman, A. and Vovk, V. (1998). Ridge regression learning algorithm in dual variables. In Proceedings of 15th International Conference on Machine Learning, Madison, WI, 515-521.
18 Scholkopf, B. and Smola, A. (2002). Learning with kernels-support vector machines, regularization, optimization and beyond, MIT Press, Cambridge, MA.
19 Stablein, D. M., Carter, W. H. and Novak, J. W. (1981). Analysis of survival data with nonproportional hazard functions. Controlled Clinical Trials, 2, 149-159.   DOI   ScienceOn
20 Suykens, J. A. K. and Vandewalle, J. (1999). Least square support vector machine classifier. Neural Processing Letters, 9, 293-300.   DOI
21 Vapnik, V. N. (1995). The nature of statistical learning theory, Springer, New York.
22 Vapnik, V. N. (1998). Statistical learning theory, John Wiley, New York.
23 Wahba, G. (1990). Spline models for observational data, SIAM, Philadelphia.