Browse > Article
http://dx.doi.org/10.5351/CKSS.2008.15.1.125

Using Support Vector Regression for Optimization of Black-box Objective Functions  

Kwak, Min-Jung (Department of Computer Science & Statistics, Pyongtaek University)
Yoon, Min (Department of Applied Statistics, Konkuk University)
Publication Information
Communications for Statistical Applications and Methods / v.15, no.1, 2008 , pp. 125-136 More about this Journal
Abstract
In many practical engineering design problems, the form of objective functions is not given explicitly in terms of design variables. Given the value of design variables, under this circumstance, the value of objective functions is obtained by real/computational experiments such as structural analysis, fluid mechanic analysis, thermodynamic analysis, and so on. These experiments are, in general, considerably expensive. In order to make the number of these experiments as few as possible, optimization is performed in parallel with predicting the form of objective functions. Response Surface Methods (RSM) are well known along this approach. This paper suggests to apply Support Vector Machines (SVM) for predicting the objective functions. One of most important tasks in this approach is to allocate sample data moderately in order to make the number of experiments as small as possible. It will be shown that the information of support vector can be used effectively to this aim. The effectiveness of our suggested method will be shown through numerical example which is well known in design of engineering.
Keywords
Support vector regression; genetic algorithm; global and local information; optimization;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Kannan, B. K. and Kramer, S. N. (1994). An augmented lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. Journal of Mechanical Design, 116, 405-411   DOI   ScienceOn
2 Zabinsky, Z. B. (1998). Stochastic methods for practical global optimization. Journal of Global Optimization, 13, 433-444   DOI
3 Zhang, C. and Wang, H. P. (1993). Mixed-Discrete nonlinear optimization with simulated annealing. Engineering Optimization, 21, 277-291   DOI   ScienceOn
4 Schonlau, M. (1997). Computer Experiments and Global Optimization. PhD. thesis, University of Waterloo, Ontario, Canada
5 Scholkopf, B. and Smola, A. J. (2002), Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. The MIT Press
6 Radcliffe, N. J. (1991). Forma analysis and random respectful recombination. In Proceedings of the Forth International Conference on Genetic Algorithms, 222-229
7 Sacks, J., Welch, W. J., Mitchell, T. J. and Wynn, H. P. (1989). Design and analysis of computer experiments. Statistical Science, 4, 409-435   DOI   ScienceOn
8 Nakayama, H., Arakawa, M. and Sasaki, K. (2002). Simulation based optimiza-tion for unknown objective functions. Optimization and Engineering, 3, 201-214   DOI
9 Jones, D. R., Schonlau, M. and Welch, W. J. (1998). Efficient global opti-mization of expensive black-box functions. Journal of Global Optimization, 13, 455-492   DOI
10 Myers, R. H. and Montgomery, D. C. (1995). Response Surface Methodology: Process and Product Optimization using Designed Experiments. John Wiley & Sons, New York
11 Eshelman, L. J. and Schaffer, J. D. (1993). Real-coded genetic algorithms and interval-schemata. In Foundations of Genetic Algorithms 2 (Whitely, L. D., ed.), Morgan Kaufman
12 Haykin, S. (1999). Neural Networks: A Comprehensive Foundation. 2nd ed., Prentice Hall
13 Hsu, Y. H., Sun, T. L. and Leu, L. H. (2000) A two-stage sequential approxi-mation method for nonlinear discrete variable optimization, ASME Design Engineering Technical Conference, Boston, 197-202
14 Cristianini, N. and Shawe-Taylor, J. (2000). An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press