Browse > Article
http://dx.doi.org/10.9766/KIMST.2022.25.4.412

An Application of Surrogate and Resampling for the Optimization of Success Probability from Binary-Response Type Simulation  

Lee, Donghoon (The 1st Research and Development Institute, Agency for Defense Development)
Hwang, Kunchul (The 1st Research and Development Institute, Agency for Defense Development)
Lee, Sangil (The 1st Research and Development Institute, Agency for Defense Development)
Yun, Won-young (Department of Industrial Engineering, Pusan National University)
Publication Information
Journal of the Korea Institute of Military Science and Technology / v.25, no.4, 2022 , pp. 412-424 More about this Journal
Abstract
Since traditional derivative-based optimization for noisy simulation shows bad performance, evolutionary algorithms are considered as substitutes. Especially in case when outputs are binary, more simulation trials are needed to get near-optimal solution since the outputs are discrete and have high and heterogeneous variance. In this paper, we propose a genetic algorithm called SARAGA which adopts dynamic resampling and fitness approximation using surrogate. SARAGA reduces unnecessary numbers of expensive simulations to estimate success probabilities estimated from binary simulation outputs. SARAGA allocates number of samples to each solution dynamically and sometimes approximates the fitness without additional expensive experiments. Experimental results show that this novel approach is effective and proper hyper parameter choice of surrogate and resampling can improve the performance of algorithm.
Keywords
Noisy Optimization; Genetic Algorithm; Bernoulli Simulation; Dynamic Resampling; Surrogate;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Y. Jin and J. Branke, "Evolutionary Optimization in Uncertain Environmentsv-a Survey," IEEE Transactions on Evolutionary Computation, 9(3):303-317, 2005.   DOI
2 E. Aarts, and J. Korst, 1989, "Simulated Annealing and Boltzmann Machines - a Stochastic Approach to Combinatorial Optimization and Neural Computing," Wiley, New York, p. 272, 1989.
3 J. Kennedy and R. Eberhart, "Particle Swarm Optimization", Proc. IEEE Int. Conf. Neudral Networks, pp. 1942-1948, 1995.
4 D. Beasley, D. R. Bull and Martin R. R., "An Overview of Genetic Algorithms : Fundamentals," Morgan Kaufmann, 1993.
5 T. Back, "Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms", Oxford University Press, 1996.
6 T. Back, and U. Hammel, "Evolution Strategies Applied to Perturbed Objective Functions," in Proceedings of IEEE Congress on Evolutionary Computation, pp. 40-45, 1994.
7 B. L. Miller, "Noise, Sampling, and Efficient Genetic Algorithms," Ph. D. Thesis, Department of Computer Science, University of Illinoisat UrbanaChampaign, TR 97001, 1997.
8 T. Bartz-Beielstein, "New Experimentalism Applied to Evolutionary Computation," Ph. D. Thesis, University of Dortmund, 2005.
9 T. Back, "Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms," Oxford University Press, Oxford, UK, 1996.
10 R. L. Anderson, "Recent Advances in Finding Best Operating Conditions," Journal of the American Statistical Association, 1953.
11 L. M. Gambardella and M. Dorigo, "Ant Colony System: A Cooperative Learning Approach to the Traveling Salesman Problem," IEEE Transactions on Evolutionary Computation, Vol. 1, No. 1, pp. 53- 66, 1997.   DOI
12 Y. Akimoto, S. A. Morales, and Teytaud O., "Analysis of Runtime of Optimization Algorithms for Noisy Functions Over Discrete Codomains," Theoretical Computer Science, Vol. 605, pp. 42-50, 2015.   DOI
13 A. N. Aizawa, and B. W. Wah, "Scheduling of Genetic Algorithms in a Noisy Environment," Evolutionary Computation, Vol. 2, No. 2, pp. 97-122, 1994.   DOI
14 F. Siegmund, "Sequential Sampling in Noisy Multi-Objective Evolutionary Optimization," Master's Thesis, University of Skovde, School of Humanities and Informatics, 2009.
15 V. N. Vapnik, "Statistical Learning Theory, Adaptive and Learning Systems for Signal Processing, Communications, and Control," John Wiley & Sons, New York, USA, 1998.
16 J. E. Fieldsend, "Elite Accumulative Sampling Strategies for Noisy Multi-Objective Optimisation," Proceedings of Evolutionary MultiCriterion Optimization, Springer International Publishing, pp. 172-186, 2015.
17 T. Park, and K. R. Ryu, "Accumulative Sampling for Noisy Evolutionary Multi-Objective Optimization," Proceedings of the ACM 13th Annual Conference on Genetic and Evolutionary Computation, pp. 793-800, 2011.
18 A. D. Pietro, "Optimising Evolutionary Strategies for Problems with Varying Noise Strength," Ph. D. Thesis, University of Western Australia, 2007.
19 Z. Zhang, and T. Xin, "Immune Algorithm with Adaptive Sampling in Noisy Environments and its Application to Stochastic Optimization Problems," IEEE Computational Intelligence Magazine, Vol. 2, No. 4, pp. 29-40, 2007.   DOI
20 A. D. Pietro, "Optimising Evolutionary Strategies for Problems with Varying Noise Strength," Ph. D. Thesis, University of Western Australia, 2007.
21 R. L. Hardy, "Multiquadric Equations of Topography and Other Irregular Surfaces," Journal of Geophysical Research, Vol. 76, No. 8, pp. 1905-1915, 1971.   DOI
22 J. Sacks et. al., "Design and Analysis of Computer Experiments," Statistical Science, Vol. 4, No. 4, pp. 409-435, 1989.
23 R. H. Myers el. al., "Response Surface Methodology: Process and Product Optimization Using Designed Experiments," Vol. 705, John Wiley & Sons, 2009.
24 L. L. Schumaker, "Spline Functions: Basic Theory," Cambridge Mathematical Library, Cambridge University Press, Cambridge, Uk, 3rd edition, 2007.