skip to main content
10.1145/3321707.3321861acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Landscape analysis of gaussian process surrogates for the covariance matrix adaptation evolution strategy

Published:13 July 2019Publication History

ABSTRACT

Gaussian processes modeling technique has been shown as a valuable surrogate model for the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) in continuous single-objective black-box optimization tasks, where the optimized function is expensive. In this paper, we investigate how different Gaussian process settings influence the error between the predicted and genuine population ordering in connection with features representing the fitness landscape. Apart from using features for landscape analysis known from the literature, we propose a new set of features based on CMA-ES state variables. We perform the landscape analysis of a large set of data generated using runs of a surrogate-assisted version of the CMA-ES on the noiseless part of the Comparing Continuous Optimisers benchmark function testbed.

Skip Supplemental Material Section

Supplemental Material

References

  1. A. Auger and N. Hansen. 2005. A restart CMA evolution strategy with increasing population size. In The 2005 IEEE Congress on Evolutionary Computation, 2005, Vol. 2. IEEE, 1769--1776.Google ScholarGoogle ScholarCross RefCross Ref
  2. A. Auger, M. Schoenauer, and N. Vanhaecke. 2004. LS-CMA-ES: A Second-Order Algorithm for Covariance Matrix Adaptation. In Parallel Problem Solving from Nature - PPSN VIII. 182--191.Google ScholarGoogle Scholar
  3. M. Baerns and M. Holeňa. 2009. Combinatorial Development of Solid Catalytic Materials. Design of High-Throughput Experiments, Data Analysis, Data Mining. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. L. Bajer, Z. Pitra, J. Repický, and M. Holeňa. 0. Gaussian Process Surrogate Models for the CMA Evolution Strategy. Evolutionary Computation 0, ja (0), 1--30. PMID: 30540493.Google ScholarGoogle Scholar
  5. N. Belkhir, J. Dréo, P. Savéant, and M. Schoenauer. 2016. Surrogate Assisted Feature Computation for Continuous Problems. In Learning and Intelligent Optimization. Springer International Publishing, Cham, 17--31.Google ScholarGoogle Scholar
  6. Y. Bengio and Y. Lecun. 2007. Scaling learning algorithms towards AI. MIT Press.Google ScholarGoogle Scholar
  7. L. Breiman. 1984. Classification and regression trees. Chapman & Hall/CRC.Google ScholarGoogle Scholar
  8. D. Büche, N. N. Schraudolph, and P. Koumoutsakos. 2005. Accelerating evolutionary algorithms with Gaussian process fitness function models. IEEE Transactions on Systems, Man, and Cybernetics, Part C 35, 2 (2005), 183--194. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. J. Demšar. 2006. Statistical Comparisons of Classifiers over Multiple Data Sets. Journal of Machine Learning Research 7 (2006), 1--30. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. A.I.J. Forrester and A.J. Keane. 2009. Recent Advances in Surrogate-Based Optimization. Progress in Aerospace Sciences 45 (2009), 50--79.Google ScholarGoogle ScholarCross RefCross Ref
  11. M. N. Gibbs. 1997. Bayesian Gaussian Processes for Regression and Classification. Ph.D. Dissertation. Department of Physics, University of Cambridge.Google ScholarGoogle Scholar
  12. N. Hansen. 2006. The CMA Evolution Strategy: A Comparing Review. In Towards a New Evolutionary Computation. Number 192 in Studies in Fuzziness and Soft Computing. Springer Berlin Heidelberg, 75--102.Google ScholarGoogle Scholar
  13. N. Hansen, A. Auger, S. Finck, and R. Ros. 2012. Real-Parameter Black-Box Optimization Benchmarking 2012: Experimental Setup. Technical Report. INRIA.Google ScholarGoogle Scholar
  14. N. Hansen, S. Finck, R. Ros, and A. Auger. 2009. Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions. Technical Report RR-6829. INRIA. Updated February 2010.Google ScholarGoogle Scholar
  15. N. Hansen and A. Ostermeier. 1996. Adapting Arbitrary Normal Mutation Distributions in Evolution Strategies: The Covariance Matrix Adaptation. In Proceedings of IEEE International Conference on Evolutionary Computation, 1996. IEEE, 312--317.Google ScholarGoogle Scholar
  16. N. Hansen and R. Ros. 2010. Benchmarking a weighted negative covariance matrix update on the BBOB-2010 noiseless testbed. In Proceedings of the 12th annual conference companion on Genetic and evolutionary computation. ACM, 1673--1680. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. K. Holmström, N.H. Quittneh, and M. Edvall. 2008. An Adaptive Radial Basis Algorithm (ARBF) for Expensive Black-Box Mixed-Integer Constrained Global Optimization. Optimization and Engineering 9 (2008), 311--339.Google ScholarGoogle ScholarCross RefCross Ref
  18. Y. Jin. 2011. Surrogate-Assisted Evolutionary Computation: Recent Advances and Future Challenges. Swarm and Evolutionary Computation 1 (2011), 61--70.Google ScholarGoogle ScholarCross RefCross Ref
  19. Y. Jin, M. Olhofer, and B. Sendhoff. 2001. Managing approximate models in evolutionary aerodynamic design optimization. In Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546), Vol. 1. 592--599.Google ScholarGoogle Scholar
  20. S. Kern, N. Hansen, and P. Koumoutsakos. 2006. Local Meta-models for Optimization Using Evolution Strategies. In Parallel Problem Solving from Nature - PPSN IX (Lecture Notes in Computer Science), Vol. 4193. Springer Berlin Heidelberg, 939--948. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. P. Kerschke. 2017. Comprehensive Feature-Based Landscape Analysis of Continuous and Constrained Optimization Problems Using the R-Package flacco. ArXiv e-prints (2017). arXiv:stat.ML/1708.05258Google ScholarGoogle Scholar
  22. P. Kerschke and J. Dagefoerde. 2017. flacco: Feature-Based Landscape Analysis of Continuous and Constraint Optimization Problems. https://cran.r-project.org/package=flacco R-package v. 1.7.Google ScholarGoogle Scholar
  23. P. Kerschke, M. Preuss, S. Wessing, and H. Trautmann. 2015. Detecting Funnel Structures by Means of Exploratory Landscape Analysis (GECCO '15). ACM, 265--272. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. H. Lee, Y. Jo, D.J. Lee, and S. Choi. 2016. Surrogate Model Based Design Optimization of Multiple Wing Sails Considering Flow Interaction Effect. Ocean Engineering 121 (2016), 422--436.Google ScholarGoogle ScholarCross RefCross Ref
  25. M. Lunacek and D. Whitley. 2006. The Dispersion Metric and the CMA Evolution Strategy (GECCO '06). ACM, 477--484. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. O. Mersmann, B. Bischl, H. Trautmann, M. Preuss, C. Weihs, and G. Rudolph. 2011. Exploratory Landscape Analysis (GECCO '11). ACM, 829--836. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. M. A. Muñoz, M. Kirley, and S. K. Halgamuge. 2015. Exploratory Landscape Analysis of Continuous Space Optimization Problems Using Information Content. IEEE Transactions on Evolutionary Computation 19, 1 (2015), 74--87.Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. M. A. Muñoz, Y. Sun, M. Kirley, and S. K. Halgamuge. 2015. Algorithm Selection for Black-box Continuous Optimization Problems. Inf. Sci. 317, C (2015), 224--245. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Z. Pitra, L. Bajer, and M. Holeňa. 2016. Doubly Trained Evolution Control for the Surrogate CMA-ES. In Proceedings of the PPSN XIV: 14th International Conference, Edinburgh, UK, September 17--21. Springer International Publishing, Cham, 59--68.Google ScholarGoogle Scholar
  30. Z. Pitra, L. Bajer, J. Repický, and M. Holeňa. 2017. Overview of Surrogate-model Versions of Covariance Matrix Adaptation Evolution Strategy (GECCO '17). ACM, 8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. C. E. Rasmussen and C. K. I. Williams. 2006. Gaussian Processes for Machine Learning. MIT Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. J. Repický, Z. Pitra, and M. Holeňa. 2018. Adaptive Selection of Gaussian Process Model for Active Learning in Expensive Optimization. In Proceedings of the Workshop on IAL co-located with ECML PKDD 2018, Dublin, Ireland, September 10th, 2018. (CEUR Workshop Proceedings), Vol. 2192. CEUR-WS.org, 80--84.Google ScholarGoogle Scholar
  33. J. Tian, C. Sun, J.-C. Zeng, H. Yu, Y. Tan, and Y. Jin. 2017. Comparisons of different kernels in Kriging-assisted evolutionary expensive optimization. 2017 IEEE Symposium Series on Computational Intelligence (SSCI) (11 2017), 1--8.Google ScholarGoogle ScholarCross RefCross Ref
  34. H. Ulmer, F. Streichert, and A. Zell. 2003. Evolution strategies assisted by Gaussian processes with improved preselection criterion. In The 2003 Congress on Evolutionary Computation, 2003. CEC '03, Vol. 1. 692--699 Vol.1.Google ScholarGoogle Scholar
  35. M. Zaefferer, D. Gaida, and T. Bartz-Beielstein. 2016. Multi-fidelity Modeling and Optimization of Biogas Plants. Applied Soft Computing 48 (2016), 13--28. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Landscape analysis of gaussian process surrogates for the covariance matrix adaptation evolution strategy

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          GECCO '19: Proceedings of the Genetic and Evolutionary Computation Conference
          July 2019
          1545 pages
          ISBN:9781450361118
          DOI:10.1145/3321707

          Copyright © 2019 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 13 July 2019

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          Overall Acceptance Rate1,669of4,410submissions,38%

          Upcoming Conference

          GECCO '24
          Genetic and Evolutionary Computation Conference
          July 14 - 18, 2024
          Melbourne , VIC , Australia

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader