Skip to main content
Top
Published in:
Cover of the book

2018 | OriginalPaper | Chapter

A Comparative Study of Large-Scale Variants of CMA-ES

Authors : Konstantinos Varelas, Anne Auger, Dimo Brockhoff, Nikolaus Hansen, Ouassim Ait ElHara, Yann Semet, Rami Kassab, Frédéric Barbaresco

Published in: Parallel Problem Solving from Nature – PPSN XV

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The CMA-ES is one of the most powerful stochastic numerical optimizers to address difficult black-box problems. Its intrinsic time and space complexity is quadratic—limiting its applicability with increasing problem dimensionality. To circumvent this limitation, different large-scale variants of CMA-ES with subquadratic complexity have been proposed over the past ten years. To-date however, these variants have been tested and compared only in rather restrictive settings, due to the lack of a comprehensive large-scale testbed to assess their performance. In this context, we introduce a new large-scale testbed with dimension up to 640, implemented within the COCO benchmarking platform. We use this testbed to assess the performance of several promising variants of CMA-ES and the standard limited-memory L-BFGS. In all tested dimensions, the best CMA-ES variant solves more problems than L-BFGS for larger budgets while L-BFGS outperforms the best CMA-ES variant for smaller budgets. However, over all functions, the cumulative runtime distributions between L-BFGS and the best CMA-ES variants are close (less than a factor of 4 in high dimension).
Our results illustrate different scaling behaviors of the methods, expose a few defects of the algorithms and reveal that for dimension larger than 80, LM-CMA solves more problems than VkD-CMA while in the cumulative runtime distribution over all functions the VkD-CMA dominates LM-CMA for budgets up to \(10^4\) times dimension and for all budgets up to dimension 80.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
All raw datasets are available for download at http://​coco.​gforge.​inria.​fr/​doku.​php?​id=​algorithms while already postprocessed results are available (without the need to install COCO) at http://​coco.​gforge.​inria.​fr/​ppdata-archive.
 
2
The source code of the new test suite (incl. adaptations in COCO’s postprocessing) can be found in the devel-LS-development branch of the COCO Github page.
 
3
Except L-BFGS, where the factr parameter was set to 1.0 for very high precision.
 
Literature
1.
go back to reference Ait ElHara, O., Auger, A., Hansen, N.: Permuted orthogonal block-diagonal transformation matrices for large scale optimization benchmarking. In: Genetic and Evolutionary Computation Conference (GECCO 2016), pp. 189–196. ACM (2016) Ait ElHara, O., Auger, A., Hansen, N.: Permuted orthogonal block-diagonal transformation matrices for large scale optimization benchmarking. In: Genetic and Evolutionary Computation Conference (GECCO 2016), pp. 189–196. ACM (2016)
3.
go back to reference Akimoto, Y., Hansen, N.: Projection-based restricted covariance matrix adaptation for high dimension. In: Genetic and Evolutionary Computation Conference (GECCO 2016), pp. 197–204. Denver, USA, July 2016 Akimoto, Y., Hansen, N.: Projection-based restricted covariance matrix adaptation for high dimension. In: Genetic and Evolutionary Computation Conference (GECCO 2016), pp. 197–204. Denver, USA, July 2016
4.
go back to reference Hansen, N., Auger, A., Mersmann, O., Tušar, T., Brockhoff, D.: COCO: A platform for comparing continuous optimizers in a black-box setting (2016). arXiv:1603.08785 Hansen, N., Auger, A., Mersmann, O., Tušar, T., Brockhoff, D.: COCO: A platform for comparing continuous optimizers in a black-box setting (2016). arXiv:​1603.​08785
5.
go back to reference Hansen, N., Finck, S., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. Research Report RR-6829, INRIA (2009) Hansen, N., Finck, S., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. Research Report RR-6829, INRIA (2009)
6.
go back to reference Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)CrossRef Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)CrossRef
7.
go back to reference Knight, J.N., Lunacek, M.: Reducing the space-time complexity of the CMA-ES. In: Genetic and Evolutionary Computation Conference (GECCO 2007), pp. 658–665. ACM (2007) Knight, J.N., Lunacek, M.: Reducing the space-time complexity of the CMA-ES. In: Genetic and Evolutionary Computation Conference (GECCO 2007), pp. 658–665. ACM (2007)
8.
go back to reference Krause, O., Arbonès, D.R., Igel, C.: CMA-ES with optimal covariance update and storage complexity. In: NIPS Proceedings (2016) Krause, O., Arbonès, D.R., Igel, C.: CMA-ES with optimal covariance update and storage complexity. In: NIPS Proceedings (2016)
9.
go back to reference Li, Z., Zhang, Q.: A simple yet efficient evolution strategy for large scale black-box optimization. IEEE Trans. Evol. Comput. (2017, accepted) Li, Z., Zhang, Q.: A simple yet efficient evolution strategy for large scale black-box optimization. IEEE Trans. Evol. Comput. (2017, accepted)
10.
go back to reference Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45(3), 503–528 (1989)MathSciNetCrossRef Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45(3), 503–528 (1989)MathSciNetCrossRef
11.
go back to reference Loshchilov, I.: LM-CMA: an alternative to L-BFGS for large scale black-box optimization. Evol. Comput. 25, 143–171 (2017)CrossRef Loshchilov, I.: LM-CMA: an alternative to L-BFGS for large scale black-box optimization. Evol. Comput. 25, 143–171 (2017)CrossRef
12.
go back to reference Loshchilov, I.: A computationally efficient limited memory CMA-ES for large scale optimization. In: Genetic and Evolutionary Computation Conference (GECCO 2014), pp. 397–404 (2014) Loshchilov, I.: A computationally efficient limited memory CMA-ES for large scale optimization. In: Genetic and Evolutionary Computation Conference (GECCO 2014), pp. 397–404 (2014)
13.
go back to reference Loshchilov, I., Glasmachers, T., Beyer, H.: Limited-memory matrix adaptation for large scale black-box optimization. CoRR abs/1705.06693 (2017) Loshchilov, I., Glasmachers, T., Beyer, H.: Limited-memory matrix adaptation for large scale black-box optimization. CoRR abs/1705.06693 (2017)
15.
go back to reference Sun, Y., Gomez, F.J., Schaul, T., Schmidhuber, J.: A linear time natural evolution strategy for non-separable functions. CoRR abs/1106.1998 (2011) Sun, Y., Gomez, F.J., Schaul, T., Schmidhuber, J.: A linear time natural evolution strategy for non-separable functions. CoRR abs/1106.1998 (2011)
16.
go back to reference Suttorp, T., Hansen, N., Igel, C.: Efficient covariance matrix update for variable metric evolution strategies. Mach. Learn. 75(2), 167–197 (2009)CrossRef Suttorp, T., Hansen, N., Igel, C.: Efficient covariance matrix update for variable metric evolution strategies. Mach. Learn. 75(2), 167–197 (2009)CrossRef
17.
go back to reference Tang, K., et al.: Benchmark functions for the CEC 2008 special session and competition on large scale global optimization (2007) Tang, K., et al.: Benchmark functions for the CEC 2008 special session and competition on large scale global optimization (2007)
Metadata
Title
A Comparative Study of Large-Scale Variants of CMA-ES
Authors
Konstantinos Varelas
Anne Auger
Dimo Brockhoff
Nikolaus Hansen
Ouassim Ait ElHara
Yann Semet
Rami Kassab
Frédéric Barbaresco
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-319-99253-2_1

Premium Partner