Skip to main content

2019 | OriginalPaper | Buchkapitel

Constraint Exploration of Convolutional Network Architectures with Neuroevolution

verfasst von : Jonas Dominik Homburg, Michael Adams, Michael Thies, Timo Korthals, Marc Hesse, Ulrich Rückert

Erschienen in: Advances in Computational Intelligence

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The effort spent on adapting existing networks to new applications has motivated the automated architecture search. Network structures discovered with evolutionary or other search algorithms have surpassed hand-crafted image classifiers in terms of accuracy. However, these approaches do not constrain certain characteristics like network size, which leads to unnecessary computational effort. Thus, this work shows that generational evolutionary algorithms can be used for a constrained exploration of convolutional network architectures to create a selection of networks for a specific application or target architecture.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)MathSciNetMATH Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)MathSciNetMATH
2.
Zurück zum Zitat Castillo, P.A., Merelo, J., Prieto, A., Rivas, V., Romero, G.: G-Prop: global optimization of multilayer perceptrons using gas. Neurocomputing 35(1–4), 149–163 (2000)CrossRef Castillo, P.A., Merelo, J., Prieto, A., Rivas, V., Romero, G.: G-Prop: global optimization of multilayer perceptrons using gas. Neurocomputing 35(1–4), 149–163 (2000)CrossRef
3.
Zurück zum Zitat Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Li, F.-F.: ImageNet: a large-scale hierarchical image database (2009) Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Li, F.-F.: ImageNet: a large-scale hierarchical image database (2009)
4.
Zurück zum Zitat Elsken, T., Metzen, J.H., Hutter, F.: Simple and efficient architecture search for convolutional neural networks. arXiv preprint arXiv:1711.04528 (2017) Elsken, T., Metzen, J.H., Hutter, F.: Simple and efficient architecture search for convolutional neural networks. arXiv preprint arXiv:​1711.​04528 (2017)
5.
Zurück zum Zitat Feurer, M., Klein, A., Eggensperger, K., Springenberg, J., Blum, M., Hutter, F.: Efficient and robust automated machine learning. In: Cortes, C., Lawrence, N.D., Lee, D.D., Sugiyama, M., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 28, pp. 2962–2970. Curran Associates Inc., New York (2015) Feurer, M., Klein, A., Eggensperger, K., Springenberg, J., Blum, M., Hutter, F.: Efficient and robust automated machine learning. In: Cortes, C., Lawrence, N.D., Lee, D.D., Sugiyama, M., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 28, pp. 2962–2970. Curran Associates Inc., New York (2015)
6.
Zurück zum Zitat Fogel, D.B., Fogel, L.J., Porto, V.: Evolving neural networks. Biol. Cybern. 63(6), 487–493 (1990)CrossRef Fogel, D.B., Fogel, L.J., Porto, V.: Evolving neural networks. Biol. Cybern. 63(6), 487–493 (1990)CrossRef
7.
Zurück zum Zitat Han, S., Mao, H., Dally, W.J.: Deep compression: compressing deep neural networks with pruning, trained quantization and Huffman coding. arXiv preprint arXiv:1510.00149 (2015) Han, S., Mao, H., Dally, W.J.: Deep compression: compressing deep neural networks with pruning, trained quantization and Huffman coding. arXiv preprint arXiv:​1510.​00149 (2015)
9.
Zurück zum Zitat Jin, H., Song, Q., Hu, X.: Auto-Keras: efficient neural architecture search with network morphism (2018) Jin, H., Song, Q., Hu, X.: Auto-Keras: efficient neural architecture search with network morphism (2018)
10.
Zurück zum Zitat Koza, J.R.: Genetic programming as a means for programming computers by natural selection. Stat. Comput. 4(2), 87–112 (1994)CrossRef Koza, J.R.: Genetic programming as a means for programming computers by natural selection. Stat. Comput. 4(2), 87–112 (1994)CrossRef
11.
Zurück zum Zitat Krizhevsky, A., Hinton, G.: Learning multiple layers of features from tiny images. Technical report, Citeseer (2009) Krizhevsky, A., Hinton, G.: Learning multiple layers of features from tiny images. Technical report, Citeseer (2009)
12.
Zurück zum Zitat LeCun, Y., Cortes, C., Burges, C.: The MNIST database of handwritten digits. The Courant Institute of Mathematical Sciences (1998) LeCun, Y., Cortes, C., Burges, C.: The MNIST database of handwritten digits. The Courant Institute of Mathematical Sciences (1998)
14.
Zurück zum Zitat Liu, Y., Yao, X.: Evolving modular neural networks which generalise well. In: Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC 1997), pp. 605–610. IEEE (1997) Liu, Y., Yao, X.: Evolving modular neural networks which generalise well. In: Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC 1997), pp. 605–610. IEEE (1997)
15.
Zurück zum Zitat Olson, R.S., Bartley, N., Urbanowicz, R.J., Moore, J.H.: Evaluation of a tree-based pipeline optimization tool for automating data science. In: Proceedings of the Genetic and Evolutionary Computation Conference 2016, GECCO 2016, pp. 485–492. ACM, New York (2016). https://doi.org/10.1145/2908812.2908918 Olson, R.S., Bartley, N., Urbanowicz, R.J., Moore, J.H.: Evaluation of a tree-based pipeline optimization tool for automating data science. In: Proceedings of the Genetic and Evolutionary Computation Conference 2016, GECCO 2016, pp. 485–492. ACM, New York (2016). https://​doi.​org/​10.​1145/​2908812.​2908918
16.
Zurück zum Zitat Pham, H., Guan, M.Y., Zoph, B., Le, Q.V., Dean, J.: Efficient neural architecture search via parameter sharing. arXiv preprint arXiv:1802.03268 (2018) Pham, H., Guan, M.Y., Zoph, B., Le, Q.V., Dean, J.: Efficient neural architecture search via parameter sharing. arXiv preprint arXiv:​1802.​03268 (2018)
17.
Zurück zum Zitat Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. arXiv preprint arXiv:1802.01548 (2018) Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. arXiv preprint arXiv:​1802.​01548 (2018)
19.
Zurück zum Zitat Romanuke, V.V.: Training data expansion and boosting of convolutional neural networks for reducing the MNIST dataset error rate (2016) Romanuke, V.V.: Training data expansion and boosting of convolutional neural networks for reducing the MNIST dataset error rate (2016)
20.
Zurück zum Zitat Schrum, J.: Evolving indirectly encoded convolutional neural networks to play tetris with low-level features (2018) Schrum, J.: Evolving indirectly encoded convolutional neural networks to play tetris with low-level features (2018)
21.
Zurück zum Zitat Simard, P.Y., Steinkraus, D., Platt, J.C.: Best practices for convolutional neural networks applied to visual document analysis. In: Proceedings of the Seventh International Conference on Document Analysis and Recognition, vol. 2, p. 958. IEEE (2003) Simard, P.Y., Steinkraus, D., Platt, J.C.: Best practices for convolutional neural networks applied to visual document analysis. In: Proceedings of the Seventh International Conference on Document Analysis and Recognition, vol. 2, p. 958. IEEE (2003)
22.
Zurück zum Zitat Stanley, K.O., Bryant, B.D., Miikkulainen, R.: Real-time neuroevolution in the NERO video game. IEEE Trans. Evol. Comput. 9(6), 653–668 (2005)CrossRef Stanley, K.O., Bryant, B.D., Miikkulainen, R.: Real-time neuroevolution in the NERO video game. IEEE Trans. Evol. Comput. 9(6), 653–668 (2005)CrossRef
23.
Zurück zum Zitat Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)CrossRef Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)CrossRef
24.
25.
Zurück zum Zitat Yao, X., Liu, Y.: Towards designing artificial neural networks by evolution. Appl. Math. Comput. 91(1), 83–90 (1998)MATH Yao, X., Liu, Y.: Towards designing artificial neural networks by evolution. Appl. Math. Comput. 91(1), 83–90 (1998)MATH
Metadaten
Titel
Constraint Exploration of Convolutional Network Architectures with Neuroevolution
verfasst von
Jonas Dominik Homburg
Michael Adams
Michael Thies
Timo Korthals
Marc Hesse
Ulrich Rückert
Copyright-Jahr
2019
DOI
https://doi.org/10.1007/978-3-030-20518-8_61

Premium Partner