Skip to main content

2013 | OriginalPaper | Buchkapitel

The Importance of Topology Evolution in NeuroEvolution: A Case Study Using Cartesian Genetic Programming of Artificial Neural Networks

verfasst von : Andrew James Turner, Julian Francis Miller

Erschienen in: Research and Development in Intelligent Systems XXX

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

NeuroEvolution (NE) is the application of evolutionary algorithms to Artificial Neural Networks (ANN). This paper reports on an investigation into the relative importance of weight evolution and topology evolution when training ANN using NE. This investigation used the NE technique Cartesian Genetic Programming of Artificial Neural Networks (CGPANN). The results presented show that the choice of topology has a dramatic impact on the effectiveness of NE when only evolving weights; an issue not faced when manipulating both weights and topology. This paper also presents the surprising result that topology evolution alone is far more effective when training ANN than weight evolution alone. This is a significant result as many methods which train ANN manipulate only weights.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
Both CGP and ANNs can also be structured in a recurrent form.
 
2
Fully connected between layers i.e. a node in hidden layer two has an input from every node in hidden layer one.
 
3
If the arity is set high enough however all topologies are possible as each node can lower its own arity by only utilizing the first of multiple connections between two nodes.
 
Literatur
1.
Zurück zum Zitat P. Angeline, G. Saunders, and J. Pollack. An Evolutionary Algorithm that Constructs Recurrent Neural Networks. IEEE Transactions on Neural Networks, 5(1):54–65, 1994. P. Angeline, G. Saunders, and J. Pollack. An Evolutionary Algorithm that Constructs Recurrent Neural Networks. IEEE Transactions on Neural Networks, 5(1):54–65, 1994.
2.
Zurück zum Zitat D. Floreano, P. Dürr, and C. Mattiussi. Neuroevolution: from Architectures to Learning. Evolutionary Intelligence, 1(1):47–62, 2008. D. Floreano, P. Dürr, and C. Mattiussi. Neuroevolution: from Architectures to Learning. Evolutionary Intelligence, 1(1):47–62, 2008.
3.
Zurück zum Zitat X. Glorot and Y. Bengio. Understanding the Difficulty of Training Deep Feedforward Neural Networks. In Proceedings of the International Conference on Artificial Intelligence and Statistics (AISTATS10). Society for Artificial Intelligence and, Statistics, 2010. X. Glorot and Y. Bengio. Understanding the Difficulty of Training Deep Feedforward Neural Networks. In Proceedings of the International Conference on Artificial Intelligence and Statistics (AISTATS10). Society for Artificial Intelligence and, Statistics, 2010.
4.
Zurück zum Zitat C. Igel. Neuroevolution for Reinforcement Learning using Evolution Strategies. In Evolutionary Computation, volume 4, pages 2588–2595. IEEE, 2003. C. Igel. Neuroevolution for Reinforcement Learning using Evolution Strategies. In Evolutionary Computation, volume 4, pages 2588–2595. IEEE, 2003.
5.
Zurück zum Zitat M. M. Khan, G. M. Khan, and J. F. Miller. Evolution of Neural Networks using Cartesian Genetic Programming. In Proceedings of IEEE World Congress on Computational Intelligence, 2010. M. M. Khan, G. M. Khan, and J. F. Miller. Evolution of Neural Networks using Cartesian Genetic Programming. In Proceedings of IEEE World Congress on Computational Intelligence, 2010.
6.
Zurück zum Zitat H. Larochelle, Y. Bengio, J. Louradour, and P. Lamblin. Exploring Strategies for Training Deep Neural Networks. The Journal of Machine Learning Research, 10:1–40, 2009. H. Larochelle, Y. Bengio, J. Louradour, and P. Lamblin. Exploring Strategies for Training Deep Neural Networks. The Journal of Machine Learning Research, 10:1–40, 2009.
7.
Zurück zum Zitat J. Miller and S. Smith. Redundancy and Computational Efficiency in Cartesian Genetic Programming. IEEE Transactions on Evolutionary Computation, 10(2):167–174, 2006. J. Miller and S. Smith. Redundancy and Computational Efficiency in Cartesian Genetic Programming. IEEE Transactions on Evolutionary Computation, 10(2):167–174, 2006.
8.
Zurück zum Zitat J. Miller and P. Thomson. Cartesian Genetic Programming. In Proceedings of the Third European Conference on Genetic Programming (EuroGP2000), volume 1802, pages 121–132. Springer-Verlag, 2000. J. Miller and P. Thomson. Cartesian Genetic Programming. In Proceedings of the Third European Conference on Genetic Programming (EuroGP2000), volume 1802, pages 121–132. Springer-Verlag, 2000.
9.
Zurück zum Zitat J. F. Miller. What bloat? Cartesian Genetic Programming on Boolean Problems. In 2001 Genetic and Evolutionary Computation Conference Late Breaking Papers, pages 295–302, 2001. J. F. Miller. What bloat? Cartesian Genetic Programming on Boolean Problems. In 2001 Genetic and Evolutionary Computation Conference Late Breaking Papers, pages 295–302, 2001.
10.
Zurück zum Zitat J. F. Miller, editor. Cartesian Genetic Programming. Springer, 2011. J. F. Miller, editor. Cartesian Genetic Programming. Springer, 2011.
11.
Zurück zum Zitat D. Moriarty and R. Mikkulainen. Efficient Reinforcement Learning through Symbiotic Evolution. Machine learning, 22(1):11–32, 1996. D. Moriarty and R. Mikkulainen. Efficient Reinforcement Learning through Symbiotic Evolution. Machine learning, 22(1):11–32, 1996.
12.
Zurück zum Zitat R. Poli. Some Steps Towards a Form of Parallel Distributed Genetic Programming. In Proceedings of the First On-line Workshop on, Soft Computing, pages 290–295, 1996. R. Poli. Some Steps Towards a Form of Parallel Distributed Genetic Programming. In Proceedings of the First On-line Workshop on, Soft Computing, pages 290–295, 1996.
13.
Zurück zum Zitat K. Stanley and R. Miikkulainen. Evolving Neural Networks through Augmenting Topologies. Evolutionary computation, 10(2):99–127, 2002. K. Stanley and R. Miikkulainen. Evolving Neural Networks through Augmenting Topologies. Evolutionary computation, 10(2):99–127, 2002.
14.
Zurück zum Zitat S. Thrun, J. Bala, E. Bloedorn, I. Bratko, B. Cestnik, J. Cheng, K. De Jong, S. Dzeroski, S. Fahlman, D. Fisher, et al. The Monk’s Problems a Performance Comparison of Different Learning Algorithms. Technical report, Carnegie Mellon University, 1991. S. Thrun, J. Bala, E. Bloedorn, I. Bratko, B. Cestnik, J. Cheng, K. De Jong, S. Dzeroski, S. Fahlman, D. Fisher, et al. The Monk’s Problems a Performance Comparison of Different Learning Algorithms. Technical report, Carnegie Mellon University, 1991.
15.
Zurück zum Zitat A. J. Turner and J. F. Miller. Cartesian Genetic Programming encoded Artificial Neural Networks: A Comparison using Three Benchmarks. In Proceedings of the Conference on Genetic and Evolutionary Computation (GECCO-13), pages 1005–1012. ACM, 2013. A. J. Turner and J. F. Miller. Cartesian Genetic Programming encoded Artificial Neural Networks: A Comparison using Three Benchmarks. In Proceedings of the Conference on Genetic and Evolutionary Computation (GECCO-13), pages 1005–1012. ACM, 2013.
16.
Zurück zum Zitat A. Vargha and H. D. Delaney. A Critique and Improvement of the CL Common Language Effect Size Statistics of McGraw and Wong. Journal of Educational and Behavioral Statistics, 25(2):101–132, 2000. A. Vargha and H. D. Delaney. A Critique and Improvement of the CL Common Language Effect Size Statistics of McGraw and Wong. Journal of Educational and Behavioral Statistics, 25(2):101–132, 2000.
17.
Zurück zum Zitat V. K. Vassilev and J. F. Miller. The Advantages of Landscape Neutrality in Digital Circuit Evolution. In Proc. International Conference on Evolvable Systems, volume 1801 of LNCS, pages 252–263. Springer Verlag, 2000. V. K. Vassilev and J. F. Miller. The Advantages of Landscape Neutrality in Digital Circuit Evolution. In Proc. International Conference on Evolvable Systems, volume 1801 of LNCS, pages 252–263. Springer Verlag, 2000.
18.
Zurück zum Zitat A. Wieland. Evolving Neural Network Controllers for Unstable Systems. In Neural Networks, 1991, IJCNN-91-Seattle International Joint Conference on, volume 2, pages 667–673. IEEE, 1991. A. Wieland. Evolving Neural Network Controllers for Unstable Systems. In Neural Networks, 1991, IJCNN-91-Seattle International Joint Conference on, volume 2, pages 667–673. IEEE, 1991.
19.
Zurück zum Zitat X. Yao. Evolving Artificial Neural Networks. Proceedings of the IEEE, 87(9):1423–1447, 1999. X. Yao. Evolving Artificial Neural Networks. Proceedings of the IEEE, 87(9):1423–1447, 1999.
20.
Zurück zum Zitat T. Yu and J. F. Miller. Neutrality and the Evolvability of a Boolean Function Landscape. Genetic programming, pages 204–217, 2001. T. Yu and J. F. Miller. Neutrality and the Evolvability of a Boolean Function Landscape. Genetic programming, pages 204–217, 2001.
Metadaten
Titel
The Importance of Topology Evolution in NeuroEvolution: A Case Study Using Cartesian Genetic Programming of Artificial Neural Networks
verfasst von
Andrew James Turner
Julian Francis Miller
Copyright-Jahr
2013
DOI
https://doi.org/10.1007/978-3-319-02621-3_15

Premium Partner