Skip to main content

2015 | OriginalPaper | Buchkapitel

Gravitation Search Training Algorithm for Asynchronous Distributed Multilayer Perceptron Model

verfasst von : Natalya P. Plotnikova, Sergey A. Fedosin, Valery V. Teslya

Erschienen in: New Trends in Networking, Computing, E-learning, Systems Sciences, and Engineering

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

More popular principles of neural networks distribution are trainset parallel processing, matrix operations parallel processing or parallel processing of neural networks collections. This paper describes architecture of asynchronous distributed system for training and simulating multilayer perceptron based on the actor model. Usual backpropagation training algorithms for multilayer perceptrons, such as gradient training, Levenberg-Marquardt training, RPROP training have a common disadvantage: all of them are partially synchronous. Global optimization algorithms (genetic training, annealing algorithm etc.) are more appropriate for distributed processing but most of them are slow and inefficient. Gravitation search algorithm is new global optimization procedure combining good convergence, efficiency and algorithm step dispensability. The paper develops asynchronous distributed modification of this algorithm and presents the results of experiments. The proposed architecture shows the performance increase for distributed systems with different environment parameters (high-performance cluster and local network with a slow interconnection bus).

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat K. Hornik, M. Stinchcombe, and H. White, “Multilayer feedforward networks are universal approximators,” Neural Netw., vol. 2, no. 5, pp. 359–366, 1989.CrossRef K. Hornik, M. Stinchcombe, and H. White, “Multilayer feedforward networks are universal approximators,” Neural Netw., vol. 2, no. 5, pp. 359–366, 1989.CrossRef
2.
Zurück zum Zitat T. P. Chen and H. Chen, “Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems,” IEEE Trans. Neural Netw., vol. 6, no. 4, pp. 911–917, Jul. 1995. T. P. Chen and H. Chen, “Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems,” IEEE Trans. Neural Netw., vol. 6, no. 4, pp. 911–917, Jul. 1995.
3.
Zurück zum Zitat M. Leshno, V. Y. Lin, A. Pinkus, and S. Schocken, “Multilayer feedforward networks with a nonpolynomial activation function can approximate any function,” Neural Netw., vol. 6, no. 6, pp. 861–867, 1993.CrossRef M. Leshno, V. Y. Lin, A. Pinkus, and S. Schocken, “Multilayer feedforward networks with a nonpolynomial activation function can approximate any function,” Neural Netw., vol. 6, no. 6, pp. 861–867, 1993.CrossRef
4.
Zurück zum Zitat B. M. Wilamowski, “Neural networks and fuzzy systems for nonlinear applications,” in Proc. 11th INES 2007–11th Int. Conf. Intelligent Engineering Systems, Budapest, Hungary, June 29–July 1, 2007, pp. 13–19. B. M. Wilamowski, “Neural networks and fuzzy systems for nonlinear applications,” in Proc. 11th INES 2007–11th Int. Conf. Intelligent Engineering Systems, Budapest, Hungary, June 29–July 1, 2007, pp. 13–19.
5.
Zurück zum Zitat C. M. Macal and M. J. North, “Tutorial on agent-based modeling and simulation part 2: How to model with agents,” in Proc. Winter Simulation Conference WSC 06, 3–6 Dec. 2006, pp. 73–83. C. M. Macal and M. J. North, “Tutorial on agent-based modeling and simulation part 2: How to model with agents,” in Proc. Winter Simulation Conference WSC 06, 3–6 Dec. 2006, pp. 73–83.
6.
Zurück zum Zitat Agha G. A. “Actors: A model of concurrent computation in distributed systems,” Cambridge, Massachusetts: MIT Press, 1986. 190p. Agha G. A. “Actors: A model of concurrent computation in distributed systems,” Cambridge, Massachusetts: MIT Press, 1986. 190p.
7.
Zurück zum Zitat K. Levenberg, “A method for the solution of certain problems in least squares,” Quart. Appl. Math., vol. 2, pp. 164–168, 1944.MATHMathSciNet K. Levenberg, “A method for the solution of certain problems in least squares,” Quart. Appl. Math., vol. 2, pp. 164–168, 1944.MATHMathSciNet
8.
Zurück zum Zitat M. T. Hagan and M. Menhaj, “Training feedforward networks with the Marquardt algorithm,” IEEE Trans. Neural Networks, vol. 5, no. 6, pp. 989–993, 1994.CrossRef M. T. Hagan and M. Menhaj, “Training feedforward networks with the Marquardt algorithm,” IEEE Trans. Neural Networks, vol. 5, no. 6, pp. 989–993, 1994.CrossRef
9.
Zurück zum Zitat M. Riedmiller and H. Braun, “A direct adaptive method for faster backpropagation learning: The RPROP algorithm,” in Proc. Int. Conf. Neural Networks, San Francisco, CA, 1993, pp. 586–591. M. Riedmiller and H. Braun, “A direct adaptive method for faster backpropagation learning: The RPROP algorithm,” in Proc. Int. Conf. Neural Networks, San Francisco, CA, 1993, pp. 586–591.
10.
Zurück zum Zitat R. S. Sexton, R. E. Dorsey, and J. D. Johnson, “Toward global optimization of neural networks: A comparison of the genetic algorithm and backpropagation,” Decis. Support Syst., vol. 22, no. 2, pp. 171–185, Feb. 1998. R. S. Sexton, R. E. Dorsey, and J. D. Johnson, “Toward global optimization of neural networks: A comparison of the genetic algorithm and backpropagation,” Decis. Support Syst., vol. 22, no. 2, pp. 171–185, Feb. 1998.
11.
Zurück zum Zitat Esmat Rashedi, Hossein Nezamabadi-pour, Saeid Saryazdi “GSA: A Gravitational Search Algorithm,” Information Sciences, v. 179, issue 13, 2009, pp. 2232–2248. Esmat Rashedi, Hossein Nezamabadi-pour, Saeid Saryazdi “GSA: A Gravitational Search Algorithm,” Information Sciences, v. 179, issue 13, 2009, pp. 2232–2248.
12.
Zurück zum Zitat Esmat Rashedi, Hossien Nezamabadi-pour, and Saied Saryazdi, “BGSA: binary gravitational search algorithm,” Nat Comput, vol. 9, pp. 727–745, 2010.CrossRefMATHMathSciNet Esmat Rashedi, Hossien Nezamabadi-pour, and Saied Saryazdi, “BGSA: binary gravitational search algorithm,” Nat Comput, vol. 9, pp. 727–745, 2010.CrossRefMATHMathSciNet
13.
Zurück zum Zitat Mirjalili, S.; Hashim, S.; Sardroudi, H. “Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm,” in ELSEVIER: Applied Mathematics and Computation, Volume 218, Issue 22, Malaysia, 2012, pp. 11125–11137. Mirjalili, S.; Hashim, S.; Sardroudi, H. “Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm,” in ELSEVIER: Applied Mathematics and Computation, Volume 218, Issue 22, Malaysia, 2012, pp. 11125–11137.
14.
Zurück zum Zitat S. Sheikhpour, M. Sabouri, S.-H. Zahiri “A hybrid Gravitational search algorithm—Genetic algorithm for neural network training,” in Electrical Engineering (ICEE), 2013 21st Iranian Conference, 2013, pp. 1–5. S. Sheikhpour, M. Sabouri, S.-H. Zahiri “A hybrid Gravitational search algorithm—Genetic algorithm for neural network training,” in Electrical Engineering (ICEE), 2013 21st Iranian Conference, 2013, pp. 1–5.
16.
Zurück zum Zitat J. Armstrong, R. Virding, C. Wikstorm, M. Williams, “Concurrent programming in Erlang,” – Hertfordshire: Prentice Hall, 1996. 385p. J. Armstrong, R. Virding, C. Wikstorm, M. Williams, “Concurrent programming in Erlang,” – Hertfordshire: Prentice Hall, 1996. 385p.
17.
Zurück zum Zitat J. Armstrong, “Programming Erlang: Software for a Concurrent World,” Pragmatic Bookshelf, 2007. J. Armstrong, “Programming Erlang: Software for a Concurrent World,” Pragmatic Bookshelf, 2007.
18.
Zurück zum Zitat M. Kandegedara, D. N. Ranasinghe, “Functional Parallelism with Shared Memory and Distributed Memory Approaches in Industrial and Information Systems,” IEEE Region 10 and the Third international Conference, 2008, pp. 1–6. M. Kandegedara, D. N. Ranasinghe, “Functional Parallelism with Shared Memory and Distributed Memory Approaches in Industrial and Information Systems,” IEEE Region 10 and the Third international Conference, 2008, pp. 1–6.
19.
Zurück zum Zitat S. Vinoski, “Concurrency and Message Passing in Erlang,” Computing in Science & Engineering, vol. 14, issue 6, 2012, pp. 24–34. S. Vinoski, “Concurrency and Message Passing in Erlang,” Computing in Science & Engineering, vol. 14, issue 6, 2012, pp. 24–34.
Metadaten
Titel
Gravitation Search Training Algorithm for Asynchronous Distributed Multilayer Perceptron Model
verfasst von
Natalya P. Plotnikova
Sergey A. Fedosin
Valery V. Teslya
Copyright-Jahr
2015
DOI
https://doi.org/10.1007/978-3-319-06764-3_52

Neuer Inhalt