Skip to main content
Erschienen in: Optical Memory and Neural Networks 1/2020

01.01.2020

Multi-Start Method with Cutting for Solving Problems of Unconditional Optimization

verfasst von: V. A. Kostenko

Erschienen in: Optical Memory and Neural Networks | Ausgabe 1/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In the present paper, we present a multi-start method with dynamic cutting of “unpromising” starts of locally optimal algorithms for solving continuous unconditional optimization problems. We also describe our results of testing of the proposed method for solving problems of feed-forward neural network training.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Rumelhart, D., Hinton, G., and Williams, R., Learning internal representation by error propagation, in Parallel Distributed Processing, Cambridge, MA: MIT Press, 1986, vol. 1, pp. 318–362.CrossRef Rumelhart, D., Hinton, G., and Williams, R., Learning internal representation by error propagation, in Parallel Distributed Processing, Cambridge, MA: MIT Press, 1986, vol. 1, pp. 318–362.CrossRef
2.
Zurück zum Zitat Parker, D., Learning logic, in Invention Report S81-64, File 1, Stanford, CA: Stanford University, 1982. Parker, D., Learning logic, in Invention Report S81-64, File 1, Stanford, CA: Stanford University, 1982.
3.
Zurück zum Zitat Werbos, P., Beyond regression: New tools for prediction and analysis in the behavioral sciences, Master’s Thesis, Harvard University, 1974. Werbos, P., Beyond regression: New tools for prediction and analysis in the behavioral sciences, Master’s Thesis, Harvard University, 1974.
4.
Zurück zum Zitat Bartsev, S.I. and Okhonin, V.A., Adaptive information processing networks, in Preprint IF SB USSR, Krasnoyarsk, 1986, no. 59. Bartsev, S.I. and Okhonin, V.A., Adaptive information processing networks, in Preprint IF SB USSR, Krasnoyarsk, 1986, no. 59.
5.
Zurück zum Zitat Hagan, M. and Menhaj, M., Training feedforward networks with the Marquardt algorithm, IEEE Trans. Neural Networks, 1994, vol. 5, no. 6, pp. 989–993.CrossRef Hagan, M. and Menhaj, M., Training feedforward networks with the Marquardt algorithm, IEEE Trans. Neural Networks, 1994, vol. 5, no. 6, pp. 989–993.CrossRef
6.
Zurück zum Zitat Zhiglyavskii, A.A. and Zhilinskas, A.G., Metody poiska global’nogo ekstremuma (Methods for Global Extremum Search), Moscow: Nauka, 1991. Zhiglyavskii, A.A. and Zhilinskas, A.G., Metody poiska global’nogo ekstremuma (Methods for Global Extremum Search), Moscow: Nauka, 1991.
7.
Zurück zum Zitat Markin, M.I., The choice of the initial approximation in training a neural network using local optimization methods, Proceedings of the Second All-Russian Scientific and Technical Conference Neuroinformatics-2000, Moscow, 2000, part 1. Markin, M.I., The choice of the initial approximation in training a neural network using local optimization methods, Proceedings of the Second All-Russian Scientific and Technical Conference Neuroinformatics-2000, Moscow, 2000, part 1.
8.
Zurück zum Zitat Markin, M.I., About one method of increasing the efficiency of learning a direct distribution neural network, Software Systems and Tools: Thematic Collection of the VMiK Faculty of Lomonosov Moscow State University, Moscow, 2000, no. 1, pp. 87–97. Markin, M.I., About one method of increasing the efficiency of learning a direct distribution neural network, Software Systems and Tools: Thematic Collection of the VMiK Faculty of Lomonosov Moscow State University, Moscow, 2000, no. 1, pp. 87–97.
9.
Zurück zum Zitat Nguyen, D. and Widrow, B., Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights, Proceedings of the International Joint Conference on Neural Networks, 1990, vol. 3, pp. 21–26. Nguyen, D. and Widrow, B., Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights, Proceedings of the International Joint Conference on Neural Networks, 1990, vol. 3, pp. 21–26.
10.
Zurück zum Zitat Osowski, S., New approach to selection of initial values of weights in neural function approximation, Electron. Lett., 1993, vol. 29, pp. 313–315.CrossRef Osowski, S., New approach to selection of initial values of weights in neural function approximation, Electron. Lett., 1993, vol. 29, pp. 313–315.CrossRef
11.
Zurück zum Zitat Yam, J.Y.F. and Chow, T.W.S., A weight initialization method for improving training speed in feedforward neural network, Neurocomputing, 2000, vol. 30, no. 1, pp. 219–232.CrossRef Yam, J.Y.F. and Chow, T.W.S., A weight initialization method for improving training speed in feedforward neural network, Neurocomputing, 2000, vol. 30, no. 1, pp. 219–232.CrossRef
12.
Zurück zum Zitat Wasserman, P.D., Neural Computing: Theory and Practice, Coriolis Group, 1989. Wasserman, P.D., Neural Computing: Theory and Practice, Coriolis Group, 1989.
13.
Zurück zum Zitat Fahlman, S., Faster-learning variations on back-propagation: An empirical study, Proceedings of the 1988 Connectionist Models Summer School, 1989, pp. 38–51. Fahlman, S., Faster-learning variations on back-propagation: An empirical study, Proceedings of the 1988 Connectionist Models Summer School, 1989, pp. 38–51.
14.
Zurück zum Zitat Riedmiller, M. and Braun, H., A direct adaptive method for faster backpropagation learning: The RPROP algorithm, Proceedings of the IEEE International Conference on Neural Networks 1993, San Francisco, 1993. Riedmiller, M. and Braun, H., A direct adaptive method for faster backpropagation learning: The RPROP algorithm, Proceedings of the IEEE International Conference on Neural Networks 1993, San Francisco, 1993.
15.
Zurück zum Zitat Minoux, M., Mathematical Programming: Theory and Algorithms, Wiley, 1986.MATH Minoux, M., Mathematical Programming: Theory and Algorithms, Wiley, 1986.MATH
Metadaten
Titel
Multi-Start Method with Cutting for Solving Problems of Unconditional Optimization
verfasst von
V. A. Kostenko
Publikationsdatum
01.01.2020
Verlag
Pleiades Publishing
Erschienen in
Optical Memory and Neural Networks / Ausgabe 1/2020
Print ISSN: 1060-992X
Elektronische ISSN: 1934-7898
DOI
https://doi.org/10.3103/S1060992X20010099

Weitere Artikel der Ausgabe 1/2020

Optical Memory and Neural Networks 1/2020 Zur Ausgabe