Skip to main content

2018 | OriginalPaper | Buchkapitel

10. An Overview of Different Neural Network Architectures

verfasst von : Sandro Skansi

Erschienen in: Introduction to Deep Learning

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Energy-based models are a specific class of neural networks. The simplest energy model is the Hopfield Network dating back from the 1980s (Hopfield Proc Nat Acad Sci USA 79(8):2554–2558, 1982, [1]). Hopfield networks are often thought to be very simple, but they are quite different from what we have seen before.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
For a fully detailed view, see the blog entry of one of the creators of the NTM, https://​medium.​com/​aidangomez/​the-neural-turing-machine-79f6e806c0a1.
 
2
By default, memory networks make one hop, but it has been shown that multiple hops are beneficial, especially in natural language processing.
 
3
Winograd sentences are sentences of a particular form, whare the computer should resolve the coreference of a pronoun. They were proposed as an alternative to the Turing test, since the turing test has some deep flaws (deceptive behaviour is encouraged), and it is hard to quantify its results and evaluate it on a large scale. Winograd sentences are sentances of the form ‘I tried to put the book in the drwer but it was too [big/small]’, and they are named after Terry Winograd who first considered them in the 1970s [13].
 
Literatur
1.
Zurück zum Zitat J.J. Hopfield, Neural networks and physical systems with emergent collective computational abilities. Proc. Nat. Acad. Sci. U.S.A 79(8), 2554–2558 (1982)MathSciNetCrossRef J.J. Hopfield, Neural networks and physical systems with emergent collective computational abilities. Proc. Nat. Acad. Sci. U.S.A 79(8), 2554–2558 (1982)MathSciNetCrossRef
2.
Zurück zum Zitat D.H. Ackley, G.E. Hinton, T. Sejnowski, A learning algorithm for boltzmann machines. Cogn. Sci. 9(1), 147–169 (1985)CrossRef D.H. Ackley, G.E. Hinton, T. Sejnowski, A learning algorithm for boltzmann machines. Cogn. Sci. 9(1), 147–169 (1985)CrossRef
3.
Zurück zum Zitat P. Smolensky, Information processing in dynamical systems: foundations of harmony theory, in Parallel Distributed Processing: Explorations in the Microstructure of Cognition, ed. by D.E. Rumelhart, J.L. McClelland, the PDP Research Group, (MIT Press, Cambridge) P. Smolensky, Information processing in dynamical systems: foundations of harmony theory, in Parallel Distributed Processing: Explorations in the Microstructure of Cognition, ed. by D.E. Rumelhart, J.L. McClelland, the PDP Research Group, (MIT Press, Cambridge)
4.
Zurück zum Zitat G.E. Hinton, S. Osindero, Y.-W. Teh, A fast learning algorithm for deep belief nets. Neural Comput. 18(7), 1527–1554 (2006)MathSciNetCrossRef G.E. Hinton, S. Osindero, Y.-W. Teh, A fast learning algorithm for deep belief nets. Neural Comput. 18(7), 1527–1554 (2006)MathSciNetCrossRef
5.
Zurück zum Zitat Y. Bengio, P. Lamblin, D. Popovici, H. Larochelle, Greedy layer-wise training of deep networks, in Proceedings of the 19th International Conference on Neural Information Processing Systems (MIT Press, Cambridge, 2006), pp. 153–160 Y. Bengio, P. Lamblin, D. Popovici, H. Larochelle, Greedy layer-wise training of deep networks, in Proceedings of the 19th International Conference on Neural Information Processing Systems (MIT Press, Cambridge, 2006), pp. 153–160
7.
Zurück zum Zitat I. Goodfellow, Y. Bengio, A. Courville, Deep Learning (MIT Press, Cambridge, 2016)MATH I. Goodfellow, Y. Bengio, A. Courville, Deep Learning (MIT Press, Cambridge, 2016)MATH
8.
Zurück zum Zitat W. Bechtel, A. Abrahamsen, Connectionism and the Mind: Parallel Processing, Dynamics and Evolution in Networks (Blackwell, Oxford, 2002) W. Bechtel, A. Abrahamsen, Connectionism and the Mind: Parallel Processing, Dynamics and Evolution in Networks (Blackwell, Oxford, 2002)
12.
Zurück zum Zitat J. Weston, A. Bordes, S. Chopra, A.M. Rush, B. van Merriënboer, A. Joulin, T. Mikolov, Towards ai-complete question answering: A set of prerequisite toy tasks, in ICLR (2016), arXiv:1502.05698 J. Weston, A. Bordes, S. Chopra, A.M. Rush, B. van Merriënboer, A. Joulin, T. Mikolov, Towards ai-complete question answering: A set of prerequisite toy tasks, in ICLR (2016), arXiv:​1502.​05698
13.
Zurück zum Zitat T. Winograd, Understanding Natural Language (Academic Press, New York, 1972)CrossRef T. Winograd, Understanding Natural Language (Academic Press, New York, 1972)CrossRef
Metadaten
Titel
An Overview of Different Neural Network Architectures
verfasst von
Sandro Skansi
Copyright-Jahr
2018
DOI
https://doi.org/10.1007/978-3-319-73004-2_10

Premium Partner