Skip to main content

1999 | OriginalPaper | Buchkapitel

Universality of Sigmoidal Networks

verfasst von : Hava T. Siegelmann

Erschienen in: Neural Networks and Analog Computation

Verlag: Birkhäuser Boston

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Up to this point we considered only the saturated linear activation function. In this chapter, we investigate the computational power of networks with sigmoid activation functions, such as those widely considered in the neural network literature, e.g., $$ \varrho (x) = \frac{1} {{1 + e^{ - x} }} $$ or (7.1)$$ \varrho (x) = \frac{2} {{1 + e^{ - x} }} - 1. $$. In Chapter 10 we will see that a large class of activation functions, which also includes the sigmoid, yields networks whose computational power is bounded from above by P/poly. In this chapter we obtain a lower bound on the computational power of sigmoidal networks. We prove that there exists a universal architecture of sigmoidal neurons that can be used to compute any recursive function, with exponential slowdown. Our proof techniques can be applied to a much more general class of “sigmoidal-like” activation functions, suggesting that Turing universality is a common property of recurrent neural network models. In conclusion, the computational capabilities of sigmoidal networks are located in between Turing machines and advice Turing machines.

Metadaten
Titel
Universality of Sigmoidal Networks
verfasst von
Hava T. Siegelmann
Copyright-Jahr
1999
Verlag
Birkhäuser Boston
DOI
https://doi.org/10.1007/978-1-4612-0707-8_7

Neuer Inhalt