2013 | OriginalPaper | Buchkapitel
Introduction
verfasst von : Sundaram Suresh, Narasimhan Sundararajan, Ramasamy Savitha
Erschienen in: Supervised Learning with Complex-valued Neural Networks
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Signals occurring in applications like medical imaging and telecommunications are inherently complex-valued, and processing them in their natural form preserves the physical characteristics of these signals. Therefore, there is a widespread research interest in developing efficient complex-valued neural networks along with their learning algorithms. However, operating in the Complex domain presents new challenges; foremost among them being the choice of an appropriate complex-valued activation function. Basically, an activation function for a neural network is required to be nonlinear, bounded and differentiable in every point on the considered plane [1]. This implies that in the Complex domain, the function has to be nonlinear, bounded and entire. However, Liouville’s theorem states that an entire and bounded function in the Complex domain is a constant (function) [2]. As neither the analyticity and boundedness can be compromised, nor is a constant function acceptable as an activation function as it cannot project the input space to a non-linear higher dimensional space, choices for activation functions for complex-valued neural network are limited. In this chapter, the different complex-valued neural networks existing in the literature are discussed in detail, along with their limitations.