Signals occurring in applications like medical imaging and telecommunications are inherently complex-valued, and processing them in their natural form preserves the physical characteristics of these signals. Therefore, there is a widespread research interest in developing efficient complex-valued neural networks along with their learning algorithms. However, operating in the Complex domain presents new challenges; foremost among them being the choice of an appropriate complex-valued activation function. Basically, an activation function for a neural network is required to be nonlinear, bounded and differentiable in every point on the considered plane . This implies that in the Complex domain, the function has to be nonlinear, bounded and entire. However, Liouville’s theorem states that an entire and bounded function in the Complex domain is a constant (function) . As neither the analyticity and boundedness can be compromised, nor is a constant function acceptable as an activation function as it cannot project the input space to a non-linear higher dimensional space, choices for activation functions for complex-valued neural network are limited. In this chapter, the different complex-valued neural networks existing in the literature are discussed in detail, along with their limitations.
Weitere Kapitel dieses Buchs durch Wischen aufrufen
Bitte loggen Sie sich ein, um Zugang zu diesem Inhalt zu erhalten
Sie möchten Zugang zu diesem Inhalt erhalten? Dann informieren Sie sich jetzt über unsere Produkte:
- Springer Berlin Heidelberg
Neuer Inhalt/© ITandMEDIA