Skip to main content
Top

Analog Current-Mode Computational Circuits for Artificial Neural Networks

  • 2025
  • Book

About this book

This book discusses in detail low-voltage low-power designs for minimizing the hardware resources required by neural network implementations. The novel method presented in this book for an accurate realization of activation functions for artificial neural networks (ANNs), is based on specific superior-order approximation functions. The author describes analog implementations in CMOS technology to increase the speed of operation, while reducing the hardware resources required for obtaining these approximation functions. Original architectures presented in this book, used for implementing previous CMOS computational structures, allow for operation independent of technological errors and temperature variations. SPICE simulations confirm the theoretically estimated results for previously presented CMOS computational structures, developed for ANNs and artificial intelligence applications.

Table of Contents

  1. Frontmatter

  2. Chapter 1. Superior-Order Approximation Functions for Generating Sigmoidal Activation Functions

    Cosmin Radu Popa
    Abstract
    Sigmoidal activation functions are frequently used in neural networks for introducing nonlinearity (Popa Electronics 12:24, 2023). Having a sigmoidal shape (S-shape), they have the important advantage of squishing input values into a limited range (usually between 0 and 1 or between − 1 and 1). It exists a multitude of sigmoidal activation functions, such as unipolar sigmoidal activation function, bipolar sigmoidal activation function, Einstein activation functions, hyperbolic tangent activation function or continuous-log sigmoidal activation functions.
  3. Chapter 2. Superior-Order Approximation Functions for Generating Radial Basis Activation Functions

    Cosmin Radu Popa
    Abstract
    The neural networks based on radial basis functions (RBF) represent a class of neural networks that are especially used for approximating functions, interpolations, regressions and classifications. There are feedforward networks, but their operation is different. The RBF neural networks have a multitude of applications: models recognition, medical classifications (very useful for deciding the diagnostic), fuzzy systems or function interpolation.
  4. Chapter 3. Superior-Order Approximation Functions for Artificial Neural Networks Applications

    Cosmin Radu Popa
    Abstract
    In addition to the functions from previous two chapters (sigmoidal activation functions, presented in Chap. 1 and radial basis activation functions described in Chap. 2), it exists other important mathematical functions, having a multitude of applications in analog signal processing and artificial neural networks (Popa EURASIP J Adv Sig Process 2012:129, 2012; Popa Electronics 12:24, 2023). These functions are detailed analyzed in this chapter and the most accurate approximation functions are proposed, in order to generate them. Two important objectives are considered: as good as possible accuracy of the approximation and reasonable hardware resources required for their implementation in CMOS technology using fundamental CMOS computational circuits (further described in Chap. 4).
  5. Chapter 4. Low-Voltage Low-Power Current-Mode CMOS Computational Circuits for Implementing Activation Functions

    Cosmin Radu Popa
    Abstract
    In order to implement in CMOS technology all the previous approximation functions, detailed analyzed in Chaps. 1, 2 and 3, the most convenient fundamental CMOS computational circuits must be designed. A multitude of very important objectives must be taken into account for a proper choice of these fundamental computational circuits (Popa EURASIP J Adv Signal Process 2012:129, 2012; Popa Electronics 12:24, 2023;)
  6. Chapter 5. Analysis and Design of Analog Function Synthesizers for Generating Sigmoidal Activation Functions

    Cosmin Radu Popa
    Abstract
    This chapter presents a multitude of analog function synthesizers (Popa EURASIP Journal on Advances in Signal Processing 2012:129, 2012;Popa Electronics 12:24, 2023;), developed for implementing in CMOS technology the sigmoidal activation functions, detailed and analyzed in Chap. 1 (unipolar and bipolar sigmoidal activation function, Einstein activation functions, hyperbolic tangent activation function and continuous-log sigmoid activation functions).
  7. Chapter 6. Analysis and Design of Analog Function Synthesizers for Generating Radial Basis Activation Functions

    Cosmin Radu Popa
    Abstract
    This chapter presents a multitude of analog function synthesizers (Popa EURASIP Journal on Advances in Signal Processing 2012:129, 2012;Popa Electronics 12:24, 2023;), developed for implementing in CMOS technology the radial basis activation functions, detailed analyzed in Chap. 2 (cubic activation function, Gaussian activation function, quadratic, inverse quadratic activation functions, generalized multiquadratic and generalized inverse multiquadratic activation functions).
  8. Chapter 7. Analysis and Design of Analog Function Synthesizers for Artificial Neural Networks Applications

    Cosmin Radu Popa
    Abstract
    This chapter presents a multitude of analog function synthesizers (Popa EURASIP Journal on Advances in Signal Processing 2012:129, 2012; Popa Electronics 12:24, 2023), developed for implementing in CMOS technology important functions, having a multitude of applications in analog signal processing and artificial neural networks, analyzed in Chap. 3 (exponential function, exponential ramp function, hyperbolic sinusoidal function, hyperbolic cosinusoidal function and, also, any mathematical function with continuous behavior, useful for previous mentioned applications).
Title
Analog Current-Mode Computational Circuits for Artificial Neural Networks
Author
Cosmin Radu Popa
Copyright Year
2025
Electronic ISBN
978-3-032-03989-7
Print ISBN
978-3-032-03988-0
DOI
https://doi.org/10.1007/978-3-032-03989-7

PDF files of this book have been created in accordance with the PDF/UA-1 standard to enhance accessibility, including screen reader support, described non-text content (images, graphs), bookmarks for easy navigation, keyboard-friendly links and forms and searchable, selectable text. We recognize the importance of accessibility, and we welcome queries about accessibility for any of our products. If you have a question or an access need, please get in touch with us at accessibilitysupport@springernature.com.