Skip to main content
Erschienen in: Neural Computing and Applications 11/2019

31.07.2018 | Original Article

Adaptive learning rule for hardware-based deep neural networks using electronic synapse devices

verfasst von: Suhwan Lim, Jong-Ho Bae, Jai-Ho Eum, Sungtae Lee, Chul-Heung Kim, Dongseok Kwon, Byung-Gook Park, Jong-Ho Lee

Erschienen in: Neural Computing and Applications | Ausgabe 11/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In this paper, we propose a learning rule based on a back-propagation (BP) algorithm that can be applied to a hardware-based deep neural network using electronic devices that exhibit discrete and limited conductance characteristics. This adaptive learning rule, which enables forward, backward propagation, as well as weight updates in hardware, is helpful during the implementation of power-efficient and high-speed deep neural networks. In simulations using a three-layer perceptron network, we evaluate the learning performance according to various conductance responses of electronic synapse devices and weight-updating methods. It is shown that the learning accuracy is comparable to that obtained when using a software-based BP algorithm when the electronic synapse device has a linear conductance response with a high dynamic range. Furthermore, the proposed unidirectional weight-updating method is suitable for electronic synapse devices which have nonlinear and finite conductance responses. Because this weight-updating method can compensate the demerit of asymmetric weight updates, we can obtain better accuracy compared to other methods. This adaptive learning rule, which can be applied to full hardware implementation, can also compensate the degradation of learning accuracy due to the probable device-to-device variation in an actual electronic synapse device.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
15.
Zurück zum Zitat Burr GW, Shelby RM, di Nolfo C, Jang J-W, Shenoy RS, Narayanan P, Virwani K, Giacometti EU, Kurdi B, Hwang H (2014) Experimental demonstration and tolerancing of a large-scale neural network (165,000 synapses), using phase-change memory as the synaptic weight element. IEEE Electron Devices Meeting. https://doi.org/10.1109/iedm.2014.7047135 CrossRef Burr GW, Shelby RM, di Nolfo C, Jang J-W, Shenoy RS, Narayanan P, Virwani K, Giacometti EU, Kurdi B, Hwang H (2014) Experimental demonstration and tolerancing of a large-scale neural network (165,000 synapses), using phase-change memory as the synaptic weight element. IEEE Electron Devices Meeting. https://​doi.​org/​10.​1109/​iedm.​2014.​7047135 CrossRef
22.
Zurück zum Zitat Cho K, van Merrienboer B, Gülçehre Ç, Bahdanau D, Bougares F, Schwenk H, Bengio Y (2014) Learning phrase representations using RNN encoder–decoder for statistical machine translation. arXiv:1406.1078 Cho K, van Merrienboer B, Gülçehre Ç, Bahdanau D, Bougares F, Schwenk H, Bengio Y (2014) Learning phrase representations using RNN encoder–decoder for statistical machine translation. arXiv:​1406.​1078
35.
Zurück zum Zitat Binas J, Neil D, Indiveri G, Liu S-C, Pfeiffer M (2016) Precise deep neural network computation on imprecise low-power analog hardware. arXiv preprint. arXiv:1606.07786 Binas J, Neil D, Indiveri G, Liu S-C, Pfeiffer M (2016) Precise deep neural network computation on imprecise low-power analog hardware. arXiv preprint. arXiv:​1606.​07786
38.
Zurück zum Zitat MATLAB and Statistics Toolbox Release 2016a. The MathWorks, Inc., Natick MATLAB and Statistics Toolbox Release 2016a. The MathWorks, Inc., Natick
Metadaten
Titel
Adaptive learning rule for hardware-based deep neural networks using electronic synapse devices
verfasst von
Suhwan Lim
Jong-Ho Bae
Jai-Ho Eum
Sungtae Lee
Chul-Heung Kim
Dongseok Kwon
Byung-Gook Park
Jong-Ho Lee
Publikationsdatum
31.07.2018
Verlag
Springer London
Erschienen in
Neural Computing and Applications / Ausgabe 11/2019
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-018-3659-y

Weitere Artikel der Ausgabe 11/2019

Neural Computing and Applications 11/2019 Zur Ausgabe