Skip to main content
Top

2018 | OriginalPaper | Chapter

Gating Sensory Noise in a Spiking Subtractive LSTM

Authors : Isabella Pozzi, Roeland Nusselder, Davide Zambrano, Sander Bohté

Published in: Artificial Neural Networks and Machine Learning – ICANN 2018

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Spiking neural networks are being investigated both as biologically plausible models of neural computation and also as a potentially more efficient type of neural network. Recurrent neural networks in the form of networks of gating memory cells have been central in state-of-the-art solutions in problem domains that involve sequence recognition or generation. Here, we design an analog Long Short-Term Memory (LSTM) cell where its neurons can be substituted with efficient spiking neurons, where we use subtractive gating (following the subLSTM in [1]) instead of multiplicative gating. Subtractive gating allows for a less sensitive gating mechanism, critical when using spiking neurons. By using fast adapting spiking neurons with a smoothed Rectified Linear Unit (ReLU)-like effective activation function, we show that then an accurate conversion from an analog subLSTM to a continuous-time spiking subLSTM is possible. This architecture results in memory networks that compute very efficiently, with low average firing rates comparable to those in biological neurons, while operating in continuous time.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Costa, R., Assael, I.A., Shillingford, B., de Freitas, N., Vogels, T.: Cortical microcircuits as gated-recurrent neural networks. In: Advances in Neural Information Processing Systems, pp. 272–283 (2017) Costa, R., Assael, I.A., Shillingford, B., de Freitas, N., Vogels, T.: Cortical microcircuits as gated-recurrent neural networks. In: Advances in Neural Information Processing Systems, pp. 272–283 (2017)
2.
go back to reference Attwell, D., Laughlin, S.: An energy budget for signaling in the grey matter of the brain. J. Cereb. Blood Flow Metab. 21(10), 1133–1145 (2001)CrossRef Attwell, D., Laughlin, S.: An energy budget for signaling in the grey matter of the brain. J. Cereb. Blood Flow Metab. 21(10), 1133–1145 (2001)CrossRef
3.
go back to reference Esser, S., et al.: Convolutional networks for fast, energy-efficient neuromorphic computing. In: PNAS, p. 201604850, September 2016 Esser, S., et al.: Convolutional networks for fast, energy-efficient neuromorphic computing. In: PNAS, p. 201604850, September 2016
4.
go back to reference Neil, D., Pfeiffer, M., Liu, S.C.: Learning to be efficient: algorithms for training low-latency, low-compute deep spiking neural networks (2016) Neil, D., Pfeiffer, M., Liu, S.C.: Learning to be efficient: algorithms for training low-latency, low-compute deep spiking neural networks (2016)
5.
go back to reference Diehl, P., Neil, D., Binas, J., Cook, M., Liu, S.C., Pfeiffer, M.: Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: IEEE IJCNN, pp. 1–8, July 2015 Diehl, P., Neil, D., Binas, J., Cook, M., Liu, S.C., Pfeiffer, M.: Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: IEEE IJCNN, pp. 1–8, July 2015
6.
go back to reference O’Connor, P., Neil, D., Liu, S.C., Delbruck, T., Pfeiffer, M.: Real-time classification and sensor fusion with a spiking deep belief network. Front. Neurosci. 7, 178 (2013) O’Connor, P., Neil, D., Liu, S.C., Delbruck, T., Pfeiffer, M.: Real-time classification and sensor fusion with a spiking deep belief network. Front. Neurosci. 7, 178 (2013)
8.
go back to reference Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput, 9(8), 1735–1780 (1997)CrossRef Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput, 9(8), 1735–1780 (1997)CrossRef
9.
go back to reference Shrestha, A., et al.: A spike-based long short-term memory on a neurosynaptic processor (2017) Shrestha, A., et al.: A spike-based long short-term memory on a neurosynaptic processor (2017)
10.
go back to reference Davies, M., Srinivasa, N., Lin, T.H., Chinya, G., Cao, Y., Choday, S.H., Dimou, G., Joshi, P., Imam, N., Jain, S.: Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38(1), 82–99 (2018)CrossRef Davies, M., Srinivasa, N., Lin, T.H., Chinya, G., Cao, Y., Choday, S.H., Dimou, G., Joshi, P., Imam, N., Jain, S.: Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38(1), 82–99 (2018)CrossRef
11.
go back to reference Zambrano, D., Bohte, S.: Fast and efficient asynchronous neural computation with adapting spiking neural networks. arXiv preprint arXiv:1609.02053 (2016) Zambrano, D., Bohte, S.: Fast and efficient asynchronous neural computation with adapting spiking neural networks. arXiv preprint arXiv:​1609.​02053 (2016)
12.
go back to reference Bohte, S.: Efficient spike-coding with multiplicative adaptation in a spike response model. In: NIPS, vol. 25, pp. 1844–1852 (2012) Bohte, S.: Efficient spike-coding with multiplicative adaptation in a spike response model. In: NIPS, vol. 25, pp. 1844–1852 (2012)
13.
go back to reference Gers, F.A., Schraudolph, N.N., Schmidhuber, J.: Learning precise timing with LSTM recurrent networks. J. Mach. Learn. Res. 3(Aug), 115–143 (2002)MathSciNetMATH Gers, F.A., Schraudolph, N.N., Schmidhuber, J.: Learning precise timing with LSTM recurrent networks. J. Mach. Learn. Res. 3(Aug), 115–143 (2002)MathSciNetMATH
14.
go back to reference Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neurosci. 19(3), 375–382 (2016)CrossRef Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neurosci. 19(3), 375–382 (2016)CrossRef
15.
go back to reference Bakker, B.: Reinforcement learning with long short-term memory. In: NIPS, vol. 14, pp. 1475–1482 (2002) Bakker, B.: Reinforcement learning with long short-term memory. In: NIPS, vol. 14, pp. 1475–1482 (2002)
16.
go back to reference Harmon, M., Baird III, L.: Multi-player residual advantage learning with general function approximation. Wright Laboratory, 45433–7308 (1996) Harmon, M., Baird III, L.: Multi-player residual advantage learning with general function approximation. Wright Laboratory, 45433–7308 (1996)
17.
go back to reference Rombouts, J., Bohte, S., Roelfsema, P.: Neurally plausible reinforcement learning of working memory tasks. In: NIPS, vol. 25, pp. 1871–1879 (2012) Rombouts, J., Bohte, S., Roelfsema, P.: Neurally plausible reinforcement learning of working memory tasks. In: NIPS, vol. 25, pp. 1871–1879 (2012)
18.
go back to reference Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078 (2014) Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:​1406.​1078 (2014)
19.
go back to reference Greff, K., Srivastava, R.K., Koutník, J., Steunebrink, B.R., Schmidhuber, J.: LSTM: a search space odyssey. IEEE Trans. Neural Netw. Learn. Syst. 28(10), 2222–2232 (2017)MathSciNetCrossRef Greff, K., Srivastava, R.K., Koutník, J., Steunebrink, B.R., Schmidhuber, J.: LSTM: a search space odyssey. IEEE Trans. Neural Netw. Learn. Syst. 28(10), 2222–2232 (2017)MathSciNetCrossRef
20.
go back to reference Jozefowicz, R., Zaremba, W., Sutskever, I.: An empirical exploration of recurrent network architectures. In: International Conference on Machine Learning, pp. 2342–2350 (2015) Jozefowicz, R., Zaremba, W., Sutskever, I.: An empirical exploration of recurrent network architectures. In: International Conference on Machine Learning, pp. 2342–2350 (2015)
Metadata
Title
Gating Sensory Noise in a Spiking Subtractive LSTM
Authors
Isabella Pozzi
Roeland Nusselder
Davide Zambrano
Sander Bohté
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-030-01418-6_28

Premium Partner