Skip to main content
Erschienen in: Neural Processing Letters 2/2022

10.11.2021

BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning

verfasst von: Saeed Reza Kheradpisheh, Maryam Mirsadeghi, Timothée Masquelier

Erschienen in: Neural Processing Letters | Ausgabe 2/2022

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

We recently proposed the S4NN algorithm, essentially an adaptation of backpropagation to multilayer spiking neural networks that use simple non-leaky integrate-and-fire neurons and a form of temporal coding known as time-to-first-spike coding. With this coding scheme, neurons fire at most once per stimulus, but the firing order carries information. Here, we introduce BS4NN, a modification of S4NN in which the synaptic weights are constrained to be binary (\(+\) 1 or − 1), in order to decrease memory (ideally, one bit per synapse) and computation footprints. This was done using two sets of weights: firstly, real-valued weights, updated by gradient descent, and used in the backward pass of backpropagation, and secondly, their signs, used in the forward pass. Similar strategies have been used to train (non-spiking) binarized neural networks. The main difference is that BS4NN operates in the time domain: spikes are propagated sequentially, and different neurons may reach their threshold at different times, which increases computational power. We validated BS4NN on two popular benchmarks, MNIST and Fashion-MNIST, and obtained reasonable accuracies for this sort of network (97.0% and 87.3% respectively) with a negligible accuracy drop with respect to real-valued weights (0.4% and 0.7%, respectively). We also demonstrated that BS4NN outperforms a simple BNN with the same architectures on those two datasets (by 0.2% and 0.9% respectively), presumably because it leverages the temporal dimension.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Tavanaei A, Ghodrati M, Kheradpisheh SR, Masquelier T, Maida A (2019) Deep learning in spiking neural networks. Neural Netw 111:47–63CrossRef Tavanaei A, Ghodrati M, Kheradpisheh SR, Masquelier T, Maida A (2019) Deep learning in spiking neural networks. Neural Netw 111:47–63CrossRef
2.
Zurück zum Zitat Pfeiffer M, Pfeil T (2018) Deep learning with spiking neurons: opportunities and challenges. Front Neurosci 12:774CrossRef Pfeiffer M, Pfeil T (2018) Deep learning with spiking neurons: opportunities and challenges. Front Neurosci 12:774CrossRef
3.
Zurück zum Zitat Taherkhani A, Belatreche A, Li Y, Cosma G, Maguire LP, McGinnity TM (2020) A review of learning in biologically plausible spiking neural networks. Neural Netw 122:253–272CrossRef Taherkhani A, Belatreche A, Li Y, Cosma G, Maguire LP, McGinnity TM (2020) A review of learning in biologically plausible spiking neural networks. Neural Netw 122:253–272CrossRef
4.
Zurück zum Zitat Illing B, Gerstner W, Brea J (2019) Biologically plausible deep learning-but how far can we go with shallow networks? Neural Netw 118:90–101CrossRef Illing B, Gerstner W, Brea J (2019) Biologically plausible deep learning-but how far can we go with shallow networks? Neural Netw 118:90–101CrossRef
5.
Zurück zum Zitat Wang X, Lin X, Dang X (2020) Supervised learning in spiking neural networks: a review of algorithms and evaluations. Neural Netw 125:258–280CrossRef Wang X, Lin X, Dang X (2020) Supervised learning in spiking neural networks: a review of algorithms and evaluations. Neural Netw 125:258–280CrossRef
6.
Zurück zum Zitat Roy K, Jaiswal A, Panda P (2019) Towards spike-based machine intelligence with neuromorphic computing. Nature 575(7784):607–617CrossRef Roy K, Jaiswal A, Panda P (2019) Towards spike-based machine intelligence with neuromorphic computing. Nature 575(7784):607–617CrossRef
7.
Zurück zum Zitat Simons T, Lee D-J (2019) A review of binarized neural networks. Electronics 8(6):661CrossRef Simons T, Lee D-J (2019) A review of binarized neural networks. Electronics 8(6):661CrossRef
8.
Zurück zum Zitat Saad D, Marom E (1990) Training feed forward nets with binary weights via a modified chir algorithm. Complex Syst 4(5):573–586MathSciNetMATH Saad D, Marom E (1990) Training feed forward nets with binary weights via a modified chir algorithm. Complex Syst 4(5):573–586MathSciNetMATH
9.
Zurück zum Zitat Venkatesh SS (1993) Directed drift: a new linear threshold algorithm for learning binary weights on-line. J Comput Syst Sci 46(2):198–217MathSciNetCrossRef Venkatesh SS (1993) Directed drift: a new linear threshold algorithm for learning binary weights on-line. J Comput Syst Sci 46(2):198–217MathSciNetCrossRef
10.
Zurück zum Zitat Baldassi C, Braunstein A, Brunel N, Zecchina R (2007) Efficient supervised learning in networks with binary synapses. Proc Natl Acad Sci 104(26):11079–11084CrossRef Baldassi C, Braunstein A, Brunel N, Zecchina R (2007) Efficient supervised learning in networks with binary synapses. Proc Natl Acad Sci 104(26):11079–11084CrossRef
11.
Zurück zum Zitat Courbariaux M, Bengio Y, David JP (2015) Binaryconnect: training deep neural networks with binary weights during propagations. In: Advances in neural information processing systems, pp. 3123–3131 (2015) Courbariaux M, Bengio Y, David JP (2015) Binaryconnect: training deep neural networks with binary weights during propagations. In: Advances in neural information processing systems, pp. 3123–3131 (2015)
12.
Zurück zum Zitat Courbariaux M, Hubara I, Soudry D, El-Yaniv R, Bengio Y (2016) Binarized neural networks: training deep neural networks with weights and activations constrained to +1 or -1. arXiv:1602.02830 Courbariaux M, Hubara I, Soudry D, El-Yaniv R, Bengio Y (2016) Binarized neural networks: training deep neural networks with weights and activations constrained to +1 or -1. arXiv:​1602.​02830
13.
Zurück zum Zitat Rastegari M, Ordonez V, Redmon J, Farhadi A (2016) Xnor-net: imagenet classification using binary convolutional neural networks. In: European conference on computer vision. Springer, pp 525–542 Rastegari M, Ordonez V, Redmon J, Farhadi A (2016) Xnor-net: imagenet classification using binary convolutional neural networks. In: European conference on computer vision. Springer, pp 525–542
14.
Zurück zum Zitat Tang W, Hua G, Wang L (2017) How to train a compact binary neural network with high accuracy?. In: Thirty-first AAAI conference on artificial intelligence, 2017 Tang W, Hua G, Wang L (2017) How to train a compact binary neural network with high accuracy?. In: Thirty-first AAAI conference on artificial intelligence, 2017
15.
Zurück zum Zitat Zhou S, Wu Y, Ni Z, Zhou X, Wen H, Zou Y (2016) Dorefa-net: training low bitwidth convolutional neural networks with low bitwidth gradients. arXiv:1606.06160 Zhou S, Wu Y, Ni Z, Zhou X, Wen H, Zou Y (2016) Dorefa-net: training low bitwidth convolutional neural networks with low bitwidth gradients. arXiv:​1606.​06160
16.
Zurück zum Zitat Esser SK, Appuswamy R, Merolla P, Arthur JV, Modha DS (2015) Backpropagation for energy-efficient neuromorphic computing. In: Cortes C, Lawrence ND, Lee DD, Sugiyama M, Garnett R (eds) Advances in neural information processing systems, vol 28. Curran Associates, Inc., Red Hook, pp 1117–1125 Esser SK, Appuswamy R, Merolla P, Arthur JV, Modha DS (2015) Backpropagation for energy-efficient neuromorphic computing. In: Cortes C, Lawrence ND, Lee DD, Sugiyama M, Garnett R (eds) Advances in neural information processing systems, vol 28. Curran Associates, Inc., Red Hook, pp 1117–1125
17.
Zurück zum Zitat Esser SK, Merolla PA, Arthur JV, Cassidy AS, Appuswamy R, Andreopoulos A, Berg DJ, McKinstry JL, Melano T, Barch DR, di Nolfo C, Datta P, Amir A, Taba B, Flickner MD, Modha DS (2016) Convolutional networks for fast, energy-efficient neuromorphic computing. Proc Natl Acad Sci 113(41):11441–11446CrossRef Esser SK, Merolla PA, Arthur JV, Cassidy AS, Appuswamy R, Andreopoulos A, Berg DJ, McKinstry JL, Melano T, Barch DR, di Nolfo C, Datta P, Amir A, Taba B, Flickner MD, Modha DS (2016) Convolutional networks for fast, energy-efficient neuromorphic computing. Proc Natl Acad Sci 113(41):11441–11446CrossRef
18.
Zurück zum Zitat Rueckauer B, Lungu I-A, Hu Y, Pfeiffer M, Liu S-C (2017) Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci 11:682CrossRef Rueckauer B, Lungu I-A, Hu Y, Pfeiffer M, Liu S-C (2017) Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci 11:682CrossRef
19.
Zurück zum Zitat Wang Y, Xu Y, Yan R, Tang H (2020) Deep spiking neural networks with binary weights for object recognition. IEEE Trans Cogn Dev Syst (2020) Wang Y, Xu Y, Yan R, Tang H (2020) Deep spiking neural networks with binary weights for object recognition. IEEE Trans Cogn Dev Syst (2020)
21.
Zurück zum Zitat Kheradpisheh SR, Ganjtabesh M, Thorpe SJ, Masquelier T (2018) Stdp-based spiking deep convolutional neural networks for object recognition. Neural Netw 99:56–67CrossRef Kheradpisheh SR, Ganjtabesh M, Thorpe SJ, Masquelier T (2018) Stdp-based spiking deep convolutional neural networks for object recognition. Neural Netw 99:56–67CrossRef
22.
Zurück zum Zitat Mozafari M, Kheradpisheh SR, Masquelier T, Nowzari-Dalini A, Ganjtabesh M (2018) First-spike-based visual categorization using reward-modulated STDP. IEEE Trans Neural Netw Learn Syst 29(12):6178–6190CrossRef Mozafari M, Kheradpisheh SR, Masquelier T, Nowzari-Dalini A, Ganjtabesh M (2018) First-spike-based visual categorization using reward-modulated STDP. IEEE Trans Neural Netw Learn Syst 29(12):6178–6190CrossRef
23.
Zurück zum Zitat Kheradpisheh SR, Ganjtabesh M, Masquelier T (2016) Bio-inspired unsupervised learning of visual features leads to robust invariant object recognition. Neurocomputing 205:382–392CrossRef Kheradpisheh SR, Ganjtabesh M, Masquelier T (2016) Bio-inspired unsupervised learning of visual features leads to robust invariant object recognition. Neurocomputing 205:382–392CrossRef
24.
Zurück zum Zitat LeCun Y, Bottou L, Bengio Y, Haffner P et al (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324CrossRef LeCun Y, Bottou L, Bengio Y, Haffner P et al (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324CrossRef
25.
Zurück zum Zitat Xiao H, Rasul K, Vollgraf R (2017) Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv:1708.07747 Xiao H, Rasul K, Vollgraf R (2017) Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv:​1708.​07747
26.
Zurück zum Zitat Kheradpisheh SR, Masquelier T (2020) Temporal backpropagation for spiking neural networks with one spike per neuron. Int J Neural Syst 30(06):2050027 (pMID: 32466691)CrossRef Kheradpisheh SR, Masquelier T (2020) Temporal backpropagation for spiking neural networks with one spike per neuron. Int J Neural Syst 30(06):2050027 (pMID: 32466691)CrossRef
27.
Zurück zum Zitat Mostafa H (2017) Supervised learning based on temporal coding in spiking neural networks. IEEE Trans Neural Netw Learn Syst 29(7):3227–3235 Mostafa H (2017) Supervised learning based on temporal coding in spiking neural networks. IEEE Trans Neural Netw Learn Syst 29(7):3227–3235
28.
Zurück zum Zitat Tavanaei A, Maida A (2019) Bp-stdp: approximating backpropagation using spike timing dependent plasticity. Neurocomputing 330:39–47CrossRef Tavanaei A, Maida A (2019) Bp-stdp: approximating backpropagation using spike timing dependent plasticity. Neurocomputing 330:39–47CrossRef
29.
Zurück zum Zitat Comsa IM, Potempa K, Versari L, Fischbacher T, Gesmundo A, Alakuijala J (2019) Temporal coding in spiking neural networks with alpha synaptic function. arXiv, p. 1907.13223 Comsa IM, Potempa K, Versari L, Fischbacher T, Gesmundo A, Alakuijala J (2019) Temporal coding in spiking neural networks with alpha synaptic function. arXiv, p. 1907.13223
30.
Zurück zum Zitat Zhang M, Wang J, Zhang Z, Belatreche A, Wu J, Chua Y, Qu H, Li H (2020) Spike-timing-dependent back propagation in deep spiking neural networks. arXiv:2003.11837 Zhang M, Wang J, Zhang Z, Belatreche A, Wu J, Chua Y, Qu H, Li H (2020) Spike-timing-dependent back propagation in deep spiking neural networks. arXiv:​2003.​11837
31.
Zurück zum Zitat Sakemi Y, Morino K, Morie T, Aihara K (2020), A supervised learning algorithm for multilayer spiking neural networks based on temporal coding toward energy-efficient vlsi processor design. arXiv:2001.05348 Sakemi Y, Morino K, Morie T, Aihara K (2020), A supervised learning algorithm for multilayer spiking neural networks based on temporal coding toward energy-efficient vlsi processor design. arXiv:​2001.​05348
32.
Zurück zum Zitat Glorot X, Bengio Y (2010) Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the thirteenth international conference on artificial intelligence and statistics, pp 249–256 Glorot X, Bengio Y (2010) Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the thirteenth international conference on artificial intelligence and statistics, pp 249–256
33.
Zurück zum Zitat Zhang W, Li P (2019) Spike-train level backpropagation for training deep recurrent spiking neural networks. In: Advances in neural information processing systems, 2019, pp 7802–7813 Zhang W, Li P (2019) Spike-train level backpropagation for training deep recurrent spiking neural networks. In: Advances in neural information processing systems, 2019, pp 7802–7813
34.
Zurück zum Zitat Ranjan JAK, Sigamani T, Barnabas J (2019) A novel and efficient classifier using spiking neural network. J Supercomput 76:1–16 Ranjan JAK, Sigamani T, Barnabas J (2019) A novel and efficient classifier using spiking neural network. J Supercomput 76:1–16
35.
Zurück zum Zitat Wu Y, Zhao R, Zhu J, Chen F, Xu M, Li G, Song S, Deng L, Wang G, Zheng H et al (2020) Brain-inspired global-local hybrid learning towards human-like intelligence. arXiv:2006.03226 Wu Y, Zhao R, Zhu J, Chen F, Xu M, Li G, Song S, Deng L, Wang G, Zheng H et al (2020) Brain-inspired global-local hybrid learning towards human-like intelligence. arXiv:​2006.​03226
36.
37.
Zurück zum Zitat Hao Y, Huang X, Dong M, Xu B (2020) A biologically plausible supervised learning method for spiking neural networks using the symmetric STDP rule. Neural Netw 121:387–395CrossRef Hao Y, Huang X, Dong M, Xu B (2020) A biologically plausible supervised learning method for spiking neural networks using the symmetric STDP rule. Neural Netw 121:387–395CrossRef
38.
Zurück zum Zitat Mozafari M, Ganjtabesh M, Nowzari-Dalini A, Thorpe SJ, Masquelier T (2019) Bio-inspired digit recognition using reward-modulated spike-timing-dependent plasticity in deep convolutional networks. Pattern Recognit 94:87–95CrossRef Mozafari M, Ganjtabesh M, Nowzari-Dalini A, Thorpe SJ, Masquelier T (2019) Bio-inspired digit recognition using reward-modulated spike-timing-dependent plasticity in deep convolutional networks. Pattern Recognit 94:87–95CrossRef
39.
Zurück zum Zitat Mozafari M, Ganjtabesh M, Nowzari-Dalini A, Masquelier T (2019) SpykeTorch: efficient simulation of convolutional spiking neural networks with at most one spike per neuron. Front Neurosci 13(July):1–12 Mozafari M, Ganjtabesh M, Nowzari-Dalini A, Masquelier T (2019) SpykeTorch: efficient simulation of convolutional spiking neural networks with at most one spike per neuron. Front Neurosci 13(July):1–12
40.
Zurück zum Zitat Vaila R, Chiasson J, Saxena V (2019) Feature extraction using spiking convolutional neural networks. In: Proceedings of the international conference on neuromorphic systems—ICONS’19. ACM Press, New York, USA, pp 1–8 Vaila R, Chiasson J, Saxena V (2019) Feature extraction using spiking convolutional neural networks. In: Proceedings of the international conference on neuromorphic systems—ICONS’19. ACM Press, New York, USA, pp 1–8
41.
42.
Zurück zum Zitat Kirkland P, Di Caterina G, Soraghan J, Matich G (2020) Spikeseg: spiking segmentation via stdp saliency mapping. In: International joint conference on nerual networks Kirkland P, Di Caterina G, Soraghan J, Matich G (2020) Spikeseg: spiking segmentation via stdp saliency mapping. In: International joint conference on nerual networks
44.
Zurück zum Zitat Oh S, Kwon D, Yeom G, Kang WM, Lee S, Woo SY, Kim JS, Park MK, Lee JH (2020) Hardware implementation of spiking neural networks using time-to-first-spike encoding. arXiv:2006.05033 Oh S, Kwon D, Yeom G, Kang WM, Lee S, Woo SY, Kim JS, Park MK, Lee JH (2020) Hardware implementation of spiking neural networks using time-to-first-spike encoding. arXiv:​2006.​05033
45.
Zurück zum Zitat Liang M, Zhang J, Chen H (2021) A 1.13 \(\mu \)j, classification spiking neural network accelerator with a single-spike neuron model and sparse weights. In: IEEE international symposium on circuits and systems (ISCAS). IEEE, vol 2021, pp 1–5 Liang M, Zhang J, Chen H (2021) A 1.13 \(\mu \)j, classification spiking neural network accelerator with a single-spike neuron model and sparse weights. In: IEEE international symposium on circuits and systems (ISCAS). IEEE, vol 2021, pp 1–5
46.
Zurück zum Zitat Masquelier T, Kheradpisheh SR (2018) Optimal localist and distributed coding of spatiotemporal spike patterns through STDP and coincidence detection. Front Comput Neurosci 12:74CrossRef Masquelier T, Kheradpisheh SR (2018) Optimal localist and distributed coding of spatiotemporal spike patterns through STDP and coincidence detection. Front Comput Neurosci 12:74CrossRef
47.
Zurück zum Zitat Yousefzadeh A, Masquelier T, Serrano-Gotarredona T, Linares-Barranco B (2017) Hardware implementation of convolutional STDP for on-line visual feature learning. In: 2017 IEEE international symposium on circuits and systems (ISCAS), pp 1–4 Yousefzadeh A, Masquelier T, Serrano-Gotarredona T, Linares-Barranco B (2017) Hardware implementation of convolutional STDP for on-line visual feature learning. In: 2017 IEEE international symposium on circuits and systems (ISCAS), pp 1–4
48.
Zurück zum Zitat Orchard G, Meyer C, Etienne-Cummings R, Posch C, Thakor N, Benosman R (2015) HFirst: a temporal approach to object recognition. IEEE Trans Pattern Anal Mach Intell 37:2028–2040CrossRef Orchard G, Meyer C, Etienne-Cummings R, Posch C, Thakor N, Benosman R (2015) HFirst: a temporal approach to object recognition. IEEE Trans Pattern Anal Mach Intell 37:2028–2040CrossRef
49.
Zurück zum Zitat Yousefzadeh A, Serrano-Gotarredona T, Linares-Barranco B (2015) Fast pipeline 128x128 pixel spiking convolution core for event-driven vision processing in FPGAs. IEEE, pp 1–8 Yousefzadeh A, Serrano-Gotarredona T, Linares-Barranco B (2015) Fast pipeline 128x128 pixel spiking convolution core for event-driven vision processing in FPGAs. IEEE, pp 1–8
50.
Zurück zum Zitat Rueckauer B, Liu S-C (2018) Conversion of analog to spiking neural networks using sparse temporal coding. In: IEEE international symposium on circuits and systems (ISCAS). IEEE, vol 2018, pp 1–5 Rueckauer B, Liu S-C (2018) Conversion of analog to spiking neural networks using sparse temporal coding. In: IEEE international symposium on circuits and systems (ISCAS). IEEE, vol 2018, pp 1–5
51.
Zurück zum Zitat Srivatsa P, Chu KT, Tavva Y, Wu J, Zhang M, Li H, Carlson TE (2020) You only spike once: improving energy-efficient neuromorphic inference to ann-level accuracy. arXiv:2006.09982 Srivatsa P, Chu KT, Tavva Y, Wu J, Zhang M, Li H, Carlson TE (2020) You only spike once: improving energy-efficient neuromorphic inference to ann-level accuracy. arXiv:​2006.​09982
52.
Zurück zum Zitat Göltz J, Baumbach A, Billaudelle S, Kungl A, Breitwieser O, Meier K, Schemmel J, Kriener L, Petrovici M (2020) Fast and deep neuromorphic learning with first-spike coding. In: Proceedings of the neuro-inspired computational elements workshop, 2020, pp 1–3 Göltz J, Baumbach A, Billaudelle S, Kungl A, Breitwieser O, Meier K, Schemmel J, Kriener L, Petrovici M (2020) Fast and deep neuromorphic learning with first-spike coding. In: Proceedings of the neuro-inspired computational elements workshop, 2020, pp 1–3
54.
Zurück zum Zitat Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci 10:508 Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci 10:508
55.
Zurück zum Zitat Neftci EO, Augustine C, Paul S, Detorakis G (2017) Event-driven random back-propagation: enabling neuromorphic deep learning machines. Front Neurosci 11:324CrossRef Neftci EO, Augustine C, Paul S, Detorakis G (2017) Event-driven random back-propagation: enabling neuromorphic deep learning machines. Front Neurosci 11:324CrossRef
56.
Zurück zum Zitat Zenke F, Ganguli S (2018) Superspike: supervised learning in multilayer spiking neural networks. Neural Comput 30(6):1514–1541MathSciNetCrossRef Zenke F, Ganguli S (2018) Superspike: supervised learning in multilayer spiking neural networks. Neural Comput 30(6):1514–1541MathSciNetCrossRef
57.
Zurück zum Zitat Huh D, Sejnowski TJ (2018) Gradient descent for spiking neural networks. In: Advances in neural information processing systems, pp 1433–1443 Huh D, Sejnowski TJ (2018) Gradient descent for spiking neural networks. In: Advances in neural information processing systems, pp 1433–1443
59.
Zurück zum Zitat Bohte SM (2011) Error-backpropagation in networks of fractionally predictive spiking neurons. In: International conference on artificial neural networks. Springer, pp 60–68 Bohte SM (2011) Error-backpropagation in networks of fractionally predictive spiking neurons. In: International conference on artificial neural networks. Springer, pp 60–68
60.
Zurück zum Zitat Esser SK, Merolla PA, Arthur JV, Cassidy AS, Appuswama R, Andreopoulos A, Berg DJ, McKinstry JL, Melano T, Barch DR, Nolfo CD, Datta P, Amir A, Taba B, Flickner MD, Modha DS (2016) Convolutional networks for fast energy-efficient neuromorphic computing. In: Proceedings of the national academy of sciences of USA, vol 113, no 41, pp 11441–11446 Esser SK, Merolla PA, Arthur JV, Cassidy AS, Appuswama R, Andreopoulos A, Berg DJ, McKinstry JL, Melano T, Barch DR, Nolfo CD, Datta P, Amir A, Taba B, Flickner MD, Modha DS (2016) Convolutional networks for fast energy-efficient neuromorphic computing. In: Proceedings of the national academy of sciences of USA, vol 113, no 41, pp 11441–11446
61.
Zurück zum Zitat Shrestha SB, Orchard G (2018) Slayer: spike layer error reassignment in time. In: Advances in neural information processing systems, 2018, pp 1412–1421 Shrestha SB, Orchard G (2018) Slayer: spike layer error reassignment in time. In: Advances in neural information processing systems, 2018, pp 1412–1421
62.
Zurück zum Zitat Bellec G, Salaj D, Subramoney A, Legenstein R, Maass W (2018) Long short-term memory and learning-to-learn in networks of spiking neurons. In: Advances in neural information processing systems, 2018, pp 787–797 Bellec G, Salaj D, Subramoney A, Legenstein R, Maass W (2018) Long short-term memory and learning-to-learn in networks of spiking neurons. In: Advances in neural information processing systems, 2018, pp 787–797
63.
Zurück zum Zitat Zimmer R, Pellegrini T, Singh SF, Masquelier T (2019) Technical report: supervised training of convolutional spiking neural networks with pytorch. arXiv:1911.10124 Zimmer R, Pellegrini T, Singh SF, Masquelier T (2019) Technical report: supervised training of convolutional spiking neural networks with pytorch. arXiv:​1911.​10124
64.
Zurück zum Zitat Pellegrini T, Zimmer R, Masquelier T (2021) Low-activity supervised convolutional spiking neural networks applied to speech commands recognition. In: 2021 IEEE spoken language technology workshop (SLT). IEEE, pp 97–103 Pellegrini T, Zimmer R, Masquelier T (2021) Low-activity supervised convolutional spiking neural networks applied to speech commands recognition. In: 2021 IEEE spoken language technology workshop (SLT). IEEE, pp 97–103
65.
Zurück zum Zitat Fang W, Yu Z, Chen Y, Masquelier T, Huang T, Tian Y (2020) Incorporating learnable membrane time constant to enhance learning of spiking neural networks Fang W, Yu Z, Chen Y, Masquelier T, Huang T, Tian Y (2020) Incorporating learnable membrane time constant to enhance learning of spiking neural networks
66.
Zurück zum Zitat Zenke F, Bohté SM, Clopath C, Comsa IM, Göltz J, Maass W, Masquelier T, Naud R, Neftci EO, Petrovici MA, Scherr F, Goodman DF (2021) Visualizing a joint future of neuroscience and neuromorphic engineering. Neuron 109(4):571–575CrossRef Zenke F, Bohté SM, Clopath C, Comsa IM, Göltz J, Maass W, Masquelier T, Naud R, Neftci EO, Petrovici MA, Scherr F, Goodman DF (2021) Visualizing a joint future of neuroscience and neuromorphic engineering. Neuron 109(4):571–575CrossRef
67.
Zurück zum Zitat Wu J, Chua Y, Zhang M, Li G, Li H, Tan KC (2019) A tandem learning rule for efficient and rapid inference on deep spiking neural networks Wu J, Chua Y, Zhang M, Li G, Li H, Tan KC (2019) A tandem learning rule for efficient and rapid inference on deep spiking neural networks
68.
Zurück zum Zitat Bohte SM, La Poutré H, Kok JN (2000) Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48:17–37CrossRef Bohte SM, La Poutré H, Kok JN (2000) Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48:17–37CrossRef
69.
Zurück zum Zitat Zhou S, Chen Y, Ye Q, Li J (2019) Direct training based spiking convolutional neural networks for object recognition. arXiv:1909.10837 Zhou S, Chen Y, Ye Q, Li J (2019) Direct training based spiking convolutional neural networks for object recognition. arXiv:​1909.​10837
Metadaten
Titel
BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning
verfasst von
Saeed Reza Kheradpisheh
Maryam Mirsadeghi
Timothée Masquelier
Publikationsdatum
10.11.2021
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 2/2022
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-021-10680-x

Weitere Artikel der Ausgabe 2/2022

Neural Processing Letters 2/2022 Zur Ausgabe

Neuer Inhalt