Skip to main content
Erschienen in: Neural Computing and Applications 15/2024

25.02.2024 | Original Article

MSAT: biologically inspired multistage adaptive threshold for conversion of spiking neural networks

verfasst von: Xiang He, Yang Li, Dongcheng Zhao, Qingqun Kong, Yi Zeng

Erschienen in: Neural Computing and Applications | Ausgabe 15/2024

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Spiking neural networks (SNNs) can do inference with low power consumption due to their spike sparsity. Although SNNs can be combined with neuromorphic hardware to achieve efficient inference, they are often difficult to train directly due to discrete non-differentiable spikes. As an alternative, ANN-SNN conversion is an efficient way to achieve deep SNNs by converting well-trained artificial neural networks (ANNs). However, the existing methods commonly use constant threshold for conversion. A high constant threshold value prevents neurons from rapidly delivering spikes to deeper layers and causes high time delay. In addition, the same response for different inputs may result in information loss during the information transmission. Inspired by the biological adaptive threshold mechanism, we propose a multistage adaptive threshold (MSAT) method to alleviate this problem. Instead of using a single, constant value, the threshold is adjusted in multistages, adapting to each neuron’s firing history and input properties. Specifically, for each neuron, the dynamic threshold is positively correlated with the average membrane potential and negatively correlated with the rate of depolarization. The adaptation to membrane potential and input allows a timely adjustment of the threshold to fire spikes faster and transmit more information. Moreover, we analyze the spikes of inactivated neurons error, which is pervasive in early time steps. We also propose spike confidence accordingly to measure confidence about the neurons that correctly deliver spikes. Such spike confidence in early time steps is used to determine whether to elicit the spike to alleviate the spikes of inactivated neurons error. Combined with the proposed methods, we examine the performance on CIFAR-10, CIFAR-100, and ImageNet datasets. We also conduct sentiment classification and speech recognition experiments on the IDBM and Google speech commands datasets, respectively. Experiments show that our methods can achieve near-lossless and lower latency ANN-SNN conversion. In summary, we build a biologically inspired multistage adaptive threshold for converted SNN, with comparable performance to state-of-the-art methods while improving energy efficiency.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Liu W, Wang Z, Liu X, Zeng N, Liu Y, Alsaadi FE (2017) A survey of deep neural network architectures and their applications. Neurocomputing 234:11–26CrossRef Liu W, Wang Z, Liu X, Zeng N, Liu Y, Alsaadi FE (2017) A survey of deep neural network architectures and their applications. Neurocomputing 234:11–26CrossRef
2.
Zurück zum Zitat Abiodun OI, Jantan A, Omolara AE, Dada KV, Mohamed NA, Arshad H (2018) State-of-the-art in artificial neural network applications: a survey. Heliyon 4(11):00938CrossRef Abiodun OI, Jantan A, Omolara AE, Dada KV, Mohamed NA, Arshad H (2018) State-of-the-art in artificial neural network applications: a survey. Heliyon 4(11):00938CrossRef
3.
Zurück zum Zitat Khan A, Sohail A, Zahoora U, Qureshi AS (2020) A survey of the recent architectures of deep convolutional neural networks. Artif Intell Rev 53:5455–5516CrossRef Khan A, Sohail A, Zahoora U, Qureshi AS (2020) A survey of the recent architectures of deep convolutional neural networks. Artif Intell Rev 53:5455–5516CrossRef
4.
Zurück zum Zitat Akopyan F, Sawada J, Cassidy A, Alvarez-Icaza R, Arthur J, Merolla P, Imam N, Nakamura Y, Datta P, Nam G-J et al (2015) Truenorth: design and tool flow of a 65 mw 1 million neuron programmable neurosynaptic chip. IEEE Trans Comput Aided Des Integr Circuits Syst 34(10):1537–1557CrossRef Akopyan F, Sawada J, Cassidy A, Alvarez-Icaza R, Arthur J, Merolla P, Imam N, Nakamura Y, Datta P, Nam G-J et al (2015) Truenorth: design and tool flow of a 65 mw 1 million neuron programmable neurosynaptic chip. IEEE Trans Comput Aided Des Integr Circuits Syst 34(10):1537–1557CrossRef
5.
Zurück zum Zitat Davies M, Srinivasa N, Lin T-H, Chinya G, Cao Y, Choday SH, Dimou G, Joshi P, Imam N, Jain S et al (2018) Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38(1):82–99CrossRef Davies M, Srinivasa N, Lin T-H, Chinya G, Cao Y, Choday SH, Dimou G, Joshi P, Imam N, Jain S et al (2018) Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38(1):82–99CrossRef
6.
Zurück zum Zitat Pei J, Deng L, Song S, Zhao M, Zhang Y, Wu S, Wang G, Zou Z, Wu Z, He W et al (2019) Towards artificial general intelligence with hybrid tianjic chip architecture. Nature 572(7767):106–111CrossRef Pei J, Deng L, Song S, Zhao M, Zhang Y, Wu S, Wang G, Zou Z, Wu Z, He W et al (2019) Towards artificial general intelligence with hybrid tianjic chip architecture. Nature 572(7767):106–111CrossRef
7.
Zurück zum Zitat Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci 10:508CrossRef Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci 10:508CrossRef
8.
Zurück zum Zitat Wu Y, Deng L, Li G, Zhu J, Shi L (2018) Spatio-temporal backpropagation for training high-performance spiking neural networks. Front Neurosci 12:331CrossRef Wu Y, Deng L, Li G, Zhu J, Shi L (2018) Spatio-temporal backpropagation for training high-performance spiking neural networks. Front Neurosci 12:331CrossRef
9.
Zurück zum Zitat Wu Y, Deng L, Li G, Zhu J, Xie Y, Shi L (2019) Direct training for spiking neural networks: faster, larger, better. In: Proceedings of the AAAI conference on artificial intelligence, vol 33, pp 1311–1318 Wu Y, Deng L, Li G, Zhu J, Xie Y, Shi L (2019) Direct training for spiking neural networks: faster, larger, better. In: Proceedings of the AAAI conference on artificial intelligence, vol 33, pp 1311–1318
10.
Zurück zum Zitat Zhang W, Li P (2020) Temporal spike sequence learning via backpropagation for deep spiking neural networks. Adv Neural Inf Process Syst 33:12022–12033 Zhang W, Li P (2020) Temporal spike sequence learning via backpropagation for deep spiking neural networks. Adv Neural Inf Process Syst 33:12022–12033
11.
Zurück zum Zitat Shen G, Zhao D, Zeng Y (2022) Backpropagation with biologically plausible spatiotemporal adjustment for training deep spiking neural networks. Patterns 3(6):100522CrossRef Shen G, Zhao D, Zeng Y (2022) Backpropagation with biologically plausible spatiotemporal adjustment for training deep spiking neural networks. Patterns 3(6):100522CrossRef
12.
Zurück zum Zitat Li Y, Guo Y, Zhang S, Deng S, Hai Y, Gu S (2021) Differentiable spike: rethinking gradient-descent for training spiking neural networks. Adv Neural Inf Process Syst 34:23426–23439 Li Y, Guo Y, Zhang S, Deng S, Hai Y, Gu S (2021) Differentiable spike: rethinking gradient-descent for training spiking neural networks. Adv Neural Inf Process Syst 34:23426–23439
13.
Zurück zum Zitat Wu Z, Zhang H, Lin Y, Li G, Wang M, Tang Y (2021) Liaf-net: leaky integrate and analog fire network for lightweight and efficient spatiotemporal information processing. IEEE Trans Neural Netw Learn Syst 33(11):6249–6262CrossRef Wu Z, Zhang H, Lin Y, Li G, Wang M, Tang Y (2021) Liaf-net: leaky integrate and analog fire network for lightweight and efficient spatiotemporal information processing. IEEE Trans Neural Netw Learn Syst 33(11):6249–6262CrossRef
14.
Zurück zum Zitat Fang W, Yu Z, Chen Y, Huang T, Masquelier T, Tian Y (2021) Deep residual learning in spiking neural networks. Adv Neural Inform Process Syst 34:21056–69 Fang W, Yu Z, Chen Y, Huang T, Masquelier T, Tian Y (2021) Deep residual learning in spiking neural networks. Adv Neural Inform Process Syst 34:21056–69
15.
Zurück zum Zitat Duan C, Ding J, Chen S, Yu Z, Huang T (2022) Temporal effective batch normalization in spiking neural networks. Adv Neural Inform Process Syst 35:34377–90 Duan C, Ding J, Chen S, Yu Z, Huang T (2022) Temporal effective batch normalization in spiking neural networks. Adv Neural Inform Process Syst 35:34377–90
16.
Zurück zum Zitat Deng S, Li Y, Zhang S, Gu S (2022) Temporal efficient training of spiking neural network via gradient re-weighting. arXiv preprint arXiv:2202.11946 Deng S, Li Y, Zhang S, Gu S (2022) Temporal efficient training of spiking neural network via gradient re-weighting. arXiv preprint arXiv:​2202.​11946
17.
Zurück zum Zitat Caporale N, Dan Y et al (2008) Spike timing-dependent plasticity: a hebbian learning rule. Annu Rev Neurosci 31(1):25–46CrossRef Caporale N, Dan Y et al (2008) Spike timing-dependent plasticity: a hebbian learning rule. Annu Rev Neurosci 31(1):25–46CrossRef
18.
Zurück zum Zitat Diehl PU, Cook M (2015) Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front Comput Neurosci 9:99CrossRef Diehl PU, Cook M (2015) Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front Comput Neurosci 9:99CrossRef
19.
Zurück zum Zitat Hao Y, Huang X, Dong M, Xu B (2020) A biologically plausible supervised learning method for spiking neural networks using the symmetric stdp rule. Neural Netw 121:387–395CrossRef Hao Y, Huang X, Dong M, Xu B (2020) A biologically plausible supervised learning method for spiking neural networks using the symmetric stdp rule. Neural Netw 121:387–395CrossRef
20.
Zurück zum Zitat Zhao D, Zeng Y, Zhang T, Shi M, Zhao F (2020) Glsnn: a multi-layer spiking neural network based on global feedback alignment and local stdp plasticity. Front Comput Neurosci 14:576841CrossRef Zhao D, Zeng Y, Zhang T, Shi M, Zhao F (2020) Glsnn: a multi-layer spiking neural network based on global feedback alignment and local stdp plasticity. Front Comput Neurosci 14:576841CrossRef
21.
Zurück zum Zitat Datta G, Liu Z, Beerel PA (2022) Hoyer regularizer is all you need for ultra low-latency spiking neural networks. arXiv preprint arXiv:2212.10170 Datta G, Liu Z, Beerel PA (2022) Hoyer regularizer is all you need for ultra low-latency spiking neural networks. arXiv preprint arXiv:​2212.​10170
22.
Zurück zum Zitat Lien H-H, Chang T-S (2022) Sparse compressed spiking neural network accelerator for object detection. IEEE Trans Circuits Syst I Regul Pap 69(5):2060–2069CrossRef Lien H-H, Chang T-S (2022) Sparse compressed spiking neural network accelerator for object detection. IEEE Trans Circuits Syst I Regul Pap 69(5):2060–2069CrossRef
23.
Zurück zum Zitat Diehl PU, Neil D, Binas J, Cook M, Liu SC, Pfeiffer M (2015) Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: 2015 International joint conference on neural networks (IJCNN), pp 1–8. IEEE Diehl PU, Neil D, Binas J, Cook M, Liu SC, Pfeiffer M (2015) Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: 2015 International joint conference on neural networks (IJCNN), pp 1–8. IEEE
24.
Zurück zum Zitat Sengupta A, Ye Y, Wang R, Liu C, Roy K (2019) Going deeper in spiking neural networks: Vgg and residual architectures. Front Neurosci 13:95CrossRef Sengupta A, Ye Y, Wang R, Liu C, Roy K (2019) Going deeper in spiking neural networks: Vgg and residual architectures. Front Neurosci 13:95CrossRef
25.
Zurück zum Zitat Yu Q, Ma C, Song S, Zhang G, Dang J, Tan KC (2021) Constructing accurate and efficient deep spiking neural networks with double-threshold and augmented schemes. IEEE Trans Neural Netw Learn Syst 33(4):1714–1726CrossRef Yu Q, Ma C, Song S, Zhang G, Dang J, Tan KC (2021) Constructing accurate and efficient deep spiking neural networks with double-threshold and augmented schemes. IEEE Trans Neural Netw Learn Syst 33(4):1714–1726CrossRef
26.
Zurück zum Zitat Bu T, Fang W, Ding J, Dai P, Yu Z, Huang T (2021) Optimal ann-snn conversion for high-accuracy and ultra-low-latency spiking neural networks. In: International conference on learning representations Bu T, Fang W, Ding J, Dai P, Yu Z, Huang T (2021) Optimal ann-snn conversion for high-accuracy and ultra-low-latency spiking neural networks. In: International conference on learning representations
27.
Zurück zum Zitat Li Y, Zeng Y (2022) Efficient and accurate conversion of spiking neural network with burst spikes. arXiv preprint arXiv:2204.13271 Li Y, Zeng Y (2022) Efficient and accurate conversion of spiking neural network with burst spikes. arXiv preprint arXiv:​2204.​13271
28.
Zurück zum Zitat Li Y, Deng S, Dong X, Gong R, Gu S (2021) A free lunch from ann: towards efficient, accurate spiking neural networks calibration. In: International conference on machine learning, pp 6316–6325. PMLR Li Y, Deng S, Dong X, Gong R, Gu S (2021) A free lunch from ann: towards efficient, accurate spiking neural networks calibration. In: International conference on machine learning, pp 6316–6325. PMLR
29.
Zurück zum Zitat Kim S, Park S, Na B, Kim J, Yoon S (2020) Towards fast and accurate object detection in bio-inspired spiking neural networks through bayesian optimization. IEEE Access 9:2633–2643CrossRef Kim S, Park S, Na B, Kim J, Yoon S (2020) Towards fast and accurate object detection in bio-inspired spiking neural networks through bayesian optimization. IEEE Access 9:2633–2643CrossRef
30.
Zurück zum Zitat Bu T, Ding J, Yu Z, Huang T (2022) Optimized potential initialization for low-latency spiking neural networks. arXiv preprint arXiv:2202.01440 Bu T, Ding J, Yu Z, Huang T (2022) Optimized potential initialization for low-latency spiking neural networks. arXiv preprint arXiv:​2202.​01440
31.
Zurück zum Zitat Fontaine B, Peña JL, Brette R (2014) Spike-threshold adaptation predicted by membrane potential dynamics in vivo. PLoS Comput Biol 10(4):1003560CrossRef Fontaine B, Peña JL, Brette R (2014) Spike-threshold adaptation predicted by membrane potential dynamics in vivo. PLoS Comput Biol 10(4):1003560CrossRef
32.
Zurück zum Zitat Azouz R, Gray CM (2000) Dynamic spike threshold reveals a mechanism for synaptic coincidence detection in cortical neurons in vivo. Proc Natl Acad Sci 97(14):8110–8115CrossRef Azouz R, Gray CM (2000) Dynamic spike threshold reveals a mechanism for synaptic coincidence detection in cortical neurons in vivo. Proc Natl Acad Sci 97(14):8110–8115CrossRef
33.
Zurück zum Zitat Henze D, Buzsáki G (2001) Action potential threshold of hippocampal pyramidal cells in vivo is increased by recent spiking activity. Neuroscience 105(1):121–130CrossRef Henze D, Buzsáki G (2001) Action potential threshold of hippocampal pyramidal cells in vivo is increased by recent spiking activity. Neuroscience 105(1):121–130CrossRef
34.
Zurück zum Zitat Azouz R, Gray CM (2003) Adaptive coincidence detection and dynamic gain control in visual cortical neurons in vivo. Neuron 37(3):513–523CrossRef Azouz R, Gray CM (2003) Adaptive coincidence detection and dynamic gain control in visual cortical neurons in vivo. Neuron 37(3):513–523CrossRef
35.
Zurück zum Zitat Pena JL, Konishi M (2002) From postsynaptic potentials to spikes in the genesis of auditory spatial receptive fields. J Neurosci 22(13):5652–5658CrossRef Pena JL, Konishi M (2002) From postsynaptic potentials to spikes in the genesis of auditory spatial receptive fields. J Neurosci 22(13):5652–5658CrossRef
36.
Zurück zum Zitat Wilent WB, Contreras D (2005) Stimulus-dependent changes in spike threshold enhance feature selectivity in rat barrel cortex neurons. J Neurosci 25(11):2983–2991CrossRef Wilent WB, Contreras D (2005) Stimulus-dependent changes in spike threshold enhance feature selectivity in rat barrel cortex neurons. J Neurosci 25(11):2983–2991CrossRef
37.
Zurück zum Zitat Pérez-Carrasco JA, Zhao B, Serrano C, Acha B, Serrano-Gotarredona T, Chen S, Linares-Barranco B (2013) Mapping from frame-driven to frame-free event-driven vision systems by low-rate rate coding and coincidence processing-application to feedforward convnets. IEEE Trans Pattern Anal Mach Intell 35(11):2706–2719CrossRef Pérez-Carrasco JA, Zhao B, Serrano C, Acha B, Serrano-Gotarredona T, Chen S, Linares-Barranco B (2013) Mapping from frame-driven to frame-free event-driven vision systems by low-rate rate coding and coincidence processing-application to feedforward convnets. IEEE Trans Pattern Anal Mach Intell 35(11):2706–2719CrossRef
38.
Zurück zum Zitat Burkitt AN (2006) A review of the integrate-and-fire neuron model: I. homogeneous synaptic input. Biol Cybern 95(1):1–19MathSciNetCrossRef Burkitt AN (2006) A review of the integrate-and-fire neuron model: I. homogeneous synaptic input. Biol Cybern 95(1):1–19MathSciNetCrossRef
39.
Zurück zum Zitat Rueckauer B, Lungu I-A, Hu Y, Pfeiffer M, Liu S-C (2017) Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci 11:682CrossRef Rueckauer B, Lungu I-A, Hu Y, Pfeiffer M, Liu S-C (2017) Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci 11:682CrossRef
40.
Zurück zum Zitat Han B, Srinivasan G, Roy K (2020) Rmp-snn: residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network. In: Proceedings of the IEEE/CVF Conference on computer vision and pattern recognition, pp 13558–13567 Han B, Srinivasan G, Roy K (2020) Rmp-snn: residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network. In: Proceedings of the IEEE/CVF Conference on computer vision and pattern recognition, pp 13558–13567
41.
Zurück zum Zitat Rueckauer B, Liu SC (2018) Conversion of analog to spiking neural networks using sparse temporal coding. In: 2018 IEEE International symposium on circuits and systems (ISCAS), pp 1–5. IEEE Rueckauer B, Liu SC (2018) Conversion of analog to spiking neural networks using sparse temporal coding. In: 2018 IEEE International symposium on circuits and systems (ISCAS), pp 1–5. IEEE
42.
Zurück zum Zitat Ding J, Yu Z, Tian Y, Huang T (2021) Optimal ann-snn conversion for fast and accurate inference in deep spiking neural networks. arXiv preprint arXiv:2105.11654 Ding J, Yu Z, Tian Y, Huang T (2021) Optimal ann-snn conversion for fast and accurate inference in deep spiking neural networks. arXiv preprint arXiv:​2105.​11654
43.
Zurück zum Zitat Deng S, Gu S (2021) Optimal conversion of conventional artificial neural networks to spiking neural networks. arXiv preprint arXiv:2103.00476 Deng S, Gu S (2021) Optimal conversion of conventional artificial neural networks to spiking neural networks. arXiv preprint arXiv:​2103.​00476
44.
Zurück zum Zitat Meng Q, Yan S, Xiao M, Wang Y, Lin Z, Luo Z-Q (2022) Training much deeper spiking neural networks with a small number of time-steps. Neural Netw 153:254–268CrossRef Meng Q, Yan S, Xiao M, Wang Y, Lin Z, Luo Z-Q (2022) Training much deeper spiking neural networks with a small number of time-steps. Neural Netw 153:254–268CrossRef
45.
Zurück zum Zitat Wu X, Zhao Y, Song Y, Jiang Y, Bai Y, Li X, Zhou Y, Yang X, Hao Q (2023) Dynamic threshold integrate and fire neuron model for low latency spiking neural networks. Neurocomputing 544:126247CrossRef Wu X, Zhao Y, Song Y, Jiang Y, Bai Y, Li X, Zhou Y, Yang X, Hao Q (2023) Dynamic threshold integrate and fire neuron model for low latency spiking neural networks. Neurocomputing 544:126247CrossRef
46.
Zurück zum Zitat Ding J, Dong B, Heide F, Ding Y, Zhou Y, Yin B, Yang X (2022) Biologically inspired dynamic thresholds for spiking neural networks. Adv Neural Inf Process Syst 35:6090–6103 Ding J, Dong B, Heide F, Ding Y, Zhou Y, Yin B, Yang X (2022) Biologically inspired dynamic thresholds for spiking neural networks. Adv Neural Inf Process Syst 35:6090–6103
47.
Zurück zum Zitat Li Y, Zeng Y, Zhao D (2021) Bsnn: Towards faster and better conversion of artificial neural networks to spiking neural networks with bistable neurons. arXiv preprint arXiv:2105.12917 Li Y, Zeng Y, Zhao D (2021) Bsnn: Towards faster and better conversion of artificial neural networks to spiking neural networks with bistable neurons. arXiv preprint arXiv:​2105.​12917
48.
Zurück zum Zitat Liu W, Anguelov D, Erhan D, Szegedy C, Reed S, Fu CY, Berg AC (2016) Ssd: Single shot multibox detector. In: European conference on computer vision, pp 21–37. Springer Liu W, Anguelov D, Erhan D, Szegedy C, Reed S, Fu CY, Berg AC (2016) Ssd: Single shot multibox detector. In: European conference on computer vision, pp 21–37. Springer
49.
Zurück zum Zitat Redmon J, Divvala S, Girshick R, Farhadi A (2016) You only look once: Unified, real-time object detection. In: Proceedings of the IEEE Conference on computer vision and pattern recognition, pp 779–788 Redmon J, Divvala S, Girshick R, Farhadi A (2016) You only look once: Unified, real-time object detection. In: Proceedings of the IEEE Conference on computer vision and pattern recognition, pp 779–788
50.
Zurück zum Zitat Krizhevsky A, Hinton G, et al (2009) Learning multiple layers of features from tiny images Krizhevsky A, Hinton G, et al (2009) Learning multiple layers of features from tiny images
51.
Zurück zum Zitat Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein M et al (2015) Imagenet large scale visual recognition challenge. Int J Comput Vis 115(3):211–252MathSciNetCrossRef Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein M et al (2015) Imagenet large scale visual recognition challenge. Int J Comput Vis 115(3):211–252MathSciNetCrossRef
52.
Zurück zum Zitat Maas A, Daly RE, Pham PT, Huang D, Ng AY, Potts C (2011) Learning word vectors for sentiment analysis. In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp 142–150 Maas A, Daly RE, Pham PT, Huang D, Ng AY, Potts C (2011) Learning word vectors for sentiment analysis. In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp 142–150
53.
54.
Zurück zum Zitat DeVries T, Taylor GW (2017) Improved regularization of convolutional neural networks with cutout. arXiv preprint arXiv:1708.04552 DeVries T, Taylor GW (2017) Improved regularization of convolutional neural networks with cutout. arXiv preprint arXiv:​1708.​04552
55.
Zurück zum Zitat Cubuk ED, Zoph B, Mane D, Vasudevan V, Le QV (2019) Autoaugment: learning augmentation strategies from data. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 113–123 Cubuk ED, Zoph B, Mane D, Vasudevan V, Le QV (2019) Autoaugment: learning augmentation strategies from data. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 113–123
56.
Zurück zum Zitat Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:​1409.​1556
57.
Zurück zum Zitat He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on computer vision and pattern recognition, pp 770–778 He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on computer vision and pattern recognition, pp 770–778
58.
Zurück zum Zitat Zeng Y, Zhao D, Zhao F, Shen G, Dong Y, Lu E, Zhang Q, Sun Y, Liang Q, Zhao Y, et al (2022) Braincog: a spiking neural network based brain-inspired cognitive intelligence engine for brain-inspired ai and brain simulation. arXiv preprint arXiv:2207.08533 Zeng Y, Zhao D, Zhao F, Shen G, Dong Y, Lu E, Zhang Q, Sun Y, Liang Q, Zhao Y, et al (2022) Braincog: a spiking neural network based brain-inspired cognitive intelligence engine for brain-inspired ai and brain simulation. arXiv preprint arXiv:​2207.​08533
59.
Zurück zum Zitat Han B, Roy K (2020) Deep spiking neural network: Energy efficiency through time based coding. In: European conference on computer vision, pp 388–404. Springer Han B, Roy K (2020) Deep spiking neural network: Energy efficiency through time based coding. In: European conference on computer vision, pp 388–404. Springer
60.
Zurück zum Zitat Dai W, Dai C, Qu S, Li J, Das S (2017) Very deep convolutional neural networks for raw waveforms. In: 2017 IEEE International conference on acoustics, speech and signal processing (ICASSP), pp 421–425. IEEE Dai W, Dai C, Qu S, Li J, Das S (2017) Very deep convolutional neural networks for raw waveforms. In: 2017 IEEE International conference on acoustics, speech and signal processing (ICASSP), pp 421–425. IEEE
61.
Zurück zum Zitat Rathi N, Roy K (2020) Diet-snn: Direct input encoding with leakage and threshold optimization in deep spiking neural networks. arXiv preprint arXiv:2008.03658 Rathi N, Roy K (2020) Diet-snn: Direct input encoding with leakage and threshold optimization in deep spiking neural networks. arXiv preprint arXiv:​2008.​03658
62.
Zurück zum Zitat Horowitz M (2014) 1.1 computing’s energy problem (and what we can do about it). In: 2014 IEEE International solid-state circuits conference digest of technical papers (ISSCC), pp 10–14. IEEE Horowitz M (2014) 1.1 computing’s energy problem (and what we can do about it). In: 2014 IEEE International solid-state circuits conference digest of technical papers (ISSCC), pp 10–14. IEEE
Metadaten
Titel
MSAT: biologically inspired multistage adaptive threshold for conversion of spiking neural networks
verfasst von
Xiang He
Yang Li
Dongcheng Zhao
Qingqun Kong
Yi Zeng
Publikationsdatum
25.02.2024
Verlag
Springer London
Erschienen in
Neural Computing and Applications / Ausgabe 15/2024
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-024-09529-w

Weitere Artikel der Ausgabe 15/2024

Neural Computing and Applications 15/2024 Zur Ausgabe

Premium Partner