Skip to main content
Erschienen in: Optical Memory and Neural Networks 2/2023

01.12.2023

Implementation Challenges and Strategies for Hebbian Learning in Convolutional Neural Networks

verfasst von: A. V. Demidovskij, M. S. Kazyulina, I. G. Salnikov, A. M. Tugaryov, A. I. Trutnev, S. V. Pavlov

Erschienen in: Optical Memory and Neural Networks | Sonderheft 2/2023

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Given the unprecedented growth of deep learning applications, training acceleration is becoming a subject of strong academic interest. Hebbian learning as a training strategy alternative to backpropagation presents a promising optimization approach due to its locality, lower computational complexity and parallelization potential. Nevertheless, due to the challenging optimization of Hebbian learning, there is no widely accepted approach to the implementation of such mixed strategies. The current paper overviews the 4 main strategies for updating weights using the Hebbian rule, including its widely used modifications—Oja’s and Instar rules. Additionally, the paper analyses 21 industrial implementations of Hebbian learning, discusses merits and shortcomings of Hebbian rules, as well as presents the results of computational experiments on 4 convolutional networks. Experiments show that the most efficient implementation strategy of Hebbian learning allows for \(1.66 \times \) acceleration and \(3.76 \times \) memory consumption when updating DenseNet121 weights compared to backpropagation. Finally, a comparative analysis of the implementation strategies is carried out and grounded recommendations for Hebbian learning application are formulated.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat GVR. Deep Learning Market Size, Share, and Trends Analysis Report. https://www.grandviewresearch.com/industry-analysis/deep-learning-market. Accesses February 2023. GVR. Deep Learning Market Size, Share, and Trends Analysis Report. https://​www.​grandviewresearc​h.​com/​industry-analysis/​deep-learning-market.​ Accesses February 2023.
2.
Zurück zum Zitat Krizhevsky, A., Sutskever, I., and Hinton, G.E., Imagenet classification with deep convolutional neural networks, Commun. ACM, 2017, vol. 60, no. 6, pp. 84–90.CrossRef Krizhevsky, A., Sutskever, I., and Hinton, G.E., Imagenet classification with deep convolutional neural networks, Commun. ACM, 2017, vol. 60, no. 6, pp. 84–90.CrossRef
3.
Zurück zum Zitat Yoon Kim, Convolutional neural networks for sentence classification, in Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, October 2014, Association for Computational Linguistics, pp. 1746–1751. Yoon Kim, Convolutional neural networks for sentence classification, in Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, October 2014, Association for Computational Linguistics, pp. 1746–1751.
4.
Zurück zum Zitat Shaojie Bai, J Zico Kolter, and Vladlen Koltun, An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271, 2018. Shaojie Bai, J Zico Kolter, and Vladlen Koltun, An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271, 2018.
5.
Zurück zum Zitat Sarada Krithivasan, Sanchari Sen, Swagath Venkataramani, and Anand Raghunathan, Accelerating DNN training through selective localized learning, Front. Neurosci., 2022, vol. 15, p. 759807.CrossRef Sarada Krithivasan, Sanchari Sen, Swagath Venkataramani, and Anand Raghunathan, Accelerating DNN training through selective localized learning, Front. Neurosci., 2022, vol. 15, p. 759807.CrossRef
6.
Zurück zum Zitat Hopeld J., Jr., Neural network and physical systems with emergent collective computational abilities, Proc. Natl. Acad. Sci. U. S. A., 1982, vol. 79, pp. 2554–2558.MathSciNetCrossRefMATH Hopeld J., Jr., Neural network and physical systems with emergent collective computational abilities, Proc. Natl. Acad. Sci. U. S. A., 1982, vol. 79, pp. 2554–2558.MathSciNetCrossRefMATH
7.
Zurück zum Zitat Teuvo Kohonen, The self-organizing map, Proc. IEEE, 1990, vol. 78, no. 9, pp. 1464–1480.CrossRef Teuvo Kohonen, The self-organizing map, Proc. IEEE, 1990, vol. 78, no. 9, pp. 1464–1480.CrossRef
8.
Zurück zum Zitat Guo-qiang Bi and Mu-ming Poo, Synaptic modification by correlated activity: Hebb’s postulate revisited, Annu. Rev. Neurosci., 2001, vol. 24, no. 1, pp. 139–166.CrossRef Guo-qiang Bi and Mu-ming Poo, Synaptic modification by correlated activity: Hebb’s postulate revisited, Annu. Rev. Neurosci., 2001, vol. 24, no. 1, pp. 139–166.CrossRef
9.
Zurück zum Zitat Hebb, D.O., The Organization of Behavior: A Neuropsychological Theory, Psychology Press, 2005.CrossRef Hebb, D.O., The Organization of Behavior: A Neuropsychological Theory, Psychology Press, 2005.CrossRef
10.
Zurück zum Zitat Smirnitskaya, I.A., Survey of computational modeling of the functional parts of the brain, Opt.l Mem. Neural Networks, 2022, vol. 31, pp. 145–162.CrossRef Smirnitskaya, I.A., Survey of computational modeling of the functional parts of the brain, Opt.l Mem. Neural Networks, 2022, vol. 31, pp. 145–162.CrossRef
11.
Zurück zum Zitat Cengiz Pehlevan, Anirvan M. Sengupta, and Dmitri B. Chklovskii, Why do similarity matching objectives lead to Hebbian/anti-Hebbian networks?, Neural Comput., 2017, vol. 30, no. 1, pp. 84–124.MathSciNetCrossRefMATH Cengiz Pehlevan, Anirvan M. Sengupta, and Dmitri B. Chklovskii, Why do similarity matching objectives lead to Hebbian/anti-Hebbian networks?, Neural Comput., 2017, vol. 30, no. 1, pp. 84–124.MathSciNetCrossRefMATH
12.
Zurück zum Zitat Calderon, D., Baidyk, T., and Kussul, E., Hebbian ensemble neural network for robot movement control, Opt. Mem. Neural Networks, 2013, vol. 22, pp. 166–183.CrossRef Calderon, D., Baidyk, T., and Kussul, E., Hebbian ensemble neural network for robot movement control, Opt. Mem. Neural Networks, 2013, vol. 22, pp. 166–183.CrossRef
13.
Zurück zum Zitat Ting-Ho Lo, J., Unsupervised hebbian learning by recurrent multilayer neural networks for temporal hierarchical pattern recognition, in 2010 44th Annual Conference on Information Sciences and Systems (CISS), IEEE, 2010, pp. 1–6. Ting-Ho Lo, J., Unsupervised hebbian learning by recurrent multilayer neural networks for temporal hierarchical pattern recognition, in 2010 44th Annual Conference on Information Sciences and Systems (CISS), IEEE, 2010, pp. 1–6.
14.
Zurück zum Zitat Burns, T.F., Classic Hebbian learning endows feed-forward networks with sufficient adaptability in challenging reinforcement learning tasks, J. Neurophysiol., 2021, vol. 125, no. 6, pp. 2034–2037.CrossRef Burns, T.F., Classic Hebbian learning endows feed-forward networks with sufficient adaptability in challenging reinforcement learning tasks, J. Neurophysiol., 2021, vol. 125, no. 6, pp. 2034–2037.CrossRef
15.
Zurück zum Zitat Zucchet, N., Schug, S., von Oswald, J., Zhao, D., and Sacramento, J., A contrastive rule for meta-learning, Adv. Neural Inf. Process. Syst., 2022, vol. 35, pp. 25921–25936. Zucchet, N., Schug, S., von Oswald, J., Zhao, D., and Sacramento, J., A contrastive rule for meta-learning, Adv. Neural Inf. Process. Syst., 2022, vol. 35, pp. 25921–25936.
16.
Zurück zum Zitat Kryzhanovsky, B.V., Expansion of a matrix in terms of external products of configuration vectors, Opt. Mem. Neural Networks, 2008, vol. 17, pp. 62–68. Kryzhanovsky, B.V., Expansion of a matrix in terms of external products of configuration vectors, Opt. Mem. Neural Networks, 2008, vol. 17, pp. 62–68.
17.
Zurück zum Zitat Kryzhanovskiy, V.M. and Malsagov, M.Yu., Increase of the speed of operation of scalar neural network tree when solving the nearest neighbor search problem in binary space of large dimension, Opt. Mem. Neural Networks, 2016, vol. 25, pp. 59–71.CrossRef Kryzhanovskiy, V.M. and Malsagov, M.Yu., Increase of the speed of operation of scalar neural network tree when solving the nearest neighbor search problem in binary space of large dimension, Opt. Mem. Neural Networks, 2016, vol. 25, pp. 59–71.CrossRef
18.
19.
Zurück zum Zitat Grossberg, S., Adaptive pattern classification and universal recoding: Ii. feedback, expectation, olfaction, illusions, Biol. Cybern., 1976, vol. 23, no. 4, pp. 187–202.MathSciNetCrossRefMATH Grossberg, S., Adaptive pattern classification and universal recoding: Ii. feedback, expectation, olfaction, illusions, Biol. Cybern., 1976, vol. 23, no. 4, pp. 187–202.MathSciNetCrossRefMATH
20.
Zurück zum Zitat Amato, G., Carrara, F., Falchi, F., Gennaro, C., and Lagani, G., Hebbian learning meets deep convolutional neural networks, in Image Analysis and Processing–ICIAP 2019: 20th International Conference, Trento, Italy, September 9–13, 2019, Proceedings, Part I 20, Springer, 2019, pp. 324–334. Amato, G., Carrara, F., Falchi, F., Gennaro, C., and Lagani, G., Hebbian learning meets deep convolutional neural networks, in Image Analysis and Processing–ICIAP 2019: 20th International Conference, Trento, Italy, September 9–13, 2019, Proceedings, Part I 20, Springer, 2019, pp. 324–334.
21.
Zurück zum Zitat Rojas, R. and Rojas, R., The backpropagation algorithm, Neural Networks: A Ssystematic Introduction, 1996, pp. 149–182. Rojas, R. and Rojas, R., The backpropagation algorithm, Neural Networks: A Ssystematic Introduction, 1996, pp. 149–182.
22.
Zurück zum Zitat Frenkel, Ch., Lefebvre, M., and Bol, D., Learning without feedback: Fixed random learning signals allow for feedforward training of deep neural networks, Front. Neurosci., 2021, vol. 15, p. 629892.CrossRef Frenkel, Ch., Lefebvre, M., and Bol, D., Learning without feedback: Fixed random learning signals allow for feedforward training of deep neural networks, Front. Neurosci., 2021, vol. 15, p. 629892.CrossRef
23.
Zurück zum Zitat Miconi, T., Hebbian learning with gradients: Hebbian convolutional neural networks with modern deep learning frameworks. arXiv preprint arXiv:2107.01729, 2021. Miconi, T., Hebbian learning with gradients: Hebbian convolutional neural networks with modern deep learning frameworks. arXiv preprint arXiv:2107.01729, 2021.
24.
Zurück zum Zitat Adrien Journé, Hector Garcia Rodriguez, Qinghai Guo, and Timoleon Moraitis, Hebbian deep learning without feedback. arXiv preprint arXiv:2209.11883, 2022. Adrien Journé, Hector Garcia Rodriguez, Qinghai Guo, and Timoleon Moraitis, Hebbian deep learning without feedback. arXiv preprint arXiv:2209.11883, 2022.
25.
Zurück zum Zitat Lagani, G., Falchi, F., Gennaro, C., and Amato, G., Comparing the performance of hebbian against backpropagation learning using convolutional neural networks, Neural Comput. Appl., 2022, vol. 34, no. 8, pp. 6503–6519.CrossRef Lagani, G., Falchi, F., Gennaro, C., and Amato, G., Comparing the performance of hebbian against backpropagation learning using convolutional neural networks, Neural Comput. Appl., 2022, vol. 34, no. 8, pp. 6503–6519.CrossRef
26.
Zurück zum Zitat Chu, D., Information theoretical properties of a spiking neuron trained with Hebbian and STDP learning rules, Nat. Comput., 2023, pages 1–19, 2023. Chu, D., Information theoretical properties of a spiking neuron trained with Hebbian and STDP learning rules, Nat. Comput., 2023, pages 1–19, 2023.
27.
Zurück zum Zitat Lagani, G., Hebbian Learning Thesis. ttps://github.com/GabrieleLagani/HebbianLearningThesis. Accesses August, 2021. Lagani, G., Hebbian Learning Thesis. ttps://github.com/GabrieleLagani/HebbianLearningThesis. Accesses August, 2021.
28.
Zurück zum Zitat Lagani, G., Hebbian learning algorithms for training convolutional neural networks, Lecture Notes, 2019. Lagani, G., Hebbian learning algorithms for training convolutional neural networks, Lecture Notes, 2019.
29.
Zurück zum Zitat Lagani, G., Hebbian PCA. https://github.com/GabrieleLagani/HebbianPCA. Accesses April 2022. Lagani, G., Hebbian PCA. https://​github.​com/​GabrieleLagani/​HebbianPCA.​ Accesses April 2022.
30.
Zurück zum Zitat Lagani, G., Amato, G., Falchi, F., and Gennaro, C., Training convolutional neural networks with Hebbian principal component analysis. arXiv preprint arXiv:2012.12229, 2020. Lagani, G., Amato, G., Falchi, F., and Gennaro, C., Training convolutional neural networks with Hebbian principal component analysis. arXiv preprint arXiv:2012.12229, 2020.
31.
Zurück zum Zitat Talloen, J., Dambre, J., and Vandesompele, A., PyTorch-Hebbian: facilitating local learning in a deep learning framework. arXiv preprint arXiv:2102.00428, 2021. Talloen, J., Dambre, J., and Vandesompele, A., PyTorch-Hebbian: facilitating local learning in a deep learning framework. arXiv preprint arXiv:2102.00428, 2021.
32.
Zurück zum Zitat Joxis. pytorch-hebbian. https://github.com/Joxis/pytorch-hebbian. Accesses February 2021. Joxis. pytorch-hebbian. https://​github.​com/​Joxis/​pytorch-hebbian.​ Accesses February 2021.
33.
Zurück zum Zitat Miconi, T., HebbianCNNPyTorch. https://github.com/ThomasMiconi/HebbianCNNPyTorch. Accesses May 2023. Miconi, T., HebbianCNNPyTorch. https://​github.​com/​ThomasMiconi/​HebbianCNNPyTorc​h.​ Accesses May 2023.
34.
Zurück zum Zitat Weitekamp, D., Hebbian Learning. https://github.com/DannyWeitekamp/HebbianLearning. Accesses April 2021. Weitekamp, D., Hebbian Learning. https://​github.​com/​DannyWeitekamp/​HebbianLearning.​ Accesses April 2021.
35.
Zurück zum Zitat Aseem Wadhwa and Upamanyu Madhow, Bottom-up deep learning using the hebbian principle, 2016. Aseem Wadhwa and Upamanyu Madhow, Bottom-up deep learning using the hebbian principle, 2016.
36.
Zurück zum Zitat Metehan Cekic, Can Bakiskan, and Upamanyu Madhow, Towards robust, interpretable neural networks via Hebbian/anti-Hebbian learning: A software framework for training with feature-based costs, Software Impacts, 2022, vol. 13, p. 100347.CrossRef Metehan Cekic, Can Bakiskan, and Upamanyu Madhow, Towards robust, interpretable neural networks via Hebbian/anti-Hebbian learning: A software framework for training with feature-based costs, Software Impacts, 2022, vol. 13, p. 100347.CrossRef
37.
Zurück zum Zitat metehancekic. HaH. https://github.com/metehancekic/HaH. Accesses May 2022. metehancekic. HaH. https://​github.​com/​metehancekic/​HaH.​ Accesses May 2022.
38.
Zurück zum Zitat Metehan Cekic, Can Bakiskan, and Upamanyu Madhow, Neuro-inspired deep neural networks with sparse, strong activations, in 2022 IEEE International Conference on Image Processing (ICIP), IEEE, 2022, pp. 3843–3847. Metehan Cekic, Can Bakiskan, and Upamanyu Madhow, Neuro-inspired deep neural networks with sparse, strong activations, in 2022 IEEE International Conference on Image Processing (ICIP), IEEE, 2022, pp. 3843–3847.
39.
Zurück zum Zitat enajx. HebbianMetaLearning. https://github.com/enajx/HebbianMetaLearning. Accesses May 2022. enajx. HebbianMetaLearning. https://​github.​com/​enajx/​HebbianMetaLearn​ing.​ Accesses May 2022.
40.
Zurück zum Zitat Najarro, E. and Risi, S., Meta-learning through hebbian plasticity in random networks, Adv. Neural Inf. Process. Syst., 2020, vol. 33, pp. 20719–20731. Najarro, E. and Risi, S., Meta-learning through hebbian plasticity in random networks, Adv. Neural Inf. Process. Syst., 2020, vol. 33, pp. 20719–20731.
41.
Zurück zum Zitat dtyulman. hebbff. https://github.com/dtyulman/hebbff. Accesses May 2022. dtyulman. hebbff. https://​github.​com/​dtyulman/​hebbff.​ Accesses May 2022.
42.
Zurück zum Zitat Tyulmankov, D., Yang, G.R., and Abbott, L.F., Meta-learning local synaptic plasticity for continual familiarity detection. bioRxiv, 2021, pp. 2021–03. Tyulmankov, D., Yang, G.R., and Abbott, L.F., Meta-learning local synaptic plasticity for continual familiarity detection. bioRxiv, 2021, pp. 2021–03.
43.
Zurück zum Zitat Lagani, G., Gennaro, C., Fassold, H., and Amato, G., Fasthebb: Scaling hebbian training of deep neural networks to imagenet level, in Similarity Search and Applications: 15th International Conference, SISAP 2022, Bologna, Italy, October 5–7, 2022, Springer, 2022, pp. 251–264. Lagani, G., Gennaro, C., Fassold, H., and Amato, G., Fasthebb: Scaling hebbian training of deep neural networks to imagenet level, in Similarity Search and Applications: 15th International Conference, SISAP 2022, Bologna, Italy, October 5–7, 2022, Springer, 2022, pp. 251–264.
44.
Zurück zum Zitat monoelh. PCAnet-HebbianPCA-kPCA-PowerPCA. https://github.com/monoelh/PCAnet-HebbianPCA-kPCA-PowerPCA. Accesses May 2022. monoelh. PCAnet-HebbianPCA-kPCA-PowerPCA. https://​github.​com/​monoelh/​PCAnet-HebbianPCA-kPCA-PowerPCA.​ Accesses May 2022.
45.
Zurück zum Zitat Shayan Personal. Hebbian Masks. https://github.com/ShayanPersonal/hebbian-masks. Accesses May 2022. Shayan Personal. Hebbian Masks. https://​github.​com/​ShayanPersonal/​hebbian-masks.​ Accesses May 2022.
46.
Zurück zum Zitat ironbar. Theano_Generalized_Hebbian_Learning. https://github.com/ironbar/Theano_Generalized_Hebbian_Learning. May 2022. ironbar. Theano_Generalized_Hebbian_Learning. https://​github.​com/​ironbar/​Theano_​Generalized_​Hebbian_​Learning.​ May 2022.
47.
Zurück zum Zitat Sanger, T.D., Optimal unsupervised learning in a single-layer linear feedforward neural network, Neural Networks, 1989, vol. 2, no. 6, pp. 459–473.CrossRef Sanger, T.D., Optimal unsupervised learning in a single-layer linear feedforward neural network, Neural Networks, 1989, vol. 2, no. 6, pp. 459–473.CrossRef
48.
Zurück zum Zitat raphaelholca. hebbianRL. https://github.com/raphaelholca/hebbianRL. Accesses September 2017. raphaelholca. hebbianRL. https://​github.​com/​raphaelholca/​hebbianRL.​ Accesses September 2017.
49.
Zurück zum Zitat maxgillett. hebbian_sequence_learning. https://github.com/maxgillett/hebbian_sequence_learning. Accesses May 2020. maxgillett. hebbian_sequence_learning. https://​github.​com/​maxgillett/​hebbian_​sequence_​learning.​ Accesses May 2020.
50.
Zurück zum Zitat jkperin. hebbian-lms. https://github.com/jkperin/hebbian-lms. Accesses December 2017. jkperin. hebbian-lms. https://​github.​com/​jkperin/​hebbian-lms.​ Accesses December 2017.
51.
Zurück zum Zitat gkocker. TensorHebb. https://github.com/gkocker/TensorHebb. Accesses October 2021. gkocker. TensorHebb. https://​github.​com/​gkocker/​TensorHebb.​ Accesses October 2021.
52.
Zurück zum Zitat Ocker, G. and Buice, M., Tensor decompositions of higher-order correlations by nonlinear hebbian plasticity, Adv. Neural Inf. Process. Syst., 2021, vol. 34, pp. 11326–11339. Ocker, G. and Buice, M., Tensor decompositions of higher-order correlations by nonlinear hebbian plasticity, Adv. Neural Inf. Process. Syst., 2021, vol. 34, pp. 11326–11339.
53.
Zurück zum Zitat panda1230. CombinedHebbian_NonHebbianPlasticity-in-Spiking-Neural-Networks. https://github.com/panda1230/CombinedHebbian_NonHebbianPlasticity-in-Spiking-Neural-Networks. Accesses September 2018. panda1230. CombinedHebbian_NonHebbianPlasticity-in-Spiking-Neural-Networks. https://​github.​com/​panda1230/​CombinedHebbian_​NonHebbianPlasti​city-in-Spiking-Neural-Networks.​ Accesses September 2018.
54.
Zurück zum Zitat Priyadarshini Panda and Kaushik Roy, Learning to generate sequences with combination of hebbian and non-hebbian plasticity in recurrent spiking neural networks, Front. Neurosci., 2017, vol. no. 11, p. 693. Priyadarshini Panda and Kaushik Roy, Learning to generate sequences with combination of hebbian and non-hebbian plasticity in recurrent spiking neural networks, Front. Neurosci., 2017, vol. no. 11, p. 693.
55.
Zurück zum Zitat tammytran10. Reward-Modulated-Hebbian-Learning. https://github.com/tammytran10/Reward-Modulated-Hebbian-Learning. Accesses March 2015. tammytran10. Reward-Modulated-Hebbian-Learning. https://​github.​com/​tammytran10/​Reward-Modulated-Hebbian-Learning.​ Accesses March 2015.
56.
Zurück zum Zitat Hoerzer, G.M., Legenstein, R., and Maass, W., Emergence of complex computational structures from chaotic neural networks through reward-modulated hebbian learning, Cereb. Cortex, 2014, vol. 24, no. 3, pp. 677–690.CrossRef Hoerzer, G.M., Legenstein, R., and Maass, W., Emergence of complex computational structures from chaotic neural networks through reward-modulated hebbian learning, Cereb. Cortex, 2014, vol. 24, no. 3, pp. 677–690.CrossRef
57.
Zurück zum Zitat Luan Ademi. HebbianLearning. https://github.com/LuanAdemi/HebbianLearning. Accesses January 2022. Luan Ademi. HebbianLearning. https://​github.​com/​LuanAdemi/​HebbianLearning.​ Accesses January 2022.
58.
Zurück zum Zitat Flesch, T., Flesch_Nagy_etal_HebbCL. https://github.com/TimoFlesch/Flesch_Nagy_etal_HebbCL. Accesses July 2022. Flesch, T., Flesch_Nagy_etal_HebbCL. https://​github.​com/​TimoFlesch/​Flesch_​Nagy_​etal_​HebbCL.​ Accesses July 2022.
59.
Zurück zum Zitat Flesch, T., Nagy, D.G., Saxe, A., and Summerfield, Ch., Modelling continual learning in humans with hebbian context gating and exponentially decaying task signals, PLOS Comput. Biol., 2023, vol. 19, no. 1, e1010808.CrossRef Flesch, T., Nagy, D.G., Saxe, A., and Summerfield, Ch., Modelling continual learning in humans with hebbian context gating and exponentially decaying task signals, PLOS Comput. Biol., 2023, vol. 19, no. 1, e1010808.CrossRef
Metadaten
Titel
Implementation Challenges and Strategies for Hebbian Learning in Convolutional Neural Networks
verfasst von
A. V. Demidovskij
M. S. Kazyulina
I. G. Salnikov
A. M. Tugaryov
A. I. Trutnev
S. V. Pavlov
Publikationsdatum
01.12.2023
Verlag
Pleiades Publishing
Erschienen in
Optical Memory and Neural Networks / Ausgabe Sonderheft 2/2023
Print ISSN: 1060-992X
Elektronische ISSN: 1934-7898
DOI
https://doi.org/10.3103/S1060992X23060048

Weitere Artikel der Sonderheft 2/2023

Optical Memory and Neural Networks 2/2023 Zur Ausgabe

Premium Partner