Skip to main content
Log in

Implementation Challenges and Strategies for Hebbian Learning in Convolutional Neural Networks

  • Published:
Optical Memory and Neural Networks Aims and scope Submit manuscript

Abstract

Given the unprecedented growth of deep learning applications, training acceleration is becoming a subject of strong academic interest. Hebbian learning as a training strategy alternative to backpropagation presents a promising optimization approach due to its locality, lower computational complexity and parallelization potential. Nevertheless, due to the challenging optimization of Hebbian learning, there is no widely accepted approach to the implementation of such mixed strategies. The current paper overviews the 4 main strategies for updating weights using the Hebbian rule, including its widely used modifications—Oja’s and Instar rules. Additionally, the paper analyses 21 industrial implementations of Hebbian learning, discusses merits and shortcomings of Hebbian rules, as well as presents the results of computational experiments on 4 convolutional networks. Experiments show that the most efficient implementation strategy of Hebbian learning allows for \(1.66 \times \) acceleration and \(3.76 \times \) memory consumption when updating DenseNet121 weights compared to backpropagation. Finally, a comparative analysis of the implementation strategies is carried out and grounded recommendations for Hebbian learning application are formulated.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

REFERENCES

  1. GVR. Deep Learning Market Size, Share, and Trends Analysis Report. https://www.grandviewresearch.com/industry-analysis/deep-learning-market. Accesses February 2023.

  2. Krizhevsky, A., Sutskever, I., and Hinton, G.E., Imagenet classification with deep convolutional neural networks, Commun. ACM, 2017, vol. 60, no. 6, pp. 84–90.

    Article  Google Scholar 

  3. Yoon Kim, Convolutional neural networks for sentence classification, in Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, October 2014, Association for Computational Linguistics, pp. 1746–1751.

  4. Shaojie Bai, J Zico Kolter, and Vladlen Koltun, An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271, 2018.

  5. Sarada Krithivasan, Sanchari Sen, Swagath Venkataramani, and Anand Raghunathan, Accelerating DNN training through selective localized learning, Front. Neurosci., 2022, vol. 15, p. 759807.

    Article  Google Scholar 

  6. Hopeld J., Jr., Neural network and physical systems with emergent collective computational abilities, Proc. Natl. Acad. Sci. U. S. A., 1982, vol. 79, pp. 2554–2558.

    Article  MathSciNet  MATH  Google Scholar 

  7. Teuvo Kohonen, The self-organizing map, Proc. IEEE, 1990, vol. 78, no. 9, pp. 1464–1480.

    Article  Google Scholar 

  8. Guo-qiang Bi and Mu-ming Poo, Synaptic modification by correlated activity: Hebb’s postulate revisited, Annu. Rev. Neurosci., 2001, vol. 24, no. 1, pp. 139–166.

    Article  Google Scholar 

  9. Hebb, D.O., The Organization of Behavior: A Neuropsychological Theory, Psychology Press, 2005.

    Book  Google Scholar 

  10. Smirnitskaya, I.A., Survey of computational modeling of the functional parts of the brain, Opt.l Mem. Neural Networks, 2022, vol. 31, pp. 145–162.

    Article  Google Scholar 

  11. Cengiz Pehlevan, Anirvan M. Sengupta, and Dmitri B. Chklovskii, Why do similarity matching objectives lead to Hebbian/anti-Hebbian networks?, Neural Comput., 2017, vol. 30, no. 1, pp. 84–124.

    Article  MathSciNet  MATH  Google Scholar 

  12. Calderon, D., Baidyk, T., and Kussul, E., Hebbian ensemble neural network for robot movement control, Opt. Mem. Neural Networks, 2013, vol. 22, pp. 166–183.

    Article  Google Scholar 

  13. Ting-Ho Lo, J., Unsupervised hebbian learning by recurrent multilayer neural networks for temporal hierarchical pattern recognition, in 2010 44th Annual Conference on Information Sciences and Systems (CISS), IEEE, 2010, pp. 1–6.

  14. Burns, T.F., Classic Hebbian learning endows feed-forward networks with sufficient adaptability in challenging reinforcement learning tasks, J. Neurophysiol., 2021, vol. 125, no. 6, pp. 2034–2037.

    Article  Google Scholar 

  15. Zucchet, N., Schug, S., von Oswald, J., Zhao, D., and Sacramento, J., A contrastive rule for meta-learning, Adv. Neural Inf. Process. Syst., 2022, vol. 35, pp. 25921–25936.

    Google Scholar 

  16. Kryzhanovsky, B.V., Expansion of a matrix in terms of external products of configuration vectors, Opt. Mem. Neural Networks, 2008, vol. 17, pp. 62–68.

    Google Scholar 

  17. Kryzhanovskiy, V.M. and Malsagov, M.Yu., Increase of the speed of operation of scalar neural network tree when solving the nearest neighbor search problem in binary space of large dimension, Opt. Mem. Neural Networks, 2016, vol. 25, pp. 59–71.

    Article  Google Scholar 

  18. Oja, E., Simplified neuron model as a principal component analyzer, J. Math. Biol., 1982, vol. 15, pp. 267–273.

    Article  MathSciNet  MATH  Google Scholar 

  19. Grossberg, S., Adaptive pattern classification and universal recoding: Ii. feedback, expectation, olfaction, illusions, Biol. Cybern., 1976, vol. 23, no. 4, pp. 187–202.

    Article  MathSciNet  MATH  Google Scholar 

  20. Amato, G., Carrara, F., Falchi, F., Gennaro, C., and Lagani, G., Hebbian learning meets deep convolutional neural networks, in Image Analysis and Processing–ICIAP 2019: 20th International Conference, Trento, Italy, September 9–13, 2019, Proceedings, Part I 20, Springer, 2019, pp. 324–334.

  21. Rojas, R. and Rojas, R., The backpropagation algorithm, Neural Networks: A Ssystematic Introduction, 1996, pp. 149–182.

  22. Frenkel, Ch., Lefebvre, M., and Bol, D., Learning without feedback: Fixed random learning signals allow for feedforward training of deep neural networks, Front. Neurosci., 2021, vol. 15, p. 629892.

    Article  Google Scholar 

  23. Miconi, T., Hebbian learning with gradients: Hebbian convolutional neural networks with modern deep learning frameworks. arXiv preprint arXiv:2107.01729, 2021.

  24. Adrien Journé, Hector Garcia Rodriguez, Qinghai Guo, and Timoleon Moraitis, Hebbian deep learning without feedback. arXiv preprint arXiv:2209.11883, 2022.

  25. Lagani, G., Falchi, F., Gennaro, C., and Amato, G., Comparing the performance of hebbian against backpropagation learning using convolutional neural networks, Neural Comput. Appl., 2022, vol. 34, no. 8, pp. 6503–6519.

    Article  Google Scholar 

  26. Chu, D., Information theoretical properties of a spiking neuron trained with Hebbian and STDP learning rules, Nat. Comput., 2023, pages 1–19, 2023.

    Google Scholar 

  27. Lagani, G., Hebbian Learning Thesis. ttps://github.com/GabrieleLagani/HebbianLearningThesis. Accesses August, 2021.

  28. Lagani, G., Hebbian learning algorithms for training convolutional neural networks, Lecture Notes, 2019.

    Google Scholar 

  29. Lagani, G., Hebbian PCA. https://github.com/GabrieleLagani/HebbianPCA. Accesses April 2022.

  30. Lagani, G., Amato, G., Falchi, F., and Gennaro, C., Training convolutional neural networks with Hebbian principal component analysis. arXiv preprint arXiv:2012.12229, 2020.

  31. Talloen, J., Dambre, J., and Vandesompele, A., PyTorch-Hebbian: facilitating local learning in a deep learning framework. arXiv preprint arXiv:2102.00428, 2021.

  32. Joxis. pytorch-hebbian. https://github.com/Joxis/pytorch-hebbian. Accesses February 2021.

  33. Miconi, T., HebbianCNNPyTorch. https://github.com/ThomasMiconi/HebbianCNNPyTorch. Accesses May 2023.

  34. Weitekamp, D., Hebbian Learning. https://github.com/DannyWeitekamp/HebbianLearning. Accesses April 2021.

  35. Aseem Wadhwa and Upamanyu Madhow, Bottom-up deep learning using the hebbian principle, 2016.

  36. Metehan Cekic, Can Bakiskan, and Upamanyu Madhow, Towards robust, interpretable neural networks via Hebbian/anti-Hebbian learning: A software framework for training with feature-based costs, Software Impacts, 2022, vol. 13, p. 100347.

    Article  Google Scholar 

  37. metehancekic. HaH. https://github.com/metehancekic/HaH. Accesses May 2022.

  38. Metehan Cekic, Can Bakiskan, and Upamanyu Madhow, Neuro-inspired deep neural networks with sparse, strong activations, in 2022 IEEE International Conference on Image Processing (ICIP), IEEE, 2022, pp. 3843–3847.

  39. enajx. HebbianMetaLearning. https://github.com/enajx/HebbianMetaLearning. Accesses May 2022.

  40. Najarro, E. and Risi, S., Meta-learning through hebbian plasticity in random networks, Adv. Neural Inf. Process. Syst., 2020, vol. 33, pp. 20719–20731.

    Google Scholar 

  41. dtyulman. hebbff. https://github.com/dtyulman/hebbff. Accesses May 2022.

  42. Tyulmankov, D., Yang, G.R., and Abbott, L.F., Meta-learning local synaptic plasticity for continual familiarity detection. bioRxiv, 2021, pp. 2021–03.

  43. Lagani, G., Gennaro, C., Fassold, H., and Amato, G., Fasthebb: Scaling hebbian training of deep neural networks to imagenet level, in Similarity Search and Applications: 15th International Conference, SISAP 2022, Bologna, Italy, October 5–7, 2022, Springer, 2022, pp. 251–264.

  44. monoelh. PCAnet-HebbianPCA-kPCA-PowerPCA. https://github.com/monoelh/PCAnet-HebbianPCA-kPCA-PowerPCA. Accesses May 2022.

  45. Shayan Personal. Hebbian Masks. https://github.com/ShayanPersonal/hebbian-masks. Accesses May 2022.

  46. ironbar. Theano_Generalized_Hebbian_Learning. https://github.com/ironbar/Theano_Generalized_Hebbian_Learning. May 2022.

  47. Sanger, T.D., Optimal unsupervised learning in a single-layer linear feedforward neural network, Neural Networks, 1989, vol. 2, no. 6, pp. 459–473.

    Article  Google Scholar 

  48. raphaelholca. hebbianRL. https://github.com/raphaelholca/hebbianRL. Accesses September 2017.

  49. maxgillett. hebbian_sequence_learning. https://github.com/maxgillett/hebbian_sequence_learning. Accesses May 2020.

  50. jkperin. hebbian-lms. https://github.com/jkperin/hebbian-lms. Accesses December 2017.

  51. gkocker. TensorHebb. https://github.com/gkocker/TensorHebb. Accesses October 2021.

  52. Ocker, G. and Buice, M., Tensor decompositions of higher-order correlations by nonlinear hebbian plasticity, Adv. Neural Inf. Process. Syst., 2021, vol. 34, pp. 11326–11339.

    Google Scholar 

  53. panda1230. CombinedHebbian_NonHebbianPlasticity-in-Spiking-Neural-Networks. https://github.com/panda1230/CombinedHebbian_NonHebbianPlasticity-in-Spiking-Neural-Networks. Accesses September 2018.

  54. Priyadarshini Panda and Kaushik Roy, Learning to generate sequences with combination of hebbian and non-hebbian plasticity in recurrent spiking neural networks, Front. Neurosci., 2017, vol. no. 11, p. 693.

  55. tammytran10. Reward-Modulated-Hebbian-Learning. https://github.com/tammytran10/Reward-Modulated-Hebbian-Learning. Accesses March 2015.

  56. Hoerzer, G.M., Legenstein, R., and Maass, W., Emergence of complex computational structures from chaotic neural networks through reward-modulated hebbian learning, Cereb. Cortex, 2014, vol. 24, no. 3, pp. 677–690.

    Article  Google Scholar 

  57. Luan Ademi. HebbianLearning. https://github.com/LuanAdemi/HebbianLearning. Accesses January 2022.

  58. Flesch, T., Flesch_Nagy_etal_HebbCL. https://github.com/TimoFlesch/Flesch_Nagy_etal_HebbCL. Accesses July 2022.

  59. Flesch, T., Nagy, D.G., Saxe, A., and Summerfield, Ch., Modelling continual learning in humans with hebbian context gating and exponentially decaying task signals, PLOS Comput. Biol., 2023, vol. 19, no. 1, e1010808.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. V. Demidovskij.

Ethics declarations

The authors of this work declare that they have no conflicts of interest.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Demidovskij, A.V., Kazyulina, M.S., Salnikov, I.G. et al. Implementation Challenges and Strategies for Hebbian Learning in Convolutional Neural Networks. Opt. Mem. Neural Networks 32 (Suppl 2), S252–S264 (2023). https://doi.org/10.3103/S1060992X23060048

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.3103/S1060992X23060048

Keywords:

Navigation