[go: up one dir, main page]

Skip to main content

Temporal-Sequential Learning with Columnar-Structured Spiking Neural Networks

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2022)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1791))

Included in the following conference series:

  • 1074 Accesses

Abstract

Human can memorize complex temporal sequences, such as music, indicating that the brain has a mechanism for storing time intervals between elements. However, most of the existing sequential memory models can only handle sequences that lack temporal information between elements, such as sentences. In this paper, we propose a columnar-structured model that can memorize sequences with variable time intervals. Each column is composed of several spiking neurons that have the dendritic structure and the synaptic delays. Dendrites allow a neuron to represent the same element belonging to different contexts, while transmission delays between two spiking neurons preserve the time intervals between sequence elements. Moreover, the proposed model can remember sequence information even after a single presentation of a new input sample, i.e., it is capable of one-shot learning. Experimental results demonstrate that the proposed model can memorize complex temporal sequences like musical pieces, and recall the entire sequence with high accuracy given an extremely short sub-sequence. Its significance lies not only in its superiority to comparable methods, but also in providing a reference for the development of neuromorphic memory systems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Basawaraj, Starzyk, J.A., Horzyk, A.: Episodic memory in minicolumn associative knowledge graphs. IEEE Trans. Neural Netw. Learn. Syst. 30(11), 3505–3516 (2019)

    Google Scholar 

  2. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, pp. 1724–1734. Association for Computational Linguistics (2014)

    Google Scholar 

  3. Cui, Y., Ahmad, S., Hawkins, J.: Continuous online sequence learning with an unsupervised neural network model. Neural Comput. 28(11), 2474–2504 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  4. Elman, J.L.: Finding structure in time. Cogn. Sci. 14(2), 179–211 (1990)

    Article  Google Scholar 

  5. Hawkins, J., Ahmad, S., Dubinsky, D.: Cortical learning algorithm and hierarchical temporal memory. Numenta Whitepaper, pp. 1–68 (2011)

    Google Scholar 

  6. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  7. Horzyk, A., Starzyk, J.A., Graham, J.: Integration of semantic and episodic memories. IEEE Trans. Neural Netw. Learn. Syst. 28(12), 3084–3095 (2017)

    Article  MathSciNet  Google Scholar 

  8. Hu, J., Tang, H., Tan, K., Li, H.: How the brain formulates memory: a spatio-temporal model research frontier. IEEE Comput. Intell. Mag. 11(2), 56–68 (2016)

    Article  Google Scholar 

  9. Krueger, B.: Classical piano midi page. http://www.piano-midi.de/. Accessed 10 Mar 2022

  10. Li, N., Chen, Z.: Image cationing with visual-semantic LSTM. In: Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence (IJCAI-2018), pp. 793–799. International Joint Conferences on Artificial Intelligence Organization (2018)

    Google Scholar 

  11. Liang, Q., Zeng, Y., Xu, B.: Temporal-sequential learning with a brain-inspired spiking neural network and its application to musical memory. Front. Comput. Neurosci. 14, 51 (2020)

    Article  Google Scholar 

  12. Schuster, M., Paliwal, K.: Bidirectional recurrent neural networks. IEEE Trans. Sig. Process. 45(11), 2673–2681 (1997)

    Article  Google Scholar 

  13. Shipp, S.: Structure and function of the cerebral cortex. Curr. Biol. 17(12), R443–R449 (2007)

    Article  Google Scholar 

  14. Soltesz, I., Losonczy, A.: CA1 pyramidal cell diversity enabling parallel information processing in the hippocampus. Nat. Neurosci. 21(4), 484–493 (2018)

    Article  Google Scholar 

  15. Starzyk, J.A., Maciura, L., Horzyk, A.: Associative memories with synaptic delays. IEEE Trans. Neural Netw. Learn. Syst. 31(1), 331–344 (2020)

    Article  Google Scholar 

  16. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Proceedings of the 27th International Conference on Neural Information Processing Systems, pp. 3104–3112. MIT Press, Cambridge (2014)

    Google Scholar 

  17. Tully, P.J., Lindén, H., Hennig, M.H., Lansner, A.: Spike-based Bayesian-Hebbian learning of temporal sequences. PLoS Comput. Biol. 12(5), e1004954 (2016)

    Article  Google Scholar 

  18. Wang, Y., Zhang, M., Chen, Y., Qu, H.: Signed neuron with memory: towards simple, accurate and high-efficient ANN-SNN conversion. In: Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, pp. 2501–2508 (2022)

    Google Scholar 

  19. Zhang, B., Xiong, D., Xie, J., Su, J.: Neural machine translation with GRU-gated attention model. IEEE Trans. Neural Netw. Learn. Syst. 31(11), 4688–4698 (2020)

    Article  Google Scholar 

  20. Zhang, M., et al.: Rectified linear postsynaptic potential function for backpropagation in deep spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. 33(5), 1947–1958 (2022)

    Article  Google Scholar 

Download references

Acknowledgement

This work is supported in part by the National Key Research and Development Program of China under Grant 2018AAA0100202, in part by the National Science Foundation of China under Grant 61976043.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hong Qu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Luo, X., Liu, H., Chen, Y., Zhang, M., Qu, H. (2023). Temporal-Sequential Learning with Columnar-Structured Spiking Neural Networks. In: Tanveer, M., Agarwal, S., Ozawa, S., Ekbal, A., Jatowt, A. (eds) Neural Information Processing. ICONIP 2022. Communications in Computer and Information Science, vol 1791. Springer, Singapore. https://doi.org/10.1007/978-981-99-1639-9_13

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-1639-9_13

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-1638-2

  • Online ISBN: 978-981-99-1639-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics