Abstract
Human can memorize complex temporal sequences, such as music, indicating that the brain has a mechanism for storing time intervals between elements. However, most of the existing sequential memory models can only handle sequences that lack temporal information between elements, such as sentences. In this paper, we propose a columnar-structured model that can memorize sequences with variable time intervals. Each column is composed of several spiking neurons that have the dendritic structure and the synaptic delays. Dendrites allow a neuron to represent the same element belonging to different contexts, while transmission delays between two spiking neurons preserve the time intervals between sequence elements. Moreover, the proposed model can remember sequence information even after a single presentation of a new input sample, i.e., it is capable of one-shot learning. Experimental results demonstrate that the proposed model can memorize complex temporal sequences like musical pieces, and recall the entire sequence with high accuracy given an extremely short sub-sequence. Its significance lies not only in its superiority to comparable methods, but also in providing a reference for the development of neuromorphic memory systems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Basawaraj, Starzyk, J.A., Horzyk, A.: Episodic memory in minicolumn associative knowledge graphs. IEEE Trans. Neural Netw. Learn. Syst. 30(11), 3505–3516 (2019)
Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, pp. 1724–1734. Association for Computational Linguistics (2014)
Cui, Y., Ahmad, S., Hawkins, J.: Continuous online sequence learning with an unsupervised neural network model. Neural Comput. 28(11), 2474–2504 (2016)
Elman, J.L.: Finding structure in time. Cogn. Sci. 14(2), 179–211 (1990)
Hawkins, J., Ahmad, S., Dubinsky, D.: Cortical learning algorithm and hierarchical temporal memory. Numenta Whitepaper, pp. 1–68 (2011)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Horzyk, A., Starzyk, J.A., Graham, J.: Integration of semantic and episodic memories. IEEE Trans. Neural Netw. Learn. Syst. 28(12), 3084–3095 (2017)
Hu, J., Tang, H., Tan, K., Li, H.: How the brain formulates memory: a spatio-temporal model research frontier. IEEE Comput. Intell. Mag. 11(2), 56–68 (2016)
Krueger, B.: Classical piano midi page. http://www.piano-midi.de/. Accessed 10 Mar 2022
Li, N., Chen, Z.: Image cationing with visual-semantic LSTM. In: Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence (IJCAI-2018), pp. 793–799. International Joint Conferences on Artificial Intelligence Organization (2018)
Liang, Q., Zeng, Y., Xu, B.: Temporal-sequential learning with a brain-inspired spiking neural network and its application to musical memory. Front. Comput. Neurosci. 14, 51 (2020)
Schuster, M., Paliwal, K.: Bidirectional recurrent neural networks. IEEE Trans. Sig. Process. 45(11), 2673–2681 (1997)
Shipp, S.: Structure and function of the cerebral cortex. Curr. Biol. 17(12), R443–R449 (2007)
Soltesz, I., Losonczy, A.: CA1 pyramidal cell diversity enabling parallel information processing in the hippocampus. Nat. Neurosci. 21(4), 484–493 (2018)
Starzyk, J.A., Maciura, L., Horzyk, A.: Associative memories with synaptic delays. IEEE Trans. Neural Netw. Learn. Syst. 31(1), 331–344 (2020)
Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Proceedings of the 27th International Conference on Neural Information Processing Systems, pp. 3104–3112. MIT Press, Cambridge (2014)
Tully, P.J., Lindén, H., Hennig, M.H., Lansner, A.: Spike-based Bayesian-Hebbian learning of temporal sequences. PLoS Comput. Biol. 12(5), e1004954 (2016)
Wang, Y., Zhang, M., Chen, Y., Qu, H.: Signed neuron with memory: towards simple, accurate and high-efficient ANN-SNN conversion. In: Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, pp. 2501–2508 (2022)
Zhang, B., Xiong, D., Xie, J., Su, J.: Neural machine translation with GRU-gated attention model. IEEE Trans. Neural Netw. Learn. Syst. 31(11), 4688–4698 (2020)
Zhang, M., et al.: Rectified linear postsynaptic potential function for backpropagation in deep spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. 33(5), 1947–1958 (2022)
Acknowledgement
This work is supported in part by the National Key Research and Development Program of China under Grant 2018AAA0100202, in part by the National Science Foundation of China under Grant 61976043.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Luo, X., Liu, H., Chen, Y., Zhang, M., Qu, H. (2023). Temporal-Sequential Learning with Columnar-Structured Spiking Neural Networks. In: Tanveer, M., Agarwal, S., Ozawa, S., Ekbal, A., Jatowt, A. (eds) Neural Information Processing. ICONIP 2022. Communications in Computer and Information Science, vol 1791. Springer, Singapore. https://doi.org/10.1007/978-981-99-1639-9_13
Download citation
DOI: https://doi.org/10.1007/978-981-99-1639-9_13
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-1638-2
Online ISBN: 978-981-99-1639-9
eBook Packages: Computer ScienceComputer Science (R0)