Abstract
This paper presents a new memory of the Hopfield model that fixes many drawbacks of the model, such as loading capacity, limit cycle and error tolerance. This memory is derived from the hairy model [15]. This paper also constructs a training process to further balance the vulnerable memory parts and improve the memory.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Ackley, D.H., Hinton, G.E., Sejnowski, T.J.: A learning algorithm for Boltzmann machine. Cognitive Science 9, 147–169 (1985)
Amari, S.I., Maginu, K.: Statistical Neurodynamics of Associative Memory. Neural Networks 1(1), 63–73 (1988)
Amit, D.J.: Modeling brain function: The world of attractor neural networks. Cambridge University Press, Cambridge (1989)
Gardner, E., Derrida, B.: Optimal storages properties of neural network models. Journal of Physics A 21, 271–284 (1988)
Gardner, E.: Optimal basins of attraction in randomly sparse neural network models. Journal of Physics A 22(12), 1969–1974 (1989)
Hartwell, L.H., Hopfield, J.J., Leibler, S., Murray, A.W.: From molecular to modular cell biology. Nature, Suppl. 402, C47–C52 (1999)
Hopfield, J.J.: Neural networks and physical systems with emergent collective computational ability. Proceedings of the National Academy of Sciences of the United States of America 79, 2554–2558 (1982)
Kanter, I., Sompolinsky, H.: Associative recall of memory without errors. Physical Review A 35(1), 380–392 (1987)
Kauffman, S.A.: Antichaos and adaptation, August, pp. 64–70. Scientific American (1991)
Li, J., Michel, A.N., Porod, W.: Analysis and synthesis of a class of neural networks: linear systems operating on a closed hypercube. IEEE Transactions on Circuits and Systems 36(11), 1405–1422 (1989)
Liou, C.-Y., Lin, S.-L.: The other variant Boltzmann machine. In: Proceedings of International Joint Conference on Neural Networks, Washington DC, pp. 449–454 (1989)
Liou, C.-Y., Wu, J.-M.: Self-organization using Potts models. Neural Networks 9(4), 671–684 (1996)
Liou, C.-Y., Yuan, S.-K.: Error tolerant associative memory. Biological Cybernetics 81, 331–342 (1999)
Liou, C.-Y., Yang, H.-C.: Selective feature-to-feature adhesion for recognition of cursive handprinted characters. IEEE Transactions on Pattern Analysis and Machine Intelligence 21(2), 184–191 (1999)
Liou, C.-Y., Lin, S.-L.: Finite memory loading in hairy neurons. Natural Computing 5(1), 15–42 (2006)
Little, W.A.: The existence of persistent states in the brain. Mathematical Biosciences 19, 101–120 (1974)
Personnaz, L., Guyon, I., Dreyfus, G.: Information storage and retrieval in spin-glass like neural networks. Journal Physique Lett. 46, 359–365 (1985)
Weisbuch, G., Fogelman-Soulie, F.: Scaling laws for the attractors of Hopfield networks. Journal De Physique Lett. 46, 623–630 (1985)
Widrow, B., Hoff Jr., M.E.: Adaptive switching circuits. IRE WESCON Convention Record, pp. 96–104 (1960)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Liou, CY. (2006). Backbone Structure of Hairy Memory. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds) Artificial Neural Networks – ICANN 2006. ICANN 2006. Lecture Notes in Computer Science, vol 4131. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11840817_72
Download citation
DOI: https://doi.org/10.1007/11840817_72
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-38625-4
Online ISBN: 978-3-540-38627-8
eBook Packages: Computer ScienceComputer Science (R0)