Abstract
For some problems, the back-propagation learning rule often used for training multilayer feedforward networks appears to have serious limitations. In this paper we describe BP-SOM, an alternative training procedure. In BP-SOM the traditional back-propagation learning rule is combined with unsupervised learning in self-organizing maps. While the multilayer feedforward network is trained, the hidden-unit activations of the feedforward network are used as training material for the accompanying self-organizing maps. After a few training cycles, the maps develop, to a certain extent, self-organization. The information in the maps is used in updating the connection weights of the feedforward network. The effect is that during BP-SOM learning, hidden-unit activations of patterns, associated with the same class, becomemore similar to each other. Results on two hard to learn classification tasks show that the BP-SOM architecture and learning rule offer a strong alternative for training multilayer feedforward networks with back-propagation.
Similar content being viewed by others
References
D.E. Rumelhart, G.E. Hinton, R.J. Williams. Learning internal representations by error propagation. In D. E. Rumelhart & J. L. McClelland eds.,Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol. I: Foundations, pp. 318–362, Cambridge, MA: The MIT Press, 1986.
D. Norris. How to build a connectionist idiot (savant),Cognition, 35, pp. 277–291, 1989.
T. Kohonen.Self-Organization and Associative Memory, Berlin: Springer Verlag, 1989.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Weijters, A.J.M.M. The BP-SOM architecture and learning rule. Neural Process Lett 2, 13–16 (1995). https://doi.org/10.1007/BF02309010
Issue Date:
DOI: https://doi.org/10.1007/BF02309010