Skip to main content
Top
Published in: Cognitive Computation 2/2009

01-06-2009

Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors

Author: Pentti Kanerva

Published in: Cognitive Computation | Issue 2/2009

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The 1990s saw the emergence of cognitive models that depend on very high dimensionality and randomness. They include Holographic Reduced Representations, Spatter Code, Semantic Vectors, Latent Semantic Analysis, Context-Dependent Thinning, and Vector-Symbolic Architecture. They represent things in high-dimensional vectors that are manipulated by operations that produce new high-dimensional vectors in the style of traditional computing, in what is called here hyperdimensional computing on account of the very high dimensionality. The paper presents the main ideas behind these models, written as a tutorial essay in hopes of making the ideas accessible and even provocative. A sketch of how we have arrived at these models, with references and pointers to further reading, is given at the end. The thesis of the paper is that hyperdimensional representation has much to offer to students of cognitive science, theoretical neuroscience, computer science and engineering, and mathematics.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Anderson JA. A simple neural network generating an interactive memory. Math Biosci. 1972;14:197–220.CrossRef Anderson JA. A simple neural network generating an interactive memory. Math Biosci. 1972;14:197–220.CrossRef
2.
go back to reference Kohonen T. Correlation matrix memories. IEEE Trans Comput. 1984;C21(4):353–9. Kohonen T. Correlation matrix memories. IEEE Trans Comput. 1984;C21(4):353–9.
3.
go back to reference Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci USA. 1982;79(8):2554–8.PubMedCrossRef Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci USA. 1982;79(8):2554–8.PubMedCrossRef
4.
go back to reference Kanerva P. Sparse distributed memory. Cambridge, MA: MIT Press; 1988. Kanerva P. Sparse distributed memory. Cambridge, MA: MIT Press; 1988.
5.
go back to reference Karlsson R. A fast activation mechanism for the Kanerva SDM memory. In: Uesaka Y, Kanerva P, Asoh H, editors. Foundations of real-world computing. Stanford: CSLI; 2001. p. 289–93. Karlsson R. A fast activation mechanism for the Kanerva SDM memory. In: Uesaka Y, Kanerva P, Asoh H, editors. Foundations of real-world computing. Stanford: CSLI; 2001. p. 289–93.
6.
go back to reference Aleksander I, Stonham TJ, Wilkie BA. Computer vision systems for industry: WISARD and the like. Digit Syst Ind Autom. 1982;1:305–23. Aleksander I, Stonham TJ, Wilkie BA. Computer vision systems for industry: WISARD and the like. Digit Syst Ind Autom. 1982;1:305–23.
7.
go back to reference Hinton GH, Anderson JA, editors. Parallel models of associative memory. Hillsdale, NJ: Erlbaum; 1981. Hinton GH, Anderson JA, editors. Parallel models of associative memory. Hillsdale, NJ: Erlbaum; 1981.
8.
go back to reference Hassoun MH, editor. Associative neural memories: theory and implementation. New York, Oxford: Oxford University Press; 1993. Hassoun MH, editor. Associative neural memories: theory and implementation. New York, Oxford: Oxford University Press; 1993.
9.
go back to reference Kohonen T. Self-organization and associative memory. 3rd ed. Berlin: Springer; 1989. Kohonen T. Self-organization and associative memory. 3rd ed. Berlin: Springer; 1989.
10.
go back to reference Palm G. Neural assemblies: an alternative approach to artificial intelligence. Heidelberg: Springer; 1982. Palm G. Neural assemblies: an alternative approach to artificial intelligence. Heidelberg: Springer; 1982.
11.
go back to reference Hinton GE. Mapping part–whole hierarchies into connectionist networks. Artif Intell. 1990;46(1–2):47–75.CrossRef Hinton GE. Mapping part–whole hierarchies into connectionist networks. Artif Intell. 1990;46(1–2):47–75.CrossRef
12.
go back to reference Smolensky P. Tensor product variable binding and the representation of symbolic structures in connectionist networks. Artif Intell. 1990;46(1–2):159–216.CrossRef Smolensky P. Tensor product variable binding and the representation of symbolic structures in connectionist networks. Artif Intell. 1990;46(1–2):159–216.CrossRef
13.
go back to reference Plate T. Holographic Reduced Representations: convolution algebra for compositional distributed representations. In: Mylopoulos J, Reiter R, editors. Proc. 12th int’l joint conference on artificial intelligence (IJCAI). San Mateo, CA: Kaufmann; 1991. p. 30–35. Plate T. Holographic Reduced Representations: convolution algebra for compositional distributed representations. In: Mylopoulos J, Reiter R, editors. Proc. 12th int’l joint conference on artificial intelligence (IJCAI). San Mateo, CA: Kaufmann; 1991. p. 30–35.
14.
go back to reference Plate TA. Holographic reduced representation: distributed representation of cognitive structure. Stanford: CSLI; 2003. Plate TA. Holographic reduced representation: distributed representation of cognitive structure. Stanford: CSLI; 2003.
15.
go back to reference Kanerva P. Binary spatter-coding of ordered K-tuples. In: von der Malsburg C, von Seelen W, Vorbruggen JC, Sendhoff B, editors. Artificial neural networks – ICANN 96 proceedings (Lecture notes in computer science, vol. 1112). Berlin: Springer; 1996. p. 869–73. Kanerva P. Binary spatter-coding of ordered K-tuples. In: von der Malsburg C, von Seelen W, Vorbruggen JC, Sendhoff B, editors. Artificial neural networks – ICANN 96 proceedings (Lecture notes in computer science, vol. 1112). Berlin: Springer; 1996. p. 869–73.
16.
go back to reference Gayler RW. Multiplicative binding, representation operators, and analogy. Poster abstract. In: Holyoak K, Gentner D, Kokinov B, editors. Advances in analogy research. Sofia: New Bulgarian University; 1998. p. 405. Full poster http://cogprints.org/502/. Accessed 15 Nov 2008. Gayler RW. Multiplicative binding, representation operators, and analogy. Poster abstract. In: Holyoak K, Gentner D, Kokinov B, editors. Advances in analogy research. Sofia: New Bulgarian University; 1998. p. 405. Full poster http://​cogprints.​org/​502/​. Accessed 15 Nov 2008.
17.
go back to reference Rachkovskij DA, Kussul EM. Binding and normalization of binary sparse distributed representations by context-dependent thinning. Neural Comput. 2001;13(2):411–52.CrossRef Rachkovskij DA, Kussul EM. Binding and normalization of binary sparse distributed representations by context-dependent thinning. Neural Comput. 2001;13(2):411–52.CrossRef
18.
go back to reference Kussul EM, Baidyk TN. On Information encoding in associative–projective neural networks. Report 93-3. Kiev, Ukraine: V.M. Glushkov Inst. of Cybernetics; 1993 (in Russian). Kussul EM, Baidyk TN. On Information encoding in associative–projective neural networks. Report 93-3. Kiev, Ukraine: V.M. Glushkov Inst. of Cybernetics; 1993 (in Russian).
19.
go back to reference Landauer T, Dumais S. A solution to Plato’s problem: the Latent Semantic Analysis theory of acquisition, induction and representation of knowledge. Psychol Rev. 1997;104(2):211–40.CrossRef Landauer T, Dumais S. A solution to Plato’s problem: the Latent Semantic Analysis theory of acquisition, induction and representation of knowledge. Psychol Rev. 1997;104(2):211–40.CrossRef
20.
go back to reference Kanerva P, Kristoferson J, Holst A. Random Indexing of text samples for latent semantic analysis. Poster abstract. In: Gleitman LR, Josh AK, editors. Proc. 22nd annual conference of the Cognitive Science Society. Mahwah, NJ: Erlbaum; 2000. p. 1036. Full poster http://www.rni.org/kanerva/cogsci2k-poster.txt. Accessed 23 Nov 2008. Kanerva P, Kristoferson J, Holst A. Random Indexing of text samples for latent semantic analysis. Poster abstract. In: Gleitman LR, Josh AK, editors. Proc. 22nd annual conference of the Cognitive Science Society. Mahwah, NJ: Erlbaum; 2000. p. 1036. Full poster http://​www.​rni.​org/​kanerva/​cogsci2k-poster.​txt. Accessed 23 Nov 2008.
21.
go back to reference Papadimitriou C, Raghavan P, Tamaki H, Vempala S. Latent semantic indexing: a probabilistic analysis. Proc. 17th ACM symposium on the principles of database systems. New York: ACM Press; 1998. p. 159–68. Papadimitriou C, Raghavan P, Tamaki H, Vempala S. Latent semantic indexing: a probabilistic analysis. Proc. 17th ACM symposium on the principles of database systems. New York: ACM Press; 1998. p. 159–68.
22.
go back to reference Kaski S. Dimensionality reduction by random mapping: fast similarity computation for clustering. Proc. int’l joint conference on neural networks, IJCNN’98. Piscataway, NJ: IEEE Service Center; 1999. p. 413–8. Kaski S. Dimensionality reduction by random mapping: fast similarity computation for clustering. Proc. int’l joint conference on neural networks, IJCNN’98. Piscataway, NJ: IEEE Service Center; 1999. p. 413–8.
24.
go back to reference Schütze H. Word space. In: Hanson SJ, Cowan JD, Giles CL, editors. Advances in neural information processing systems 5. San Mateo, CA: Kaufmann; 1993. p. 895–902. Schütze H. Word space. In: Hanson SJ, Cowan JD, Giles CL, editors. Advances in neural information processing systems 5. San Mateo, CA: Kaufmann; 1993. p. 895–902.
25.
go back to reference Lund K, Burgess C, Atchley R. Semantic and associative priming in high-dimensional semantic space. Proc. 17th annual conference of the Cognitive Science Society. Mahwah, NJ: Erlbaum; 1995. p. 660–5. Lund K, Burgess C, Atchley R. Semantic and associative priming in high-dimensional semantic space. Proc. 17th annual conference of the Cognitive Science Society. Mahwah, NJ: Erlbaum; 1995. p. 660–5.
27.
go back to reference Jones MN, Mewhort DJK. Representing word meaning and order information in a composite holographic lexicon. Psychol Rev. 2007;114(1):1–37.PubMedCrossRef Jones MN, Mewhort DJK. Representing word meaning and order information in a composite holographic lexicon. Psychol Rev. 2007;114(1):1–37.PubMedCrossRef
28.
go back to reference Sahlgren M, Holst A, Kanerva P. Permutations as a means to encode order in word space. Proc. 30th annual conference of the Cognitive Science Society. Austin, TX: Cognitive Science Society. p. 1300–5. Sahlgren M, Holst A, Kanerva P. Permutations as a means to encode order in word space. Proc. 30th annual conference of the Cognitive Science Society. Austin, TX: Cognitive Science Society. p. 1300–5.
29.
go back to reference Widdows D. Geometry and meaning. Stanford: CSLI; 2004. Widdows D. Geometry and meaning. Stanford: CSLI; 2004.
Metadata
Title
Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors
Author
Pentti Kanerva
Publication date
01-06-2009
Publisher
Springer-Verlag
Published in
Cognitive Computation / Issue 2/2009
Print ISSN: 1866-9956
Electronic ISSN: 1866-9964
DOI
https://doi.org/10.1007/s12559-009-9009-8

Other articles of this Issue 2/2009

Cognitive Computation 2/2009 Go to the issue

Premium Partner