Vector Symbolic Architectures as a Computing Framework for Emerging Hardware.
Denis Kleyko, Mike Davies, E Paxon Frady, Pentti Kanerva, Spencer J Kent, Bruno A Olshausen, Evgeny Osipov, Jan M Rabaey, Dmitri A Rachkovskij, Abbas Rahimi, Friedrich T Sommer
Author Information
Denis Kleyko: Redwood Center for Theoretical Neuroscience at the University of California at Berkeley, CA 94720, USA and also with the Intelligent Systems Lab at Research Institutes of Sweden, 16440 Kista, Sweden.
Mike Davies: Neuromorphic Computing Lab, Intel Labs, Santa Clara, CA 95054, USA.
E Paxon Frady: Neuromorphic Computing Lab, Intel Labs, Santa Clara, CA 95054, USA.
Pentti Kanerva: Redwood Center for Theoretical Neuroscience at the University of California at Berkeley, CA 94720, USA.
Spencer J Kent: Redwood Center for Theoretical Neuroscience at the University of California at Berkeley, CA 94720, USA.
Bruno A Olshausen: Redwood Center for Theoretical Neuroscience at the University of California at Berkeley, CA 94720, USA.
Evgeny Osipov: Department of Computer Science Electrical and Space Engineering, Luleå University of Technology, 97187 Luleå, Sweden.
Jan M Rabaey: Department of Electrical Engineering and Computer Sciences at the University of California at Berkeley, CA 94720, USA.
Dmitri A Rachkovskij: International Research and Training Center for Information Technologies and Systems, 03680 Kyiv, Ukraine, and with the Department of Computer Science Electrical and Space Engineering, Luleå University of Technology, 97187 Luleå, Sweden.
Abbas Rahimi: IBM Research - Zurich, 8803 Rüschlikon, Switzerland.
Friedrich T Sommer: Neuromorphic Computing Lab, Intel Labs, Santa Clara, CA 95054, USA and also with the Redwood Center for Theoretical Neuroscience at the University of California at Berkeley, CA 94720, USA.
This article reviews recent progress in the development of the computing framework (also known as Hyperdimensional Computing). This framework is well suited for implementation in stochastic, emerging hardware and it naturally expresses the types of cognitive operations required for Artificial Intelligence (AI). We demonstrate in this article that the field-like algebraic structure of Vector Symbolic Architectures offers simple but powerful operations on high-dimensional vectors that can support all data structures and manipulations relevant to modern computing. In addition, we illustrate the distinguishing feature of Vector Symbolic Architectures, "computing in superposition," which sets it apart from conventional computing. It also opens the door to efficient solutions to the difficult combinatorial search problems inherent in AI applications. We sketch ways of demonstrating that Vector Symbolic Architectures are computationally universal. We see them acting as a framework for computing with distributed representations that can play a role of an abstraction layer for emerging computing hardware. This article serves as a reference for computer architects by illustrating the philosophy behind Vector Symbolic Architectures, techniques of distributed computing with them, and their relevance to emerging computing hardware, such as neuromorphic computing.