Computing on Functions Using Randomized Vector Representations

09/08/2021
by   E. Paxon Frady, et al.
31

Vector space models for symbolic processing that encode symbols by random vectors have been proposed in cognitive science and connectionist communities under the names Vector Symbolic Architecture (VSA), and, synonymously, Hyperdimensional (HD) computing. In this paper, we generalize VSAs to function spaces by mapping continuous-valued data into a vector space such that the inner product between the representations of any two data points represents a similarity kernel. By analogy to VSA, we call this new function encoding and computing framework Vector Function Architecture (VFA). In VFAs, vectors can represent individual data points as well as elements of a function space (a reproducing kernel Hilbert space). The algebraic vector operations, inherited from VSA, correspond to well-defined operations in function space. Furthermore, we study a previously proposed method for encoding continuous data, fractional power encoding (FPE), which uses exponentiation of a random base vector to produce randomized representations of data points and fulfills the kernel properties for inducing a VFA. We show that the distribution from which elements of the base vector are sampled determines the shape of the FPE kernel, which in turn induces a VFA for computing with band-limited functions. In particular, VFAs provide an algebraic framework for implementing large-scale kernel machines with random features, extending Rahimi and Recht, 2007. Finally, we demonstrate several applications of VFA models to problems in image recognition, density estimation and nonlinear regression. Our analyses and results suggest that VFAs constitute a powerful new framework for representing and manipulating functions in distributed neural systems, with myriad applications in artificial intelligence.

READ FULL TEXT

page 13

page 14

page 15

page 17

page 30

research
12/31/2021

Shift-Equivariant Similarity-Preserving Hypervector Representations of Sequences

Hyperdimensional Computing (HDC), also known as Vector-Symbolic Architec...
research
06/09/2021

Vector Symbolic Architectures as a Computing Framework for Nanoscale Hardware

This article reviews recent progress in the development of the computing...
research
11/11/2021

A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and Data Transformations

This two-part comprehensive survey is devoted to a computing framework m...
research
07/18/2022

Residual and Attentional Architectures for Vector-Symbols

Vector-symbolic architectures (VSAs) provide methods for computing which...
research
01/27/2022

Recursive Binding for Similarity-Preserving Hypervector Representations of Sequences

Hyperdimensional computing (HDC), also known as vector symbolic architec...
research
01/31/2020

A comparison of Vector Symbolic Architectures

Vector Symbolic Architectures (VSAs) combine a high-dimensional vector s...
research
04/10/2019

Performance Analysis of Linear Algebraic Functions using Reconfigurable Computing

This paper introduces a new mapping of geometrical transformation on the...

Please sign up or login with your details

Forgot password? Click here to reset