A Digital Neuromorphic Architecture Efficiently Facilitating Complex Synaptic Response Functions Applied to Liquid State Machines

03/21/2017
by   Michael R. Smith, et al.
0

Information in neural networks is represented as weighted connections, or synapses, between neurons. This poses a problem as the primary computational bottleneck for neural networks is the vector-matrix multiply when inputs are multiplied by the neural network weights. Conventional processing architectures are not well suited for simulating neural networks, often requiring large amounts of energy and time. Additionally, synapses in biological neural networks are not binary connections, but exhibit a nonlinear response function as neurotransmitters are emitted and diffuse between neurons. Inspired by neuroscience principles, we present a digital neuromorphic architecture, the Spiking Temporal Processing Unit (STPU), capable of modeling arbitrary complex synaptic response functions without requiring additional hardware components. We consider the paradigm of spiking neurons with temporally coded information as opposed to non-spiking rate coded neurons used in most neural networks. In this paradigm we examine liquid state machines applied to speech recognition and show how a liquid state machine with temporal dynamics maps onto the STPU-demonstrating the flexibility and efficiency of the STPU for instantiating neural algorithms.

READ FULL TEXT
research
01/16/2016

Conversion of Artificial Recurrent Neural Networks to Spiking Neural Networks for Low-power Neuromorphic Hardware

In recent years the field of neuromorphic low-power systems that consume...
research
11/27/2020

Compiling Spiking Neural Networks to Mitigate Neuromorphic Hardware Constraints

Spiking Neural Networks (SNNs) are efficient computation models to perfo...
research
02/28/2023

Hyperdimensional Computing with Spiking-Phasor Neurons

Vector Symbolic Architectures (VSAs) are a powerful framework for repres...
research
10/02/2019

An Introduction to Probabilistic Spiking Neural Networks

Spiking neural networks (SNNs) are distributed trainable systems whose c...
research
09/16/2019

Reservoirs learn to learn

We consider reservoirs in the form of liquid state machines, i.e., recur...
research
09/25/2017

Robust Associative Memories Naturally Occuring From Recurrent Hebbian Networks Under Noise

The brain is a noisy system subject to energy constraints. These facts a...
research
01/11/2019

Low-Power Neuromorphic Hardware for Signal Processing Applications

Machine learning has emerged as the dominant tool for implementing compl...

Please sign up or login with your details

Forgot password? Click here to reset