Power Optimizations in MTJ-based Neural Networks through Stochastic Computing

by   Ankit Mondal, et al.

Artificial Neural Networks (ANNs) have found widespread applications in tasks such as pattern recognition and image classification. However, hardware implementations of ANNs using conventional binary arithmetic units are computationally expensive, energy-intensive and have large area overheads. Stochastic Computing (SC) is an emerging paradigm which replaces these conventional units with simple logic circuits and is particularly suitable for fault-tolerant applications. Spintronic devices, such as Magnetic Tunnel Junctions (MTJs), are capable of replacing CMOS in memory and logic circuits. In this work, we propose an energy-efficient use of MTJs, which exhibit probabilistic switching behavior, as Stochastic Number Generators (SNGs), which forms the basis of our NN implementation in the SC domain. Further, error resilient target applications of NNs allow us to introduce Approximate Computing, a framework wherein accuracy of computations is traded-off for substantial reductions in power consumption. We propose approximating the synaptic weights in our MTJ-based NN implementation, in ways brought about by properties of our MTJ-SNG, to achieve energy-efficiency. We design an algorithm that can perform such approximations within a given error tolerance in a single-layer NN in an optimal way owing to the convexity of the problem formulation. We then use this algorithm and develop a heuristic approach for approximating multi-layer NNs. To give a perspective of the effectiveness of our approach, a 43 1 about by the proposed algorithm.


page 1

page 2

page 3

page 4


Correlation Manipulating Circuits for Stochastic Computing

Stochastic computing (SC) is an emerging computing technique that promis...

On the Universal Approximation Property and Equivalence of Stochastic Computing-based Neural Networks and Binary Neural Networks

Large-scale deep neural networks are both memory intensive and computati...

Energy-Efficient Hybrid Stochastic-Binary Neural Networks for Near-Sensor Computing

Recent advances in neural networks (NNs) exhibit unprecedented success a...

Neural Network Design for Energy-Autonomous AI Applications using Temporal Encoding

Neural Networks (NNs) are steering a new generation of artificial intell...

Design Challenges of Neural Network Acceleration Using Stochastic Computing

The enormous and ever-increasing complexity of state-of-the-art neural n...

Machine learning using magnetic stochastic synapses

The impressive performance of artificial neural networks has come at the...

Significance Driven Hybrid 8T-6T SRAM for Energy-Efficient Synaptic Storage in Artificial Neural Networks

Multilayered artificial neural networks (ANN) have found widespread util...

Please sign up or login with your details

Forgot password? Click here to reset