Neural Network Training with Approximate Logarithmic Computations

10/22/2019
by   Arnab Sanyal, et al.
0

The high computational complexity associated with training deep neural networks limits online and real-time training on edge devices. This paper proposed an end-to-end training and inference scheme that eliminates multiplications by approximate operations in the log-domain which has the potential to significantly reduce implementation complexity. We implement the entire training procedure in the log-domain, with fixed-point data representations. This training procedure is inspired by hardware-friendly approximations of log-domain addition which are based on look-up tables and bit-shifts. We show that our 16-bit log-based training can achieve classification accuracy within approximately 1 floating-point baselines for a number of commonly used datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/19/2018

Training Deep Neural Networks with 8-bit Floating Point Numbers

The state-of-the-art hardware platforms for training Deep Neural Network...
research
02/19/2023

Fixflow: A Framework to Evaluate Fixed-point Arithmetic in Light-Weight CNN Inference

Convolutional neural networks (CNN) are widely used in resource-constrai...
research
05/25/2019

LUTNet: speeding up deep neural network inferencing via look-up tables

We consider the use of look-up tables (LUT) to speed up and simplify the...
research
12/07/2020

Deep Neural Network Training without Multiplications

Is multiplication really necessary for deep neural networks? Here we pro...
research
09/29/2019

AdaptivFloat: A Floating-point based Data Type for Resilient Deep Learning Inference

Conventional hardware-friendly quantization methods, such as fixed-point...
research
05/26/2023

Hardware-Efficient Transformer Training via Piecewise Affine Operations

Multiplications are responsible for most of the computational cost invol...
research
07/30/2023

An Efficient Approach to Mitigate Numerical Instability in Backpropagation for 16-bit Neural Network Training

In this research, we delve into the intricacies of the numerical instabi...

Please sign up or login with your details

Forgot password? Click here to reset