Hardware Synthesis of State-Space Equations; Application to FPGA Implementation of Shallow and Deep Neural Networks

by   Amir-Hossein Kiamarzi, et al.

Nowadays, shallow and deep Neural Networks (NNs) have vast applications including biomedical engineering, image processing, computer vision, and speech recognition. Many researchers have developed hardware accelerators including field-programmable gate arrays (FPGAs) for implementing high-performance and energy efficient NNs. Apparently, the hardware architecture design process is specific and time-consuming for each NN. Therefore, a systematic way to design, implement and optimize NNs is highly demanded. The paper presents a systematic approach to implement state-space models in register transfer level (RTL), with special interest for NN implementation. The proposed design flow is based on the iterative nature of state-space models and the analogy between state-space formulations and finite-state machines. The method can be used in linear/nonlinear and time-varying/time-invariant systems. It can also be used to implement either intrinsically iterative systems (widely used in various domains such as signal processing, numerical analysis, computer arithmetic, and control engineering), or systems that could be rewritten in equivalent iterative forms. The implementation of recurrent NNs such as long short-term memory (LSTM) NNs, which have intrinsic state-space forms, are another major applications for this framework. As a case study, it is shown that state-space systems can be used for the systematic implementation and optimization of NNs (as nonlinear and time-varying dynamic systems). An RTL code generating software is also provided online, which simplifies the automatic generation of NNs of arbitrary size.


page 1

page 8

page 9


LiteLSTM Architecture Based on Weights Sharing for Recurrent Neural Networks

Long short-term memory (LSTM) is one of the robust recurrent neural netw...

State Space LSTM Models with Particle MCMC Inference

Long Short-Term Memory (LSTM) is one of the most powerful sequence model...

E-RNN: Design Optimization for Efficient Recurrent Neural Networks in FPGAs

Recurrent Neural Networks (RNNs) are becoming increasingly important for...

An LSTM-Based Predictive Monitoring Method for Data with Time-varying Variability

The recurrent neural network and its variants have shown great success i...

Scaled-Time-Attention Robust Edge Network

This paper describes a systematic approach towards building a new family...

Exploiting FPGA Capabilities for Accelerated Biomedical Computing

This study presents advanced neural network architectures including Conv...

Leveraging arbitrary mobile sensor trajectories with shallow recurrent decoder networks for full-state reconstruction

Sensing is one of the most fundamental tasks for the monitoring, forecas...

Please sign up or login with your details

Forgot password? Click here to reset