Just-in-Time Dynamic-Batching

04/16/2019
by   Sheng Zha, et al.
0

Batching is an essential technique to improve computation efficiency in deep learning frameworks. While batch processing for models with static feed-forward computation graphs is straightforward to implement, batching for dynamic computation graphs such as syntax trees or social network graphs is challenging due to variable computation graph structure across samples. Through simulation and analysis of a Tree-LSTM model, we show the key trade-off between graph analysis time and batching effectiveness in dynamic batching. Based on this finding, we propose a dynamic batching method as an extension to MXNet Gluon's just-in-time compilation (JIT) framework. We show empirically that our method yields up to 6.25 times speed-up on a common dynamic workload, a tree-LSTM model for the semantic relatedness task.

READ FULL TEXT
research
02/07/2017

Deep Learning with Dynamic Computation Graphs

Neural networks that compute over graph structures are a natural fit for...
research
09/01/2017

Drawing Dynamic Graphs Without Timeslices

Timeslices are often used to draw and visualize dynamic graphs. While ti...
research
10/16/2021

Dynamic Graph Echo State Networks

Dynamic temporal graphs represent evolving relations between entities, e...
research
11/08/2019

On dynamic succinct graph representations

We address the problem of representing dynamic graphs using k^2-trees. T...
research
04/17/2019

Low-Latency Graph Streaming Using Compressed Purely-Functional Trees

Due to the dynamic nature of real-world graphs, there has been a growing...
research
12/11/2017

Cavs: A Vertex-centric Programming Interface for Dynamic Neural Networks

Recent deep learning (DL) models have moved beyond static network archit...
research
10/11/2021

Online Graph Learning in Dynamic Environments

Inferring the underlying graph topology that characterizes structured da...

Please sign up or login with your details

Forgot password? Click here to reset