Leapfrogging for parallelism in deep neural networks

01/15/2018
by   Yatin Saraiya, et al.
0

We present a technique, which we term leapfrogging, to parallelize back- propagation in deep neural networks. We show that this technique yields a savings of 1-1/k of a dominant term in backpropagation, where k is the number of threads (or gpus).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro