Predictive Exit: Prediction of Fine-Grained Early Exits for Computation- and Energy-Efficient Inference

06/09/2022
by   Xiangjie Li, et al.
0

By adding exiting layers to the deep learning networks, early exit can terminate the inference earlier with accurate results. The passive decision-making of whether to exit or continue the next layer has to go through every pre-placed exiting layer until it exits. In addition, it is also hard to adjust the configurations of the computing platforms alongside the inference proceeds. By incorporating a low-cost prediction engine, we propose a Predictive Exit framework for computation- and energy-efficient deep learning applications. Predictive Exit can forecast where the network will exit (i.e., establish the number of remaining layers to finish the inference), which effectively reduces the network computation cost by exiting on time without running every pre-placed exiting layer. Moreover, according to the number of remaining layers, proper computing configurations (i.e., frequency and voltage) are selected to execute the network to further save energy. Extensive experimental results demonstrate that Predictive Exit achieves up to 96.2 computation reduction and 72.9 learning networks; and 12.8 compared with the early exit under state-of-the-art exiting strategies, given the same inference accuracy and latency.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset