AdaBin: Improving Binary Neural Networks with Adaptive Binary Sets

08/17/2022
by   Zhijun Tu, et al.
0

This paper studies the Binary Neural Networks (BNNs) in which weights and activations are both binarized into 1-bit values, thus greatly reducing the memory usage and computational complexity. Since the modern deep neural networks are of sophisticated design with complex architecture for the accuracy reason, the diversity on distributions of weights and activations is very high. Therefore, the conventional sign function cannot be well used for effectively binarizing full-precision values in BNNs. To this end, we present a simple yet effective approach called AdaBin to adaptively obtain the optimal binary sets {b_1, b_2} (b_1, b_2∈ℝ) of weights and activations for each layer instead of a fixed set (i.e., {-1, +1}). In this way, the proposed method can better fit different distributions and increase the representation ability of binarized features. In practice, we use the center position and distance of 1-bit values to define a new binary quantization function. For the weights, we propose an equalization method to align the symmetrical center of binary distribution to real-valued distribution, and minimize the Kullback-Leibler divergence of them. Meanwhile, we introduce a gradient-based optimization method to get these two parameters for activations, which are jointly trained in an end-to-end manner. Experimental results on benchmark models and datasets demonstrate that the proposed AdaBin is able to achieve state-of-the-art performance. For instance, we obtain a 66.4% Top-1 accuracy on the ImageNet using ResNet-18 architecture, and a 69.4 mAP on PASCAL VOC using SSD300.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/03/2021

Self-Distribution Binary Neural Networks

In this work, we study the binary neural networks (BNNs) of which both t...
research
03/01/2021

Learning Frequency Domain Approximation for Binary Neural Networks

Binary neural networks (BNNs) represent original full-precision weights ...
research
08/26/2023

MST-compression: Compressing and Accelerating Binary Neural Networks with Minimum Spanning Tree

Binary neural networks (BNNs) have been widely adopted to reduce the com...
research
05/16/2022

Binarizing by Classification: Is soft function really necessary?

Binary neural network leverages the Sign function to binarize real value...
research
07/06/2022

Network Binarization via Contrastive Learning

Neural network binarization accelerates deep models by quantizing their ...
research
10/08/2021

Dynamic Binary Neural Network by learning channel-wise thresholds

Binary neural networks (BNNs) constrain weights and activations to +1 or...
research
06/12/2020

AlgebraNets

Neural networks have historically been built layerwise from the set of f...

Please sign up or login with your details

Forgot password? Click here to reset