Differentiable Product Quantization for End-to-End Embedding Compression

08/26/2019
by   Ting Chen, et al.
4

Embedding layer is commonly used to map discrete symbols into continuous embedding vectors that reflect their semantic meanings. As the number of symbols increase, the number of embedding parameter, as well as their size, increase linearly and become problematically large. In this work, we aim to reduce the size of embedding layer via learning discrete codes and composing embedding vectors from the codes. More specifically, we propose a differentiable product quantization framework with two instantiations, which can serve as an efficient drop-in replacement for existing embedding layer. Empirically, we evaluate the proposed method on three different language tasks, and show that the proposed method enables end-to-end training of embedding compression that achieves significant compression ratios (14-238×) at almost no performance cost (sometimes even better).

READ FULL TEXT

page 6

page 10

page 11

research
06/21/2018

Learning K-way D-dimensional Discrete Codes for Compact Embedding Representations

Conventional embedding methods directly associate each symbol with a con...
research
04/03/2017

Soft-to-Hard Vector Quantization for End-to-End Learning Compressible Representations

We present a new approach to learn compressible representations in deep ...
research
03/07/2019

Multi-Hot Compact Network Embedding

Network embedding, as a promising way of the network representation lear...
research
01/11/2020

Embedding Compression with Isotropic Iterative Quantization

Continuous representation of words is a standard component in deep learn...
research
11/27/2020

Knowledge transfer across cell lines using Hybrid Gaussian Process models with entity embedding vectors

To date, a large number of experiments are performed to develop a bioche...
research
02/26/2019

Saec: Similarity-Aware Embedding Compression in Recommendation Systems

Production recommendation systems rely on embedding methods to represent...
research
11/08/2017

Learning K-way D-dimensional Discrete Code For Compact Embedding Representations

Embedding methods such as word embedding have become pillars for many ap...

Please sign up or login with your details

Forgot password? Click here to reset