Scaling Deep Contrastive Learning Batch Size with Almost Constant Peak Memory Usage

01/18/2021
by   Luyu Gao, et al.
0

Contrastive learning has been applied successfully to learn numerical vector representations of various forms of data, such as texts and images. Learned encoders exhibit versatile transfer capabilities to many downstream tasks. Representation based search is highly efficient with state-of-the-art performance. Previous researches demonstrated that learning high-quality representations requires a large number of negatives in contrastive loss. In practice, the technique of in-batch negative is used, where for each example in a batch, other batch examples' positives will be taken as its negatives, avoiding encoding extra negatives. This, however, still conditions each example's loss on all batch examples and requires fitting the entire large batch into GPU memory. This paper introduces a re-computation technique that decouples back propagation between contrastive loss and the encoder, removing encoder backward pass data dependency along the batch dimension. As a result, gradients can be computed for one subset of the batch at a time, leading to an almost constant peak GPU memory usage for batches of different sizes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/20/2022

Watermarking Pre-trained Encoders in Contrastive Learning

Contrastive learning has become a popular technique to pre-train image e...
research
02/24/2022

Provable Stochastic Optimization for Global Contrastive Learning: Small Batch Does Not Harm Performance

In this paper, we study contrastive learning from an optimization perspe...
research
06/05/2023

SamToNe: Improving Contrastive Loss for Dual Encoder Retrieval Models with Same Tower Negatives

Dual encoders have been used for retrieval tasks and representation lear...
research
06/07/2023

Large-Scale Cell Representation Learning via Divide-and-Conquer Contrastive Learning

Single-cell RNA sequencing (scRNA-seq) data is a potent tool for compreh...
research
03/10/2022

MetAug: Contrastive Learning via Meta Feature Augmentation

What matters for contrastive learning? We argue that contrastive learnin...
research
06/18/2021

Investigating the Role of Negatives in Contrastive Representation Learning

Noise contrastive learning is a popular technique for unsupervised repre...
research
09/26/2013

Batch-iFDD for Representation Expansion in Large MDPs

Matching pursuit (MP) methods are a promising class of feature construct...

Please sign up or login with your details

Forgot password? Click here to reset