DPFormer: Learning Differentially Private Transformer on Long-Tailed Data

by   Youlong Ding, et al.

The Transformer has emerged as a versatile and effective architecture with broad applications. However, it still remains an open problem how to efficiently train a Transformer model of high utility with differential privacy guarantees. In this paper, we identify two key challenges in learning differentially private Transformers, i.e., heavy computation overhead due to per-sample gradient clipping and unintentional attention distraction within the attention mechanism. In response, we propose DPFormer, equipped with Phantom Clipping and Re-Attention Mechanism, to address these challenges. Our theoretical analysis shows that DPFormer can reduce computational costs during gradient clipping and effectively mitigate attention distraction (which could obstruct the training process and lead to a significant performance drop, especially in the presence of long-tailed data). Such analysis is further corroborated by empirical results on two real-world datasets, demonstrating the efficiency and effectiveness of the proposed DPFormer.


page 4

page 9


Differentially Private Estimation of Hawkes Process

Point process models are of great importance in real world applications....

Differentially Private Precision Matrix Estimation

In this paper, we study the problem of precision matrix estimation when ...

Private Quantiles Estimation in the Presence of Atoms

We address the differentially private estimation of multiple quantiles (...

Oneshot Differentially Private Top-k Selection

Being able to efficiently and accurately select the top-k elements witho...

DP-TBART: A Transformer-based Autoregressive Model for Differentially Private Tabular Data Generation

The generation of synthetic tabular data that preserves differential pri...

Differentially Private Accelerated Optimization Algorithms

We present two classes of differentially private optimization algorithms...

Please sign up or login with your details

Forgot password? Click here to reset