Reweighted Proximal Pruning for Large-Scale Language Representation

09/27/2019
by   Fu-Ming Guo, et al.
0

Recently, pre-trained language representation flourishes as the mainstay of the natural language understanding community, e.g., BERT. These pre-trained language representations can create state-of-the-art results on a wide range of downstream tasks. Along with continuous significant performance improvement, the size and complexity of these pre-trained neural models continue to increase rapidly. Is it possible to compress these large-scale language representation models? How will the pruned language representation affect the downstream multi-task transfer learning objectives? In this paper, we propose Reweighted Proximal Pruning (RPP), a new pruning method specifically designed for a large-scale language representation model. Through experiments on SQuAD and the GLUE benchmark suite, we show that proximal pruned BERT keeps high accuracy for both the pre-training task and the downstream multiple fine-tuning tasks at high prune ratio. RPP provides a new perspective to help us analyze what large-scale language representation might learn. Additionally, RPP makes it possible to deploy a large state-of-the-art language representation model such as BERT on a series of distinct devices (e.g., online servers, mobile phones, and edge devices).

READ FULL TEXT

page 8

page 12

research
02/19/2020

Compressing BERT: Studying the Effects of Weight Pruning on Transfer Learning

Universal feature extractors, such as BERT for natural language processi...
research
06/05/2020

Accelerating Natural Language Understanding in Task-Oriented Dialog

Task-oriented dialog models typically leverage complex neural architectu...
research
07/11/2022

Learning Large-scale Universal User Representation with Sparse Mixture of Experts

Learning user sequence behaviour embedding is very sophisticated and cha...
research
01/11/2021

AT-BERT: Adversarial Training BERT for Acronym Identification Winning Solution for SDU@AAAI-21

Acronym identification focuses on finding the acronyms and the phrases t...
research
05/17/2019

Story Ending Prediction by Transferable BERT

Recent advances, such as GPT and BERT, have shown success in incorporati...
research
05/23/2022

Vector-Quantized Input-Contextualized Soft Prompts for Natural Language Understanding

Prompt Tuning (PT) has been largely successful as a parameter-efficient ...
research
06/14/2022

ProcTHOR: Large-Scale Embodied AI Using Procedural Generation

Massive datasets and high-capacity models have driven many recent advanc...

Please sign up or login with your details

Forgot password? Click here to reset