Improving Formality Style Transfer with Context-Aware Rule Injection

06/01/2021
by   Zonghai Yao, et al.
UMass Lowell
University of Massachusetts Amherst
0

Models pre-trained on large-scale regular text corpora often do not work well for user-generated data where the language styles differ significantly from the mainstream text. Here we present Context-Aware Rule Injection (CARI), an innovative method for formality style transfer (FST). CARI injects multiple rules into an end-to-end BERT-based encoder and decoder model. It learns to select optimal rules based on context. The intrinsic evaluation showed that CARI achieved the new highest performance on the FST benchmark dataset. Our extrinsic evaluation showed that CARI can greatly improve the regular pre-trained models' performance on several tweet sentiment analysis tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

09/16/2021

Transductive Learning for Unsupervised Text Style Transfer

Unsupervised style transfer models are mainly based on an inductive lear...
09/18/2021

Text Detoxification using Large Pre-trained Neural Models

We present two novel unsupervised methods for eliminating toxicity in te...
05/14/2021

Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer

Scarcity of parallel data causes formality style transfer models to have...
10/22/2020

Multi-dimensional Style Transfer for Partially Annotated Data using Language Models as Discriminators

Style transfer has been widely explored in natural language generation w...
09/13/2022

Exploring Code Style Transfer with Neural Networks

Style is a significant component of natural language text, reflecting a ...
07/25/2019

Style Conditioned Recommendations

We propose Style Conditioned Recommendations (SCR) and introduce style i...
03/18/2020

Collaborative Distillation for Ultra-Resolution Universal Style Transfer

Universal style transfer methods typically leverage rich representations...

Please sign up or login with your details

Forgot password? Click here to reset