Improving Formality Style Transfer with Context-Aware Rule Injection

by   Zonghai Yao, et al.
UMass Lowell
University of Massachusetts Amherst

Models pre-trained on large-scale regular text corpora often do not work well for user-generated data where the language styles differ significantly from the mainstream text. Here we present Context-Aware Rule Injection (CARI), an innovative method for formality style transfer (FST). CARI injects multiple rules into an end-to-end BERT-based encoder and decoder model. It learns to select optimal rules based on context. The intrinsic evaluation showed that CARI achieved the new highest performance on the FST benchmark dataset. Our extrinsic evaluation showed that CARI can greatly improve the regular pre-trained models' performance on several tweet sentiment analysis tasks.


page 1

page 2

page 3

page 4


Transductive Learning for Unsupervised Text Style Transfer

Unsupervised style transfer models are mainly based on an inductive lear...

Text Detoxification using Large Pre-trained Neural Models

We present two novel unsupervised methods for eliminating toxicity in te...

Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer

Scarcity of parallel data causes formality style transfer models to have...

Multi-dimensional Style Transfer for Partially Annotated Data using Language Models as Discriminators

Style transfer has been widely explored in natural language generation w...

Exploring Code Style Transfer with Neural Networks

Style is a significant component of natural language text, reflecting a ...

Style Conditioned Recommendations

We propose Style Conditioned Recommendations (SCR) and introduce style i...

Collaborative Distillation for Ultra-Resolution Universal Style Transfer

Universal style transfer methods typically leverage rich representations...

Please sign up or login with your details

Forgot password? Click here to reset