Fine-Tuning Transformers: Vocabulary Transfer

12/29/2021
by   Igor Samenko, et al.
0

Transformers are responsible for the vast majority of recent advances in natural language processing. The majority of practical natural language processing applications of these models is typically enabled through transfer learning. This paper studies if corpus-specific tokenization used for fine-tuning improves the resulting performance of the model. Through a series of experiments, we demonstrate that such tokenization combined with the initialization and fine-tuning strategy for the vocabulary tokens speeds up the transfer and boosts the performance of the fine-tuned model. We call this aspect of transfer facilitation vocabulary transfer.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset