Efficient Transfer Learning Schemes for Personalized Language Modeling using Recurrent Neural Network

by   Seunghyun Yoon, et al.
Seoul National University

In this paper, we propose an efficient transfer leaning methods for training a personalized language model using a recurrent neural network with long short-term memory architecture. With our proposed fast transfer learning schemes, a general language model is updated to a personalized language model with a small amount of user data and a limited computing resource. These methods are especially useful for a mobile device environment while the data is prevented from transferring out of the device for privacy purposes. Through experiments on dialogue data in a drama, it is verified that our transfer learning methods have successfully generated the personalized language model, whose output is more similar to the personal language style in both qualitative and quantitative aspects.


page 1

page 2

page 3

page 4


Personalized Language Model for Query Auto-Completion

Query auto-completion is a search engine feature whereby the system sugg...

DARTS: Dialectal Arabic Transcription System

We present the speech to text transcription system, called DARTS, for lo...

Transfer learning from language models to image caption generators: Better models may not transfer better

When designing a neural caption generator, a convolutional neural networ...

On Architectures for Including Visual Information in Neural Language Models for Image Description

A neural language model can be conditioned into generating descriptions ...

Fine Grained Knowledge Transfer for Personalized Task-oriented Dialogue Systems

Training a personalized dialogue system requires a lot of data, and the ...

PROPS: Probabilistic personalization of black-box sequence models

We present PROPS, a lightweight transfer learning mechanism for sequenti...

Language model integration based on memory control for sequence to sequence speech recognition

In this paper, we explore several new schemes to train a seq2seq model t...

Please sign up or login with your details

Forgot password? Click here to reset