ToD-BERT: Pre-trained Natural Language Understanding for Task-Oriented Dialogues

04/15/2020
by   Chien-Sheng Wu, et al.
0

The use of pre-trained language models has emerged as a promising direction for improving dialogue systems. However, the underlying difference of linguistic patterns between conversational data and general text makes the existing pre-trained language models not as effective as they have been shown to be. Recently, there are some pre-training approaches based on open-domain dialogues, leveraging large-scale social media data such as Twitter or Reddit. Pre-training for task-oriented dialogues, on the other hand, is rarely discussed because of the long-standing and crucial data scarcity problem. In this work, we combine nine English-based, human-human, multi-turn and publicly available task-oriented dialogue datasets to conduct language model pre-training. The experimental results show that our pre-trained task-oriented dialogue BERT (ToD-BERT) surpasses BERT and other strong baselines in four downstream task-oriented dialogue applications, including intention detection, dialogue state tracking, dialogue act prediction, and response selection. Moreover, in the simulated limited data experiments, we show that ToD-BERT has stronger few-shot capacity that can mitigate the data scarcity problem in task-oriented dialogues.

READ FULL TEXT
research
06/17/2023

FutureTOD: Teaching Future Knowledge to Pre-trained Language Model for Task-Oriented Dialogue

Pre-trained language models based on general text enable huge success in...
research
09/28/2020

DialoGLUE: A Natural Language Understanding Benchmark for Task-Oriented Dialogue

A long-standing goal of task-oriented dialogue research is the ability t...
research
09/29/2021

Multi-Task Pre-Training for Plug-and-Play Task-Oriented Dialogue System

Pre-trained language models have been recently shown to benefit task-ori...
research
04/04/2020

Pre-Trained and Attention-Based Neural Networks for Building Noetic Task-Oriented Dialogue Systems

The NOESIS II challenge, as the Track 2 of the 8th Dialogue System Techn...
research
05/27/2021

Leveraging Linguistic Coordination in Reranking N-Best Candidates For End-to-End Response Selection Using BERT

Retrieval-based dialogue systems select the best response from many cand...
research
10/08/2022

On Task-Adaptive Pretraining for Dialogue Response Selection

Recent advancements in dialogue response selection (DRS) are based on th...
research
09/20/2021

PLATO-XL: Exploring the Large-scale Pre-training of Dialogue Generation

To explore the limit of dialogue generation pre-training, we present the...

Please sign up or login with your details

Forgot password? Click here to reset