Supervised Domain Enablement Attention for Personalized Domain Classification

12/18/2018
by   Joo-Kyung Kim, et al.
0

In large-scale domain classification for natural language understanding, leveraging each user's domain enablement information, which refers to the preferred or authenticated domains by the user, with attention mechanism has been shown to improve the overall domain classification performance. In this paper, we propose a supervised enablement attention mechanism, which utilizes sigmoid activation for the attention weighting so that the attention can be computed with more expressive power without the weight sum constraint of softmax attention. The attention weights are explicitly encouraged to be similar to the corresponding elements of the ground-truth's one-hot vector by supervised attention, and the attention information of the other enabled domains is leveraged through self-distillation. By evaluating on the actual utterances from a large-scale IPDA, we show that our approach significantly improves domain classification performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2018

Efficient Large-Scale Domain Classification with Personalized Attention

In this paper, we explore the task of mapping spoken language utterances...
research
06/29/2018

Joint Learning of Domain Classification and Out-of-Domain Detection with Dynamic Class Weighting for Satisficing False Acceptance Rates

In domain classification for spoken dialog systems, correct detection of...
research
03/08/2020

Pseudo Labeling and Negative Feedback Learning for Large-scale Multi-label Domain Classification

In large-scale domain classification, an utterance can be handled by mul...
research
04/22/2018

A Scalable Neural Shortlisting-Reranking Approach for Large-Scale Domain Classification in Natural Language Understanding

Intelligent personal digital assistants (IPDAs), a popular real-life app...
research
05/02/2019

Continuous Learning for Large-scale Personalized Domain Classification

Domain classification is the task of mapping spoken language utterances ...
research
09/19/2016

A Cheap Linear Attention Mechanism with Fast Lookups and Fixed-Size Representations

The softmax content-based attention mechanism has proven to be very bene...
research
08/30/2023

Denoising Attention for Query-aware User Modeling in Personalized Search

The personalization of search results has gained increasing attention in...

Please sign up or login with your details

Forgot password? Click here to reset