Knowledge as a Teacher: Knowledge-Guided Structural Attention Networks

09/12/2016
by   Yun-Nung Chen, et al.
0

Natural language understanding (NLU) is a core component of a spoken dialogue system. Recently recurrent neural networks (RNN) obtained strong results on NLU due to their superior ability of preserving sequential information over time. Traditionally, the NLU module tags semantic slots for utterances considering their flat structures, as the underlying RNN structure is a linear chain. However, natural language exhibits linguistic properties that provide rich, structured information for better understanding. This paper introduces a novel model, knowledge-guided structural attention networks (K-SAN), a generalization of RNN to additionally incorporate non-flat network topologies guided by prior knowledge. There are two characteristics: 1) important substructures can be captured from small training data, allowing the model to generalize to previously unseen test data; 2) the model automatically figures out the salient substructures that are essential to predict the semantic tags of the given sentences, so that the understanding performance can be improved. The experiments on the benchmark Air Travel Information System (ATIS) data show that the proposed K-SAN architecture can effectively extract salient knowledge from substructures with an attention mechanism, and outperform the performance of the state-of-the-art neural network based frameworks.

READ FULL TEXT
research
06/01/2017

Natural Language Generation for Spoken Dialogue System using RNN Encoder-Decoder Networks

Natural language generation (NLG) is a critical component in a spoken di...
research
12/17/2018

Multi-task learning to improve natural language understanding

Recently advancements in sequence-to-sequence neural network architectur...
research
01/25/2022

Convex Polytope Modelling for Unsupervised Derivation of Semantic Structure for Data-efficient Natural Language Understanding

Popular approaches for Natural Language Understanding (NLU) usually rely...
research
11/13/2018

Modeling Local Dependence in Natural Language with Multi-channel Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have been widely used in processing nat...
research
07/08/2016

Log-Linear RNNs: Towards Recurrent Neural Networks with Flexible Prior Knowledge

We introduce LL-RNNs (Log-Linear RNNs), an extension of Recurrent Neural...
research
09/17/2018

Robust Spoken Language Understanding via Paraphrasing

Learning intents and slot labels from user utterances is a fundamental s...
research
09/23/2021

Incorporating Linguistic Knowledge for Abstractive Multi-document Summarization

Within natural language processing tasks, linguistic knowledge can alway...

Please sign up or login with your details

Forgot password? Click here to reset