Joint Embedding of Words and Labels for Text Classification

05/10/2018
by   Guoyin Wang, et al.
0

Word embeddings are effective intermediate representations for capturing semantic regularities between words, when learning the representations of text sequences. We propose to view text classification as a label-word joint embedding problem: each label is embedded in the same space with the word vectors. We introduce an attention framework that measures the compatibility of embeddings between text sequences and labels. The attention is learned on a training set of labeled samples to ensure that, given a text sequence, the relevant words are weighted higher than the irrelevant ones. Our method maintains the interpretability of word embeddings, and enjoys a built-in ability to leverage alternative sources of information, in addition to input text sequences. Extensive results on the several large text datasets show that the proposed framework outperforms the state-of-the-art methods by a large margin, in terms of both accuracy and speed.

READ FULL TEXT
research
04/03/2018

Incorporating Word Embeddings into Open Directory Project based Large-scale Classification

Recently, implicit representation models, such as embedding or deep lear...
research
08/29/2018

Improved Semantic-Aware Network Embedding with Fine-Grained Word Alignment

Network embeddings, which learn low-dimensional representations for each...
research
12/12/2016

FastText.zip: Compressing text classification models

We consider the problem of producing compact architectures for text clas...
research
06/03/2020

Exploiting Class Labels to Boost Performance on Embedding-based Text Classification

Text classification is one of the most frequent tasks for processing tex...
research
09/13/2021

Embedding Convolutions for Short Text Extreme Classification with Millions of Labels

Automatic annotation of short-text data to a large number of target labe...
research
05/18/2020

Text Classification with Few Examples using Controlled Generalization

Training data for text classification is often limited in practice, espe...
research
12/27/2019

Encoding word order in complex embeddings

Sequential word order is important when processing text. Currently, neur...

Please sign up or login with your details

Forgot password? Click here to reset