CatE: Category-Name GuidedWord Embedding

08/20/2019
by   Yu Meng, et al.
0

Unsupervised word embedding has benefited a wide spectrum of NLP tasks due to its effectiveness of encoding word semantics in distributed word representations. However, unsupervised word embedding is a generic representation, not optimized for specific tasks. In this work, we propose a weakly-supervised word embedding framework, CatE. It uses category names to guide word embedding and effectively selects category representative words to regularize the embedding space where the categories are well separated. Experiments show that our model outperforms unsupervised word embedding models significantly on both document classification and category representative words retrieval tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2022

Subspace-based Set Operations on a Pre-trained Word Embedding Space

Word embedding is a fundamental technology in natural language processin...
research
09/21/2023

Word Embedding with Neural Probabilistic Prior

To improve word representation learning, we propose a probabilistic prio...
research
09/02/2020

On SkipGram Word Embedding Models with Negative Sampling: Unified Framework and Impact of Noise Distributions

SkipGram word embedding models with negative sampling, or SGN in short, ...
research
01/16/2020

Document Network Projection in Pretrained Word Embedding Space

We present Regularized Linear Embedding (RLE), a novel method that proje...
research
02/24/2017

Consistent Alignment of Word Embedding Models

Word embedding models offer continuous vector representations that can c...
research
11/06/2019

Word Embedding Algorithms as Generalized Low Rank Models and their Canonical Form

Word embedding algorithms produce very reliable feature representations ...
research
11/05/2015

Comparing Writing Styles using Word Embedding and Dynamic Time Warping

The development of plot or story in novels is reflected in the content a...

Please sign up or login with your details

Forgot password? Click here to reset