Toward Incorporation of Relevant Documents in word2vec

07/20/2017
by   Navid Rekabsaz, et al.
0

Recent advances in neural word embedding provide significant benefit to various information retrieval tasks. However as shown by recent studies, adapting the embedding models for the needs of IR tasks can bring considerable further improvements. The embedding models in general define the term relatedness by exploiting the terms' co-occurrences in short-window contexts. An alternative (and well-studied) approach in IR for related terms to a query is using local information i.e. a set of top-retrieved documents. In view of these two methods of term relatedness, in this work, we report our study on incorporating the local information of the query in the word embeddings. One main challenge in this direction is that the dense vectors of word embeddings and their estimation of term-to-term relatedness remain difficult to interpret and hard to analyze. As an alternative, explicit word representations propose vectors whose dimensions are easily interpretable, and recent methods show competitive performance to the dense vectors. We introduce a neural-based explicit representation, rooted in the conceptual ideas of the word2vec Skip-Gram model. The method provides interpretable explicit vectors while keeping the effectiveness of the Skip-Gram model. The evaluation of various explicit representations on word association collections shows that the newly proposed method out- performs the state-of-the-art explicit representations when tasked with ranking highly similar terms. Based on the introduced ex- plicit representation, we discuss our approaches on integrating local documents in globally-trained embedding models and discuss the preliminary results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/13/2018

An Unbiased Approach to Quantification of Gender Inclination using Interpretable Word Representations

Recent advances in word embedding provide significant benefit to various...
research
05/09/2017

Relevance-based Word Embedding

Learning a high-dimensional dense representation for vocabulary terms, a...
research
09/04/2019

Affect Enriched Word Embeddings for News Information Retrieval

Distributed representations of words have shown to be useful to improve ...
research
05/25/2016

Dimension Projection among Languages based on Pseudo-relevant Documents for Query Translation

Using top-ranked documents in response to a query has been shown to be a...
research
01/07/2019

Vector representations of text data in deep learning

In this dissertation we report results of our research on dense distribu...
research
08/14/2017

Improved Answer Selection with Pre-Trained Word Embeddings

This paper evaluates existing and newly proposed answer selection method...
research
09/27/2018

Consistency and Variation in Kernel Neural Ranking Model

This paper studies the consistency of the kernel-based neural ranking mo...

Please sign up or login with your details

Forgot password? Click here to reset