Evaluating the Utility of Document Embedding Vector Difference for Relation Learning

07/18/2019
by   Jingyuan Zhang, et al.
0

Recent work has demonstrated that vector offsets obtained by subtracting pretrained word embedding vectors can be used to predict lexical relations with surprising accuracy. Inspired by this finding, in this paper, we extend the idea to the document level, in generating document-level embeddings, calculating the distance between them, and using a linear classifier to classify the relation between the documents. In the context of duplicate detection and dialogue act tagging tasks, we show that document-level difference vectors have utility in assessing document-level similarity, but perform less well in multi-relational classification.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset