Learning sentence embeddings using Recursive Networks

05/22/2018
by   Anson Bastos, et al.
0

Learning sentence vectors that generalise well is a challenging task. In this paper we compare three methods of learning phrase embeddings: 1) Using LSTMs, 2) using recursive nets, 3) A variant of the method 2 using the POS information of the phrase. We train our models on dictionary definitions of words to obtain a reverse dictionary application similar to Felix et al. [1]. To see if our embeddings can be transferred to a new task we also train and test on the rotten tomatoes dataset [2]. We train keeping the sentence embeddings fixed as well as with fine tuning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/03/2017

Fine-tuning Tree-LSTM for phrase-level sentiment classification on a Polish dependency treebank. Submission to PolEval task 2

We describe a variant of Child-Sum Tree-LSTM deep neural network (Tai et...
research
05/31/2016

Implementing a Reverse Dictionary, based on word definitions, using a Node-Graph Architecture

In this paper, we outline an approach to build graph-based reverse dicti...
research
07/06/2023

Efficient Domain Adaptation of Sentence Embeddings using Adapters

Sentence embeddings enable us to capture the semantic similarity of shor...
research
04/30/2017

Revisiting Recurrent Networks for Paraphrastic Sentence Embeddings

We consider the problem of learning general-purpose, paraphrastic senten...
research
08/24/2017

A dependency look at the reality of constituency

A comment on "Neurophysiological dynamics of phrase-structure building d...
research
06/10/2016

Unsupervised Learning of Word-Sequence Representations from Scratch via Convolutional Tensor Decomposition

Unsupervised text embeddings extraction is crucial for text understandin...
research
10/30/2022

Actionable Phrase Detection using NLP

Actionable sentences are terms that, in the most basic sense, imply the ...

Please sign up or login with your details

Forgot password? Click here to reset