Rethinking Skip-thought: A Neighborhood based Approach

06/09/2017
by   Shuai Tang, et al.
0

We study the skip-thought model with neighborhood information as weak supervision. More specifically, we propose a skip-thought neighbor model to consider the adjacent sentences as a neighborhood. We train our skip-thought neighbor model on a large corpus with continuous sentences, and then evaluate the trained model on 7 tasks, which include semantic relatedness, paraphrase detection, and classification benchmarks. Both quantitative comparison and qualitative investigation are conducted. We empirically show that, our skip-thought neighbor model performs as well as the skip-thought model on evaluation tasks. In addition, we found that, incorporating an autoencoder path in our model didn't aid our model to perform better, while it hurts the performance of the skip-thought model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2017

Trimming and Improving Skip-thought Vectors

The skip-thought model has been proven to be effective at learning sente...
research
11/19/2015

Skip-Thought Memory Networks

Question Answering (QA) is fundamental to natural language processing in...
research
06/22/2015

Skip-Thought Vectors

We describe an approach for unsupervised learning of a generic, distribu...
research
11/19/2015

Multi-task Sequence to Sequence Learning

Sequence to sequence learning has recently emerged as a new paradigm in ...
research
12/22/2000

Creativity and Delusions: A Neurocomputational Approach

Thinking is one of the most interesting mental processes. Its complexity...
research
07/26/2021

Thought Flow Nets: From Single Predictions to Trains of Model Thought

When humans solve complex problems, they rarely come up with a decision ...
research
04/30/2020

A more secure IPv6 neighborhood process

The process of neighborhood establishment in an IPv6 network is made out...

Please sign up or login with your details

Forgot password? Click here to reset