Long Short-Term Memory-Networks for Machine Reading

01/25/2016
by   Jianpeng Cheng, et al.
0

In this paper we address the question of how to render sequence-level networks better at handling structured input. We propose a machine reading simulator which processes text incrementally from left to right and performs shallow reasoning with memory and attention. The reader extends the Long Short-Term Memory architecture with a memory network in place of a single memory cell. This enables adaptive memory usage during recurrence with neural attention, offering a way to weakly induce relations among tokens. The system is initially designed to process a single sequence but we also demonstrate how to integrate it with an encoder-decoder architecture. Experiments on language modeling, sentiment analysis, and natural language inference show that our model matches or outperforms the state of the art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/17/2016

Cached Long Short-Term Memory Neural Networks for Document-Level Sentiment Classification

Recently, neural networks have achieved great success on sentiment class...
research
07/19/2018

Sequence to Logic with Copy and Cache

Generating logical form equivalents of human language is a fresh way to ...
research
11/24/2018

Recurrently Controlled Recurrent Networks

Recurrent neural networks (RNNs) such as long short-term memory and gate...
research
05/31/2018

Lip Reading Using Convolutional Auto Encoders as Feature Extractor

Visual recognition of speech using the lip movement is called Lip-readin...
research
04/12/2019

IIT (BHU) Varanasi at MSR-SRST 2018: A Language Model Based Approach for Natural Language Generation

This paper describes our submission system for the Shallow Track of Surf...
research
10/01/2019

A Three-dimensional Convolutional-Recurrent Network for Convective Storm Nowcasting

Very short-term convective storm forecasting, termed nowcasting, has lon...
research
04/15/2022

LaMemo: Language Modeling with Look-Ahead Memory

Although Transformers with fully connected self-attentions are powerful ...

Please sign up or login with your details

Forgot password? Click here to reset