The Lifted Matrix-Space Model for Semantic Composition

by   WooJin Chung, et al.

Recent advances in tree structured sentence encoding models have shown that explicitly modeling syntax can help handle compositionality. More specifically, recent works by Socher2012, Socher2013, and Chen2013 have shown that using more powerful composition functions with multiplicative interactions within tree-structured models can yield significant improvements in model performance. However, existing compositional approaches which make use of these multiplicative interactions usually have to learn task-specific matrix-shaped word embeddings or rely on third-order tensors, which can be very costly. This paper introduces the Lifted Matrix-Space model which improves on the predecessors on this aspect. The model learns a global transformation from pre-trained word embeddings into matrices, which can be composed via matrix multiplication. The upshot is that we can capture the multiplicative interaction without learning matrix-valued word representations from scratch. In addition, our composition function effectively transmits a larger number of activations across layers with comparably few model parameters. We evaluate our model on the Stanford NLI corpus and the Multi-Genre NLI corpus and find that the Lifted Matrix-Space model outperforms the tree-structured long short-term memory networks.


page 1

page 2

page 3

page 4


Syntax-Aware Multi-Sense Word Embeddings for Deep Compositional Models of Meaning

Deep compositional models of meaning acting on distributional representa...

Evaluating KGR10 Polish word embeddings in the recognition of temporal expressions using BiLSTM-CRF

The article introduces a new set of Polish word embeddings, built using ...

Learning to Compose Task-Specific Tree Structures

For years, recursive neural networks (RvNNs) have been shown to be suita...

Attentive Tree-structured Network for Monotonicity Reasoning

Many state-of-art neural models designed for monotonicity reasoning perf...

Deep Tree Transductions - A Short Survey

The paper surveys recent extensions of the Long-Short Term Memory networ...

Assessing the Unitary RNN as an End-to-End Compositional Model of Syntax

We show that both an LSTM and a unitary-evolution recurrent neural netwo...

Please sign up or login with your details

Forgot password? Click here to reset