Cats climb entails mammals move: preserving hyponymy in compositional distributional semantics

05/28/2020
by   Gemma De las Cuevas, et al.
0

To give vector-based representations of meaning more structure, one approach is to use positive semidefinite (psd) matrices. These allow us to model similarity of words as well as the hyponymy or is-a relationship. Psd matrices can be learnt relatively easily in a given vector space M⊗ M^*, but to compose words to form phrases and sentences, we need representations in larger spaces. In this paper, we introduce a generic way of composing the psd matrices corresponding to words. We propose that psd matrices for verbs, adjectives, and other functional words be lifted to completely positive (CP) maps that match their grammatical type. This lifting is carried out by our composition rule called Compression, Compr. In contrast to previous composition rules like Fuzz and Phaser (a.k.a. KMult and BMult), Compr preserves hyponymy. Mathematically, Compr is itself a CP map, and is therefore linear and generally non-commutative. We give a number of proposals for the structure of Compr, based on spiders, cups and caps, and generate a range of composition rules. We test these rules on a small sentence entailment dataset, and see some improvements over the performance of Fuzz and Phaser.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/14/2015

Sentence Entailment in Compositional Distributional Semantics

Distributional semantic models provide vector representations for words ...
research
06/22/2015

Distributional Sentence Entailment Using Density Matrices

Categorical compositional distributional model of Coecke et al. (2010) s...
research
05/11/2020

Towards logical negation for compositional distributional semantics

The categorical compositional distributional model of meaning gives the ...
research
08/26/2021

The cone of 5× 5 completely positive matrices

We study the cone of completely positive (cp) matrices for the first int...
research
07/11/2019

No Word is an Island -- A Transformation Weighting Model for Semantic Composition

Composition models of distributional semantics are used to construct phr...
research
06/08/2016

Learning Semantically and Additively Compositional Distributional Representations

This paper connects a vector-based composition model to a formal semanti...
research
10/14/2016

Distributional Inclusion Hypothesis for Tensor-based Composition

According to the distributional inclusion hypothesis, entailment between...

Please sign up or login with your details

Forgot password? Click here to reset