Compositional Distributional Semantics with Compact Closed Categories and Frobenius Algebras

05/01/2015
by   Dimitri Kartsaklis, et al.
0

This thesis contributes to ongoing research related to the categorical compositional model for natural language of Coecke, Sadrzadeh and Clark in three ways: Firstly, I propose a concrete instantiation of the abstract framework based on Frobenius algebras (joint work with Sadrzadeh). The theory improves shortcomings of previous proposals, extends the coverage of the language, and is supported by experimental work that improves existing results. The proposed framework describes a new class of compositional models that find intuitive interpretations for a number of linguistic phenomena. Secondly, I propose and evaluate in practice a new compositional methodology which explicitly deals with the different levels of lexical ambiguity (joint work with Pulman). A concrete algorithm is presented, based on the separation of vector disambiguation from composition in an explicit prior step. Extensive experimental work shows that the proposed methodology indeed results in more accurate composite representations for the framework of Coecke et al. in particular and every other class of compositional models in general. As a last contribution, I formalize the explicit treatment of lexical ambiguity in the context of the categorical framework by resorting to categorical quantum mechanics (joint work with Coecke). In the proposed extension, the concept of a distributional vector is replaced with that of a density matrix, which compactly represents a probability distribution over the potential different meanings of the specific word. Composition takes the form of quantum measurements, leading to interesting analogies between quantum physics and linguistics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/04/2016

Quantifier Scope in Categorical Compositional Distributional Semantics

In previous work with J. Hedges, we formalised a generalised quantifiers...
research
08/20/2019

Density Matrices for Derivational Ambiguity

Recent work on vector-based compositional natural language semantics has...
research
10/12/2020

Modelling Lexical Ambiguity with Density Matrices

Words can have multiple senses. Compositional distributional models of m...
research
02/03/2015

Open System Categorical Quantum Semantics in Natural Language Processing

Originally inspired by categorical quantum mechanics (Abramsky and Coeck...
research
08/04/2016

Dual Density Operators and Natural Language Meaning

Density operators allow for representing ambiguity about a vector repres...
research
04/13/2021

Should Semantic Vector Composition be Explicit? Can it be Linear

Vector representations have become a central element in semantic languag...
research
11/08/2018

Classical Copying versus Quantum Entanglement in Natural Language: The Case of VP-ellipsis

This paper compares classical copying and quantum entanglement in natura...

Please sign up or login with your details

Forgot password? Click here to reset