Two Discourse Driven Language Models for Semantics

06/17/2016
by   Haoruo Peng, et al.
0

Natural language understanding often requires deep semantic knowledge. Expanding on previous proposals, we suggest that some important aspects of semantic knowledge can be modeled as a language model if done at an appropriate level of abstraction. We develop two distinct models that capture semantic frame chains and discourse information while abstracting over the specific mentions of predicates and entities. For each model, we investigate four implementations: a "standard" N-gram language model and three discriminatively trained "neural" language models that generate embeddings for semantic frames. The quality of the semantic language models (SemLM) is evaluated both intrinsically, using perplexity and a narrative cloze test and extrinsically - we show that our SemLM helps improve performance on semantic natural language processing tasks such as co-reference resolution and discourse parsing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/17/2022

Neural Embeddings for Text

We propose a new kind of embedding for natural language text that deeply...
research
06/15/2023

Opportunities for Large Language Models and Discourse in Engineering Design

In recent years, large language models have achieved breakthroughs on a ...
research
09/26/2022

Entailment Semantics Can Be Extracted from an Ideal Language Model

Language models are often trained on text alone, without additional grou...
research
07/06/2023

Agentività e telicità in GilBERTo: implicazioni cognitive

The goal of this study is to investigate whether a Transformer-based neu...
research
11/05/2016

Reference-Aware Language Models

We propose a general class of language models that treat reference as an...
research
09/17/2021

Language Models as a Knowledge Source for Cognitive Agents

Language models (LMs) are sentence-completion engines trained on massive...
research
11/09/2020

Character-level Representations Improve DRS-based Semantic Parsing Even in the Age of BERT

We combine character-level and contextual language model representations...

Please sign up or login with your details

Forgot password? Click here to reset