High-risk learning: acquiring new word vectors from tiny data

07/20/2017
by   Aurélie Herbelot, et al.
0

Distributional semantics models are known to struggle with small data. It is generally accepted that in order to learn 'a good vector' for a word, a model must have sufficient examples of its usage. This contradicts the fact that humans can guess the meaning of a word from a few occurrences only. In this paper, we show that a neural language model such as Word2Vec only necessitates minor modifications to its standard architecture to learn new terms from tiny data, using background knowledge from a previously learnt semantic space. We test our model on word definitions and on a nonce task involving 2-6 sentences' worth of context, showing a large increase in performance over state-of-the-art models on the definitional task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2023

Interpretable Word Sense Representations via Definition Generation: The Case of Semantic Change Analysis

We propose using automatically generated natural language definitions of...
research
10/01/2019

Bad Form: Comparing Context-Based and Form-Based Few-Shot Learning in Distributional Semantic Models

Word embeddings are an essential component in a wide range of natural la...
research
12/30/2014

From Logical to Distributional Models

The paper relates two variants of semantic models for natural language, ...
research
06/17/2015

Non-distributional Word Vector Representations

Data-driven representation learning for words is a technique of central ...
research
07/05/2022

Making sense of spoken plurals

Distributional semantics offers new ways to study the semantics of morph...
research
06/12/2018

Term Definitions Help Hypernymy Detection

Existing methods of hypernymy detection mainly rely on statistics over a...
research
01/20/2023

Can Peanuts Fall in Love with Distributional Semantics?

The context in which a sentence appears can drastically alter our expect...

Please sign up or login with your details

Forgot password? Click here to reset