Temporal Random Indexing of Context Vectors Applied to Event Detection

08/28/2020
by   Yashank Singh, et al.
0

In this paper we explore new representations for encoding language data.The general method of one-hot encoding grows linearly with the size of the word corpus in space-complexity. We address this by using Random Indexing(RI) of context vectors with nonzero entries. We propose a novel RI representation where we exploit the effect imposing a probability distribution on the number of randomized entries which leads to a class of RI representations. We also propose an algorithm to track the semantic relationship of the key word to other words and hence propose an algorithm for suggesting the events that could happen relevant to the word in question. Finally we run simulations on the novel RI representations using the proposed algorithms for tweets relevant to the word “iPhone” and present results. The RI representation is shown to be faster and space efficient as compared to BoW embeddings.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset