Multi-view and Multi-source Transfers in Neural Topic Modeling with Pretrained Topic and Word Embeddings

09/14/2019
by   Pankaj Gupta, et al.
0

Though word embeddings and topics are complementary representations, several past works have only used pre-trained word embeddings in (neural) topic modeling to address data sparsity problem in short text or small collection of documents. However, no prior work has employed (pre-trained latent) topics in transfer learning paradigm. In this paper, we propose an approach to (1) perform knowledge transfer using latent topics obtained from a large source corpus, and (2) jointly transfer knowledge via the two representations (or views) in neural topic modeling to improve topic quality, better deal with polysemy and data sparsity issues in a target corpus. In doing so, we first accumulate topics and word representations from one or many source corpora to build a pool of topics and word vectors. Then, we identify one or multiple relevant source domain(s) and take advantage of corresponding topics and word features via the respective pools to guide meaningful learning in the sparse target domain. We quantify the quality of topic and document representations via generalization (perplexity), interpretability (topic coherence) and information retrieval (IR) using short-text, long-text, small and large document collections from news and medical domains. We have demonstrated the state-of-the-art results on topic modeling with the proposed framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2019

Multi-view and Multi-source Transfers in Neural Topic Modeling

Though word embeddings and topics are complementary representations, sev...
research
04/17/2021

Multi-source Neural Topic Modeling in Multi-view Embedding Spaces

Though word embeddings and topics are complementary representations, sev...
research
12/27/2022

TegFormer: Topic-to-Essay Generation with Good Topic Coverage and High Text Coherence

Creating an essay based on a few given topics is a challenging NLP task....
research
04/18/2021

Group-Sparse Matrix Factorization for Transfer Learning of Word Embeddings

Sparse regression has recently been applied to enable transfer learning ...
research
06/05/2019

Topic Sensitive Attention on Generic Corpora Corrects Sense Bias in Pretrained Embeddings

Given a small corpus D_T pertaining to a limited set of focused topics,...
research
12/12/2022

Effective Seed-Guided Topic Discovery by Integrating Multiple Types of Contexts

Instead of mining coherent topics from a given text corpus in a complete...
research
10/09/2018

textTOvec: Deep Contextualized Neural Autoregressive Models of Language with Distributed Compositional Prior

We address two challenges of probabilistic topic modelling in order to b...

Please sign up or login with your details

Forgot password? Click here to reset