Diversity-Aware Coherence Loss for Improving Neural Topic Models

05/25/2023
by   Raymond Li, et al.
0

The standard approach for neural topic modeling uses a variational autoencoder (VAE) framework that jointly minimizes the KL divergence between the estimated posterior and prior, in addition to the reconstruction loss. Since neural topic models are trained by recreating individual input documents, they do not explicitly capture the coherence between topic words on the corpus level. In this work, we propose a novel diversity-aware coherence loss that encourages the model to learn corpus-level coherence scores while maintaining a high diversity between topics. Experimental results on multiple datasets show that our method significantly improves the performance of neural topic models without requiring any pretraining or additional parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/07/2018

Coherence-Aware Neural Topic Modeling

Topic models are evaluated based on their ability to describe documents ...
research
05/23/2023

Contextualized Topic Coherence Metrics

The recent explosion in work on neural topic modeling has been criticize...
research
11/24/2017

Continuous Semantic Topic Embedding Model Using Variational Autoencoder

This paper proposes the continuous semantic topic embedding model (CSTEM...
research
07/24/2019

Topic Modeling with Wasserstein Autoencoders

We propose a novel neural topic model in the Wasserstein autoencoders (W...
research
02/16/2019

TopicEq: A Joint Topic and Mathematical Equation Model for Scientific Texts

Scientific documents rely on both mathematics and text to communicate id...
research
09/18/2018

Better Conversations by Modeling,Filtering,and Optimizing for Coherence and Diversity

We present three enhancements to existing encoder-decoder models for ope...
research
05/08/2023

Reinforcement Learning for Topic Models

We apply reinforcement learning techniques to topic modeling by replacin...

Please sign up or login with your details

Forgot password? Click here to reset