Semi-supervised Nonnegative Matrix Factorization for Document Classification

by   Jamie Haddock, et al.

We propose new semi-supervised nonnegative matrix factorization (SSNMF) models for document classification and provide motivation for these models as maximum likelihood estimators. The proposed SSNMF models simultaneously provide both a topic model and a model for classification, thereby offering highly interpretable classification results. We derive training methods using multiplicative updates for each new model, and demonstrate the application of these models to single-label and multi-label document classification, although the models are flexible to other supervised learning tasks such as regression. We illustrate the promise of these models and training methods on document classification datasets (e.g., 20 Newsgroups, Reuters).


page 1

page 2

page 3

page 4


Semi-supervised NMF Models for Topic Modeling in Learning Tasks

We propose several new models for semi-supervised nonnegative matrix fac...

Discriminatively Constrained Semi-supervised Multi-view Nonnegative Matrix Factorization with Graph Regularization

In recent years, semi-supervised multi-view nonnegative matrix factoriza...

Semi-Supervised Convolutive NMF for Automatic Music Transcription

Automatic Music Transcription, which consists in transforming an audio r...

On a Guided Nonnegative Matrix Factorization

Fully unsupervised topic models have found fantastic success in document...

Dual-constrained Deep Semi-Supervised Coupled Factorization Network with Enriched Prior

Nonnegative matrix factorization is usually powerful for learning the "s...

Regularized L21-Based Semi-NonNegative Matrix Factorization

We present a general-purpose data compression algorithm, Regularized L21...

BERT-Flow-VAE: A Weakly-supervised Model for Multi-Label Text Classification

Multi-label Text Classification (MLTC) is the task of categorizing docum...

Please sign up or login with your details

Forgot password? Click here to reset