Differentiable Clustering with Perturbed Spanning Forests

05/25/2023
by   Lawrence Stewart, et al.
Cole Normale Suprieure
0

We introduce a differentiable clustering method based on minimum-weight spanning forests, a variant of spanning trees with several connected components. Our method relies on stochastic perturbations of solutions of linear programs, for smoothing and efficient gradient computations. This allows us to include clustering in end-to-end trainable pipelines. We show that our method performs well even in difficult settings, such as datasets with high noise and challenging geometries. We also formulate an ad hoc loss to efficiently learn from partial clustering data using this operation. We demonstrate its performance on several real world datasets for supervised and semi-supervised tasks.

READ FULL TEXT
02/20/2020

Learning with Differentiable Perturbed Optimizers

Machine learning pipelines often rely on optimization procedures to make...
10/17/2019

Smoothing graph signals via random spanning forests

Another facet of the elegant link between random processes on graphs and...
12/20/2018

Reliable Agglomerative Clustering

We analyze the general behavior of agglomerative clustering methods, and...
05/28/2013

Matrices of forests, analysis of networks, and ranking problems

The matrices of spanning rooted forests are studied as a tool for analys...
06/05/2023

End-to-end Differentiable Clustering with Associative Memories

Clustering is a widely used unsupervised learning technique involving an...
03/10/2023

Clustering with minimum spanning trees: How good can it be?

Minimum spanning trees (MSTs) provide a convenient representation of dat...

Please sign up or login with your details

Forgot password? Click here to reset