DeepAI AI Chat
Log In Sign Up

Zonotope hit-and-run for efficient sampling from projection DPPs

by   Guillaume Gautier, et al.

Determinantal point processes (DPPs) are distributions over sets of items that model diversity using kernels. Their applications in machine learning include summary extraction and recommendation systems. Yet, the cost of sampling from a DPP is prohibitive in large-scale applications, which has triggered an effort towards efficient approximate samplers. We build a novel MCMC sampler that combines ideas from combinatorial geometry, linear programming, and Monte Carlo methods to sample from DPPs with a fixed sample cardinality, also called projection DPPs. Our sampler leverages the ability of the hit-and-run MCMC kernel to efficiently move across convex bodies. Previous theoretical results yield a fast mixing time of our chain when targeting a distribution that is close to a projection DPP, but not a DPP in general. Our empirical results demonstrate that this extends to sampling projection DPPs, i.e., our sampler is more sample-efficient than previous approaches which in turn translates to faster convergence when dealing with costly-to-evaluate functions, such as summary extraction in our experiments.


page 5

page 12


The Coordinate Sampler: A Non-Reversible Gibbs-like MCMC Sampler

In this article, we derive a novel non-reversible, continuous-time Marko...

SpreadNUTS – Moderate Dynamic Extension of Paths for No-U-Turn Sampling Partitioning Visited Regions

Markov chain Monte Carlo (MCMC) methods have existed for a long time and...

LSB: Local Self-Balancing MCMC in Discrete Spaces

Markov Chain Monte Carlo (MCMC) methods are promising solutions to sampl...

Design of Hamiltonian Monte Carlo for perfect simulation of general continuous distributions

Hamiltonian Monte Carlo (HMC) is an efficient method of simulating smoot...

Learning from DPPs via Sampling: Beyond HKPV and symmetry

Determinantal point processes (DPPs) have become a significant tool for ...

Entropy-based Training Methods for Scalable Neural Implicit Sampler

Efficiently sampling from un-normalized target distributions is a fundam...

On Testing of Samplers

Given a set of items ℱ and a weight function 𝚠𝚝: ℱ↦ (0,1), the problem o...