DeepAI AI Chat
Log In Sign Up

Learning from DPPs via Sampling: Beyond HKPV and symmetry

07/08/2020
by   Rémi Bardenet, et al.
0

Determinantal point processes (DPPs) have become a significant tool for recommendation systems, feature selection, or summary extraction, harnessing the intrinsic ability of these probabilistic models to facilitate sample diversity. The ability to sample from DPPs is paramount to the empirical investigation of these models. Most exact samplers are variants of a spectral meta-algorithm due to Hough, Krishnapur, Peres and Virág (henceforth HKPV), which is in general time and resource intensive. For DPPs with symmetric kernels, scalable HKPV samplers have been proposed that either first downsample the ground set of items, or force the kernel to be low-rank, using e.g. Nyström-type decompositions. In the present work, we contribute a radically different approach than HKPV. Exploiting the fact that many statistical and learning objectives can be effectively accomplished by only sampling certain key observables of a DPP (so-called linear statistics), we invoke an expression for the Laplace transform of such an observable as a single determinant, which holds in complete generality. Combining traditional low-rank approximation techniques with Laplace inversion algorithms from numerical analysis, we show how to directly approximate the distribution function of a linear statistic of a DPP. This distribution function can then be used in hypothesis testing or to actually sample the linear statistic, as per requirement. Our approach is scalable and applies to very general DPPs, beyond traditional symmetric kernels.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/20/2022

Scalable Sampling for Nonsymmetric Determinantal Point Processes

A determinantal point process (DPP) on a collection of M items is a mode...
07/01/2022

Scalable MCMC Sampling for Nonsymmetric Determinantal Point Processes

A determinantal point process (DPP) is an elegant model that assigns a p...
11/05/2017

Is Input Sparsity Time Possible for Kernel Low-Rank Approximation?

Low-rank approximation is a common tool used to accelerate kernel method...
05/30/2017

Zonotope hit-and-run for efficient sampling from projection DPPs

Determinantal point processes (DPPs) are distributions over sets of item...
06/17/2021

Taming Nonconvexity in Kernel Feature Selection—Favorable Properties of the Laplace Kernel

Kernel-based feature selection is an important tool in nonparametric sta...
08/08/2022

Statistical Properties of the Probabilistic Numeric Linear Solver BayesCG

We analyse the calibration of BayesCG under the Krylov prior, a probabil...
11/01/2018

Learning Signed Determinantal Point Processes through the Principal Minor Assignment Problem

Symmetric determinantal point processes (DPP's) are a class of probabili...