Learning Inconsistent Preferences with Kernel Methods

by   Siu Lun Chau, et al.

We propose a probabilistic kernel approach for preferential learning from pairwise duelling data using Gaussian Processes. Different from previous methods, we do not impose a total order on the item space, hence can capture more expressive latent preferential structures such as inconsistent preferences and clusters of comparable items. Furthermore, we prove the universality of the proposed kernels, i.e. that the corresponding reproducing kernel Hilbert Space (RKHS) is dense in the space of skew-symmetric preference functions. To conclude the paper, we provide an extensive set of numerical experiments on simulated and real-world datasets showcasing the competitiveness of our proposed method with state-of-the-art.


Symmetric and antisymmetric kernels for machine learning problems in quantum physics and chemistry

We derive symmetric and antisymmetric kernels by symmetrizing and antisy...

Learning Kernels for Structured Prediction using Polynomial Kernel Transformations

Learning the kernel functions used in kernel methods has been a vastly e...

Kernel Ridgeless Regression is Inconsistent for Low Dimensions

We show that kernel interpolation for a large class of shift-invariant k...

Kernels for Vector-Valued Functions: a Review

Kernel methods are among the most popular techniques in machine learning...

Learning Sets with Separating Kernels

We consider the problem of learning a set from random samples. We show h...

Scalable Bayesian Preference Learning for Crowds

We propose a scalable Bayesian preference learning method for jointly pr...

Addressing Dynamic and Sparse Qualitative Data: A Hilbert Space Embedding of Categorical Variables

We propose a novel framework for incorporating qualitative data into qua...

Please sign up or login with your details

Forgot password? Click here to reset