Escaping the Curse of Dimensionality in Similarity Learning: Efficient Frank-Wolfe Algorithm and Generalization Bounds

07/20/2018
by   Kuan Liu, et al.
0

Similarity and metric learning provides a principled approach to construct a task-specific similarity from weakly supervised data. However, these methods are subject to the curse of dimensionality: as the number of features grows large, poor generalization is to be expected and training becomes intractable due to high computational and memory costs. In this paper, we propose a similarity learning method that can efficiently deal with high-dimensional sparse data. This is achieved through a parameterization of similarity functions by convex combinations of sparse rank-one matrices, together with the use of a greedy approximate Frank-Wolfe algorithm which provides an efficient way to control the number of active features. We show that the convergence rate of the algorithm, as well as its time and memory complexity, are independent of the data dimension. We further provide a theoretical justification of our modeling choices through an analysis of the generalization error, which depends logarithmically on the sparsity of the solution rather than on the number of features. Our experiments on datasets with up to one million features demonstrate the ability of our approach to generalize well despite the high dimensionality as well as its superiority compared to several competing methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2014

Similarity Learning for High-Dimensional Sparse Data

A good measure of similarity between data points is crucial to many task...
research
07/23/2012

Generalization Bounds for Metric and Similarity Learning

Recently, metric learning and similarity learning have attracted a large...
research
04/15/2021

Sparse online relative similarity learning

For many data mining and machine learning tasks, the quality of a simila...
research
12/10/2015

Boosted Sparse Non-linear Distance Metric Learning

This paper proposes a boosting-based solution addressing metric learning...
research
11/25/2017

An Oracle Property of The Nadaraya-Watson Kernel Estimator for High Dimensional Nonparametric Regression

The celebrated Nadaraya-Watson kernel estimator is among the most studie...
research
01/03/2016

Dimensionality-Dependent Generalization Bounds for k-Dimensional Coding Schemes

The k-dimensional coding schemes refer to a collection of methods that a...
research
02/07/2021

Dimension Free Generalization Bounds for Non Linear Metric Learning

In this work we study generalization guarantees for the metric learning ...

Please sign up or login with your details

Forgot password? Click here to reset