How to Scale Up Kernel Methods to Be As Good As Deep Neural Nets

11/14/2014
by   Zhiyun Lu, et al.
0

The computational complexity of kernel methods has often been a major barrier for applying them to large-scale learning problems. We argue that this barrier can be effectively overcome. In particular, we develop methods to scale up kernel models to successfully tackle large-scale learning problems that are so far only approachable by deep learning architectures. Based on the seminal work by Rahimi and Recht on approximating kernel functions with features derived from random projections, we advance the state-of-the-art by proposing methods that can efficiently train models with hundreds of millions of parameters, and learn optimal representations from multiple kernels. We conduct extensive empirical studies on problems from image recognition and automatic speech recognition, and show that the performance of our kernel models matches that of well-engineered deep neural nets (DNNs). To the best of our knowledge, this is the first time that a direct comparison between these two methods on large-scale problems is reported. Our kernel methods have several appealing properties: training with convex optimization, cost for training a single model comparable to DNNs, and significantly reduced total cost due to fewer hyperparameters to tune for model selection. Our contrastive study between these two very different but equally competitive models sheds light on fundamental questions such as how to learn good representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2017

Kernel Approximation Methods for Speech Recognition

We study large-scale kernel methods for acoustic modeling in speech reco...
research
03/18/2016

A Comparison between Deep Neural Nets and Kernel Acoustic Models for Speech Recognition

We study large-scale kernel methods for acoustic modeling and compare to...
research
07/21/2014

Scalable Kernel Methods via Doubly Stochastic Gradients

The general perception is that kernel methods are not scalable, and neur...
research
05/31/2017

FALKON: An Optimal Large Scale Kernel Method

Kernel methods provide a principled way to perform non linear, nonparame...
research
02/06/2023

Toward Large Kernel Models

Recent studies indicate that kernel machines can often perform similarly...
research
12/23/2014

Deep Networks With Large Output Spaces

Deep neural networks have been extremely successful at various image, sp...
research
08/28/2015

Partitioning Large Scale Deep Belief Networks Using Dropout

Deep learning methods have shown great promise in many practical applica...

Please sign up or login with your details

Forgot password? Click here to reset