On the Similarity between the Laplace and Neural Tangent Kernels

07/03/2020
by   Amnon Geifman, et al.
0

Recent theoretical work has shown that massively overparameterized neural networks are equivalent to kernel regressors that use Neural Tangent Kernels(NTK). Experiments show that these kernel methods perform similarly to real neural networks. Here we show that NTK for fully connected networks is closely related to the standard Laplace kernel. We show theoretically that for normalized data on the hypersphere both kernels have the same eigenfunctions and their eigenvalues decay polynomially at the same rate, implying that their Reproducing Kernel Hilbert Spaces (RKHS) include the same sets of functions. This means that both kernels give rise to classes of functions with the same smoothness properties. The two kernels differ for data off the hypersphere, but experiments indicate that when data is properly normalized these differences are not significant. Finally, we provide experiments on real data comparing NTK and the Laplace kernel, along with a larger class ofγ-exponential kernels. We show that these perform almost identically. Our results suggest that much insight about neural networks can be obtained from analysis of the well-known Laplace kernel, which has a simple closed-form.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/22/2020

Deep Neural Tangent Kernel and Laplace Kernel Have the Same RKHS

We prove that the reproducing kernel Hilbert spaces (RKHS) of a deep neu...
research
04/07/2021

Spectral Analysis of the Neural Tangent Kernel for Deep Residual Networks

Deep residual network architectures have been shown to achieve superior ...
research
08/07/2022

An Empirical Analysis of the Laplace and Neural Tangent Kernels

The neural tangent kernel is a kernel function defined over the paramete...
research
05/24/2019

What Can ResNet Learn Efficiently, Going Beyond Kernels?

How can neural networks such as ResNet efficiently learn CIFAR-10 with t...
research
10/28/2020

Kernel Aggregated Fast Multipole Method: Efficient summation of Laplace and Stokes kernel functions

Many different simulation methods for Stokes flow problems involve a com...
research
07/26/2023

Controlling the Inductive Bias of Wide Neural Networks by Modifying the Kernel's Spectrum

Wide neural networks are biased towards learning certain functions, infl...
research
07/30/2020

Improving Sample Efficiency with Normalized RBF Kernels

In deep learning models, learning more with less data is becoming more i...

Please sign up or login with your details

Forgot password? Click here to reset