Deriving Neural Architectures from Sequence and Graph Kernels

05/25/2017
by   Tao Lei, et al.
0

The design of neural architectures for structured objects is typically guided by experimental insights rather than a formal process. In this work, we appeal to kernels over combinatorial structures, such as sequences and graphs, to derive appropriate neural operations. We introduce a class of deep recurrent neural operations and formally characterize their associated kernel spaces. Our recurrent modules compare the input to virtual reference objects (cf. filters in CNN) via the kernels. Similar to traditional neural operations, these reference objects are parameterized and directly optimized in end-to-end training. We empirically evaluate the proposed class of neural architectures on standard applications such as language modeling and molecular graph regression, achieving state-of-the-art results across these applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/07/2020

Graph Kernels: State-of-the-Art and Future Challenges

Graph-structured data are an integral part of many application domains, ...
research
06/07/2019

Recurrent Kernel Networks

Substring kernels are classical tools for representing biological sequen...
research
12/22/2016

Structured Sequence Modeling with Graph Convolutional Recurrent Networks

This paper introduces Graph Convolutional Recurrent Network (GCRN), a de...
research
06/18/2020

The Recurrent Neural Tangent Kernel

The study of deep networks (DNs) in the infinite-width limit, via the so...
research
06/09/2016

MuFuRU: The Multi-Function Recurrent Unit

Recurrent neural networks such as the GRU and LSTM found wide adoption i...
research
10/22/2021

Graph Filtration Kernels

The majority of popular graph kernels is based on the concept of Haussle...
research
02/11/2018

Understanding Recurrent Neural State Using Memory Signatures

We demonstrate a network visualization technique to analyze the recurren...

Please sign up or login with your details

Forgot password? Click here to reset