DeepAI AI Chat
Log In Sign Up

Deriving Neural Architectures from Sequence and Graph Kernels

by   Tao Lei, et al.

The design of neural architectures for structured objects is typically guided by experimental insights rather than a formal process. In this work, we appeal to kernels over combinatorial structures, such as sequences and graphs, to derive appropriate neural operations. We introduce a class of deep recurrent neural operations and formally characterize their associated kernel spaces. Our recurrent modules compare the input to virtual reference objects (cf. filters in CNN) via the kernels. Similar to traditional neural operations, these reference objects are parameterized and directly optimized in end-to-end training. We empirically evaluate the proposed class of neural architectures on standard applications such as language modeling and molecular graph regression, achieving state-of-the-art results across these applications.


page 1

page 2

page 3

page 4


Graph Kernels: State-of-the-Art and Future Challenges

Graph-structured data are an integral part of many application domains, ...

Recurrent Kernel Networks

Substring kernels are classical tools for representing biological sequen...

Structured Sequence Modeling with Graph Convolutional Recurrent Networks

This paper introduces Graph Convolutional Recurrent Network (GCRN), a de...

The Recurrent Neural Tangent Kernel

The study of deep networks (DNs) in the infinite-width limit, via the so...

MuFuRU: The Multi-Function Recurrent Unit

Recurrent neural networks such as the GRU and LSTM found wide adoption i...

Graph Filtration Kernels

The majority of popular graph kernels is based on the concept of Haussle...

Understanding Recurrent Neural State Using Memory Signatures

We demonstrate a network visualization technique to analyze the recurren...