Embeddings and Attention in Predictive Modeling

by   Kevin Kuo, et al.

We explore in depth how categorical data can be processed with embeddings in the context of claim severity modeling. We develop several models that range in complexity from simple neural networks to state-of-the-art attention based architectures that utilize embeddings. We illustrate the utility of learned embeddings from neural networks as pretrained features in generalized linear models, and discuss methods for visualizing and interpreting embeddings. Finally, we explore how attention based models can contextually augment embeddings, leading to enhanced predictive performance.


page 23

page 24


TabTransformer: Tabular Data Modeling Using Contextual Embeddings

We propose TabTransformer, a novel deep tabular data modeling architectu...

On the Effective Use of Pretraining for Natural Language Inference

Neural networks have excelled at many NLP tasks, but there remain open q...

Fisher Information Embedding for Node and Graph Learning

Attention-based graph neural networks (GNNs), such as graph attention ne...

On Embeddings for Numerical Features in Tabular Deep Learning

Recently, Transformer-like deep architectures have shown strong performa...

Attention-Based Models for Text-Dependent Speaker Verification

Attention-based models have recently shown great performance on a range ...

Attention vs non-attention for a Shapley-based explanation method

The field of explainable AI has recently seen an explosion in the number...

Lessons Learned from Applying off-the-shelf BERT: There is no Silver Bullet

One of the challenges in the NLP field is training large classification ...

Please sign up or login with your details

Forgot password? Click here to reset