Attention Visualizer Package: Revealing Word Importance for Deeper Insight into Encoder-Only Transformer Models

08/28/2023
by   Ala Alam Falaki, et al.
0

This report introduces the Attention Visualizer package, which is crafted to visually illustrate the significance of individual words in encoder-only transformer-based models. In contrast to other methods that center on tokens and self-attention scores, our approach will examine the words and their impact on the final embedding representation. Libraries like this play a crucial role in enhancing the interpretability and explainability of neural networks. They offer the opportunity to illuminate their internal mechanisms, providing a better understanding of how they operate and can be enhanced. You can access the code and review examples on the following GitHub repository: https://github.com/AlaFalaki/AttentionVisualizer.

READ FULL TEXT

page 4

page 6

page 7

page 8

page 9

page 10

page 11

research
03/29/2022

MatteFormer: Transformer-Based Image Matting via Prior-Tokens

In this paper, we propose a transformer-based image matting model called...
research
05/03/2023

A Lightweight CNN-Transformer Model for Learning Traveling Salesman Problems

Transformer-based models show state-of-the-art performance even for larg...
research
08/12/2019

On the Validity of Self-Attention as Explanation in Transformer Models

Explainability of deep learning systems is a vital requirement for many ...
research
10/04/2022

Accurate Image Restoration with Attention Retractable Transformer

Recently, Transformer-based image restoration networks have achieved pro...
research
09/16/2021

The NiuTrans System for WNGT 2020 Efficiency Task

This paper describes the submissions of the NiuTrans Team to the WNGT 20...
research
06/09/2022

Xplique: A Deep Learning Explainability Toolbox

Today's most advanced machine-learning models are hardly scrutable. The ...
research
10/24/2022

Exploring Self-Attention for Crop-type Classification Explainability

Automated crop-type classification using Sentinel-2 satellite time serie...

Please sign up or login with your details

Forgot password? Click here to reset