Black-box language model explanation by context length probing

12/30/2022
by   Ondřej Cífka, et al.
0

The increasingly widespread adoption of large language models has highlighted the need for improving their explainability. We present context length probing, a novel explanation technique for causal language models, based on tracking the predictions of a model as a function of the length of available context, and allowing to assign differential importance scores to different contexts. The technique is model-agnostic and does not rely on access to model internals beyond computing token-level probabilities. We apply context length probing to large pre-trained language models and offer some initial analyses and insights, including the potential for studying long-range dependencies. The source code and a demo of the method are available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/17/2023

Token-wise Decomposition of Autoregressive Language Model Hidden States for Analyzing Model Predictions

While there is much recent interest in studying why Transformer-based la...
research
07/06/2023

Lost in the Middle: How Language Models Use Long Contexts

While recent language models have the ability to take long contexts as i...
research
12/21/2022

Prompt-Augmented Linear Probing: Scaling Beyond The Limit of Few-shot In-Context Learners

Through in-context learning (ICL), large-scale language models are effec...
research
05/24/2023

Lexinvariant Language Models

Token embeddings, a mapping from discrete lexical symbols to continuous ...
research
09/01/2023

BatchPrompt: Accomplish more with less

As the ever-increasing token limits of large language models (LLMs) have...
research
06/21/2023

Opening the Black Box: Analyzing Attention Weights and Hidden States in Pre-trained Language Models for Non-language Tasks

Investigating deep learning language models has always been a significan...
research
07/26/2023

Three Bricks to Consolidate Watermarks for Large Language Models

The task of discerning between generated and natural texts is increasing...

Please sign up or login with your details

Forgot password? Click here to reset