RET-LLM: Towards a General Read-Write Memory for Large Language Models

05/23/2023
by   Ali Modarressi, et al.
0

Large language models (LLMs) have significantly advanced the field of natural language processing (NLP) through their extensive parameters and comprehensive data utilization. However, existing LLMs lack a dedicated memory unit, limiting their ability to explicitly store and retrieve knowledge for various tasks. In this paper, we propose RET-LLM a novel framework that equips LLMs with a general write-read memory unit, allowing them to extract, store, and recall knowledge from the text as needed for task performance. Inspired by Davidsonian semantics theory, we extract and save knowledge in the form of triplets. The memory unit is designed to be scalable, aggregatable, updatable, and interpretable. Through qualitative evaluations, we demonstrate the superiority of our proposed framework over baseline approaches in question answering tasks. Moreover, our framework exhibits robust performance in handling temporal-based question answering tasks, showcasing its ability to effectively manage time-dependent information.

READ FULL TEXT
research
05/24/2023

Unlocking Temporal Question Answering for Large Language Models Using Code Execution

Large language models (LLMs) have made significant progress in natural l...
research
05/03/2022

XLTime: A Cross-Lingual Knowledge Transfer Framework for Temporal Expression Extraction

Temporal Expression Extraction (TEE) is essential for understanding time...
research
08/23/2023

Bridging the Gap: Deciphering Tabular Data Using Large Language Model

In the realm of natural language processing, the understanding of tabula...
research
09/27/2017

A Read-Write Memory Network for Movie Story Understanding

We propose a novel memory network model named Read-Write Memory Network ...
research
08/10/2023

Encode-Store-Retrieve: Enhancing Memory Augmentation through Language-Encoded Egocentric Perception

We depend on our own memory to encode, store, and retrieve our experienc...
research
01/10/2023

Memory Augmented Large Language Models are Computationally Universal

We show that transformer-based large language models are computationally...
research
09/10/2021

Entity-Based Knowledge Conflicts in Question Answering

Knowledge-dependent tasks typically use two sources of knowledge: parame...

Please sign up or login with your details

Forgot password? Click here to reset