DeepAI AI Chat
Log In Sign Up

Chain of Knowledge: A Framework for Grounding Large Language Models with Structured Knowledge Bases

by   Xingxuan Li, et al.

We introduce Chain of Knowledge (CoK), a framework that augments large language models with structured knowledge bases to improve factual correctness and reduce hallucination. Compared to previous works which only retrieve unstructured texts, CoK leverages structured knowledge bases which support complex queries and offer more direct factual statements. To assist large language models to effectively query knowledge bases, we propose a query generator model with contrastive instruction-tuning. As the query generator is separate from the frozen large language model, our framework is modular and thus easily adapted to various knowledge sources and models. Experiments show that our framework significantly enhances the factual correctness of large language models on knowledge-intensive tasks.


Unstructured and structured data: Can we have the best of both worlds with large language models?

This paper presents an opinion on the potential of using large language ...

LLM2KB: Constructing Knowledge Bases using instruction tuned context aware Large Language Models

The advent of Large Language Models (LLM) has revolutionized the field o...

Knowledge Prompts: Injecting World Knowledge into Language Models through Soft Prompts

Soft prompts have been recently proposed as a tool for adapting large fr...

Utilizing Large Language Models for Natural Interface to Pharmacology Databases

The drug development process necessitates that pharmacologists undertake...

KnowledGPT: Enhancing Large Language Models with Retrieval and Storage Access on Knowledge Bases

Large language models (LLMs) have demonstrated impressive impact in the ...

A Review on Language Models as Knowledge Bases

Recently, there has been a surge of interest in the NLP community on the...

BERTese: Learning to Speak to BERT

Large pre-trained language models have been shown to encode large amount...