Recent Advances in Neural Text Generation: A Task-Agnostic Survey
In recent years much effort has been devoted to applying neural models to the task of natural language generation. The challenge is to generate natural human-like text, and to control the generation process. This paper presents a task-agnostic survey of recent advances in neural text generation. These advances have been achieved by numerous developments, which we group under the following four headings: data construction, neural frameworks, training and inference strategies, and evaluation metrics. Finally we discuss the future directions for the development of neural text generation including neural pipelines and exploiting back-ground knowledge.
READ FULL TEXT