ChainCQG: Flow-Aware Conversational Question Generation
Conversational systems enable numerous valuable applications, and question-answering is an important component underlying many of these. However, conversational question-answering remains challenging due to the lack of realistic, domain-specific training data. Inspired by this bottleneck, we focus on conversational question generation as a means to generate synthetic conversations for training and evaluation purposes. We present a number of novel strategies to improve conversational flow and accommodate varying question types and overall fluidity. Specifically, we design ChainCQG as a two-stage architecture that learns question-answer representations across multiple dialogue turns using a flow propagation training strategy.ChainCQG significantly outperforms both answer-aware and answer-unaware SOTA baselines (e.g., up to 48 generate different types of questions, with improved fluidity and coreference alignment.
READ FULL TEXT