An Initial Investigation of Non-Native Spoken Question-Answering

07/09/2021
by   Vatsal Raina, et al.
0

Text-based machine comprehension (MC) systems have a wide-range of applications, and standard corpora exist for developing and evaluating approaches. There has been far less research on spoken question answering (SQA) systems. The SQA task considered in this paper is to extract the answer from a candidate's spoken response to a question in a prompt-response style language assessment test. Applying these MC approaches to this SQA task rather than, for example, off-topic response detection provides far more detailed information that can be used for further downstream processing. One significant challenge is the lack of appropriately annotated speech corpora to train systems for this task. Hence, a transfer-learning style approach is adopted where a system trained on text-based MC is evaluated on an SQA task with non-native speakers. Mismatches must be considered between text documents and spoken responses; non-native spoken grammar and written grammar. In practical SQA, ASR systems are used, necessitating an investigation of the impact of ASR errors. We show that a simple text-based ELECTRA MC model trained on SQuAD2.0 transfers well for SQA. It is found that there is an approximately linear relationship between ASR errors and the SQA assessment scores but grammar mismatches have minimal impact.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset