MWP-BERT: A Strong Baseline for Math Word Problems

07/28/2021
by   Zhenwen Liang, et al.
0

Math word problem (MWP) solving is the task of transforming a sequence of natural language problem descriptions to executable math equations. An MWP solver not only needs to understand complex scenarios described in the problem texts, but also identify the key mathematical variables and associate text descriptions with math equation logic. Although recent sequence modeling MWP solvers have gained credits on the math-text contextual understanding, pre-trained language models (PLM) have not been explored for solving MWP, considering that PLM trained over free-form texts is limited in representing text references to mathematical logic. In this work, we introduce MWP-BERT to obtain pre-trained token representations that capture the alignment between text description and mathematical logic. Additionally, we introduce a keyword-based prompt matching method to address the MWPs requiring common-sense knowledge. On a benchmark Math23K dataset and a new Ape210k dataset, we show that MWP-BERT outperforms the strongest baseline model by 5-10 accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2022

JiuZhang: A Chinese Pre-trained Language Model for Mathematical Problem Understanding

This paper aims to advance the mathematical intelligence of machines by ...
research
06/10/2019

Detecting Everyday Scenarios in Narrative Texts

Script knowledge consists of detailed information on everyday activities...
research
08/06/2019

Predicting Prosodic Prominence from Text with Pre-trained Contextualized Word Representations

In this paper we introduce a new natural language processing dataset and...
research
09/09/2021

Math Word Problem Generation with Mathematical Consistency and Problem Context Constraints

We study the problem of generating arithmetic math word problems (MWPs) ...
research
05/22/2020

Improving Segmentation for Technical Support Problems

Technical support problems are often long and complex. They typically co...
research
01/12/2022

Differentiating Geographic Movement Described in Text Documents

Understanding movement described in text documents is important since te...
research
08/03/2022

Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language

We present a new pre-trained language model (PLM) for Rabbinic Hebrew, t...

Please sign up or login with your details

Forgot password? Click here to reset