World Models for Math Story Problems

by   Andreas Opedal, et al.

Solving math story problems is a complex task for students and NLP models alike, requiring them to understand the world as described in the story and reason over it to compute an answer. Recent years have seen impressive performance on automatically solving these problems with large pre-trained language models and innovative techniques to prompt them. However, it remains unclear if these models possess accurate representations of mathematical concepts. This leads to lack of interpretability and trustworthiness which impedes their usefulness in various applications. In this paper, we consolidate previous work on categorizing and representing math story problems and develop MathWorld, which is a graph-based semantic formalism specific for the domain of math story problems. With MathWorld, we can assign world models to math story problems which represent the situations and actions introduced in the text and their mathematical relationships. We combine math story problems from several existing datasets and annotate a corpus of 1,019 problems and 3,204 logical forms with MathWorld. Using this data, we demonstrate the following use cases of MathWorld: (1) prompting language models with synthetically generated question-answer pairs to probe their reasoning and world modeling abilities, and (2) generating new problems by using the world models as a design space.


page 1

page 2

page 3

page 4


Plot Writing From Pre-Trained Language Models

Pre-trained language models (PLMs) fail to generate long-form narrative ...

Can Very Large Pretrained Language Models Learn Storytelling With A Few Examples?

While pre-trained language models can generate individually fluent sente...

SMART: A Situation Model for Algebra Story Problems via Attributed Grammar

Solving algebra story problems remains a challenging task in artificial ...

Tableless Calculation of Circular Functions on Dyadic Rationals

I would like to tell a story. A story about a beautiful mathematical rel...

COINS: Dynamically Generating COntextualized Inference Rules for Narrative Story Completion

Despite recent successes of large pre-trained language models in solving...

CORRPUS: Detecting Story Inconsistencies via Codex-Bootstrapped Neurosymbolic Reasoning

Story generation and understanding – as with all NLG/NLU tasks – has see...

New Methods of Analysis of Narrative and Semantics in Support of Interactivity

Our work has focused on support for film or television scriptwriting. Si...

Please sign up or login with your details

Forgot password? Click here to reset