The Neural Network Pushdown Automaton: Model, Stack and Learning Simulations

11/15/2017
by   G. Z. Sun, et al.
0

In order for neural networks to learn complex languages or grammars, they must have sufficient computational power or resources to recognize or generate such languages. Though many approaches have been discussed, one ob- vious approach to enhancing the processing power of a recurrent neural network is to couple it with an external stack memory - in effect creating a neural network pushdown automata (NNPDA). This paper discusses in detail this NNPDA - its construction, how it can be trained and how useful symbolic information can be extracted from the trained network. In order to couple the external stack to the neural network, an optimization method is developed which uses an error function that connects the learning of the state automaton of the neural network to the learning of the operation of the external stack. To minimize the error function using gradient descent learning, an analog stack is designed such that the action and storage of information in the stack are continuous. One interpretation of a continuous stack is the probabilistic storage of and action on data. After training on sample strings of an unknown source grammar, a quantization procedure extracts from the analog stack and neural network a discrete pushdown automata (PDA). Simulations show that in learning deterministic context-free grammars - the balanced parenthesis language, 1*n0*n, and the deterministic Palindrome - the extracted PDA is correct in the sense that it can correctly recognize unseen strings of arbitrary length. In addition, the extracted PDAs can be shown to be identical or equivalent to the PDAs of the source grammars which were used to generate the training strings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/07/2019

The Neural State Pushdown Automata

In order to learn complex grammars, recurrent neural networks (RNNs) req...
research
06/05/2020

Provably Stable Interpretable Encodings of Context Free Grammars in RNNs with a Differentiable Stack

Given a collection of strings belonging to a context free grammar (CFG) ...
research
10/04/2022

The Surprising Computational Power of Nondeterministic Stack RNNs

Traditional recurrent neural networks (RNNs) have a fixed, finite number...
research
05/04/2021

Reservoir Stack Machines

Memory-augmented neural networks equip a recurrent neural network with a...
research
02/02/2019

Parametric FEM for Shape Optimization applied to Golgi Stack

The thesis is about an application of the shape optimization to the morp...
research
06/18/2020

Stability of Internal States in Recurrent Neural Networks Trained on Regular Languages

We provide an empirical study of the stability of recurrent neural netwo...
research
11/09/2022

Automated MRI Field of View Prescription from Region of Interest Prediction by Intra-stack Attention Neural Network

Manual prescription of the field of view (FOV) by MRI technologists is v...

Please sign up or login with your details

Forgot password? Click here to reset