Grokking modular arithmetic

01/06/2023
by   Andrey Gromov, et al.
0

We present a simple neural network that can learn modular arithmetic tasks and exhibits a sudden jump in generalization known as “grokking”. Concretely, we present (i) fully-connected two-layer networks that exhibit grokking on various modular arithmetic tasks under vanilla gradient descent with the MSE loss function in the absence of any regularization; (ii) evidence that grokking modular arithmetic corresponds to learning specific feature maps whose structure is determined by the task; (iii) analytic expressions for the weights – and thus for the feature maps – that solve a large class of modular arithmetic tasks; and (iv) evidence that these feature maps are also found by vanilla gradient descent as well as AdamW, thereby establishing complete interpretability of the representations learnt by the network.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset