A network that learns Strassen multiplication

01/26/2016
by   Veit Elser, et al.
0

We study neural networks whose only non-linear components are multipliers, to test a new training rule in a context where the precise representation of data is paramount. These networks are challenged to discover the rules of matrix multiplication, given many examples. By limiting the number of multipliers, the network is forced to discover the Strassen multiplication rules. This is the mathematical equivalent of finding low rank decompositions of the n× n matrix multiplication tensor, M_n. We train these networks with the conservative learning rule, which makes minimal changes to the weights so as to give the correct output for each input at the time the input-output pair is received. Conservative learning needs a few thousand examples to find the rank 7 decomposition of M_2, and 10^5 for the rank 23 decomposition of M_3 (the lowest known). High precision is critical, especially for M_3, to discriminate between true decompositions and "border approximations".

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset