ReLU activated Multi-Layer Neural Networks trained with Mixed Integer Linear Programs
This paper is a case study to demonstrate that, in principle, multi-layer feedforward Neural Networks activated by ReLU functions can be iteratively trained with Mixed Integer Linear Programs. To this end, two simple networks were trained with a backpropagation-like algorithm on the MNIST dataset that contains handwritten digits.
READ FULL TEXT