ReLU activated Multi-Layer Neural Networks trained with Mixed Integer Linear Programs

08/19/2020
by   Steffen Goebbels, et al.
0

This paper is a case study to demonstrate that, in principle, multi-layer feedforward Neural Networks activated by ReLU functions can be iteratively trained with Mixed Integer Linear Programs. To this end, two simple networks were trained with a backpropagation-like algorithm on the MNIST dataset that contains handwritten digits.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/06/2019

ReLU Networks as Surrogate Models in Mixed-Integer Linear Programs

We consider the embedding of piecewise-linear deep neural networks (ReLU...
research
01/13/2013

Clustering Learning for Robotic Vision

We present the clustering learning technique applied to multi-layer feed...
research
01/27/2023

Certified Invertibility in Neural Networks via Mixed-Integer Programming

Neural networks are notoriously vulnerable to adversarial attacks – smal...
research
12/17/2017

Deep Neural Networks as 0-1 Mixed Integer Linear Programs: A Feasibility Study

Deep Neural Networks (DNNs) are very popular these days, and are the sub...
research
05/27/2022

Optimizing Objective Functions from Trained ReLU Neural Networks via Sampling

This paper introduces scalable, sampling-based algorithms that optimize ...
research
02/17/2020

Identifying Critical Neurons in ANN Architectures using Mixed Integer Programming

We introduce a novel approach to optimize the architecture of deep neura...
research
03/11/2022

A Mixed Integer Programming Approach for Verifying Properties of Binarized Neural Networks

Many approaches for verifying input-output properties of neural networks...

Please sign up or login with your details

Forgot password? Click here to reset