SkelEx and BoundEx: Natural Visualization of ReLU Neural Networks

05/09/2023
by   Pawel Pukowski, et al.
0

Despite their limited interpretability, weights and biases are still the most popular encoding of the functions learned by ReLU Neural Networks (ReLU NNs). That is why we introduce SkelEx, an algorithm to extract a skeleton of the membership functions learned by ReLU NNs, making those functions easier to interpret and analyze. To the best of our knowledge, this is the first work that considers linear regions from the perspective of critical points. As a natural follow-up, we also introduce BoundEx, which is the first analytical method known to us to extract the decision boundary from the realization of a ReLU NN. Both of those methods introduce very natural visualization tool for ReLU NNs trained on low-dimensional data.

READ FULL TEXT
research
06/27/2022

Expressive power of binary and ternary neural networks

We show that deep sparse ReLU networks with ternary weights and deep ReL...
research
11/17/2021

Traversing the Local Polytopes of ReLU Neural Networks: A Unified Approach for Network Verification

Although neural networks (NNs) with ReLU activation functions have found...
research
05/13/2020

The effect of Target Normalization and Momentum on Dying ReLU

Optimizing parameters with momentum, normalizing data values, and using ...
research
07/20/2021

An Embedding of ReLU Networks and an Analysis of their Identifiability

Neural networks with the Rectified Linear Unit (ReLU) nonlinearity are d...
research
06/01/2020

On the Number of Linear Regions of Convolutional Neural Networks

One fundamental problem in deep learning is understanding the outstandin...
research
02/06/2019

On the CVP for the root lattices via folding with deep ReLU neural networks

Point lattices and their decoding via neural networks are considered in ...
research
06/01/2023

Learning Prescriptive ReLU Networks

We study the problem of learning optimal policy from a set of discrete t...

Please sign up or login with your details

Forgot password? Click here to reset