An Embedding of ReLU Networks and an Analysis of their Identifiability

07/20/2021
by   Pierre Stock, et al.
0

Neural networks with the Rectified Linear Unit (ReLU) nonlinearity are described by a vector of parameters θ, and realized as a piecewise linear continuous function R_θ: x ∈ℝ^d↦ R_θ(x) ∈ℝ^k. Natural scalings and permutations operations on the parameters θ leave the realization unchanged, leading to equivalence classes of parameters that yield the same realization. These considerations in turn lead to the notion of identifiability – the ability to recover (the equivalence class of) θ from the sole knowledge of its realization R_θ. The overall objective of this paper is to introduce an embedding for ReLU neural networks of any depth, Φ(θ), that is invariant to scalings and that provides a locally linear parameterization of the realization of the network. Leveraging these two key properties, we derive some conditions under which a deep ReLU network is indeed locally identifiable from the knowledge of the realization on a finite set of samples x_i∈ℝ^d. We study the shallow case in more depth, establishing necessary and sufficient conditions for the network to be identifiable from a bounded subset 𝒳⊆ℝ^d.

READ FULL TEXT
research
08/12/2021

On minimal representations of shallow ReLU networks

The realization function of a shallow ReLU network is a continuous and p...
research
05/09/2023

SkelEx and BoundEx: Natural Visualization of ReLU Neural Networks

Despite their limited interpretability, weights and biases are still the...
research
06/15/2022

Local Identifiability of Deep ReLU Neural Networks: the Theory

Is a sample rich enough to determine, at least locally, the parameters o...
research
07/27/2021

Convergence of Deep ReLU Networks

We explore convergence of deep neural networks with the popular ReLU act...
research
03/24/2021

On a realization of motion and similarity group equivalence classes of labeled points in ℝ^k with applications to computer vision

We study a realization of motion and similarity group equivalence classe...
research
12/24/2021

Parameter identifiability of a deep feedforward ReLU neural network

The possibility for one to recover the parameters-weights and biases-of ...

Please sign up or login with your details

Forgot password? Click here to reset