Expressive power of binary and ternary neural networks

06/27/2022
by   Aleksandr Beknazaryan, et al.
0

We show that deep sparse ReLU networks with ternary weights and deep ReLU networks with binary weights can approximate β-Hölder functions on [0,1]^d. Also, for any interval [a,b)⊂ℝ, continuous functions on [0,1]^d can be approximated by networks of depth 2 with binary activation function 1_[a,b).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset