Expressive power of binary and ternary neural networks
We show that deep sparse ReLU networks with ternary weights and deep ReLU networks with binary weights can approximate β-Hölder functions on [0,1]^d. Also, for any interval [a,b)⊂ℝ, continuous functions on [0,1]^d can be approximated by networks of depth 2 with binary activation function 1_[a,b).
READ FULL TEXT