Revisit Fuzzy Neural Network: Demystifying Batch Normalization and ReLU with Generalized Hamming Network

10/27/2017
by   Lixin Fan, et al.
0

We revisit fuzzy neural network with a cornerstone notion of generalized hamming distance, which provides a novel and theoretically justified framework to re-interpret many useful neural network techniques in terms of fuzzy logic. In particular, we conjecture and empirically illustrate that, the celebrated batch normalization (BN) technique actually adapts the normalized bias such that it approximates the rightful bias induced by the generalized hamming distance. Once the due bias is enforced analytically, neither the optimization of bias terms nor the sophisticated batch normalization is needed. Also in the light of generalized hamming distance, the popular rectified linear units (ReLU) can be treated as setting a minimal hamming distance threshold between network inputs and weights. This thresholding scheme, on the one hand, can be improved by introducing double thresholding on both extremes of neuron outputs. On the other hand, ReLUs turn out to be non-essential and can be removed from networks trained for simple tasks like MNIST classification. The proposed generalized hamming network (GHN) as such not only lends itself to rigorous analysis and interpretation within the fuzzy logic theory but also demonstrates fast learning speed, well-controlled behaviour and state-of-the-art performances on a variety of learning tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/07/2019

Two classes of linear codes and their generalized Hamming weights

The generalized Hamming weights (GHWs) are fundamental parameters of lin...
research
11/15/2017

Deep Epitome for Unravelling Generalized Hamming Network: A Fuzzy Logic Interpretation of Deep Learning

This paper gives a rigorous analysis of trained Generalized Hamming Netw...
research
09/13/2019

Relative Generalized Hamming weights of affine Cartesian codes

We explicitly determine all the relative generalized Hamming weights of ...
research
02/25/2020

Generalized Hamming weight of Projective Toric Code over Hypersimplices

The d-th hypersimplex of R^s is the convex hull in R^s of all integral p...
research
08/30/2021

A New Test for Hamming-Weight Dependencies

We describe a new statistical test for pseudorandom number generators (P...
research
09/28/2021

The connections among Hamming metric, b-symbol metric, and r-th generalized Hamming metric

The r-th generalized Hamming metric and the b-symbol metric are two diff...
research
12/08/2018

Generalized Batch Normalization: Towards Accelerating Deep Neural Networks

Utilizing recently introduced concepts from statistics and quantitative ...

Please sign up or login with your details

Forgot password? Click here to reset