Batch Normalization in the final layer of generative networks

05/18/2018
by   Sean Mullery, et al.
0

Generative Networks have shown great promise in generating photo-realistic images. Despite this, the theory surrounding them is still an active research area. Much of the useful work with Generative networks rely on heuristics that tend to produce good results. One of these heuristics is the advice not to use Batch Normalization in the final layer of the generator network. Many of the state-of-the-art generative network architectures use this heuristic, but the reasons for doing so are inconsistent. This paper will show that this is not necessarily a good heuristic and that Batch Normalization can be beneficial in the final layer of the generator network either by placing it before the final non-linear activation, usually a tanh or replacing the final tanh activation altogether with Batch Normalization and clipping. We show that this can lead to the faster training of Generator networks by matching the generator to the mean and standard deviation of the target distribution's image colour values.

READ FULL TEXT

page 5

page 6

research
12/09/2019

An Empirical Study on Position of the Batch Normalization Layer in Convolutional Neural Networks

In this paper, we have studied how the training of the convolutional neu...
research
05/21/2015

Why Regularized Auto-Encoders learn Sparse Representation?

While the authors of Batch Normalization (BN) identify and address an im...
research
11/23/2020

Comparing Normalization Methods for Limited Batch Size Segmentation Neural Networks

The widespread use of Batch Normalization has enabled training deeper ne...
research
03/04/2016

Normalization Propagation: A Parametric Technique for Removing Internal Covariate Shift in Deep Networks

While the authors of Batch Normalization (BN) identify and address an im...
research
12/30/2022

Batchless Normalization: How to Normalize Activations with just one Instance in Memory

In training neural networks, batch normalization has many benefits, not ...
research
02/21/2018

Batch Normalization and the impact of batch structure on the behavior of deep convolution networks

Batch normalization was introduced in 2015 to speed up training of deep ...
research
04/04/2017

Geracao Automatica de Paineis de Controle para Analise de Mobilidade Urbana Utilizando Redes Complexas

In this paper we describe an automatic generator to support the data sci...

Please sign up or login with your details

Forgot password? Click here to reset