AdaDM: Enabling Normalization for Image Super-Resolution

11/27/2021
by   Jie Liu, et al.
31

Normalization like Batch Normalization (BN) is a milestone technique to normalize the distributions of intermediate layers in deep learning, enabling faster training and better generalization accuracy. However, in fidelity image Super-Resolution (SR), it is believed that normalization layers get rid of range flexibility by normalizing the features and they are simply removed from modern SR networks. In this paper, we study this phenomenon quantitatively and qualitatively. We found that the standard deviation of the residual feature shrinks a lot after normalization layers, which causes the performance degradation in SR networks. Standard deviation reflects the amount of variation of pixel values. When the variation becomes smaller, the edges will become less discriminative for the network to resolve. To address this problem, we propose an Adaptive Deviation Modulator (AdaDM), in which a modulation factor is adaptively predicted to amplify the pixel deviation. For better generalization performance, we apply BN in state-of-the-art SR networks with the proposed AdaDM. Meanwhile, the deviation amplification strategy in AdaDM makes the edge information in the feature more distinguishable. As a consequence, SR networks with BN and our AdaDM can get substantial performance improvements on benchmark datasets. Extensive experiments have been conducted to show the effectiveness of our method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset