Multi-bin Trainable Linear Unit for Fast Image Restoration Networks

by   Shuhang Gu, et al.
ETH Zurich

Tremendous advances in image restoration tasks such as denoising and super-resolution have been achieved using neural networks. Such approaches generally employ very deep architectures, large number of parameters, large receptive fields and high nonlinear modeling capacity. In order to obtain efficient and fast image restoration networks one should improve upon the above mentioned requirements. In this paper we propose a novel activation function, the multi-bin trainable linear unit (MTLU), for increasing the nonlinear modeling capacity together with lighter and shallower networks. We validate the proposed fast image restoration networks for image denoising (FDnet) and super-resolution (FSRnet) on standard benchmarks. We achieve large improvements in both memory and runtime over current state-of-the-art for comparable or better PSNR accuracies.


page 3

page 8

page 9

page 10


Stochastic Frequency Masking to Improve Super-Resolution and Denoising Networks

Super-resolution and denoising are ill-posed yet fundamental image resto...

xUnit: Learning a Spatial Activation Function for Efficient Image Restoration

In recent years, deep neural networks (DNNs) achieved unprecedented perf...

MemNet: A Persistent Memory Network for Image Restoration

Recently, very deep convolutional neural networks (CNNs) have been attra...

Toward DNN of LUTs: Learning Efficient Image Restoration with Multiple Look-Up Tables

The widespread usage of high-definition screens on edge devices stimulat...

Simple Baselines for Image Restoration

Although there have been significant advances in the field of image rest...

WIRE: Wavelet Implicit Neural Representations

Implicit neural representations (INRs) have recently advanced numerous v...

Rank-One Network: An Effective Framework for Image Restoration

The principal rank-one (RO) components of an image represent the self-si...

Please sign up or login with your details

Forgot password? Click here to reset