SAPE: Spatially-Adaptive Progressive Encoding for Neural Optimization

04/19/2021
by   Amir Hertz, et al.
0

Multilayer-perceptrons (MLP) are known to struggle with learning functions of high-frequencies, and in particular cases with wide frequency bands. We present a spatially adaptive progressive encoding (SAPE) scheme for input signals of MLP networks, which enables them to better fit a wide range of frequencies without sacrificing training stability or requiring any domain specific preprocessing. SAPE gradually unmasks signal components with increasing frequencies as a function of time and space. The progressive exposure of frequencies is monitored by a feedback loop throughout the neural optimization process, allowing changes to propagate at different rates among local spatial portions of the signal space. We demonstrate the advantage of SAPE on a variety of domains and applications, including regression of low dimensional signals and images, representation learning of occupancy networks, and a geometric task of mesh transfer between 3D shapes.

READ FULL TEXT

page 1

page 6

page 7

page 8

page 9

page 10

research
02/09/2022

PINs: Progressive Implicit Networks for Multi-Scale Neural Representations

Multi-layer perceptrons (MLP) have proven to be effective scene encoders...
research
06/16/2023

MultiWave: Multiresolution Deep Architectures through Wavelet Decomposition for Multivariate Time Series Prediction

The analysis of multivariate time series data is challenging due to the ...
research
10/11/2021

Mesh Draping: Parametrization-Free Neural Mesh Transfer

Despite recent advances in geometric modeling, 3D mesh modeling still in...
research
09/27/2022

Blind Robust VideoWatermarking Based on Adaptive Region Selection and Channel Reference

Digital watermarking technology has a wide range of applications in vide...
research
03/03/2020

Procedural band patterns

We seek to cover a parametric domain with a set of evenly spaced bands w...
research
11/10/2015

Analyzing Stability of Convolutional Neural Networks in the Frequency Domain

Understanding the internal process of ConvNets is commonly done using vi...
research
11/01/2019

Progressive Compressed Records: Taking a Byte out of Deep Learning Data

Deep learning training accesses vast amounts of data at high velocity, p...

Please sign up or login with your details

Forgot password? Click here to reset