One Reflection Suffice

09/30/2020
by   Alexander Mathiasen, et al.
4

Orthogonal weight matrices are used in many areas of deep learning. Much previous work attempt to alleviate the additional computational resources it requires to constrain weight matrices to be orthogonal. One popular approach utilizes *many* Householder reflections. The only practical drawback is that many reflections cause low GPU utilization. We mitigate this final drawback by proving that *one* reflection is sufficient, if the reflection is computed by an auxiliary neural network.

READ FULL TEXT

page 3

page 8

research
05/29/2018

Mirror, Mirror, on the Wall, Who's Got the Clearest Image of Them All? - A Tailored Approach to Single Image Reflection Removal

Removing reflection artefacts from a single-image is a problem of both t...
research
09/25/2018

Reflection On Reflection In Design Study

Visualization design study research methodologies emphasize the need for...
research
06/10/2020

Least-Squares Affine Reflection Using Eigen Decomposition

This note summarizes the steps to computing the best-fitting affine refl...
research
04/18/2020

CWY Parametrization for Scalable Learning of Orthogonal and Stiefel Matrices

In this paper we propose a new approach for optimization over orthogonal...
research
01/08/2019

Learning with Collaborative Neural Network Group by Reflection

For the present engineering of neural systems, the preparing of extensiv...
research
10/19/2020

DeepReflecs: Deep Learning for Automotive Object Classification with Radar Reflections

This paper presents an novel object type classification method for autom...

Please sign up or login with your details

Forgot password? Click here to reset