On Mean Absolute Error for Deep Neural Network Based Vector-to-Vector Regression

08/12/2020
by   Jun Qi, et al.
0

In this paper, we exploit the properties of mean absolute error (MAE) as a loss function for the deep neural network (DNN) based vector-to-vector regression. The goal of this work is two-fold: (i) presenting performance bounds of MAE, and (ii) demonstrating new properties of MAE that make it more appropriate than mean squared error (MSE) as a loss function for DNN based vector-to-vector regression. First, we show that a generalized upper-bound for DNN-based vector- to-vector regression can be ensured by leveraging the known Lipschitz continuity property of MAE. Next, we derive a new generalized upper bound in the presence of additive noise. Finally, in contrast to conventional MSE commonly adopted to approximate Gaussian errors for regression, we show that MAE can be interpreted as an error modeled by Laplacian distribution. Speech enhancement experiments are conducted to corroborate our proposed theorems and validate the performance advantages of MAE over MSE for DNN based regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/04/2020

Analyzing Upper Bounds on Mean Absolute Errors for Deep Neural Network Based Vector-to-Vector Regression

In this paper, we show that, in vector-to-vector regression utilizing de...
research
02/14/2023

Cauchy Loss Function: Robustness Under Gaussian and Cauchy Noise

In supervised machine learning, the choice of loss function implicitly a...
research
02/02/2021

Leveraging IoT and Weather Conditions to Estimate the Riders Waiting for the Bus Transit on Campus

The communication technology revolution in this era has increased the us...
research
04/24/2023

Quantum Machine Learning Approach for the Prediction of Surface Roughness in Additive Manufactured Specimens

Surface roughness is a crucial factor influencing the performance and fu...
research
12/17/2021

Improving evidential deep learning via multi-task learning

The Evidential regression network (ENet) estimates a continuous target a...
research
05/13/2023

A note on bounded distance-based information loss metrics for statistical disclosure control of numeric microdata

In the field of statistical disclosure control, the tradeoff between dat...
research
11/30/2020

Iterative Error Decimation for Syndrome-Based Neural Network Decoders

In this letter, we introduce a new syndrome-based decoder where a deep n...

Please sign up or login with your details

Forgot password? Click here to reset