Activation Functions for Generalized Learning Vector Quantization - A Performance Comparison

01/17/2019
by   Thomas Villmann, et al.
0

An appropriate choice of the activation function (like ReLU, sigmoid or swish) plays an important role in the performance of (deep) multilayer perceptrons (MLP) for classification and regression learning. Prototype-based classification learning methods like (generalized) learning vector quantization (GLVQ) are powerful alternatives. These models also deal with activation functions but here they are applied to the so-called classifier function instead. In this paper we investigate successful candidates of activation functions known for MLPs for application in GLVQ and their influence on the performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset