The Bregman-Tweedie Classification Model

07/16/2019
by   Hyenkyun Woo, et al.
0

This work proposes the Bregman-Tweedie classification model and analyzes the domain structure of the extended exponential function, an extension of the classic generalized exponential function with additional scaling parameter, and related high-level mathematical structures, such as the Bregman-Tweedie loss function and the Bregman-Tweedie divergence. The base function of this divergence is the convex function of Legendre type induced from the extended exponential function. The Bregman-Tweedie loss function of the proposed classification model is the regular Legendre transformation of the Bregman-Tweedie divergence. This loss function is a polynomial parameterized function between unhinge loss and the logistic loss function. Actually, we have two sub-models of the Bregman-Tweedie classification model; H-Bregman with hinge-like loss function and L-Bregman with logistic-like loss function. Although the proposed classification model is nonconvex and unbounded, empirically, we have observed that the H-Bregman and L-Bregman outperform, in terms of the Friedman ranking, logistic regression and SVM and show reasonable performance in terms of the classification accuracy in the category of the binary linear classification problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2019

Logitron: Perceptron-augmented classification model based on an extended logistic loss function

Classification is the most important process in data analysis. However, ...
research
10/04/2019

Bregman-divergence-guided Legendre exponential dispersion model with finite cumulants (K-LED)

Exponential dispersion model is a useful framework in machine learning a...
research
02/12/2019

A Tunable Loss Function for Binary Classification

We present α-loss, α∈ [1,∞], a tunable loss function for binary classifi...
research
05/24/2022

Soft-SVM Regression For Binary Classification

The binomial deviance and the SVM hinge loss functions are two of the mo...
research
05/15/2023

Label Smoothing is Robustification against Model Misspecification

Label smoothing (LS) adopts smoothed targets in classification tasks. Fo...
research
02/22/2022

Nonconvex Extension of Generalized Huber Loss for Robust Learning and Pseudo-Mode Statistics

We propose an extended generalization of the pseudo Huber loss formulati...
research
05/30/2023

Deep Clustering with Incomplete Noisy Pairwise Annotations: A Geometric Regularization Approach

The recent integration of deep learning and pairwise similarity annotati...

Please sign up or login with your details

Forgot password? Click here to reset