DeepAI AI Chat
Log In Sign Up

Kernel Subspace and Feature Extraction

by   Xiangxiang Xu, et al.

We study kernel methods in machine learning from the perspective of feature subspace. We establish a one-to-one correspondence between feature subspaces and kernels and propose an information-theoretic measure for kernels. In particular, we construct a kernel from Hirschfeld–Gebelein–Rényi maximal correlation functions, coined the maximal correlation kernel, and demonstrate its information-theoretic optimality. We use the support vector machine (SVM) as an example to illustrate a connection between kernel methods and feature extraction approaches. We show that the kernel SVM on maximal correlation kernel achieves minimum prediction error. Finally, we interpret the Fisher kernel as a special maximal correlation kernel and establish its optimality.


page 1

page 2

page 3

page 4


Practical Selection of SVM Supervised Parameters with Different Feature Representations for Vowel Recognition

It is known that the classification performance of Support Vector Machin...

Grassmannian Packings in Neural Networks: Learning with Maximal Subspace Packings for Diversity and Anti-Sparsity

Kernel sparsity ("dying ReLUs") and lack of diversity are commonly obser...

Software Defect Prediction Using Support Vector Machine

Software defect prediction is an essential task during the software deve...

TrIK-SVM : an alternative decomposition for kernel methods in Krein spaces

The proposed work aims at proposing a alternative kernel decomposition i...

An Efficient Approach to Informative Feature Extraction from Multimodal Data

One primary focus in multimodal feature extraction is to find the repres...

An ADMM Solver for the MKL-L_0/1-SVM

We formulate the Multiple Kernel Learning (abbreviated as MKL) problem f...

Construction of 'Support Vector' Machine Feature Spaces via Deformed Weyl-Heisenberg Algebra

This paper uses deformed coherent states, based on a deformed Weyl-Heise...