Using Multilevel Circulant Matrix Approximate to Speed Up Kernel Logistic Regression

08/19/2021
by   Junna~Zhang, et al.
0

Kernel logistic regression (KLR) is a classical nonlinear classifier in statistical machine learning. Newton method with quadratic convergence rate can solve KLR problem more effectively than the gradient method. However, an obvious limitation of Newton method for training large-scale problems is the O(n^3) time complexity and O(n^2) space complexity, where n is the number of training instances. In this paper, we employ the multilevel circulant matrix (MCM) approximate kernel matrix to save in storage space and accelerate the solution of the KLR. Combined with the characteristics of MCM and our ingenious design, we propose an MCM approximate Newton iterative method. We first simplify the Newton direction according to the semi-positivity of the kernel matrix and then perform a two-step approximation of the Newton direction by using MCM. Our method reduces the time complexity of each iteration to O(n log n) by using the multidimensional fast Fourier transform (mFFT). In addition, the space complexity can be reduced to O(n) due to the built-in periodicity of MCM. Experimental results on some large-scale binary and multi-classification problems show that our method makes KLR scalable for large-scale problems, with less memory consumption, and converges to test accuracy without sacrifice in a shorter time.

READ FULL TEXT
research
06/17/2020

Structured Stochastic Quasi-Newton Methods for Large-Scale Optimization Problems

In this paper, we consider large-scale finite-sum nonconvex problems ari...
research
06/28/2020

A Multilevel Approach to Training

We propose a novel training method based on nonlinear multilevel minimiz...
research
07/21/2020

A Semismooth-Newton's-Method-Based Linearization and Approximation Approach for Kernel Support Vector Machines

Support Vector Machines (SVMs) are among the most popular and the best p...
research
03/21/2019

OverSketched Newton: Fast Convex Optimization for Serverless Systems

Motivated by recent developments in serverless systems for large-scale m...
research
07/03/2019

Globally Convergent Newton Methods for Ill-conditioned Generalized Self-concordant Losses

In this paper, we study large-scale convex optimization algorithms based...
research
06/15/2018

Multilevel preconditioning for Ridge Regression

Solving linear systems is often the computational bottleneck in real-lif...
research
02/17/2020

A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization

We demonstrate how to scalably solve a class of constrained self-concord...

Please sign up or login with your details

Forgot password? Click here to reset