Least Squares Maximum and Weighted Generalization-Memorization Machines

by   Shuai Wang, et al.

In this paper, we propose a new way of remembering by introducing a memory influence mechanism for the least squares support vector machine (LSSVM). Without changing the equation constraints of the original LSSVM, this mechanism, allows an accurate partitioning of the training set without overfitting. The maximum memory impact model (MIMM) and the weighted impact memory model (WIMM) are then proposed. It is demonstrated that these models can be degraded to the LSSVM. Furthermore, we propose some different memory impact functions for the MIMM and WIMM. The experimental results show that that our MIMM and WIMM have better generalization performance compared to the LSSVM and significant advantage in time cost compared to other memory models.


page 1

page 2

page 3

page 4


Generalization-Memorization Machines

Classifying the training data correctly without over-fitting is one of t...

Semi-Global Weighted Least Squares in Image Filtering

Solving the global method of Weighted Least Squares (WLS) model in image...

Weighted second-order cone programming twin support vector machine for imbalanced data classification

We propose a method of using a Weighted second-order cone programming tw...

The Iterated Weighted Least-Squares Fit

Least-squares methods are popular in statistical inference, but are wide...

Learning Sign-Constrained Support Vector Machines

Domain knowledge is useful to improve the generalization performance of ...

Efficient Implementation of a Recognition System Using the Cortex Ventral Stream Model

In this paper, an efficient implementation for a recognition system base...

Please sign up or login with your details

Forgot password? Click here to reset