A novel extension of Generalized Low-Rank Approximation of Matrices based on multiple-pairs of transformations

08/31/2018
by   Soheil Ahmadi, et al.
0

Dimension reduction is a main step in learning process which plays a essential role in many applications. The most popular methods in this field like SVD, PCA, and LDA, only can apply to vector data. This means that for higher order data like matrices or more generally tensors, data should be fold to a vector. By this folding, the probability of overfitting is increased and also maybe some important spatial features are ignored. Then, to tackle these issues, methods are proposed which work directly on data with their own format like GLRAM, MPCA, and MLDA. In these methods the spatial relationship among data are preserved and furthermore, the probability of overfitiing has fallen. Also the time and space complexity are less than vector-based ones. Having said that, because of the less parameters in multilinear methods, they have a much smaller search space to find an optimal answer in comparison with vector-based approach. To overcome this drawback of multilinear methods like GLRAM, we proposed a new method which is a general form of GLRAM and by preserving the merits of it have a larger search space. We have done plenty of experiments to show that our proposed method works better than GLRAM. Also, applying this approach to other multilinear dimension reduction methods like MPCA and MLDA is straightforwar

READ FULL TEXT
research
09/06/2023

A multilinear Nyström algorithm for low-rank approximation of tensors in Tucker format

The Nyström method offers an effective way to obtain low-rank approximat...
research
11/07/2012

Randomized Dimension Reduction on Massive Data

Scalability of statistical estimators is of increasing importance in mod...
research
10/21/2021

Autonomous Dimension Reduction by Flattening Deformation of Data Manifold under an Intrinsic Deforming Field

A new dimension reduction (DR) method for data sets is proposed by auton...
research
03/07/2023

Sufficient dimension reduction for feature matrices

We address the problem of sufficient dimension reduction for feature mat...
research
03/31/2021

Dimension reduction of open-high-low-close data in candlestick chart based on pseudo-PCA

The (open-high-low-close) OHLC data is the most common data form in the ...
research
10/21/2022

Dimension reduction of high-dimension categorical data with two or multiple responses considering interactions between responses

This paper models categorical data with two or multiple responses, focus...
research
10/24/2010

Local Component Analysis for Nonparametric Bayes Classifier

The decision boundaries of Bayes classifier are optimal because they lea...

Please sign up or login with your details

Forgot password? Click here to reset