Ternary and Binary Quantization for Improved Classification

03/31/2022
by   Weizhi Lu, et al.
0

Dimension reduction and data quantization are two important methods for reducing data complexity. In the paper, we study the methodology of first reducing data dimension by random projection and then quantizing the projections to ternary or binary codes, which has been widely applied in classification. Usually, the quantization will seriously degrade the accuracy of classification due to high quantization errors. Interestingly, however, we observe that the quantization could provide comparable and often superior accuracy, as the data to be quantized are sparse features generated with common filters. Furthermore, this quantization property could be maintained in the random projections of sparse features, if both the features and random projection matrices are sufficiently sparse. By conducting extensive experiments, we validate and analyze this intriguing property.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/09/2008

Random projection trees for vector quantization

A simple and computationally efficient scheme for tree-structured vector...
research
06/28/2016

Adaptive Training of Random Mapping for Data Quantization

Data quantization learns encoding results of data with certain requireme...
research
05/11/2018

Taking the edge off quantization: projected back projection in dithered compressive sensing

Quantized compressive sensing (QCS) deals with the problem of representi...
research
06/18/2022

Bioinspired random projections for robust, sparse classification

Inspired by the use of random projections in biological sensing systems,...
research
06/03/2015

Bilinear Random Projections for Locality-Sensitive Binary Codes

Locality-sensitive hashing (LSH) is a popular data-independent indexing ...
research
02/25/2021

Quantization Algorithms for Random Fourier Features

The method of random projection (RP) is the standard technique in machin...

Please sign up or login with your details

Forgot password? Click here to reset