Sparse quadratic classification rules via linear dimension reduction

11/13/2017
by   Irina Gaynanova, et al.
0

We consider the problem of high-dimensional classification between the two groups with unequal covariance matrices. Rather than estimating the full quadratic discriminant rule, we perform simultaneous variable selection and linear dimension reduction on original data, with the subsequent application of quadratic discriminant analysis on the reduced space. The projection vectors can be efficiently estimated by solving the convex optimization problem with sparsity-inducing penalty. The new rule performs comparably to linear discriminant analysis when the assumption of equal covariance matrices is satisfied, and improves the misclassification error rates when this assumption is violated. In contrast to quadratic discriminant analysis, the proposed framework doesn't require estimation of precision matrices and scales linearly with the number of measurements, making it especially attractive for the use on high-dimensional datasets. We support the methodology with theoretical guarantees on variable selection consistency, and empirical comparison with competing approaches.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset