Online Algorithms and Lower Bounds for Average-Case Matrix Discrepancy
We study the operator norm discrepancy of i.i.d. random matrices, initiating the matrix-valued analog of a long line of work on the ℓ^∞ norm discrepancy of i.i.d. random vectors. First, we give a new analysis of the matrix hyperbolic cosine algorithm of Zouzias (2011), a matrix version of an online vector discrepancy algorithm of Spencer (1977) studied for average-case inputs by Bansal and Spencer (2020), for the case of i.i.d. random matrix inputs. We both give a general analysis and extract concrete bounds on the discrepancy achieved by this algorithm for matrices with independent entries and positive semidefinite matrices drawn from Wishart distributions. Second, using the first moment method, we give lower bounds on the discrepancy of random matrices, in particular showing that the matrix hyperbolic cosine algorithm achieves optimal discrepancy up to logarithmic terms in several cases. We both treat the special case of the Gaussian orthogonal ensemble and give a general result for low-rank matrix distributions that we apply to orthogonally invariant random projections.
READ FULL TEXT