Asymptotic Log-Det Rank Minimization via (Alternating) Iteratively Reweighted Least Squares

06/28/2021
by   Sebastian Krämer, et al.
0

The affine rank minimization (ARM) problem is well known for both its applications and the fact that it is NP-hard. One of the most successful approaches, yet arguably underrepresented, is iteratively reweighted least squares (IRLS), more specifically IRLS-0. Despite comprehensive empirical evidence that it overall outperforms nuclear norm minimization and related methods, it is still not understood to a satisfying degree. In particular, the significance of a slow decrease of the therein appearing regularization parameter denoted γ poses interesting questions. While commonly equated to matrix recovery, we here consider the ARM independently. We investigate the particular structure and global convergence property behind the asymptotic minimization of the log-det objective function on which IRLS-0 is based. We expand on local convergence theorems, now with an emphasis on the decline of γ, and provide representative examples as well as counterexamples such as a diverging IRLS-0 sequence that clarify theoretical limits. We present a data sparse, alternating realization AIRLS-p (related to prior work under the name SALSA) that, along with the rest of this work, serves as basis and introduction to the more general tensor setting. In conclusion, numerical sensitivity experiments are carried out that reconfirm the success of IRLS-0 and demonstrate that in surprisingly many cases, a slower decay of γ will yet lead to a solution of the ARM problem, up to the point that the exact theoretical phase transition for generic recoverability can be observed. Likewise, this suggests that non-convexity is less substantial and problematic for the log-det approach than it might initially appear.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset