Multiplicative Updates for Elastic Net Regularized Convolutional NMF Under β-Divergence
We generalize the convolutional NMF by taking the β-divergence as the loss function, add a regularizer for sparsity in the form of an elastic net, and provide multiplicative update rules for its factors in closed form. The new update rules embed the β-NMF, the standard convolutional NMF, and sparse coding alias basis pursuit. We demonstrate that the originally published update rules for the convolutional NMF are suboptimal and that their convergence rate depends on the size of the kernel.
READ FULL TEXT