Bayesian Inference of Random Dot Product Graphs via Conic Programming
We present a convex cone program to infer the latent probability matrix of a random dot product graph (RDPG). The optimization problem maximizes the Bernoulli maximum likelihood function with an added nuclear norm regularization term. The dual problem has a particularly nice form, related to the well-known semidefinite program relaxation of the MaxCut problem. Using the primal-dual optimality conditions, we bound the entries and rank of the primal and dual solutions. Furthermore, we bound the optimal objective value and prove asymptotic consistency of the probability estimates of a slightly modified model under mild technical assumptions. Our experiments on synthetic RDPGs not only recover natural clusters, but also reveal the underlying low-dimensional geometry of the original data. We also demonstrate that the method recovers latent structure in the Karate Club Graph and synthetic U.S. Senate vote graphs and is scalable to graphs with up to a few hundred nodes.
READ FULL TEXT