A Generic Approach for Accelerating Belief Propagation based DCOP Algorithms via A Branch-and-Bound Technique

06/17/2019
by   Ziyu Chen, et al.
0

Belief propagation approaches, such as Max-Sum and its variants, are a kind of important methods to solve large-scale Distributed Constraint Optimization Problems (DCOPs). However, for problems with n-ary constraints, these algorithms face a huge challenge since their computational complexity scales exponentially with the number of variables a function holds. In this paper, we present a generic and easy-to-use method based on a branch-and-bound technique to solve the issue, called Function Decomposing and State Pruning (FDSP). We theoretically prove that FDSP can provide monotonically non-increasing upper bounds and speed up belief propagation based DCOP algorithms without an effect on solution quality. Also, our empirically evaluation indicates that FDSP can reduce 97% of the search space at least and effectively accelerate Max-Sum, compared with the state-of-the-art.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset