Graph filtering over expanding graphs
Our capacity to learn representations from data is related to our ability to design filters that can leverage their coupling with the underlying domain. Graph filters are one such tool for network data and have been used in a myriad of applications. But graph filters work only with a fixed number of nodes despite the expanding nature of practical networks. Learning filters in this setting is challenging not only because of the increased dimensions but also because the connectivity is known only up to an attachment model. We propose a filter learning scheme for data over expanding graphs by relying only on such a model. By characterizing the filter stochastically, we develop an empirical risk minimization framework inspired by multi-kernel learning to balance the information inflow and outflow at the incoming nodes. We particularize the approach for denoising and semi-supervised learning (SSL) over expanding graphs and show near-optimal performance compared with baselines relying on the exact topology. For SSL, the proposed scheme uses the incoming node information to improve the task on the existing ones. These findings lay the foundation for learning representations over expanding graphs by relying only on the stochastic connectivity model.
READ FULL TEXT