Provably Powerful Graph Neural Networks for Directed Multigraphs
Béni Egressy, Luc von Niederhäusern, et al.
AAAI 2024
Variational Bayes has allowed the analysis of Bayes’ rule in terms of gradient flows, partial differential equations (PDE), and diffusion processes. Mean-field variational inference (MFVI) is a version of approximate Bayesian inference that optimizes over product distributions. In the spirit of variational Bayes, we represent the MFVI problem in three different manners: a gradient flow in a Wasserstein space, a system of quasilinear PDE, and a McKean–Vlasov diffusion process. Furthermore, we show that a time-discretized coordinate ascent variational inference algorithm in the product Wasserstein space of measures yields a gradient flow in the small-time-step limit. A similar result is obtained for their associated densities, with the limit given by a system of quasilinear PDEs. We illustrate how the tools provided here can be used to guarantee convergence of algorithms, and can be extended to a variety of approaches, old and new, to solve MFVI problems.
Béni Egressy, Luc von Niederhäusern, et al.
AAAI 2024
Raúl Fernández Díaz, Lam Thanh Hoang, et al.
ICLR 2025
Michael Glass, Nandana Mihindukulasooriya, et al.
ISWC 2017
Wojciech Ozga, Do Le Quoc , et al.
IFIP DBSec 2021