The Unified Propagation and Scaling Algorithm

Part of Advances in Neural Information Processing Systems 14 (NIPS 2001)

Bibtex Metadata Paper


Yee Teh, Max Welling


In this paper we will show that a restricted class of constrained mini- mum divergence problems, named generalized inference problems, can be solved by approximating the KL divergence with a Bethe free energy. The algorithm we derive is closely related to both loopy belief propaga- tion and iterative scaling. This uniļ¬ed propagation and scaling algorithm reduces to a convergent alternative to loopy belief propagation when no constraints are present. Experiments show the viability of our algorithm.