Noam Shental, Assaf Zomet, Tomer Hertz, Yair Weiss
Signi(cid:2)cant progress in clustering has been achieved by algorithms that are based on pairwise af(cid:2)nities between the datapoints. In particular, spectral clustering methods have the advantage of being able to divide arbitrarily shaped clusters and are based on ef(cid:2)cient eigenvector calcu- lations. However, spectral methods lack a straightforward probabilistic interpretation which makes it dif(cid:2)cult to automatically set parameters us- ing training data. In this paper we use the previously proposed typical cut framework for pairwise clustering. We show an equivalence between calculating the typical cut and inference in an undirected graphical model. We show that for clustering problems with hundreds of datapoints exact inference may still be possible. For more complicated datasets, we show that loopy be- lief propagation (BP) and generalized belief propagation (GBP) can give excellent results on challenging clustering problems. We also use graph- ical models to derive a learning algorithm for af(cid:2)nity matrices based on labeled data.