Clustering with Bregman Divergences
February 18, 2009
Bregman divergences are a broad class of distance functions, which include the
squared Euclidean ($\ell_2^2$) distance and Kullback-Leibler (KL) divergence. These divergences have been widely adopted in various application domains.
However, in general, Bregman divergences lack symmetry and the triangle inequality, which makes it difficult to apply techniques from Euclidean geometry. For example, given a Bregman divergence, how can we give algorithms with provable guarantees for clustering problems like k-means? Most previous proofs crucially rely on symmetry and the triangle inequality.
In this talk, we will provide a short introduction to Bregman divergences, explore previous work on clustering in the Euclidean space, and show how to use clustering algorithms for Euclidean spaces to cluster in Bregman divergences.
This talk will be based on papers by Chaudhuri and McGregor (COLT'08), Banerjee et al. (JMLR'05), Ackermann and Blomer (SODA'09), and our preliminary work (with Guy Blelloch and Anupam Gupta).
However, in general, Bregman divergences lack symmetry and the triangle inequality, which makes it difficult to apply techniques from Euclidean geometry. For example, given a Bregman divergence, how can we give algorithms with provable guarantees for clustering problems like k-means? Most previous proofs crucially rely on symmetry and the triangle inequality.
In this talk, we will provide a short introduction to Bregman divergences, explore previous work on clustering in the Euclidean space, and show how to use clustering algorithms for Euclidean spaces to cluster in Bregman divergences.
This talk will be based on papers by Chaudhuri and McGregor (COLT'08), Banerjee et al. (JMLR'05), Ackermann and Blomer (SODA'09), and our preliminary work (with Guy Blelloch and Anupam Gupta).