Â鶹´«Ã½Ó³»­

Distance-Divergence Inequalities
Presenter(s)

2013 Shannon Lecture
Katalin Marton (Alfréd Rényi Institute of Mathematics, Hungarian Academy of Sciences)

Abstract

Let (X, d, q) be a metric probability space. We consider distances W between measures p and q that are defined by optimal couplings. Given a distance W, a distance-divergence inequality for q means that W (p, q) can be overbounded in terms of (informational) divergence of p with respect to q. There are many interesting cases when such inequalities hold. The technique built on such inequalities was initially motivated by the intention to simplify the proof of strong converses in Shannon theory. Moreover, it yields an effective way to prove measure concentration and deviation inequalities. Distance-divergence inequalities are also closely related to the ergodic properties of q. There are also connections with the rate of the decrease of divergence for the Gibbs sampler.