麻豆传媒映画

Information Contraction and Decomposition
Anuran Makur
Ph.D. Dissertation, Massachusetts Institute of Technology, June 2019
Abstract

Thesis summary:

Information contraction is one of the most fundamental concepts in information theory as evidenced by the numerous classical converse theorems that utilize it. This dissertation studies several problems aimed at better understanding this notion, broadly construed, within the intertwined realms of information theory, statistics, and discrete probability theory.

In information theory, the contraction of f-divergences, such as Kullback-Leibler (KL) divergence, 蠂2-divergence, and total variation (TV) distance, through channels (or the contraction of mutual f -information along Markov chains) is quantitatively captured by the well-known data processing inequalities. These inequalities can be tightened to produce 鈥渟trong鈥 data processing inequalities (SDPIs), which are obtained by introducing appropriate channel-dependent or source-channel-dependent 鈥渃ontraction coefficients.鈥

The thesis first proves various properties of contraction coefficients of source-channel pairs, and derives linear bounds on specific classes of such contraction coefficients in terms of the contraction coefficient for 蠂2-divergence (or the Hirschfeld-Gebelein-R茅nyi maximal correlation). Next, the thesis adopts a more statistical and machine learning perspective in elucidating the elegant geometry of SDPIs for 蠂2-divergence by developing modal decompositions of bivariate distributions based on singular value decompositions of conditional expectation operators.