Abstract
We establish the first known upper bound on the exact and Wyner's common information of n continuous random variables in terms of the dual total correlation between them (which is a generalization of mutual information). In particular, we show that when the pdf of the random variables is log-concave, there is a constant gap of n 2 Â log e + 9n log n between this upper bound and the dual total correlation lower bound that does not depend on the distribution. The upper bound is obtained using a computationally efficient dyadic decomposition scheme for constructing a discrete common randomness variable W from which the n random variables can be simulated in a distributed manner. We then bound the entropy of W using a new measure, which we refer to as the erosion entropy.