Dmitry Gavinsky , Pavel Pudlák
On the joint entropy of $d$-wise-independent variables

Comment.Math.Univ.Carolin. 57,3 (2016) 333-343.

Abstract:How low can the joint entropy of $n$ $d$-wise independent (for $d\geq 2$) discrete random variables be, subject to given constraints on the individual distributions (say, no value may be taken by a variable with probability greater than~$p$, for $p< 1$)? This question has been posed and partially answered in a recent work of Babai [{\it Entropy versus pairwise independence \/} (preliminary version), {\tt http://people.cs.uchicago.edu/~laci/papers/13augEntropy.pdf}, 2013]. In this paper we improve some of his bounds, prove new bounds in a wider range of parameters and show matching upper bounds in some special cases. In particular, we prove tight lower bounds for the min-entropy (as well as the entropy) of pairwise and three-wise independent balanced binary variables for infinitely many values of $n$.

Keywords: $d$-wise-independent variables; entropy; lower bound

DOI: DOI 10.14712/1213-7243.2015.169
AMS Subject Classification: 60C05

PDF