Getting closer to $k$ min-entropy using summation
1
$begingroup$
A distribution $D$ over $Lambda$ has $k$ min-entropy if the largest probability mass given to any element in $Lambda$ is $2^{-k}$ (i.e., for all $ainLambda$ , $D(a)leq 2^{-k}$ , and for some $a$ it is $2^{-k}$ ). We denote it by $H_infty(D)=k$ . Let $X$ and $Y$ be two independent distributions over ${0, 1}^n$ such that both $X$ is $epsilon$ -close in statistical distance to a distribution with $k$ min-entropy and $Y$ is $epsilon$ -close in statistical distance to a distribution with $k$ min-entropy. Let $Z=X+Y$ denote the distribution over ${0, 1}^n$ obtained by sampling $xsim X$ and $ysim Y$ and outputting $x+y$ , where the sum is addition modulo $2$ coordinate wise. Prove that $Z$ is $epsilon^2$ -close in statistical distance to a distribution with min-entropy at least $...