Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR16-190 | 21st November 2016 06:30

Trading information complexity for error

RSS-Feed




TR16-190
Authors: Yuval Dagan, Yuval Filmus, Hamed Hatami, Yaqiao Li
Publication: 28th November 2016 15:57
Downloads: 942
Keywords: 


Abstract:

We consider the standard two-party communication model. The central problem studied in this article is how much one can save in information complexity by allowing an error of $\epsilon$.
For arbitrary functions, we obtain lower bounds and upper bounds indicating a gain that is of order $\Omega(h(\epsilon))$ and $O(h(\sqrt{\epsilon}))$. Here $h$ denotes the binary entropy function. We analyze the case of the two-bit AND function in detail to show that for this function the gain is $\Theta(h(\epsilon))$. This answers a question of [M. Braverman, A. Garg, D. Pankratov, and O. Weinstein, From information to exact communication (extended abstract), STOC'13].
We obtain sharp bounds for the set disjointness function of order $n$. For the case of the distributional error, we introduce a new protocol that achieves a gain of $\Theta(\sqrt{h(\epsilon)})$ provided that $n$ is sufficiently large. We apply these results to answer another of question of Braverman et al. regarding the randomized communication complexity of the set disjointness function.
Answering a question of [Mark Braverman, Interactive information complexity, STOC'12], we apply our analysis of the set disjointness function to establish a gap between the two different notions of the prior-free information cost. This implies that amortized randomized communication complexity is not necessarily equal to the amortized distributional communication complexity with respect to the hardest distribution.



ISSN 1433-8092 | Imprint