Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR11-164 | 9th December 2011 02:51

A discrepancy lower bound for information complexity

RSS-Feed




TR11-164
Authors: Mark Braverman, Omri Weinstein
Publication: 9th December 2011 04:42
Downloads: 4137
Keywords: 


Abstract:

This paper provides the first general technique for proving information lower bounds on two-party
unbounded-rounds communication problems. We show that the discrepancy lower bound, which
applies to randomized communication complexity, also applies to information complexity. More
precisely, if the discrepancy of a two-party function $f$ with respect to a distribution $\mu$ is $Disc_\mu f$,
then any two party randomized protocol computing $f$ must reveal at least $\Omega(\log (1/Disc_\mu f))$ bits
of information to the participants. As a corollary, we obtain that any two-party protocol
for computing a random function on $\{0,1\}^n\times\{0,1\}^n$ must reveal $\Omega(n)$ bits of information to
the participants. The proof develops a new simulation result that may be of an independent interest.



ISSN 1433-8092 | Imprint