Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR12-177 | 19th December 2012 23:47

Information lower bounds via self-reducibility

RSS-Feed




TR12-177
Authors: Mark Braverman, Ankit Garg, Denis Pankratov, Omri Weinstein
Publication: 20th December 2012 06:24
Downloads: 3397
Keywords: 


Abstract:

We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the linear lower bound recently shown by Kerenidis et al., and answering an open problem by Chakrabarti et al. In our second result we prove that the information cost of the Inner Product function is arbitrarily close to the trivial upper bound as the permitted error tends to zero, again strengthening the linear lower bound recently proved by Braverman and Weinstein.
Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al., Bar-Yossef et al. and Barak et al.) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner.



ISSN 1433-8092 | Imprint