This paper provides the first general technique for proving information lower bounds on two-party
unbounded-rounds communication problems. We show that the discrepancy lower bound, which
applies to randomized communication complexity, also applies to information complexity. More
precisely, if the discrepancy of a two-party function f with respect to a distribution \mu is Disc_\mu f,
then any two party randomized protocol computing f must reveal at least \Omega(\log (1/Disc_\mu f)) bits
of information to the participants. As a corollary, we obtain that any two-party protocol
for computing a random function on \{0,1\}^n\times\{0,1\}^n must reveal \Omega(n) bits of information to
the participants. The proof develops a new simulation result that may be of an independent interest.