Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > KEYWORD > ENTROPY:
Reports tagged with entropy:
TR98-030 | 9th June 1998
Stasys Jukna, Stanislav Zak

On Branching Programs With Bounded Uncertainty

We propose an information-theoretic approach to proving
lower bounds on the size of branching programs (b.p.). The argument
is based on Kraft-McMillan type inequalities for the average amount of
uncertainty about (or entropy of) a given input during various
stages of the computation. ... more >>>


TR04-059 | 21st June 2004
Beatrice List, Markus Maucher, Uwe Schöning, Rainer Schuler

Randomized Quicksort and the Entropy of the Random Number Generator

The worst-case complexity of an implementation of Quicksort depends
on the random number generator that is used to select the pivot
elements. In this paper we estimate the expected number of
comparisons of Quicksort as a function in the entropy of the random
source. We give upper and lower bounds ... more >>>


TR10-160 | 28th October 2010
Zeev Dvir, Dan Gutfreund, Guy Rothblum, Salil Vadhan

On Approximating the Entropy of Polynomial Mappings

We investigate the complexity of the following computational problem:

Polynomial Entropy Approximation (PEA):
Given a low-degree polynomial mapping
$p : F^n\rightarrow F^m$, where $F$ is a finite field, approximate the output entropy
$H(p(U_n))$, where $U_n$ is the uniform distribution on $F^n$ and $H$ may be any of several entropy measures.

... more >>>

TR11-141 | 2nd November 2011
Salil Vadhan, Colin Jia Zheng

Characterizing Pseudoentropy and Simplifying Pseudorandom Generator Constructions

Revisions: 3

We provide a characterization of pseudoentropy in terms of hardness of sampling: Let $(X,B)$ be jointly distributed random variables such that $B$ takes values in a polynomial-sized set. We show that $B$ is computationally indistinguishable from a random variable of higher Shannon entropy given $X$ if and only if there ... more >>>


TR13-050 | 1st April 2013
Venkatesan Guruswami, Patrick Xia

Polar Codes: Speed of polarization and polynomial gap to capacity

Revisions: 1

We prove that, for all binary-input symmetric memoryless channels, polar codes enable reliable communication at rates within $\epsilon > 0$ of the Shannon capacity with a block length, construction complexity, and decoding complexity all bounded by a *polynomial* in $1/\epsilon$. Polar coding gives the *first known explicit construction* with rigorous ... more >>>


TR14-100 | 4th August 2014
Salman Beigi, Omid Etesami, Amin Gohari

The Value of Help Bits in Randomized and Average-Case Complexity

"Help bits" are some limited trusted information about an instance or instances of a computational problem that may reduce the computational complexity of solving that instance or instances. In this paper, we study the value of help bits in the settings of randomized and average-case complexity.

Amir, Beigel, and Gasarch ... more >>>




ISSN 1433-8092 | Imprint