Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR16-113 | 22nd July 2016 12:21

Time-Space Hardness of Learning Sparse Parities

RSS-Feed

Abstract:

We define a concept class ${\cal F}$ to be time-space hard (or memory-samples hard) if any learning algorithm for ${\cal F}$ requires either a memory of size super-linear in $n$ or a number of samples super-polynomial in $n$, where $n$ is the length of one sample.

A recent work shows that the class of all parity functions is time-space hard [Raz16]. Building on [Raz16], we show that the class of all sparse parities of Hamming weight $\ell$ is time-space hard, as long as $\ell \geq \omega(\log n / \log \log n)$. Consequently, linear-size DNF Formulas, linear-size Decision Trees and logarithmic-size Juntas are all time-space hard. Our result is more general and provides time-space lower bounds for learning any concept class of parity functions.

We give applications of our results in the field of bounded-storage cryptography. For example, for every $\omega(\log n) \leq k \leq n$, we obtain an encryption scheme that requires a private key of length $k$, and time complexity of $n$ per encryption/decryption of each bit, and is provenly and unconditionally secure as long as the attacker uses at most $o(nk)$ memory bits and the scheme is used at most $2^{o(k)}$ times. Previously, this was known only for $k=n$ [Raz16].



ISSN 1433-8092 | Imprint