Revision #1 Authors: Venkatesan Guruswami, Atri Rudra

Accepted on: 8th October 2007 00:00

Downloads: 1018

Keywords:

We present error-correcting codes that achieve the information-theoretically best possible trade-off between the rate and

error-correction radius. Specifically, for every $0 < R < 1$ and $\eps

> 0$, we present an explicit construction of error-correcting codes of

rate $R$ that can be list decoded in polynomial time up to a fraction

$(1-R-\eps)$ of {\em worst-case} errors. At least theoretically, this

meets one of the central challenges in algorithmic coding theory.

Our codes are simple to describe: they are {\em folded Reed-Solomon

codes}, which are in fact {\em exactly} Reed-Solomon (RS) codes, but

viewed as a code over a larger alphabet by careful bundling of

codeword symbols. Given the ubiquity of RS codes, this is an

appealing feature of our result, and in fact our methods directly

yield better decoding algorithms for RS codes when errors occur in

{\em phased bursts}.

The alphabet size of these folded RS codes is polynomial in the block

length. We are able to reduce this to a constant (depending on

$\eps$) using ideas concerning ``list recovery'' and

expander-based codes. Concatenating the folded RS codes with suitable inner codes also gives us polynomial time constructible binary codes that can be efficiently list decoded up to the Zyablov bound, i.e., up to twice the radius achieved by the standard GMD decoding of concatenated codes.

TR05-133 Authors: Venkatesan Guruswami, Atri Rudra

Publication: 17th November 2005 01:55

Downloads: 838

Keywords:

For every $0 < R < 1$ and $\eps > 0$, we present an explicit

construction of error-correcting codes of rate $R$ that can be list

decoded in polynomial time up to a fraction $(1-R-\eps)$ of errors.

These codes achieve the ``capacity'' for decoding from {\em adversarial} errors, i.e., achieve the {\em optimal}~ trade-off between rate and error-correction radius. At least theoretically, this meets one of the central challenges in coding theory.

Prior to this work, explicit codes achieving capacity were not known

for {\em any} rate $R$. In fact, our codes are the first to beat

the error-correction radius of $1-\sqrt{R}$, that was achieved for

Reed-Solomon codes in \cite{GS}, for all rates $R$. (For rates $R <

1/16$, a recent breakthrough by Parvaresh and Vardy

improved upon the $1-\sqrt{R}$ bound;~ for $R \to 0$, their

algorithm can decode a fraction $1-O(R \log(1/R))$ of errors.)

Our codes are simple to describe --- they are certain {\em folded

Reed-Solomon codes}, which are in fact {\em exactly} Reed-Solomon (RS) codes, but viewed as a code over a larger alphabet by careful

bundling of codeword symbols. Given the ubiquity of RS codes, this

is an appealing feature of our result, since the codes we propose

are not too far from the ones in actual use.

The main insight in our work is that some carefully chosen folded RS

codes are ``compressed" versions of a related family of

Parvaresh-Vardy codes. Further, the decoding of the folded RS codes

can be reduced to list decoding the related Parvaresh-Vardy codes.

The alphabet size of these folded RS codes is polynomial in the block length. This can be reduced to a (large) constant using ideas concerning ``list recovering'' and expander-based codes. Concatenating the folded RS codes with suitable inner codes also gives us polytime constructible binary codes that can be efficiently list decoded up to the Zyablov bound.