Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR01-045 | 26th April 2001 00:00

Neural Networks with Local Receptive Fields and Superlinear VC Dimension

RSS-Feed

Abstract:

Local receptive field neurons comprise such well-known and widely
used unit types as radial basis function neurons and neurons with
center-surround receptive field. We study the Vapnik-Chervonenkis
(VC) dimension of feedforward neural networks with one hidden layer
of these units. For several variants of local receptive field
neurons we show that the VC dimension of these networks is
superlinear. In particular, we establish the bound $\Omega(W\log k)$
for any reasonably sized network with $W$ parameters and $k$ hidden
nodes. This bound is shown to hold for discrete center-surround
receptive field neurons, which are physiologically relevant models
of cells in the mammalian visual system, for neurons computing a
difference of Gaussians, which are popular in computational vision,
and for standard radial basis function (RBF) neurons, a major
alternative to sigmoidal neurons in artificial neural networks. The
result for RBF neural networks is of particular interest since it
answers a question that has been open for several years. The results
also give rise to lower bounds for networks with fixed input
dimension. Regarding constants all bounds are larger than those
known thus far for similar architectures with sigmoidal neurons. The
superlinear lower bounds contrast with linear upper bounds for
single local receptive field neurons also derived here.



ISSN 1433-8092 | Imprint