Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_stat_ml 70% Match Research Paper Machine learning theorists,Deep learning researchers,Mathematicians 2 months ago

Does the Barron space really defy the curse of dimensionality?

reinforcement-learning › robotics-rl
📄 Abstract

Abstract: The Barron space has become famous in the theory of (shallow) neural networks because it seemingly defies the curse of dimensionality. And while the Barron space (and generalizations) indeed defies (defy) the curse of dimensionality from the POV of classical smoothness, we herein provide some evidence in favor of the idea that the Barron space (and generalizations) does (do) not defy the curse of dimensionality with a nonclassical notion of smoothness which relates naturally to "infinitely wide" shallow neural networks. Like how the Bessel potential spaces are defined via the Fourier transform, we define so-called ADZ spaces via the Mellin transform; these ADZ spaces encapsulate the nonclassical smoothness we alluded to earlier. 38 pages, will appear in the dissertation of the author

Key Contributions

This paper provides evidence that the Barron space, while defying the curse of dimensionality in a classical sense, does not defy it with a nonclassical notion of smoothness related to infinitely wide shallow neural networks. It introduces ADZ spaces, defined via the Mellin transform, to capture this nonclassical smoothness.

Business Value

Deepens the theoretical understanding of neural network expressivity and generalization, potentially leading to the design of more efficient and powerful network architectures.