Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: We prove that the list replicability number of $d$-dimensional
$\gamma$-margin half-spaces satisfies \[ \frac{d}{2}+1 \le
\mathrm{LR}(H^d_\gamma) \le d, \] which grows with dimension. This resolves
several open problems:
$\bullet$ Every disambiguation of infinite-dimensional large-margin
half-spaces to a total concept class has unbounded Littlestone dimension,
answering an open question of Alon, Hanneke, Holzman, and Moran (FOCS '21).
$\bullet$ Every disambiguation of the Gap Hamming Distance problem in the
large gap regime has unbounded public-coin randomized communication complexity.
This answers an open question of Fang, G\"o\"os, Harms, and Hatami (STOC '25).
$\bullet$ There is a separation of $O(1)$ vs $\omega(1)$ between randomized
and pseudo-deterministic communication complexity.
$\bullet$ The maximum list-replicability number of any finite set of points
and homogeneous half-spaces in $d$-dimensional Euclidean space is $d$,
resolving a problem of Chase, Moran, and Yehudayoff (FOCS '23).
$\bullet$ There exists a partial concept class with Littlestone dimension $1$
such that all its disambiguations have infinite Littlestone dimension. This
resolves a problem of Cheung, H. Hatami, P. Hatami, and Hosseini (ICALP '23).
Our lower bound follows from a topological argument based on a local
Borsuk-Ulam theorem. For the upper bound, we construct a list-replicable
learning rule using the generalization properties of SVMs.
Authors (5)
Ari Blondal
Hamed Hatami
Pooya Hatami
Chavdar Lalov
Sivan Tretiak
Key Contributions
This paper proves bounds on the list replicability number of large-margin half-spaces, resolving open problems in learning theory and communication complexity. Specifically, it shows unbounded Littlestone dimension for disambiguations of infinite-dimensional half-spaces and unbounded public-coin randomized communication complexity for the Gap Hamming Distance problem, and establishes a separation between randomized and pseudo-deterministic communication complexity.
Business Value
While primarily theoretical, the insights into learning bounds and communication complexity can inform the design of more efficient algorithms and data structures in areas like data compression and secure computation.