Poster Session 2 · Wednesday, December 3, 2025 4:30 PM → 7:30 PM
#3016 Spotlight
Robust learning of halfspaces under log-concave marginals
Abstract
We say that a classifier is adversarially robust to perturbations of norm if, with high probability over a point drawn from the input distribution, there is no point within distance from that is classified differently. The boundary volume is the probability that a point falls within distance of a point with a different label. This work studies the task of learning a hypothesis with small boundary volume, where the input is distributed as a subgaussian isotropic log-concave distribution over .
Linear threshold functions are adversarially robust; they have boundary volume proportional to . Such concept classes are efficiently learnable by polynomialregression, which produces a polynomial threshold function (PTF), but PTFs in general may have boundary volume , even for . We give an algorithm that agnostically learns linear threshold functions and returns a classfier with boundary volume atradius of perturbation . The time and sample complexity of matches the complexity of polynomial regression.
Our algorithm augments the classic approach of polynomial regression with three additional steps:
- performing the -error regression under noise sensitivity constraints,
- a structured partitioning and rounding step that returns a Boolean classifier with error and noise sensitivity simultaneously, and
- a local corrector that "smooths" a function with low noise sensitivity into a function that is adversarially robust.