logo
today local_bar
Poster Session 4 · Thursday, December 4, 2025 4:30 PM → 7:30 PM
#3908

The Cost of Robustness: Tighter Bounds on Parameter Complexity for Robust Memorization in ReLU Nets

NeurIPS OpenReview

Abstract

We study the parameter complexity of robust memorization for ReLU networks: the number of parameters required to interpolate any dataset with -separation between differently labeled points, while ensuring predictions remain consistent within a -ball around each training example.
We establish upper and lower bounds on the parameter count as a function of the robustness ratio .
Unlike prior work, we provide a fine-grained analysis across the entire range and obtain tighter upper and lower bounds that improve upon existing results. Our findings reveal that the parameter complexity of robust memorization matches that of non-robust memorization when is small, but grows with increasing .
As a special case, when the input dimension is comparable to or exceeds the dataset size, our bounds become tight (up to logarithmic factors) across the entire range of .