3 papers across 2 sessions
We prove the first generalization bound for large-margin halfspaces that is asymptotically tight.
We construct an exact RKBS model for neural networks with arbitrary width, depth and topology and use this model to derive tight bounds on Rademacher complexity.
The paper derives generalization bounds for selective SSMs using connections to self-attention, showing that spectral properties of the state matrix influence generalization.