Poster Session 4 · Thursday, December 4, 2025 4:30 PM → 7:30 PM
#3104 Spotlight
Sample-Adaptivity Tradeoff in On-Demand Sampling
Abstract
We study the tradeoff between sample complexity and round complexity in on-demand sampling, where the learning algorithm adaptively samples from distributions over a limited number of rounds. In the realizable setting of Multi-Distribution Learning (MDL), we show that the optimal sample complexity of an -round algorithm scales approximately as .
For the general agnostic case, we present an algorithm that achieves near-optimal sample complexity of within rounds.
Of independent interest, we introduce a new framework, Optimization via On-Demand Sampling (OODS), which abstracts the sample-adaptivity tradeoff and captures most existing MDL algorithms. We establish nearly tight bounds on the round complexity in the OODS setting.
The upper bounds directly yield the -round algorithm for agnostic MDL, while the lower bounds imply that achieving sub-polynomial round complexity would require fundamentally new techniques that bypass the inherent hardness of OODS.