logo
today local_bar
Poster Session 1 · Wednesday, December 3, 2025 11:00 AM → 2:00 PM
#610

The Adaptive Complexity of Minimizing Relative Fisher Information

NeurIPS OpenReview

Abstract

Non-log-concave sampling from an unnormalized density is fundamental in machine learning and statistics.
As datasets grow larger, computationalefficiency becomes increasingly important, particularly in reducing adaptive complexity, namely the number of sequential rounds required for sampling algorithms. In this work, we initiate the study of the adaptive complexity of non-log-concave sampling within the framework of relative Fisher information introduced by Balasubramanian et al. in 2022.
To obtain a relative fisher information of at most from the target distribution, we propose a novel algorithm that reduces the adaptive complexity from to by leveraging parallelism. Furthermore, we show our algorithm is optimal for a specific regime of large . Our algorithm builds on a diagonally parallelized Picard iteration, while the lower bound is based on a reduction from the problem of finding stationary points.