PhD student, Technical University of Munich
1 paper at NeurIPS 2025
We propose a stochastic federated learning framework with inherent communication regularization and principled compression via remote source generation. It achieves 5–32× communication savings underlined by theoretical guarantees.