1 paper across 1 session
We propose a stochastic federated learning framework with inherent communication regularization and principled compression via remote source generation. It achieves 5–32× communication savings underlined by theoretical guarantees.