Full Professor, University of Texas Health Science Center at Houston
1 paper at NeurIPS 2025
We propose DictPFL, a framework that ensures efficient and private federated learning (FL) by encrypting shared gradients and keeping most gradients local, while still preserving the performance of global gradient aggregation.