Assistant Professor, Florida State University
4 papers at NeurIPS 2025
We propose DictPFL, a framework that ensures efficient and private federated learning (FL) by encrypting shared gradients and keeping most gradients local, while still preserving the performance of global gradient aggregation.
An all-in-one framework that significantly speeds up diffusion models by dynamically creating pruned temporal experts during finetuning.
CSCR embeds both prompts and LLMs into a shared space using fast logit or perplexity fingerprints. A cost‑banded InfoNCE loss trains the space to balance quality against cost. It generalizes to unseen models and out‑of‑distribution prompts.
We propose RBD, a plug-in module that detects and corrects biased LLM evaluations through structured reasoning, significantly improving accuracy, consistency, and scalability across multiple bias types and evaluator models.