PhD student, Department of Computer Science, ETHZ - ETH Zurich
2 papers at NeurIPS 2025
We propose PoLAR, a polar-decomposition-based parameterization, for efficient fine-tuning of LLMs. PoLAR mitigates the low stable rank seen in LoRA, provably accelerates convergence on a canonical LoRA problem, and lifts accuracy on real-world tasks.