3 papers across 3 sessions
We prove generalization bounds for neural networks which exploit approximate low-rank structure in the weights.
The paper proposes Core Space Merging, a method to efficiently merge LoRA-adapted models by aligning them in a shared low-rank subspace, achieving higher accuracy and major speedups over prior merging techniques.