1 paper across 1 session
We introduce 3BASiL-TM, a highly efficient one-shot post-training method for Sparse plus Low-Rank decomposition of LLMs that reduces the WikiText2 perplexity gap to dense model by over $30\%$ compared to prior methods.