PhD student, Chinese University of Hong Kong, The Chinese University of Hong Kong
1 paper at NeurIPS 2025
We introduce UniGist, a unified gist token-based long context compression method without chunk-wise training, which significantly enhances long context retention and efficiency through a hardward-aligned design.