MS student, Zhejiang University
1 paper at NeurIPS 2025
We propose a confidence-separation self-distillation strategy and train SNNs using rate coding, achieving excellent performance with minimal training overhead.