Professor, Yale University
1 paper at NeurIPS 2025
We present a theoretical framework for mask-based pretraining using high-dimensional statistics and introduce R²MAE, a novel pretraining scheme that enhances self-supervised learning across diverse data domains.