Staff Machine Learning Scientist, Layer 6 AI
3 papers at NeurIPS 2025
We introduce TabDPT, a tabular foundation model capable of providing highly accurate predictions for unseen tabular datasets with no further training or hyperparameter tuning, and demonstrate scaling in both model and pre-training dataset size.
We apply conformal prediction to provide statistical guarantees that all important information within a long-text is captured by an automatically generated summary.
CausalPFN is a pre-trained transformer that amortizes causal effect estimation: trained once on simulated data-generating processes, it outputs calibrated effects for new observational datasets with zero tuning.