1 paper across 1 session
We propose GOOD, a training-free framework that leverages off-the-shelf classifiers to guide diffusion models for generating diverse, informative OOD samples—improving outlier exposure without requiring external datasets or embedding alignment.