Researcher, Flatiron Institute
2 papers at NeurIPS 2025
We develop a score-based variational inference method that learns a product-of-t-experts model via a Feynman-identity latent-variable formulation, reducing inference to a sequence of convex quadratic programs with provable convergence.
Adam with equal betas works well, and its form can be simplified for further insights