Full Professor, Universität Stuttgart
5 papers at NeurIPS 2025
a new learning scheme for multi-modal LLM, LLAVA
optimizing 3D point cloud transformer model for large-scale processing
We introduce Adaptive Constrained Equivariance, a homotopy-inspired constrained optimization apprach for training equivariant neural networks.
We reduce training variance in equivariant generative models using a low-variance gradient estimator, improving stability and performance across molecular, crystal, and protein generation tasks.
We introduce CALM-PDE, a model based on continuous convolution for discretization-agnostic solving of time-dependent partial differential equations, which achieves strong performance in fluid dynamics simulations.