Researcher, Amazon
1 paper at NeurIPS 2025
Training LLM’s with tensor-parallelism without completely synchronizing activations to accelerate training and inference.