MS student, University of Chinese Academy of Sciences
1 paper at NeurIPS 2025
We systematically investigate the design space and scaling property of native Multimodal Large Language Models and introduce a novel MLLM that achieves competitive performance against existing MLLMs.