PhD student, The Chinese University of Hong Kong, The Chinese University of Hong Kong
1 paper at NeurIPS 2025
We systematically investigate the design space and scaling property of native Multimodal Large Language Models and introduce a novel MLLM that achieves competitive performance against existing MLLMs.