I am a PhD student starting from September 2024, supervised by Prof. Chenhui Li at the Shanghai Institute of AI Education, East China Normal University, and also working with the InternLM team led by Dr. Qipeng Guo at the Shanghai Artificial Intelligence Laboratory and Shanghai Innovation Institute. I completed my Academic Masterโs degree in Computer Science and Technology at East China Normal University supervised by Prof. Chenhui Li and Prof. Changbo Wang (2021-2024) and my undergraduate studies in Mathematics & Applied Mathematics and Computer Science at East China University of Science and Technology (2017-2021). My research focuses on cross-modal LLM Agents, multimodal image editing, and computer graphics. I have served as a reviewer for top-tier conferences including IEEE VIS and CVPR.
๐ฅ News
- 2025.05: ย ๐ 1 paper accepted to ICML 2025.
- 2025.05: ย ๐ 1 paper accepted to ACL (main) 2025.
- 2024.03: ย ๐ Secured a admission offer through the Spring Camp selection at the Shanghai Innovation Institute; expected to enroll in September 2025.
- 2024.09: ย ๐ Starting my PhD at Shanghai Institute of AI Education, East China Normal University.
- 2024.05: ย ๐ผ Joined the InternLM group at Shanghai Artificial Intelligence Laboratory as a LLM Research Intern.
๐ Publications
TextCenGen: Attention-Guided Text-Centric Background Adaptation for Text-to-Image Generation
Tianyi Liangโ , Jiangqi Liuโ , Yifei Huang, Shiqi Jiang, Jianshen Shi, Changbo Wang, and Chenhui Li*.
ICML 2025, Accepted (CCF A)
arXiv ย |ย Code ย |ย Project Page
๐ผ Internships
- 2024.5 - Present: Shanghai Artificial Intelligence Laboratory, LLM Algorithm Research Intern
- Research on Open-Compass GAOKAO evaluation and InternLM2-WQX-VL-20B
- Developed an innovative Verifier-Guided workflow for quality control of Web Data and PDF corpora in InternLM3 project. Proposed a novel reflection-based criteria optimization method, improving bad case detection F1-score from 66% to 86%. More details can be found in CritiQ.
- Conducting research on LLM-as-Judge and synthetic data generation for large language model pre-training. Implemented domain filtering and Verifier-Guided Rephrase techniques, reducing required training data by 85% while maintaining model performance.
๐ Honors and Awards
- 2021.7 Shanghai Outstanding Graduate & Outstanding Thesis Award
- 2019.12 National College Student Mathematical Modeling Competition, Shanghai Region First Prize
- 2019.12 National Scholarship
- 2018.12 Shanghai Scholarship