Yiming Cui is affiliated to iFLYTEK Research as a principal scientist. He received his doctoral degree at Harbin Institute of Technology (HIT), supervised by Prof. Ting Liu. Before that, he also received M.S. (supervised by Prof. Tiejun Zhao) and B.S. degree from Harbin Institute of Technology, majoring in computer science. His research interests include: Large Language Model (LLM), Pre-trained Language Model (PLM), Machine Reading Comprehension (MRC), Question Answering (QA), AI for Science (AI4S), etc.

His mother tongues are Chinese and Korean (an ethnic minority in China). His Chinese name is 崔一鸣 (pronunciation: /tsui i ming/), and Korean name is 최일명 (Choi Ilmyeong). He can also speak English and Japanese (JLPT N1).

BTW, his favorite drink is Starbucks® Caffè Americano and Hazelnut Flavored Latte ☕️.

News

[Dec, 2025] Our paper “Evaluating Large Language Models on Multimodal Chemistry Olympiad Exams” is published at Communications Chemistry. [x.com] [WeChat]

[Dec, 2025] I am honored to serve as a Senior Area Chair for ACL 2026.

[Nov, 2025] I am honored to serve as an Associate Editor for IEEE Transactions on Audio, Speech, and Language Processing (TASLPRO).

[Sep, 2025] I am honored to be included in “World Top 2% Scientists List” for a consecutive 3 years.

[Aug, 2025] I am honored to serve as a Senior Area Chair for LREC 2026.

[Aug, 2025] Our paper “You Might Not Need Attention Diagonals” is accepted by IEEE Signal Processing Letters.

[Aug, 2025] Our paper “Chart2Code53: A Large-Scale Diverse and Complex Dataset for Enhancing Chart-to-Code Generation” is accepted by EMNLP 2025 (main).

[May, 2025] Our book is selected as one of the “Top 1% Highly Cited Book (2019-2023)” by CNKI. See more: [link]

[Mar, 2025] Our new book “Natural Language Processing: A Large Language Model Approach” is on sale!


Flag Counter