I am currently a PostDoc researcher at Department of Electronic Engineering (EE), Tsinghua University. My research focuses on the application of large language models (LLMs).
- Gan, H., Li, Y., Li, W., & Tang, W. (2025, October). Aligned or Apart? Multi-Agent Insights into Consumer and Brand Messaging Discrepancies. In Proceedings of the 33rd ACM International Conference on Multimedia (pp. 6558-6566).
- Li, Y., Hou, X., Zheng, D., Shen, L., Zhao, Z. FLIP-80M: 80 Million Visual-Linguistic Pairs for Facial Language-Image Pre-Training. In Proceedings of the 32th ACM International Conference on Multimedia. [paper][code]
- Xie. J., Ye. K., Li. Y., et al. Learning Visual Prior via Generative Pre-training. Advances in Neural Information Processing Systems. [paper]
- Li, Y., Feng, Y., Zhou, W., Zhao, Z., et al. Dynamic data sampler for cross-language transfer learning in large language models. In ICASSP 2024-2024 IEEE International Conference on Acoustics, Speech and Signal Processing [paper][code]
- Zhao, Z., Li, Y., Hou, C., et. al. TencentPretrain: A Scalable and Flexible Toolkit for Pre-training Models of Different Modalities. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics [paper][code]
- Li, Y., Hou, X., Zhao, Z. et. al. Talk2face: A unified sequence-based framework for diverse face generation and analysis tasks. In Proceedings of the 30th ACM International Conference on Multimedia, 25, 3409-3419. [paper][code]
- Li, Y., Zhang, Y., Zhao, et. al. CSL: A Large-scale Chinese Scientific Literature Dataset. In Proceedings of the 29th International Conference on Computational Linguistics (pp. 3917-3923). [paper][code]
- Hou. X., Zhang. X., Li. Y., et al. Textface: Text-to-style mapping based face generation and manipulation. IEEE Transactions on Multimedia, 2022, 25: 3409-3419. [paper]
- Xu L, Hu H, Zhang X, Li L, Cao C, Li Y, et al. CLUE: A Chinese Language Understanding Evaluation Benchmark. In Proceedings of the 28th International Conference on Computational Linguistics. [paper][code]




